Name Theory_2922-4885079-680_0
Workunit 239318375
Created 22 Feb 2026, 17:21:45 UTC
Sent 23 Feb 2026, 2:38:27 UTC
Report deadline 6 Mar 2026, 2:38:27 UTC
Received 28 Feb 2026, 22:55:14 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 10824905
Run time 19 hours 41 min 19 sec
CPU time 18 hours 33 min 40 sec
Priority 0
Validate state Valid
Credit 832.50
Device peak FLOPS 5.07 GFLOPS
Application version Theory Simulation v302.10 (docker)
windows_x86_64
Peak working set size 13.41 MB
Peak swap size 2.51 MB
Peak disk usage 7.06 MB

Stderr output

<core_client_version>8.2.4</core_client_version>
<![CDATA[
<stderr_txt>
home.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:48:40Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:48:40Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
424.09% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:48:50Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:48:50Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
424.04% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:49:01Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:49:01Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
424.00% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
423.95% 0B / 33.31GB
time="2026-02-28T21:49:12Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
time="2026-02-28T21:49:12Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:49:22Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:49:22Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.91% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:49:33Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:49:33Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.87% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:49:43Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:49:43Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.82% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:49:54Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:49:54Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.78% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:50:04Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:50:04Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.74% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
423.69% 0B / 33.31GB
time="2026-02-28T21:50:15Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:50:15Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:50:25Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:50:25Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.65% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:50:36Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:50:36Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.61% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:50:46Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:50:46Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.56% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:50:57Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:50:57Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.52% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:51:07Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
time="2026-02-28T21:51:07Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
423.48% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
423.43% 0B / 33.31GB
time="2026-02-28T21:51:18Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:51:18Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:51:29Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:51:29Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.39% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:51:39Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:51:39Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.34% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:51:50Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:51:50Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.30% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:52:00Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
time="2026-02-28T21:52:00Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
423.26% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:52:11Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:52:11Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.21% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:52:21Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:52:21Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.17% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:52:32Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:52:32Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.13% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:52:42Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:52:42Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.08% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:52:53Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:52:53Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
423.04% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
422.99% 0B / 33.31GB
time="2026-02-28T21:53:03Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:53:03Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:53:14Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:53:14Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.95% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:53:24Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:53:24Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.91% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:53:35Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:53:35Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.86% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:53:46Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:53:46Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.82% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:53:56Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:53:56Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.77% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:54:07Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:54:07Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.73% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:54:17Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:54:17Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.69% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:54:28Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:54:28Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.65% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:54:38Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:54:38Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.60% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:54:49Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:54:49Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.56% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Up 6 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
time="2026-02-28T21:54:59Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/memory.max: no such file or directory"
time="2026-02-28T21:54:59Z" level=warning msg="Failed to retrieve cgroup stats: open /sys/fs/cgroup/pids.current: no such file or directory"
422.50% 0B / 33.31GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS                    PORTS       NAMES
d3e9dc30be05  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest  /bin/sh -c ./entr...  5 days ago  Exited (0) 4 seconds ago  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
===> [runRivet] Sat Feb 28 15:42:32 UTC 2026 [boinc pp z1j 13000 150 - pythia8 8.244 tune-A14-NNPDF2.3LO 100000 680]

Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=9860
job: logsize=68 k
job: times=
0m0.017s 0m0.000s
376m54.255s 11m19.621s
job: cpuusage=23294
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          107M   3% /cvmfs/alice.cern.ch
cvmfs2           77M   2% /cvmfs/grid.cern.ch
cvmfs2          774M  20% /cvmfs/sft.cern.ch
total           957M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
===> [runRivet] Sat Feb 28 15:42:32 UTC 2026 [boinc pp z1j 13000 150 - pythia8 8.244 tune-A14-NNPDF2.3LO 100000 680]

Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=9860
job: logsize=68 k
job: times=
0m0.017s 0m0.000s
376m54.255s 11m19.621s
job: cpuusage=23294
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          107M   3% /cvmfs/alice.cern.ch
cvmfs2           77M   2% /cvmfs/grid.cern.ch
cvmfs2          774M  20% /cvmfs/sft.cern.ch
total           957M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680_0
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4885079-680:latest
EOM
2026-02-28 22:55:11 (13792): called boinc_finish(0)

</stderr_txt>
]]>


©2026 CERN