Name Theory_2922-4848620-688_0
Workunit 239591146
Created 5 Mar 2026, 3:23:20 UTC
Sent 5 Mar 2026, 14:20:46 UTC
Report deadline 16 Mar 2026, 14:20:46 UTC
Received 9 Mar 2026, 19:12:56 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 10939727
Run time 57 min 5 sec
CPU time 7 min 34 sec
Priority 0
Validate state Valid
Credit 63.22
Device peak FLOPS 7.97 GFLOPS
Application version Theory Simulation v302.10 (docker)
windows_x86_64
Peak working set size 13.77 MB
Peak swap size 2.86 MB
Peak disk usage 1.81 MB

Stderr output

<core_client_version>8.2.8</core_client_version>
<![CDATA[
<stderr_txt>
st/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Up 48 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
42.85% 329.5MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Up 48 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
42.92% 329.1MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Up 48 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
42.99% 328.7MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Up 48 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
43.06% 328.9MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Up 52 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
44.07% 327.8MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS            PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
46.95% 329.1MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
docker_wrapper 18 starting
docker_wrapper config:
   workdir: /boinc_slot_dir
   use GPU: no
   create args: --cap-add=SYS_ADMIN --device /dev/fuse
   verbose: 1
Using WSL distro boinc-buda-runner
Using podman
running docker command: ps --all --filter "name=^boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0$" --format "{{.Names}}|{{.Status}}"
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0|Paused
EOM
container state: Paused

container is paused; unpausing
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
35.67% 328.9MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS                    PORTS       NAMES
8d5f33bf5e8e  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest  /bin/sh -c ./entr...  4 days ago  Exited (0) 3 seconds ago  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
===> [runRivet] Mon Mar  9 17:26:26 UTC 2026 [boinc pp jets 8000 150 - pythia8 8.230 tune-4cx 100000 688]

Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=9804
job: logsize=72 k
job: times=
0m0.000s 0m0.012s
37m38.611s 0m42.397s
job: cpuusage=2301
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          203K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          172M   5% /cvmfs/alice.cern.ch
cvmfs2          121M   4% /cvmfs/grid.cern.ch
cvmfs2          724M  19% /cvmfs/sft.cern.ch
total          1016M   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
===> [runRivet] Mon Mar  9 17:26:26 UTC 2026 [boinc pp jets 8000 150 - pythia8 8.230 tune-4cx 100000 688]

Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=9804
job: logsize=72 k
job: times=
0m0.000s 0m0.012s
37m38.611s 0m42.397s
job: cpuusage=2301
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          203K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          172M   5% /cvmfs/alice.cern.ch
cvmfs2          121M   4% /cvmfs/grid.cern.ch
cvmfs2          724M  19% /cvmfs/sft.cern.ch
total          1016M   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688_0
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4848620-688:latest
EOM
2026-03-09 20:13:52 (21180): called boinc_finish(0)

</stderr_txt>
]]>


©2026 CERN