Name Theory_2922-4835004-677_0
Workunit 239302824
Created 21 Feb 2026, 12:02:18 UTC
Sent 21 Feb 2026, 22:57:09 UTC
Report deadline 4 Mar 2026, 22:57:09 UTC
Received 9 Mar 2026, 17:46:08 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 10998488
Run time 10 hours 4 min 18 sec
CPU time 8 hours 52 min 58 sec
Priority 0
Validate state Task was reported too late to validate
Credit 0.00
Device peak FLOPS 6.67 GFLOPS
Application version Theory Simulation v302.10 (docker)
windows_x86_64
Peak working set size 12.66 MB
Peak swap size 6.64 MB
Peak disk usage 3.48 MB

Stderr output

<core_client_version>8.2.8</core_client_version>
<![CDATA[
<stderr_txt>
 80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.54% 344.2MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.55% 344.4MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.55% 343.9MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.55% 344.2MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.55% 344.4MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.55% 344.1MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.55% 344.4MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.56% 344.1MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.57% 344.4MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.59% 343.7MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.67% 344.1MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.67% 344.1MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.67% 343.7MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.68% 344.1MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.68% 344MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.69% 344MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.69% 344.3MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.70% 344MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.70% 344.1MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.70% 343.8MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.70% 344.1MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.69% 344.4MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.70% 344.9MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.78% 344.7MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.80% 344.2MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.81% 344.9MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.70% 321.5MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.75% 320.7MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.80% 320.6MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.86% 320.8MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.81% 309.7MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS            PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Up About an hour  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
123.54% 295.6MB / 16.73GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS                    PORTS       NAMES
d1417d07195f  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest  /bin/sh -c ./entr...  11 days ago  Exited (0) 8 seconds ago  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
===> [runRivet] Mon Mar  9 16:25:55 UTC 2026 [boinc pp jets 7000 400 - pythia6 6.428 395 100000 677]

Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10576
job: logsize=64 k
job: times=
0m0.003s 0m0.019s
86m2.477s 7m21.378s
job: cpuusage=5604
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          203K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          278M   7% /cvmfs/alice.cern.ch
cvmfs2          210M   6% /cvmfs/grid.cern.ch
cvmfs2          692M  18% /cvmfs/sft.cern.ch
total           1.2G   8% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
===> [runRivet] Mon Mar  9 16:25:55 UTC 2026 [boinc pp jets 7000 400 - pythia6 6.428 395 100000 677]

Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10576
job: logsize=64 k
job: times=
0m0.003s 0m0.019s
86m2.477s 7m21.378s
job: cpuusage=5604
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          203K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          278M   7% /cvmfs/alice.cern.ch
cvmfs2          210M   6% /cvmfs/grid.cern.ch
cvmfs2          692M  18% /cvmfs/sft.cern.ch
total           1.2G   8% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677_0
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4835004-677:latest
EOM
2026-03-09 17:44:07 (5184): called boinc_finish(0)

</stderr_txt>
]]>


©2026 CERN