Name Theory_2922-4915467-693_0
Workunit 239640485
Created 6 Mar 2026, 11:22:39 UTC
Sent 6 Mar 2026, 22:00:48 UTC
Report deadline 17 Mar 2026, 22:00:48 UTC
Received 7 Mar 2026, 13:48:17 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 10870139
Run time 13 hours 48 min 0 sec
CPU time 13 hours 48 min 0 sec
Priority 0
Validate state Valid
Credit 669.15
Device peak FLOPS 5.82 GFLOPS
Application version Theory Simulation v302.10 (docker)
windows_x86_64
Peak working set size 14.10 MB
Peak swap size 2.58 MB
Peak disk usage 4.81 MB

Stderr output

<core_client_version>8.2.4</core_client_version>
<![CDATA[
<stderr_txt>
RTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
29083.36% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
29031.51% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28979.97% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28928.42% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28877.21% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28825.96% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28775.35% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28724.46% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28673.80% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28623.47% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28573.38% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28523.32% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28473.09% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28423.39% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28373.52% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28323.95% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28274.23% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28225.24% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28175.60% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28126.87% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28077.38% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
28028.84% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27980.76% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27932.78% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27885.08% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27837.10% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27789.87% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27742.58% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27695.42% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27648.49% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27601.79% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27554.86% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27507.80% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27461.51% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27413.48% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27367.39% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27321.78% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27275.87% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27230.17% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27184.53% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS      PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
27137.94% 0B / 8.284GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED       STATUS                    PORTS       NAMES
f2a3c1b750ef  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest  /bin/sh -c ./entr...  15 hours ago  Exited (0) 8 seconds ago  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
===> [runRivet] Sat Mar  7 11:17:37 UTC 2026 [boinc pp zinclusive 8000 -,-,60 - herwig++ 2.7.1 UE-EE-5-CTEQ6L1 100000 693]

Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11860
job: logsize=116 k
job: times=
0m0.000s 0m0.016s
92m32.651s 4m57.517s
job: cpuusage=5850
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          203K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          107M   3% /cvmfs/alice.cern.ch
cvmfs2           62M   2% /cvmfs/grid.cern.ch
cvmfs2          856M  22% /cvmfs/sft.cern.ch
total           1.0G   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr from container:
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
===> [runRivet] Sat Mar  7 11:17:37 UTC 2026 [boinc pp zinclusive 8000 -,-,60 - herwig++ 2.7.1 UE-EE-5-CTEQ6L1 100000 693]

Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11860
job: logsize=116 k
job: times=
0m0.000s 0m0.016s
92m32.651s 4m57.517s
job: cpuusage=5850
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          203K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          107M   3% /cvmfs/alice.cern.ch
cvmfs2           62M   2% /cvmfs/grid.cern.ch
cvmfs2          856M  22% /cvmfs/sft.cern.ch
total           1.0G   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693_0
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4915467-693:latest
EOM
2026-03-07 13:47:54 (22068): called boinc_finish(0)

</stderr_txt>
]]>


©2026 CERN