Name Theory_2922-4802660-677_2
Workunit 239299028
Created 21 Feb 2026, 23:37:22 UTC
Sent 22 Feb 2026, 0:43:42 UTC
Report deadline 5 Mar 2026, 0:43:42 UTC
Received 2 Mar 2026, 12:08:59 UTC
Server state Over
Outcome Success
Client state Done
Exit status 0 (0x00000000)
Computer ID 10906881
Run time 4 hours 10 min 26 sec
CPU time 3 hours 55 min 0 sec
Priority 0
Validate state Valid
Credit 105.33
Device peak FLOPS 3.03 GFLOPS
Application version Theory Simulation v302.10 (docker)
windows_x86_64
Peak working set size 13.24 MB
Peak swap size 6.62 MB
Peak disk usage 2.67 MB

Stderr output

<core_client_version>8.2.8</core_client_version>
<![CDATA[
<stderr_txt>
  CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
92436.46% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
92298.14% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
92159.90% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
92023.58% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
91885.69% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
91749.80% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
91614.34% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
91479.16% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
91343.85% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
91208.92% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
91075.00% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90940.54% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90806.99% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90674.71% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90541.19% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90409.65% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90277.04% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90144.15% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
90012.69% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
89883.07% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
89753.45% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
89624.92% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
89496.44% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
89368.31% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
89239.96% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
89112.85% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88985.64% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88858.80% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88732.37% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88606.31% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88480.64% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88355.67% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88230.95% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
88106.10% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
87981.37% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
87856.69% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
87731.37% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
87606.22% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
87482.18% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
87358.90% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Up 2 hours  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
87236.15% 0B / 33.58GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS                    PORTS       NAMES
130049db92c3  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest  /bin/sh -c ./entr...  6 days ago  Exited (0) 7 seconds ago  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.005s 0m0.015s
158m57.784s 14m52.718s
job: cpuusage=10431
===> [runRivet] Tue Feb 24 00:26:51 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          201K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2           43M   2% /cvmfs/alice.cern.ch
cvmfs2           17M   1% /cvmfs/grid.cern.ch
cvmfs2          676M  17% /cvmfs/sft.cern.ch
total           736M   5% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23140
job: logsize=84 k
job: times=
0m0.014s 0m0.014s
188m17.606s 20m46.644s
job: cpuusage=12544
Job Finished
Filesystem      Used Use% Mounted on
===> [runRivet] Wed Feb 25 12:37:05 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2           65M   2% /cvmfs/alice.cern.ch
cvmfs2           32M   1% /cvmfs/grid.cern.ch
cvmfs2          676M  17% /cvmfs/sft.cern.ch
total           773M   5% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
===> [runRivet] Fri Feb 27 19:23:55 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Sat Feb 28 21:42:18 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Sun Mar  1 02:27:58 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Sun Mar  1 18:04:09 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Mon Mar  2 09:13:39 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.007s 0m0.013s
159m20.393s 14m45.375s
job: cpuusage=10446
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2           86M   3% /cvmfs/alice.cern.ch
cvmfs2           47M   2% /cvmfs/grid.cern.ch
cvmfs2          677M  17% /cvmfs/sft.cern.ch
total           809M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.004s 0m0.021s
176m21.503s 16m39.519s
job: cpuusage=11581
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          129M   4% /cvmfs/alice.cern.ch
cvmfs2           77M   2% /cvmfs/grid.cern.ch
cvmfs2          677M  17% /cvmfs/sft.cern.ch
total           882M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.003s 0m0.016s
154m52.778s 13m47.438s
job: cpuusage=10120
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          172M   5% /cvmfs/alice.cern.ch
cvmfs2           77M   2% /cvmfs/grid.cern.ch
cvmfs2          678M  17% /cvmfs/sft.cern.ch
total           926M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.009s 0m0.012s
144m30.310s 13m4.178s
job: cpuusage=9455
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          193M   5% /cvmfs/alice.cern.ch
cvmfs2           92M   3% /cvmfs/grid.cern.ch
cvmfs2          678M  17% /cvmfs/sft.cern.ch
total           962M   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.004s 0m0.015s
134m26.302s 12m15.169s
job: cpuusage=8801
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          214M   6% /cvmfs/alice.cern.ch
cvmfs2          121M   4% /cvmfs/grid.cern.ch
cvmfs2          678M  17% /cvmfs/sft.cern.ch
total          1013M   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.005s 0m0.015s
158m57.784s 14m52.718s
job: cpuusage=10431
===> [runRivet] Tue Feb 24 00:26:51 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          201K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2           43M   2% /cvmfs/alice.cern.ch
cvmfs2           17M   1% /cvmfs/grid.cern.ch
cvmfs2          676M  17% /cvmfs/sft.cern.ch
total           736M   5% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23140
job: logsize=84 k
job: times=
0m0.014s 0m0.014s
188m17.606s 20m46.644s
job: cpuusage=12544
Job Finished
Filesystem      Used Use% Mounted on
===> [runRivet] Wed Feb 25 12:37:05 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2           65M   2% /cvmfs/alice.cern.ch
cvmfs2           32M   1% /cvmfs/grid.cern.ch
cvmfs2          676M  17% /cvmfs/sft.cern.ch
total           773M   5% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
===> [runRivet] Fri Feb 27 19:23:55 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Sat Feb 28 21:42:18 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Sun Mar  1 02:27:58 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Sun Mar  1 18:04:09 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

===> [runRivet] Mon Mar  2 09:13:39 UTC 2026 [boinc pp jets 13000 260 - pythia6 6.428 z1-lep 100000 677]

Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.007s 0m0.013s
159m20.393s 14m45.375s
job: cpuusage=10446
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2           86M   3% /cvmfs/alice.cern.ch
cvmfs2           47M   2% /cvmfs/grid.cern.ch
cvmfs2          677M  17% /cvmfs/sft.cern.ch
total           809M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.004s 0m0.021s
176m21.503s 16m39.519s
job: cpuusage=11581
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          129M   4% /cvmfs/alice.cern.ch
cvmfs2           77M   2% /cvmfs/grid.cern.ch
cvmfs2          677M  17% /cvmfs/sft.cern.ch
total           882M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.003s 0m0.016s
154m52.778s 13m47.438s
job: cpuusage=10120
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          172M   5% /cvmfs/alice.cern.ch
cvmfs2           77M   2% /cvmfs/grid.cern.ch
cvmfs2          678M  17% /cvmfs/sft.cern.ch
total           926M   6% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.009s 0m0.012s
144m30.310s 13m4.178s
job: cpuusage=9455
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          193M   5% /cvmfs/alice.cern.ch
cvmfs2           92M   3% /cvmfs/grid.cern.ch
cvmfs2          678M  17% /cvmfs/sft.cern.ch
total           962M   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1fnal-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1fnal-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=23144
job: logsize=84 k
job: times=
0m0.004s 0m0.015s
134m26.302s 12m15.169s
job: cpuusage=8801
Job Finished
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          214M   6% /cvmfs/alice.cern.ch
cvmfs2          121M   4% /cvmfs/grid.cern.ch
cvmfs2          678M  17% /cvmfs/sft.cern.ch
total          1013M   7% -
boinc_shutdown called with exit code 0
sd_delay: 0

EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677_2
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4802660-677:latest
EOM
2026-03-02 03:18:27 (27532): called boinc_finish(0)

</stderr_txt>
]]>


©2026 CERN