| Name | Theory_2922-4893010-832_0 |
| Workunit | 240578207 |
| Created | 12 Apr 2026, 11:22:34 UTC |
| Sent | 12 Apr 2026, 19:02:36 UTC |
| Report deadline | 23 Apr 2026, 19:02:36 UTC |
| Received | 2 May 2026, 19:46:29 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 10961136 |
| Run time | 17 hours 1 min 24 sec |
| CPU time | 13 hours 11 min 1 sec |
| Priority | 0 |
| Validate state | Invalid |
| Credit | 0.00 |
| Device peak FLOPS | 4.95 GFLOPS |
| Application version | Theory Simulation v302.10 (docker) windows_x86_64 |
| Peak working set size | 14.20 MB |
| Peak swap size | 2.73 MB |
| Peak disk usage | 5.10 MB |
<core_client_version>8.2.9</core_client_version>
<![CDATA[
<stderr_txt>
10-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.69% 358.4MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.69% 358.6MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.69% 357.9MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.70% 358.3MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.70% 358.3MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.70% 358.4MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.70% 358.9MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.71% 358.6MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.71% 358.8MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.71% 358.5MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.71% 358.7MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.72% 358.4MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.72% 358.9MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.72% 358.2MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.72% 358.2MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.72% 358.4MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.73% 358.4MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.73% 358.1MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.73% 358.2MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.73% 358.1MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.73% 358.1MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.74% 358MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.74% 357.9MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.74% 358.7MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.74% 358.8MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.75% 358.3MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.75% 358.5MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.75% 358.8MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.75% 358.2MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.75% 358MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.76% 358.9MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.76% 358.6MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.76% 358.7MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.76% 358.5MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.77% 359MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.77% 358.8MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Up 3 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
100.77% 358.4MB / 16.57GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
c4a79a3fc40a localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest /bin/sh -c ./entr... 2 weeks ago Exited (0) 3 seconds ago 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhch===> [runRivet] Sat May 2 16:36:33 UTC 2026 [boinc pp z1j 13000 55 - pythia8 8.301 CP2-CR1 100000 832]
omeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11272
job: logsize=72 k
job: times=
0m0.000s 0m0.005s
193m3.184s 1m13.406s
job: cpuusage=11657
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 214K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 214M 6% /cvmfs/alice.cern.ch
cvmfs2 121M 4% /cvmfs/grid.cern.ch
cvmfs2 861M 22% /cvmfs/sft.cern.ch
total 1.2G 8% -
boinc_shutdown called with exit code 0
sd_delay: 0
EOM
stderr from container:
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhch===> [runRivet] Sat May 2 16:36:33 UTC 2026 [boinc pp z1j 13000 55 - pythia8 8.301 CP2-CR1 100000 832]
omeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11272
job: logsize=72 k
job: times=
0m0.000s 0m0.005s
193m3.184s 1m13.406s
job: cpuusage=11657
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 214K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 214M 6% /cvmfs/alice.cern.ch
cvmfs2 121M 4% /cvmfs/grid.cern.ch
cvmfs2 861M 22% /cvmfs/sft.cern.ch
total 1.2G 8% -
boinc_shutdown called with exit code 0
sd_delay: 0
EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832_0
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4893010-832:latest
Deleted: 184108ef7090b55f59582cd455c453587e8794fdc4c61367a1c63eaf833b9b0c
Deleted: 3b9480ad90aeaa21ffa44e4a642418440d20ef985b373e816ccf7c24431ea09c
Deleted: 43fa819370b24a67903b1353bc87d760c715486bbac558bb2d37e351ce1e0518
Deleted: 96b77b4bfac302e34fbfde6e487c9d1169e4db3278ce6159830551993454afe5
Deleted: c431e493957619727ce626f3fcb93cd50c453169b96bfc0ee421658e7dc6b73a
EOM
2026-05-02 20:46:24 (27760): called boinc_finish(0)
</stderr_txt>
]]>
©2026 CERN