| Name | Theory_2922-4869901-700_1 |
| Workunit | 239774760 |
| Created | 11 Mar 2026, 8:02:46 UTC |
| Sent | 11 Mar 2026, 8:24:40 UTC |
| Report deadline | 22 Mar 2026, 8:24:40 UTC |
| Received | 17 Mar 2026, 19:48:46 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 10549246 |
| Run time | 1 days 1 hours 59 min 57 sec |
| CPU time | 22 hours 56 min 11 sec |
| Priority | 0 |
| Validate state | Valid |
| Credit | 1,089.48 |
| Device peak FLOPS | 5.03 GFLOPS |
| Application version | Theory Simulation v302.10 (docker) windows_x86_64 |
| Peak working set size | 12.96 MB |
| Peak swap size | 6.88 MB |
| Peak disk usage | 8.30 MB |
<core_client_version>8.2.8</core_client_version>
<![CDATA[
<stderr_txt>
cathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.26% 357.7MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.27% 357.6MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.28% 357.7MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.28% 357.6MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.29% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.29% 357.7MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.30% 357.3MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.31% 357.4MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.31% 357.3MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.32% 357.4MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.33% 357.8MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.33% 357.3MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.34% 357.3MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.34% 357.2MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.35% 357.3MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.36% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.36% 357.2MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.37% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.37% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.38% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.38% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.39% 357.6MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.39% 357.4MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.40% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.40% 357.3MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.40% 357.4MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.41% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.41% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.42% 357.2MB / 8.164GB
EOM
invalid usage stats; using defaults
got quit request from client - pausing container
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
docker_wrapper 18 starting
docker_wrapper config:
workdir: /boinc_slot_dir
use GPU: no
create args: --cap-add=SYS_ADMIN --device /dev/fuse
verbose: 1
Using WSL distro boinc-buda-runner
Using podman
running docker command: ps --all --filter "name=^boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1$" --format "{{.Names}}|{{.Status}}"
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1|Paused
EOM
container state: Paused
container is paused; unpausing
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.05% 357.6MB / 8.164GB
EOM
invalid usage stats; using defaults
got quit request from client - pausing container
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
docker_wrapper 18 starting
docker_wrapper config:
workdir: /boinc_slot_dir
use GPU: no
create args: --cap-add=SYS_ADMIN --device /dev/fuse
verbose: 1
Using WSL distro boinc-buda-runner
Using podman
running docker command: ps --all --filter "name=^boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1$" --format "{{.Names}}|{{.Status}}"
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1|Paused
EOM
container state: Paused
container is paused; unpausing
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.00% 357.7MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.00% 357.8MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.01% 357.5MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
102.99% 371.2MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.01% 371.4MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.03% 371.8MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.04% 372.3MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.06% 372.1MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.07% 372MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.08% 372MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Up 5 hours 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
103.10% 371.8MB / 8.164GB
EOM
invalid usage stats; using defaults
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d38313b92503 localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest /bin/sh -c ./entr... 3 days ago Exited (0) 5 seconds ago 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11492
job: logsize=84 k
===> [runRivet] Sun Mar 15 10:32:20 UTC 2026 [boinc pp top 13000 - - pythia8 8.240 tune-2c 100000 700]
===> [runRivet] Tue Mar 17 14:34:50 UTC 2026 [boinc pp top 13000 - - pythia8 8.240 tune-2c 100000 700]
job: times=
0m0.000s 0m0.036s
358m3.604s 13m46.109s
job: cpuusage=22310
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 22M 1% /cvmfs/alice.cern.ch
cvmfs2 47M 2% /cvmfs/grid.cern.ch
cvmfs2 755M 19% /cvmfs/sft.cern.ch
total 823M 6% -
boinc_shutdown called with exit code 0
sd_delay: 0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11492
job: logsize=84 k
job: times=
0m0.018s 0m0.018s
287m49.312s 9m34.908s
job: cpuusage=17844
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 129M 4% /cvmfs/alice.cern.ch
cvmfs2 121M 4% /cvmfs/grid.cern.ch
cvmfs2 756M 19% /cvmfs/sft.cern.ch
total 1005M 7% -
boinc_shutdown called with exit code 0
sd_delay: 0
EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11492
job: logsize=84 k
===> [runRivet] Sun Mar 15 10:32:20 UTC 2026 [boinc pp top 13000 - - pythia8 8.240 tune-2c 100000 700]
===> [runRivet] Tue Mar 17 14:34:50 UTC 2026 [boinc pp top 13000 - - pythia8 8.240 tune-2c 100000 700]
job: times=
0m0.000s 0m0.036s
358m3.604s 13m46.109s
job: cpuusage=22310
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 22M 1% /cvmfs/alice.cern.ch
cvmfs2 47M 2% /cvmfs/grid.cern.ch
cvmfs2 755M 19% /cvmfs/sft.cern.ch
total 823M 6% -
boinc_shutdown called with exit code 0
sd_delay: 0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1cern-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=11492
job: logsize=84 k
job: times=
0m0.018s 0m0.018s
287m49.312s 9m34.908s
job: cpuusage=17844
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 129M 4% /cvmfs/alice.cern.ch
cvmfs2 121M 4% /cvmfs/grid.cern.ch
cvmfs2 756M 19% /cvmfs/sft.cern.ch
total 1005M 7% -
boinc_shutdown called with exit code 0
sd_delay: 0
EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700_1
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4869901-700:latest
EOM
2026-03-17 20:29:25 (1800): called boinc_finish(0)
</stderr_txt>
]]>
©2026 CERN