| Name | Theory_2922-4910695-687_0 |
| Workunit | 239585133 |
| Created | 4 Mar 2026, 22:21:20 UTC |
| Sent | 5 Mar 2026, 14:29:28 UTC |
| Report deadline | 16 Mar 2026, 14:29:28 UTC |
| Received | 9 Mar 2026, 18:14:54 UTC |
| Server state | Over |
| Outcome | Success |
| Client state | Done |
| Exit status | 0 (0x00000000) |
| Computer ID | 10939727 |
| Run time | 53 min 51 sec |
| CPU time | 3 min 20 sec |
| Priority | 0 |
| Validate state | Valid |
| Credit | 59.64 |
| Device peak FLOPS | 7.97 GFLOPS |
| Application version | Theory Simulation v302.10 (docker) windows_x86_64 |
| Peak working set size | 13.72 MB |
| Peak swap size | 2.83 MB |
| Peak disk usage | 1.75 MB |
<core_client_version>8.2.8</core_client_version>
<![CDATA[
<stderr_txt>
0695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Up About a minute 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
20.67% 326MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
2026-03-09 18:48:35 (11376): Can't acquire lockfile (32) - waiting 35s
docker_wrapper 18 starting
docker_wrapper config:
workdir: /boinc_slot_dir
use GPU: no
create args: --cap-add=SYS_ADMIN --device /dev/fuse
verbose: 1
Using WSL distro boinc-buda-runner
Using podman
running docker command: ps --all --filter "name=^boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0$" --format "{{.Names}}|{{.Status}}"
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0|Paused
EOM
container state: Paused
container is paused; unpausing
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Up 23 minutes 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
28.33% 347.4MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Up 24 minutes 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
28.53% 347.5MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Up 24 minutes 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
28.82% 347.5MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Up 25 minutes 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
30.75% 348.8MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Up 26 minutes 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
31.08% 347.7MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Up 26 minutes 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
31.41% 348.7MB / 16.63GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Paused 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
Error: "exited" is not running, can't pause: container state improper
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
Error: "ee08fabc15cdfa7850b6aa358617a865953612498e5f53785334cc14e07b24bf" is not paused, can't unpause: container state improper
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
Error: "exited" is not running, can't pause: container state improper
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0"
program: podman
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
ee08fabc15cd localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest /bin/sh -c ./entr... 4 days ago Exited (0) 2 minutes ago 80/tcp boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10316
job: logsize=68 k
job: times=
0m0.007s 0m0.000s
18m37.852s 0m27.413s
job: cpuusage=1145
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 65M 2% /cvmfs/alice.cern.ch
cvmfs2 32M 1% /cvmfs/grid.cern.ch
cvmfs2 709M 18% /cvmfs/sft.cern.ch
total 805M 6% -
boinc_shutdown called with exit code 0
sd_delay: 0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10316
job: logsize=68 k
job: times=
0m0.000s 0m0.006s
18m16.987s 0m26.048s
job: cpuusage=1123
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 107M 3% /cvmfs/alice.cern.ch
cvmfs2 47M 2% /cvmfs/grid.cern.ch
cvmfs2 709M 18% /cvmfs/sft.cern.ch
total 863M 6% -
boinc_shutdown called with exit code 0
sd_delay: 0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: ht===> [runRivet] Thu Mar 5 18:50:44 UTC 2026 [boinc pp zinclusive 7000 -,-,50,130 - pythia8 8.315 default-MBR 100000 687]
tp://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a c===> [runRivet] Fri Mar 6 08:49:22 UTC 2026 [boinc pp zinclusive 7000 -,-,50,130 - pythia8 8.315 default-MBR 100000 687]
===> [runRivet] Mon Mar 9 17:26:38 UTC 2026 [boinc pp zinclusive 7000 -,-,50,130 - pythia8 8.315 default-MBR 100000 687]
ommon CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10316
job: logsize=68 k
job: times=
0m0.000s 0m0.006s
17m26.912s 0m23.664s
job: cpuusage=1071
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 193M 5% /cvmfs/alice.cern.ch
cvmfs2 121M 4% /cvmfs/grid.cern.ch
cvmfs2 710M 18% /cvmfs/sft.cern.ch
total 1.0G 7% -
boinc_shutdown called with exit code 0
sd_delay: 0
EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10316
job: logsize=68 k
job: times=
0m0.007s 0m0.000s
18m37.852s 0m27.413s
job: cpuusage=1145
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 65M 2% /cvmfs/alice.cern.ch
cvmfs2 32M 1% /cvmfs/grid.cern.ch
cvmfs2 709M 18% /cvmfs/sft.cern.ch
total 805M 6% -
boinc_shutdown called with exit code 0
sd_delay: 0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10316
job: logsize=68 k
job: times=
0m0.000s 0m0.006s
18m16.987s 0m26.048s
job: cpuusage=1123
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 107M 3% /cvmfs/alice.cern.ch
cvmfs2 47M 2% /cvmfs/grid.cern.ch
cvmfs2 709M 18% /cvmfs/sft.cern.ch
total 863M 6% -
boinc_shutdown called with exit code 0
sd_delay: 0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: ht===> [runRivet] Thu Mar 5 18:50:44 UTC 2026 [boinc pp zinclusive 7000 -,-,50,130 - pythia8 8.315 default-MBR 100000 687]
tp://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a c===> [runRivet] Fri Mar 6 08:49:22 UTC 2026 [boinc pp zinclusive 7000 -,-,50,130 - pythia8 8.315 default-MBR 100000 687]
===> [runRivet] Mon Mar 9 17:26:38 UTC 2026 [boinc pp zinclusive 7000 -,-,50,130 - pythia8 8.315 default-MBR 100000 687]
ommon CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io DIRECT
******************************************************************
IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy: DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=10316
job: logsize=68 k
job: times=
0m0.000s 0m0.006s
17m26.912s 0m23.664s
job: cpuusage=1071
Job Finished
Filesystem Used Use% Mounted on
cvmfs2 203K 1% /cvmfs/cvmfs-config.cern.ch
cvmfs2 193M 5% /cvmfs/alice.cern.ch
cvmfs2 121M 4% /cvmfs/grid.cern.ch
cvmfs2 710M 18% /cvmfs/sft.cern.ch
total 1.0G 7% -
boinc_shutdown called with exit code 0
sd_delay: 0
EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687_0
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4910695-687:latest
EOM
2026-03-09 19:13:47 (11376): called boinc_finish(0)
</stderr_txt>
]]>
©2026 CERN