Name Theory_2922-4805300-642_1
Workunit 239152720
Created 15 Feb 2026, 9:06:56 UTC
Sent 15 Feb 2026, 11:26:08 UTC
Report deadline 26 Feb 2026, 11:26:08 UTC
Received 22 Feb 2026, 16:35:26 UTC
Server state Over
Outcome Computation error
Client state Aborted by user
Exit status 203 (0x000000CB) EXIT_ABORTED_VIA_GUI
Computer ID 10849035
Run time 2 days 22 hours 56 min 19 sec
CPU time 2 days 1 hours 26 min 26 sec
Priority 0
Validate state Invalid
Credit 0.00
Device peak FLOPS 6.95 GFLOPS
Application version Theory Simulation v302.10 (docker)
windows_x86_64
Peak working set size 14.07 MB
Peak swap size 2.61 MB
Peak disk usage 38.72 MB

Stderr output

<core_client_version>8.2.8</core_client_version>
<![CDATA[
<message>
aborted by user</message>
<stderr_txt>

command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 26 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.15% 334.8MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 26 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.20% 335.3MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 26 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.33% 334.9MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 27 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.38% 335.2MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 27 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.51% 335.5MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 27 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.63% 335.1MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 27 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.78% 335.1MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 28 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.93% 336MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 28 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
72.98% 336.6MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 28 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.09% 335.6MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 28 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.14% 336.5MB / 16.69GB
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 28 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.31% 335.7MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 29 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.36% 335.6MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 29 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.47% 335.3MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 29 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.51% 335.9MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 29 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.62% 335.3MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 29 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.66% 335.9MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS      PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 30 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.79% 335.9MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 30 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
73.90% 335.2MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED     STATUS         PORTS       NAMES
6b67a8263081  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest  /bin/sh -c ./entr...  7 days ago  Up 30 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
74.00% 336MB / 16.69GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
got abort request from client
running docker command: kill boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                            PROXY
2.13.3.0  http://s1cern-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1cern-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1ral-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1ral-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642_1
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805300-642:latest
EOM

</stderr_txt>
]]>


©2026 CERN