Name Theory_2922-4805873-678_0
Workunit 239305120
Created 21 Feb 2026, 16:22:10 UTC
Sent 22 Feb 2026, 2:19:10 UTC
Report deadline 5 Mar 2026, 2:19:10 UTC
Received 6 Mar 2026, 1:29:51 UTC
Server state Over
Outcome Computation error
Client state Compute error
Exit status 206 (0x000000CE) EXIT_INIT_FAILURE
Computer ID 10874263
Run time 1 days 10 hours 50 min 32 sec
CPU time 13 hours 14 min 56 sec
Priority 0
Validate state Invalid
Credit 0.00
Device peak FLOPS 7.74 GFLOPS
Application version Theory Simulation v302.10 (docker)
windows_x86_64
Peak working set size 13.22 MB
Peak swap size 2.54 MB
Peak disk usage 35.40 MB

Stderr output

<core_client_version>8.2.8</core_client_version>
<![CDATA[
<message>
The filename or extension is too long.
 (0xce) - exit code 206 (0xce)</message>
<stderr_txt>
2-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS         PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Up 17 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
0.19% 113.6MB / 8.269GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS         PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Up 17 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
0.19% 114.4MB / 8.269GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS      PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS         PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Up 18 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
0.19% 114.9MB / 8.269GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS      PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS         PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Up 18 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
0.19% 113.9MB / 8.269GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS      PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS         PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Up 19 minutes  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: stats --no-stream  --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
0.18% 113.9MB / 8.269GB
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS      PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Paused      80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
Error: "exited" is not running, can't pause: container state improper
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
Error: "921d956ccd2342f837e0bc4a8e024da8ca5c8b43f96e328c05b713ed01500584" is not paused, can't unpause: container state improper
EOM
running docker command: pause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
Error: "exited" is not running, can't pause: container state improper
EOM
running docker command: unpause boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
Error: "921d956ccd2342f837e0bc4a8e024da8ca5c8b43f96e328c05b713ed01500584" is not paused, can't unpause: container state improper
EOM
running docker command: ps --all -f "name=boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0"
program: podman
command output:
CONTAINER ID  IMAGE                                                                         COMMAND               CREATED      STATUS                      PORTS       NAMES
921d956ccd23  localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest  /bin/sh -c ./entr...  11 days ago  Exited (206) 7 seconds ago  80/tcp      boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: logs boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... Failed!
Probing /cvmfs/cvmfs-config.cern.ch... Failed!
Probing /cvmfs/grid.cern.ch... Failed!
Probing /cvmfs/sft.cern.ch... Failed!
Probing CVMFS repositories failed
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          343M   9% /cvmfs/alice.cern.ch
cvmfs2          240M   6% /cvmfs/grid.cern.ch
cvmfs2          778M  20% /cvmfs/sft.cern.ch
total           1.4G   9% -
boinc_shutdown called with exit code 206
sd_delay: 883
ETA: 2026-03-06 01:25:39 UTC

EOM
stderr from container:
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION   HOST                           PROXY
2.13.3.0  http://s1bnl-cvmfs.openhtc.io  DIRECT
******************************************************************
                        IMPORTANT HINT(S)!
******************************************************************
CVMFS server: http://s1bnl-cvmfs.openhtc.io
CVMFS proxy:  DIRECT
No local HTTP proxy found.
With this setup concurrently running containers can't share
a common CVMFS cache. A local HTTP proxy is therefore
highly recommended.
More info how to configure a local HTTP proxy:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5473
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5474
******************************************************************
Environment HTTP proxy: not set
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not find a local HTTP proxy
CVMFS and Frontier will have to use DIRECT connections
This makes the application less efficient
It also puts higher load on the project servers
Setting up a local HTTP proxy is highly recommended
Advice can be found in the project forum
Using custom CVMFS.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... Failed!
Probing /cvmfs/cvmfs-config.cern.ch... Failed!
Probing /cvmfs/grid.cern.ch... Failed!
Probing /cvmfs/sft.cern.ch... Failed!
Probing CVMFS repositories failed
Filesystem      Used Use% Mounted on
cvmfs2          398K   1% /cvmfs/cvmfs-config.cern.ch
cvmfs2          343M   9% /cvmfs/alice.cern.ch
cvmfs2          240M   6% /cvmfs/grid.cern.ch
cvmfs2          778M  20% /cvmfs/sft.cern.ch
total           1.4G   9% -
boinc_shutdown called with exit code 206
sd_delay: 883
ETA: 2026-03-06 01:25:39 UTC

EOM
stderr end
running docker command: container rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
program: podman
command output:
boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678_0
EOM
running docker command: image rm boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678
program: podman
command output:
Untagged: localhost/boinc__lhcathome.cern.ch_lhcathome__theory_2922-4805873-678:latest
EOM
2026-03-05 20:25:48 (38240): called boinc_finish(206)

</stderr_txt>
]]>


©2026 CERN