Message boards :
Number crunching :
Atlas Nativ, CVMFS and Apptainer with Ubuntu 22.04.1
Message board moderation
Author | Message |
---|---|
![]() ![]() Send message Joined: 2 Sep 04 Posts: 455 Credit: 209,773,883 RAC: 14,582 ![]() ![]() ![]() |
I wanted to setup a new Sytem for Atlas Native with Ubuntu 22.04.1 Somewhere I had seen several questions about is Apptainer / cvmfs working with Ubuntu 22.04.1, but never found an answer. So I have upgraded a working machine to Ubuntu 22.04.1 and for me, it looks as if all is working fine. cvmfs tells ok apptainer tells ok Hitfile is produced So, is there more that I should check ? Or, can someone check my results from this box: https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10813571 Thanks in Advance Yeti ![]() Supporting BOINC, a great concept ! |
Send message Joined: 2 May 07 Posts: 2262 Credit: 175,581,097 RAC: 652 ![]() ![]() ![]() |
In Production is singularity active and multiattach. Only in -dev is apptainer in use, but there is no new work (Holiday?). |
![]() Send message Joined: 15 Jun 08 Posts: 2626 Credit: 266,266,851 RAC: 126,706 ![]() ![]() |
Looks fine. The most important log entry is: HITS file was successfully produced Beside that there are no obvious errors at the beginning of the log. Nonetheless, if you occasionally see failed/invalid tasks with runtimes of just a few minutes you may check whether they also fail on your wingcomputers. |
![]() ![]() Send message Joined: 2 Sep 04 Posts: 455 Credit: 209,773,883 RAC: 14,582 ![]() ![]() ![]() |
In Production is singularity active and multiattach. From the logfile in production: [2022-08-30 06:15:06] Using singularity image /cvmfs/atlas.cern.ch/repo/containers/fs/singularity/x86_64-centos7 [2022-08-30 06:15:06] Checking for singularity binary... [2022-08-30 06:15:06] Using singularity found in PATH at /usr/bin/singularity [2022-08-30 06:15:06] Running /usr/bin/singularity --version [2022-08-30 06:15:06] apptainer version 1.1.0-rc.2 [2022-08-30 06:15:06] Checking singularity works with /usr/bin/singularity exec -B /cvmfs /cvmfs/atlas.cern.ch/repo/containers/fs/singularity/x86_64-centos7 hostname [2022-08-30 06:15:06] mannivl22 [2022-08-30 06:15:06] Singularity works From the Logfile in DEV: [2022-08-29 23:17:50] Using apptainer image /cvmfs/atlas.cern.ch/repo/containers/fs/singularity/x86_64-centos7 [2022-08-29 23:17:50] Checking for apptainer binary... [2022-08-29 23:17:50] Using apptainer found in PATH at /usr/bin/apptainer [2022-08-29 23:17:50] Running /usr/bin/apptainer --version [2022-08-29 23:17:50] apptainer version 1.1.0-rc.2 [2022-08-29 23:17:50] Checking apptainer works with /usr/bin/apptainer exec -B /cvmfs /cvmfs/atlas.cern.ch/repo/containers/fs/singularity/x86_64-centos7 hostname [2022-08-29 23:17:50] mannivl22 [2022-08-29 23:17:50] apptainer works My boxes get sporadic 1 WU in DEV ![]() Supporting BOINC, a great concept ! |
![]() Send message Joined: 15 Jun 08 Posts: 2626 Credit: 266,266,851 RAC: 126,706 ![]() ![]() |
The check for the command is here: 2022-08-30 06:15:06] Running /usr/bin/singularity --version It returns: [2022-08-30 06:15:06] apptainer version 1.1.0-rc.2 This means your system is running Apptainer. The line: [2022-08-30 06:15:06] Singularity works is a hardcoded String that the script prints to the log. It could also print "Wuppdibragglkennsdmined works" and it would still run Apptainer. On -dev David tests an updated script that prints the hardcoded string "apptainer works". ATM " /usr/bin/singularity" and " /usr/bin/apptainer" start the very same program but since the name "singularity" will sooner or later disappear all scripts using it need to be modified to use "apptainer". Regarding the lack of tasks on -dev. Yes they are intentionally very small and for testing the process rather than for productive work. |
![]() ![]() Send message Joined: 2 Sep 04 Posts: 455 Credit: 209,773,883 RAC: 14,582 ![]() ![]() ![]() |
|
Send message Joined: 2 May 07 Posts: 2262 Credit: 175,581,097 RAC: 652 ![]() ![]() ![]() |
@David Cameron: I have reverted back to v2.87. Many tasks were failing with errors creating temporary files like this: Failed to execute payload:mktemp: failed to create file via template '/tmp/asetup_XXXXXX.sh': Read-only file system which may be related to the change in the way directories are mounted in the container. I'm investigating. |
![]() Send message Joined: 15 Jun 08 Posts: 2626 Credit: 266,266,851 RAC: 126,706 ![]() ![]() |
<not_really_serious> 1st valid that was processed under the new container app clone: https://lhcathome.cern.ch/lhcathome/result.php?resultid=364616724 [2022-08-30 09:23:09] CVMFS is ok [2022-08-30 09:23:09] Using wuppdibragglkennsdmined image /cvmfs/atlas.cern.ch/repo/containers/fs/wuppdibragglkennsdmined/x86_64-centos7 [2022-08-30 09:23:09] Checking for wuppdibragglkennsdmined binary... [2022-08-30 09:23:09] Using wuppdibragglkennsdmined found in PATH at /usr/bin/wuppdibragglkennsdmined [2022-08-30 09:23:09] Running /usr/bin/wuppdibragglkennsdmined --version [2022-08-30 09:23:09] wuppdibragglkennsdmined version 1.0.2-1.2 [2022-08-30 09:23:09] Checking wuppdibragglkennsdmined works with /usr/bin/wuppdibragglkennsdmined exec -B /cvmfs /cvmfs/atlas.cern.ch/repo/containers/fs/wuppdibragglkennsdmined/x86_64-centos7 hostname [2022-08-30 09:23:09] s4 [2022-08-30 09:23:09] wuppdibragglkennsdmined works [2022-08-30 09:23:09] Set ATHENA_PROC_NUMBER=2 [2022-08-30 09:23:09] Starting ATLAS job with PandaID=5577578530 [2022-08-30 09:23:09] Running command: /usr/bin/wuppdibragglkennsdmined exec --pwd /home/boinc4/BOINC_ATLAS/slots/0 -B /cvmfs,/home /cvmfs/atlas.cern.ch/repo/containers/fs/wuppdibragglkennsdmined/x86_64-centos7 sh start_atlas.sh </not_really_serious> |
Send message Joined: 2 May 07 Posts: 2262 Credit: 175,581,097 RAC: 652 ![]() ![]() ![]() |
In English it's the same name "Kindergarten". |
©2025 CERN