1) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48417)
Posted 10 Aug 2023 by stratos412
Post:
@computezrmle

I had boinc-client and boinc-manager with Linux Mint 20.3 (boinc packages installed via terminal, following the steps of BOINC page setup)
Those were working fine, I also had the virtualization on linux PCs.
After the upgrade to linux mint 21, guess what. BOINC stop working. I just followed the steps for the upgrade. Didn't do any shortcuts.
This is the report from system reports. (Don't have a clue what's going on.I tried somethings from Linux mint forums referring to BOINC software but it didn't help).
The last thing I am thinking to do, is run BOINC on a virtual machine, trying some linux distributions and see what will happen.


PID: 20254 (boincmgr)
UID: 1000 (stratos)
GID: 1000 (stratos)
Signal: 4 (ILL)
Timestamp: Sat 2023-08-05 18:27:44 EEST (36s ago)
Command Line: boincmgr
Executable: /usr/bin/boincmgr
Control Group: /user.slice/user-1000.slice/session-c3.scope
Unit: session-c3.scope
Slice: user-1000.slice
Session: c3
Owner UID: 1000 (stratos)
Boot ID: 7aee6ec52e504d82a32073f253d50ee4
Machine ID: aaa642a1efb049fba8d6ca0c8d22ff0a
Hostname: SDMINT64
Storage: /var/lib/systemd/coredump/core.boincmgr.1000.7aee6ec52e504d82a32073f253d50ee4.20254.1691249264000000.zst (present)
Disk Size: 1.3M
Message: Process 20254 (boincmgr) of user 1000 dumped core.

Found module linux-vdso.so.1 with build-id: 3da8dece167cc88c99d5493d8a0cc6c18bc24e47
Found module UTF-32.so with build-id: 62026277c3fc604541f4d1facbc5be96a0e0acc0
Found module libsasl2.so.2 with build-id: b8a924f277180f8743a0c6463a379b1a4ad1aae5
Found module libp11-kit.so.0 with build-id: a0ffe1d002de5812dc718186172efb78604ddf2c
Found module libcrypto.so.3 with build-id: af9d9ce956e6c1589bc9ceeb36c8c693efca776a
Found module libgmp.so.10 with build-id: f110719303ddbea25a5e89ff730fec520eed67b0
Found module libhogweed.so.6 with build-id: 3cc4a3474de72db89e9dcc93bfb95fe377f48c37
Found module liblber-2.5.so.0 with build-id: 3ed97b9c7019fe0f810db723481020089b14f9d5
Found module libldap-2.5.so.0 with build-id: 64c7706f152e87eb8f4d84b59eb488b8448eea0e
Found module libgnutls.so.30 with build-id: ededb83d3498ef77e1f7011b79f025efd7eb498f
Found module libnettle.so.8 with build-id: 3d9c6bf106ef53d625b7b1c8bb1300e84598a74a
Found module libssh.so.4 with build-id: 79d65cbf334a121bab20865a9612eea088c1566a
Found module librtmp.so.1 with build-id: 9517ef375cd71ea3da824b4118f1599735093d66
Found module libnghttp2.so.14 with build-id: 90a67111383c58bfff9fac96da818cc62e5b68c9
Found module libcurl-gnutls.so.4 with build-id: cf9564321cd0baf433dd5d24ec5fdcbc33082635
Found module libdebuginfod.so.1 with build-id: d29aedf4071eec65900c14478ce6f0734b37903f
Found module libresolv.so.2 with build-id: 7fd7253c61aa6fce2b7e13851c15afa14a5ab160
Found module libkeyutils.so.1 with build-id: ff27227afa5eeddccab180dd29bd7fcff94aea7c
Found module libmd.so.0 with build-id: cd2d2f71b3967ebde30e2aa43b8eb63339020c06
Found module libkrb5support.so.0 with build-id: 85c1fccae74910b1afbe878af2202ec6139d8fc2
Found module libcom_err.so.2 with build-id: ce0901f10854b3c9276066b98d9a72303206e0d5
Found module libk5crypto.so.3 with build-id: 8bc1e44d4148b2b533d5a97335114565d94197f8
Found module libkrb5.so.3 with build-id: 62434c49e8118c49a9d60a0795705c806524782d
Found module libidn2.so.0 with build-id: 45b73e0e1c46a76be22f572ee98c60af5768bf8f
Found module libunistring.so.2 with build-id: ca5149da8d5a298b8f286ffca3d6e2402ec0fe01
Found module libudev.so.1 with build-id: 9183eed17e70543d81d9cddda160df12a445f94a
Found module libGLX.so.0 with build-id: ac8b68a74f1ead77477f89bc98998ecb064e3ae5
Found module libGLdispatch.so.0 with build-id: 19c339ecd74c020f1db1342213a07114f4baf5e0
Found module libbz2.so.1.0 with build-id: e56b62c27bcc7ace8f9be36b255bd7b31bfde405
Found module libelf.so.1 with build-id: 0eaf2d056fb292c3da2d99fa16c13d0ec798f121
Found module libbrotlicommon.so.1 with build-id: 43a72967cf84155914c8b3e915926733d1e57c11
Found module libbsd.so.0 with build-id: 9a6c72469251e2feb63e175ef5cb944ce6e00df3
Found module libXdmcp.so.6 with build-id: 6b60f99504aa1d3999ea02a14366d1a39d6c5dcf
Found module libXau.so.6 with build-id: 7089b383cacbfc1760634a3be19a923e51fe3315
Found module libdatrie.so.1 with build-id: 128b6874a47f2b783d9e9060d3caaee4110bfd3d
Found module libevdev.so.2 with build-id: 453d5adc374cf78a17327783c6971a44b0e35a9c
Found module libgssapi_krb5.so.2 with build-id: a05177e3a955af79b999bbc081b0f7bf9fb21c87
Found module libpsl.so.5 with build-id: 2b1afc1a3bc8bdb016e432c50db058632e7895b9
Found module libgudev-1.0.so.0 with build-id: b8325dee54d53266618de95232d1755edea29006
Found module libX11-xcb.so.1 with build-id: f0a537068940d282177d86e6ac358fc7ba5dad97
Found module libEGL.so.1 with build-id: 236d96c92ee2914a0e90e06e01b79dfcba0f7b41
Found module libGL.so.1 with build-id: fe7c476406e1e41b511089398540d618177a7dcb
Found module liborc-0.4.so.0 with build-id: 5a67015c3a49d05abd48d44529240d3c5be7b21d
Found module libdw.so.1 with build-id: 617605522f344006b53d0ebd33b69527098c5fce
Found module libunwind.so.8 with build-id: 7535e1d6fc2959b541329a7cd113164deacf5b8c
Found module libgpg-error.so.0 with build-id: 3fbec71c67bee60d8aef00697ee187079b0fb307
Found module libbrotlidec.so.1 with build-id: 4b1f390dd6e24d49684db8b2443d082379e8e977
Found module libwoff2common.so.1.0.2 with build-id: 081c598dfcd160d49d940edd395af8a5e636829b
Found module libicudata.so.70 with build-id: b1c2496dd0543023c7a19c961bb7f3abc818f465
Found module libcap.so.2 with build-id: b4bf900abf14aabe12d90988ceb30888acb2bcb0
Found module liblz4.so.1 with build-id: a85971851cd059f1af80d553c8e7170d42ec59a1
Found module libatomic.so.1 with build-id: 7f5d0a270ff82aad3a38cd529c40c8f1353848cc
Found module libpcre2-8.so.0 with build-id: 184a841c55fb7fe5e3873fcda8368c71016cd54c
Found module libblkid.so.1 with build-id: cdf95a964e3302bb356fefc4b801fae8c4340b31
Found module libexpat.so.1 with build-id: d212d1f61d04a1e60fccad1a8c118428cfd9be42
Found module libgraphite2.so.3 with build-id: 5ffbc76fc948f6b88d766a7210c2e6a329a6c278
Found module libatspi.so.0 with build-id: 2843c68233d5ba81da3d6bc31422e49472873dba
Found module libdbus-1.so.3 with build-id: 63e8b99215502138cb63afd6d65851a5e837ed49
Found module libdeflate.so.0 with build-id: 702adff4f2f7536b32bba66ecaab25f470674927
Found module libjbig.so.0 with build-id: 5ae70eb022297d6be039f37f3005fa9be544d394
Found module liblzma.so.5 with build-id: b85da6c48eb60a646615392559483b93617ef265
Found module libzstd.so.1 with build-id: 5d9d0d946a3154a748e87e17af9d14764519237b
Found module libuuid.so.1 with build-id: 64c0d0cb22fa2bdeca075a0c0418ba5ff314b220
Found module libICE.so.6 with build-id: cf39da2f7c723f976c6e676704e218513e2b0b2b
Found module libXrender.so.1 with build-id: 7ccbfa4c24e93c42fa50dd2e42fa277630f9650c
Found module libxcb-render.so.0 with build-id: cb521131fd3b0f2ee6056cbc2014b3b8ef0d5c0e
Found module libxcb.so.1 with build-id: 1bef862a339557aa16c34c7a4b27f8f3aea90517
Found module libxcb-shm.so.0 with build-id: edb24ef4079aa423edcc50a3bb0dfb912fe8a57a
Found module libpixman-1.so.0 with build-id: 5e936cdac032b6048d9d1a8c0bb0e4a10c86d48c
Found module libthai.so.0 with build-id: afa54530349e68380815d606d15dbfda8952799f
Found module libXext.so.6 with build-id: 9fb1880e02dfa11a8c39cd1a170109de08302059
Found module libwayland-cursor.so.0 with build-id: abcd1ab467757354ac36fd0938b82d47d7aebe1f
Found module libxkbcommon.so.0 with build-id: a4b17d939092101dc8f6b2a1d70eaf1fddb2dd51
Found module libXcursor.so.1 with build-id: d936a5db46d8babb0f2cc490df36b6b18a16d8aa
Found module libXrandr.so.2 with build-id: 069f930a2b41f3908a1a92b1a51c38bd13a559e2
Found module libXinerama.so.1 with build-id: 9823bebfa26a681265db4aeb09abf44deec38401
Found module libdrm.so.2 with build-id: 9c7cb19295d20e515902cb0710326a0b8d6394c8
Found module libgbm.so.1 with build-id: 246f671fbf3db74dfff3a65f6fe724b9bbdf5596
Found module libseccomp.so.2 with build-id: 5e29725d7f0bd8cb9a04f40eb45d6b75ca6bfbd2
Found module libmanette-0.2.so.0 with build-id: 6c98dd21d30aa2213d766bf2c3a5501abb56265e
Found module libwayland-client.so.0 with build-id: a85980f4efa33fa5cadfbf257349fef99d499427
Found module libwayland-egl.so.1 with build-id: e13dfac5961e8fb58c54984fe8110a73c2f5ef99
Found module libwayland-server.so.0 with build-id: 75c4598a62c3c6e94868467866488cdfba049e7d
Found module libXdamage.so.1 with build-id: 8358b7625700dc247fa21fb4304cba877a28bb04
Found module libXcomposite.so.1 with build-id: 2c84db5342304ba1a76db7dd9b7a3bfaa8a716ce
Found module libhyphen.so.0 with build-id: ff1638c45b2dd10e898808e28b77c89b32c8121f
Found module libtasn1.so.6 with build-id: 2fde6ecb43c586fe4077118f771077aa1298e7ea
Found module libsecret-1.so.0 with build-id: f1bc90f2861b0a48efde601947460df81f47597b
Found module libenchant-2.so.2 with build-id: 6567af9845ef81015bb3eb15df79c7322ad55a08
Found module libsoup-2.4.so.1 with build-id: 3e41ed6c93570797554d1e1f59efe631a0e3c2c9
Found module libwebp.so.7 with build-id: 8abe271daab53b3f0663bd3bb99f9230cc75a2b1
Found module libwebpdemux.so.2 with build-id: fc57578bdb0f8d362687e2551f822265f252703a
Found module libopenjp2.so.7 with build-id: 8fca055b4b5787b0db2397d2dfe8de5b6fbf53f2
Found module libgstfft-1.0.so.0 with build-id: cf35c05628560c1bf60db49276de3880203c124d
Found module libgstgl-1.0.so.0 with build-id: c80520d5cadedac0de8bccea7f21f2f56af1c02f
Found module libgstvideo-1.0.so.0 with build-id: 0d8831ac2aee694b1bb4ef38721a2e74263a6ff2
Found module libgsttag-1.0.so.0 with build-id: 3a8a8cd1ad91e9362539a84e3f27044a8214321a
Found module libgstaudio-1.0.so.0 with build-id: 7df3f3fbf0095fcb07d4d963142e86cc82fe5ceb
Found module libgstpbutils-1.0.so.0 with build-id: 84da637f9861cde09ae30956225d16aaec660243
Found module libgstreamer-1.0.so.0 with build-id: ebda18376729c74f804450ea604d98752468b65a
Found module libgstbase-1.0.so.0 with build-id: 32e2a77fe22377f118bcf70c01c1e8549d78e7a1
Found module libgstapp-1.0.so.0 with build-id: 465eb331d56ea1813e7ff06b809043405f689c0f
Found module libgstallocators-1.0.so.0 with build-id: 217516c638729c7fa73bfeabe833e8e3da640834
Found module libgcrypt.so.20 with build-id: 60a5e524de0ed8323edf33e9eb9127a9eee02359
Found module libharfbuzz-icu.so.0 with build-id: a095e684d573592116a17180c207981cfd7c0c08
Found module libfreetype.so.6 with build-id: bc6c65a19e6f75fea5e74a7fd6c0b91182e1a8b0
Found module libwoff2dec.so.1.0.2 with build-id: 7ee907a44c16fed44822070c07e0dcbeb33754a4
Found module liblcms2.so.2 with build-id: 0bda30d5d03a817e234844b5414ea1dc25dc824a
Found module libxslt.so.1 with build-id: 3076207dc96a219dbe8aa3a15613f5a6814d14b5
Found module libsqlite3.so.0 with build-id: 0f2f07c3459119c3759ed803ccf46906be78bee4
Found module libxml2.so.2 with build-id: 1cf4a22fbe15a77baca28e9c824592b8b5d852ff
Found module libicuuc.so.70 with build-id: bef3ff1d70aadd68aab07d858a759360c8b919ae
Found module libsystemd.so.0 with build-id: e45f7492c0f62251620378d7224ad0371a8d1f98
Found module libicui18n.so.70 with build-id: ff2dbcdd92cbe5a63d20291e295d8fed9f87d35b
Found module libjavascriptcoregtk-4.0.so.18 with build-id: 0964d187a28ad25911e253b2bca5f49ad48c8a3e
Found module libWPEBackend-fdo-1.0.so.1 with build-id: 67724f32e7943a136bd5c419c9906a86e7e38795
Found module libwpe-1.0.so.1 with build-id: 08dbb3ef676b87569c3e0a13cc55e3848a659527
Found module libselinux.so.1 with build-id: 6fa53202ce676297de24873c886443b2759bfd8a
Found module libmount.so.1 with build-id: eeb33f2b4b9c3eb0a29575eb9932ef08663bd836
Found module ld-linux-x86-64.so.2 with build-id: 61ef896a699bb1c2e4e231642b2e1688b2f1a61e
Found module libpcre.so.3 with build-id: 3982f316c887e3ad9598015fa5bae8557320476a
Found module libffi.so.8 with build-id: 59c2a6b204f74f358ca7711d2dfd349d88711f6a
Found module libfontconfig.so.1 with build-id: 0bb435fdd5ec37178e14ea03bb36f779a4b72a94
Found module libharfbuzz.so.0 with build-id: 3b9c495c079286b8d1f55d0937a0a771593eb7e6
Found module libpangoft2-1.0.so.0 with build-id: fc7f13d8298f5e10fb1acbaa5472ec8b6fbf2bd9
Found module libfribidi.so.0 with build-id: 6e075a666e1da8ffdb948d734e75d82b1b6dc0fb
Found module libepoxy.so.0 with build-id: 5ea53a2b100e4b044eee19d5222881a724abf046
Found module libatk-bridge-2.0.so.0 with build-id: 7e8009077fbdcd7bf094c51bf78742b96f216d73
Found module libatk-1.0.so.0 with build-id: b93088667fbd06f6b72d273403d352e7c0554698
Found module libcairo-gobject.so.2 with build-id: 71b5bd37d77ea1862d2ed00e9f6ead482f307db8
Found module libXfixes.so.3 with build-id: a9c550a40b8154a3b4b5e2ac182bb50c013c3f18
Found module libXi.so.6 with build-id: 8ff5a3ac871a90fd9d0a7917c61f748a41c6b5ee
Found module libgmodule-2.0.so.0 with build-id: 8b369a368c3070d179ddf64724ffc229c3f214b5
Found module libz.so.1 with build-id: 30840b79ac329ecbf1dec0bb60180eed256d319f
Found module libtiff.so.5 with build-id: ac76776cbb1c36670c833fdf62b86ba5f86ef9ab
Found module libjpeg.so.8 with build-id: c54abff9294357e28532a76a049a4cb2542fc15b
Found module libpng16.so.16 with build-id: d58bf7c11ac793d528926238d831792b5ef792cf
Found module libSM.so.6 with build-id: 6b32192c8a8870a8fe04403f537e806da93a1dd8
Found module libpangocairo-1.0.so.0 with build-id: 65987e60b791e0eb6231575b8d5cfd33a6379b22
Found module libX11.so.6 with build-id: d1d3345a252a40a004cbd02011a651930e172ccd
Found module libcairo.so.2 with build-id: 60a39c3684e41370bd0a59ed1ecbdccf47e30069
Found module libpango-1.0.so.0 with build-id: 42c8896c53d9d22dc73ba7a78b326d61f34e4442
Found module libgdk-3.so.0 with build-id: 97b768ef80858a79741be2492754d948989e63c6
Found module libwebkit2gtk-4.0.so.37 with build-id: 3840b4b21f7b1b13449507038b6622968bd5fc79
Found module libgio-2.0.so.0 with build-id: 07bd46a1bb58e321e6aabc67135d054e6b78069d
Found module libgdk_pixbuf-2.0.so.0 with build-id: 374b383e3b68b5d8b552519094129f401596e502
Found module libc.so.6 with build-id: 69389d485a9793dbe873f0ea2c93e02efaa9aa3d
Found module libgcc_s.so.1 with build-id: e3a44e0da9c6e835d293ed8fd2882b4c4a87130c
Found module libm.so.6 with build-id: 27e82301dba6c3f644404d504e1bb1c97894b433
Found module libstdc++.so.6 with build-id: e37fe1a879783838de78cbc8c80621fa685d58a2
Found module libglib-2.0.so.0 with build-id: c74e800dfd5f72649d673b44292f4a817e45150b
Found module libgobject-2.0.so.0 with build-id: 7c47809b4e688382aab4127a2e07496450c5e6b0
Found module libgtk-3.so.0 with build-id: 9a340345f5e200f42140f3cb4bacb407e91843f8
Found module libwx_baseu-3.0.so.0 with build-id: 2c598c22915c07b7ea2a4f67cab1fb81fbb0f4ce
Found module libwx_baseu_net-3.0.so.0 with build-id: 586503861773eff97e8ecbd2fa142c6446a01c76
Found module libwx_gtk3u_core-3.0.so.0 with build-id: 63d405584aa1e3c4cfa135a633bdcb3b045b2904
Found module libwx_gtk3u_adv-3.0.so.0 with build-id: 7bfef3f10aa7cf0da88cc168376384ef2975c035
Found module libwx_gtk3u_html-3.0.so.0 with build-id: 7b898e601274c93c946a10bbe83254f73f0fd7fb
Found module libwx_gtk3u_webview-3.0.so.0 with build-id: e210c63096a5fb1dd97e2fb9b1a04ef38fa0e0bf
Found module libnotify.so.4 with build-id: 9f17eb75ce0087dc0e9e2049eeb47d37883006e3
Found module libboinc.so.7 with build-id: 80df7dbe2305c2f0831cf490c8953de704fb58e3
Found module boincmgr with build-id: 313b3a7d3ad4e5b5a9786e2b58809d7ca509109c
Stack trace of thread 20254:
#0 0x00005646f752f55d n/a (boincmgr + 0x10055d)
#1 0x00005646f7539053 n/a (boincmgr + 0x10a053)
#2 0x00007efe5d40febb call_init (libc.so.6 + 0x29ebb)
#3 0x00005646f753c745 n/a (boincmgr + 0x10d745)
===================================================================
GDB Log
===================================================================
[New LWP 20254]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
Core was generated by `boincmgr'.
Program terminated with signal SIGILL, Illegal instruction.
#0 0x00005646f752f55d in ?? ()

===================================================================
GDB Backtrace
===================================================================
[New LWP 20254]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
Core was generated by `boincmgr'.
Program terminated with signal SIGILL, Illegal instruction.
#0 0x00005646f752f55d in ?? ()
#0 0x00005646f752f55d in ?? ()
#1 0x00005646f7539053 in ?? ()
#2 0x00007efe5d40febb in call_init (env=<optimized out>, argv=0x7ffe11e36d88, argc=1) at ../csu/libc-start.c:145
#3 __libc_start_main_impl (main=0x5646f752f120, argc=1, argv=0x7ffe11e36d88, init=<optimized out>, fini=<optimized out>, rtld_fini=<optimized out>, stack_end=0x7ffe11e36d78) at ../csu/libc-start.c:379
#4 0x00005646f753c745 in ?? ()

===================================================================
GDB Backtrace (all threads)
===================================================================
[New LWP 20254]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
Core was generated by `boincmgr'.
Program terminated with signal SIGILL, Illegal instruction.
#0 0x00005646f752f55d in ?? ()

Thread 1 (Thread 0x7efe52db3b00 (LWP 20254)):
#0 0x00005646f752f55d in ?? ()
No symbol table info available.
#1 0x00005646f7539053 in ?? ()
No symbol table info available.
#2 0x00007efe5d40febb in call_init (env=<optimized out>, argv=0x7ffe11e36d88, argc=1) at ../csu/libc-start.c:145
j = 0
jm = <optimized out>
addrs = <optimized out>
l = <optimized out>
init_array = <optimized out>
#3 __libc_start_main_impl (main=0x5646f752f120, argc=1, argv=0x7ffe11e36d88, init=<optimized out>, fini=<optimized out>, rtld_fini=<optimized out>, stack_end=0x7ffe11e36d78) at ../csu/libc-start.c:379
No locals.
#4 0x00005646f753c745 in ?? ()
No symbol table info available.
#0 0x00005646f752f55d in ?? ()
#1 0x00005646f7539053 in ?? ()
#2 0x00007efe5d40febb in call_init (env=<optimized out>, argv=0x7ffe11e36d88, argc=1) at ../csu/libc-start.c:145
#3 __libc_start_main_impl (main=0x5646f752f120, argc=1, argv=0x7ffe11e36d88, init=<optimized out>, fini=<optimized out>, rtld_fini=<optimized out>, stack_end=0x7ffe11e36d78) at ../csu/libc-start.c:379
#4 0x00005646f753c745 in ?? ()
2) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48409)
Posted 9 Aug 2023 by stratos412
Post:
BOINC flatpak runs fine. It is installed via Software Manager.

The problem is why I keep getting hard time.

A) Open file: /etc/systemd/system/multi-user.target.wants/boinc-client.service (yesterday)
For some reason (today):
"The link boinc-client.service is broken. The link cannot be used because its target /lib/systemd/system/boinc-client.service does not exist"

B) I cannot run any LCH task


<core_client_version>7.22.1</core_client_version>
<![CDATA[
<message>
process exited with code 195 (0xc3, -61)</message>
<stderr_txt>
23:17:20 (78): wrapper (7.15.26016): starting
23:17:20 (78): wrapper (7.15.26016): starting
23:17:20 (78): wrapper: running ../../projects/lhcathome.cern.ch_lhcathome/cranky-0.0.32 ()
23:17:20 EEST +03:00 2023-08-09: cranky-0.0.32: [INFO] Detected Theory App
23:17:20 EEST +03:00 2023-08-09: cranky-0.0.32: [INFO] Checking CVMFS.
23:17:20 EEST +03:00 2023-08-09: cranky-0.0.32: [ERROR] 'which' could not locate the command 'cvmfs_config'.
23:17:21 (78): cranky exited; CPU time 0.019041
23:17:21 (78): app exit status: 0xce
23:17:21 (78): called boinc_finish(195)
</stderr_txt>
]]>


C) Tried all the instruction of those posts and still nothing. Something is misconfigured?

D) Still cannot understand why NO virtualization for the same PC with different OS.
https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10834344
https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10822906
3) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48406)
Posted 9 Aug 2023 by stratos412
Post:
This is getting ridiculous..... :/

Open file: /etc/systemd/system/multi-user.target.wants/boinc-client.service (yesterday)
For some reason (today):
"The link boinc-client.service is broken. The link cannot be used because its target /lib/systemd/system/boinc-client.service does not exist"

Uninstall boinc(flatpak) / reboot PC twice / re-install boinc(flatpak) --> Same problem.
However, BOINC start fine and get tasks.

Someone is laughing at me...

----------------------------
Just a short story.
This remind me my first touch with Linux Mint 18.3 and I believe that's the reason why Linux is never going to win a windows average user.
(it is said that is a friendly version for a noob user who uses windows OS all his life).

I plug a usb stick to PC. It doesn't recognize it.
After some internet search I learn that I have to format the usb to an ext4 format. So far so good.
After a couple of days, I get an error "unable to mount usb stick. Not authorized to perform operation blalh blah...."
I wonder " what the hell?. Two days ago it worked fine. "
Search again on internet. Spent two days trying different things and keep notes what I have done, because I may break something.

At the end, I find a post that suggest to kill any process related to google-remote-desktop!
Try it and voila! Usb worked again!
I was yelling "Why the f*** google-remote-desktop interferes with my usb stick?!?!?. It doesn't make sense."

If you have to search around the internet for just those silly things, something in the OS must be wrong.
I installed BOINC in a windows machine and it didn't get more that 30 minutes to understand it and configure.
In the linux machine, I spent whole days with no result and after a major upgrade, thing getting worse.
----------------------------
4) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48393)
Posted 8 Aug 2023 by stratos412
Post:
No file there: /etc/systemd/system/boinc-client.service
but there is a file there: /etc/systemd/system/multi-user.target.wants/boinc-client.service


[Unit]
Description=Berkeley Open Infrastructure Network Computing Client
Documentation=man:boinc(1)
After=network-online.target

[Service]
Type=simple
ProtectHome=true
ProtectSystem=strict
ProtectControlGroups=true
ReadWritePaths=-/var/lib/boinc -/etc/boinc-client
Nice=10
User=boinc
WorkingDirectory=/var/lib/boinc
ExecStart=/usr/bin/boinc
ExecStop=/usr/bin/boinccmd --quit
ExecReload=/usr/bin/boinccmd --read_cc_config
ExecStopPost=/bin/rm -f lockfile
IOSchedulingClass=idle
# The following options prevent setuid root as they imply NoNewPrivileges=true
# Since Atlas requires setuid root, they break Atlas
# In order to improve security, if you're not using Atlas,
# Add these options to the [Service] section of an override file using
# sudo systemctl edit boinc-client.service
#NoNewPrivileges=true
#ProtectKernelModules=true
#ProtectKernelTunables=true
#RestrictRealtime=true
#RestrictAddressFamilies=AF_INET AF_INET6 AF_UNIX
#RestrictNamespaces=true
#PrivateUsers=true
#CapabilityBoundingSet=
#MemoryDenyWriteExecute=true
#PrivateTmp=true #Block X11 idle detection

[Install]
WantedBy=multi-user.target



P.S. After that, you may write a whole book about ''how to configure BOINC and Virtualbox in linux mint 21.''
5) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48391)
Posted 8 Aug 2023 by stratos412
Post:
grep -i boinc /etc/passwd

boinc:x:122:129:BOINC core client,,,:/var/lib/boinc-client:/usr/sbin/bash

That's the NEW PC client:
https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10834344

P.S. I changed in the /etc/passwd those two lines: (I open the /etc as root and change the passwd file with notepad. Hope didn't do some bullsh*t)

1)
boinc:x:122:129:BOINC core client,,,:/var/lib/boinc-client:/usr/sbin/nologin to
boinc:x:122:129:BOINC core client,,,:/var/lib/boinc-client:/usr/sbin/bash

and
2)
_flatpak:x:126:136:Flatpak system-wide installation helper,,,:/nonexistent:/usr/sbin/nologin
_flatpak:x:126:136:Flatpak system-wide installation helper,,,:/nonexistent:/usr/sbin/bash


root@SDMINT64:~# su -c "cvmfs_config probe alice && cvmfs_config stat alice && ls -hal /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/" boinc
su: failed to execute /usr/sbin/bash: No such file or directory


Boinc(flatpak) version 7.22.1 (x64) is installed .
6) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48389)
Posted 8 Aug 2023 by stratos412
Post:
@computezrmle.

Thanks for your patience and instructions. :)
Something I am missing though... :/

root@SDMINT64:~# su -c "cvmfs_config probe alice && cvmfs_config stat alice && ls -hal /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/" boinc
This account is currently not available


cvmfs_config probe

Probing /cvmfs/atlas.cern.ch... OK
Probing /cvmfs/atlas-condb.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/cernvm-prod.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Probing /cvmfs/alice.cern.ch... OK

stratos@SDMINT64:~$ which cvmfs_config
/usr/bin/cvmfs_config

root@SDMINT64:~# which cvmfs_config
/usr/bin/cvmfs_config
7) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48364)
Posted 7 Aug 2023 by stratos412
Post:
OK. A few things
The command ''which cvmfs_config' gave this for both root and normal user (my acount) :
/usr/bin/cvmfs_config

For BOINC user I don't know how to switch. I suppose I have to type in terminal sudo su boinc (????)
If so, I came up with "This account is currently not available."


About the file : /etc/cvmfs/config.d/cvmfs-config.cern.ch.local // this is what inside


# In order to bootstrap the network connection, we use a DIRECT proxy
# for the config repository only if no other proxy has been set.
CVMFS_HTTP_PROXY="${CVMFS_HTTP_PROXY:=DIRECT}"

CVMFS_CONFIG_REPO_REQUIRED=NO


About the file : /etc/cvmfs/domain.d/cern.ch.local // this is what inside


# Don't edit here. Create /etc/cvmfs/domain.d/cern.ch.local.
# As a rule of thumb, overwrite only parameters you find in here.
# If you look for any other parameter, check /etc/cvmfs/default.(conf|local)
# and /etc/cvmfs/config.d/<your_repository>.(conf|local)
#
# Parameter files are sourced in the following order
#
# /etc/cvmfs/default.conf
# /etc/cvmfs/default.d/*.conf (in alphabetical order)
# $CVMFS_CONFIG_REPOSITORY/etc/cvmfs/default.conf (if config repository is set)
# /etc/cvmfs/default.local
#
# $CVMFS_CONFIG_REPOSITORY/etc/cvmfs/domain.d/<your_domain>.conf (if config repository is set)
# /etc/cvmfs/domain.d/<your_domain>.conf
# /etc/cvmfs/domain.d/<your_domain>.local
#
# $CVMFS_CONFIG_REPOSITORY/etc/cvmfs/config.d/<your_repository>.conf (if config repository is set)
# /etc/cvmfs/config.d/<your_repository>.conf
# /etc/cvmfs/config.d/<your_repository>.local
#
# Use cvmfs_config showconfig to get the effective parameters.
#

# The CVMFS_CONFIG_REPO_DEFAULT_ENV parameter is set in
# /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/default.conf
if [ "$CVMFS_CONFIG_REPO_DEFAULT_ENV" = "" ]; then
# Use the configuration in this package only if the config repository is not
# mounted. Note that in this case the cvmfs client writes a warning to syslog
# because CVMFS_CONFIG_REPOSITORY is set.

# Stratum 1 servers for the cern.ch domain.
if [ "$CVMFS_USE_CDN" = "yes" ]; then
CVMFS_SERVER_URL="http://s1cern-cvmfs.openhtc.io/cvmfs/@fqrn@;http://s1ral-cvmfs.openhtc.io/cvmfs/@fqrn@;http://s1bnl-cvmfs.openhtc.io/cvmfs/@fqrn@;http://s1fnal-cvmfs.openhtc.io/cvmfs/@fqrn@"
else
CVMFS_SERVER_URL="http://cvmfs-stratum-one.cern.ch/cvmfs/@fqrn@;http://cernvmfs.gridpp.rl.ac.uk/cvmfs/@fqrn@;http://cvmfs-s1bnl.opensciencegrid.org/cvmfs/@fqrn@;http://cvmfs-s1fnal.opensciencegrid.org/cvmfs/@fqrn@"
fi

# Key chain with public signing keys for repositories in the cern.ch domain
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch
CVMFS_CONFIG_REPO_REQUIRED=yes

# The cern.ch stratum 1 servers support the Geo-API
CVMFS_USE_GEOAPI=yes

fi



P.S.
I think I have to get a master in computer science to understand and configure all these Mint 21 and BOINC dogs.
This is the third time that things got broken (First install mint 18.3, upgrade from mint 18.3 to 20.3 and upgrade from 20.3 to 21)
8) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48362)
Posted 7 Aug 2023 by stratos412
Post:
OK. I followed the instructions.

1) Installed cvmfs_2.10.1~1+ubuntu22.04_amd64.deb + cvmfs-config-default_latest_all.deb from the links.
2) Run in terminal "cvmfs_config setup". That required root privilage, so I did "sudo su" and then run it.
It didn't seem to do anything, so I waited a while.
3) Create/ Modify the files a) /etc/cvmfs/default.local b) /etc/cvmfs/config.d/cvmfs-config.cern.ch.local c) /etc/cvmfs/domain.d/cern.ch.local


The cvmfs_config probe show this


Probing /cvmfs/atlas.cern.ch... OK
Probing /cvmfs/atlas-condb.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/cernvm-prod.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Probing /cvmfs/alice.cern.ch... OK


The cvmfs_config showconfig - s atlas show this


VMFS_REPOSITORY_NAME=atlas.cern.ch
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_DEFAULT_ENV=1 # from /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED= # from /etc/cvmfs/domain.d/cern.ch.local
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_FALLBACK_PROXY= # from /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/domain.d/cern.ch.conf
CVMFS_HIDE_MAGIC_XATTRS=yes # from /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/default.conf
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY='auto;DIRECT' # from /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/domain.d/cern.ch.conf
CVMFS_KCACHE_TIMEOUT=30 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/cvmfs/cvmfs-config.cern.ch/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.local
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAGIC_XATTRS_VISIBILITY=rootonly # from /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/default.conf
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=yes # from /etc/cvmfs/default.local
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_PAC_URLS='http://grid-wpad/wpad.dat;http://wpad/wpad.dat;http://cernvm-wpad.fnal.gov/wpad.dat;http://cernvm-wpad.cern.ch/wpad.dat' # from /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/default.conf
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_QUOTA_LIMIT=6144 # from /etc/cvmfs/default.local
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_SEND_INFO_HEADER=yes # from /cvmfs/cvmfs-config.cern.ch/etc/cvmfs/domain.d/cern.ch.conf
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/atlas.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/atlas.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/atlas.cern.ch;http://s1fnal-cvmfs.openhtc.io:8080/cvmfs/atlas.cern.ch;http://s1asgc-cvmfs.openhtc.io:8080/cvmfs/atlas.cern.ch;http://s1ihep-cvmfs.openhtc.io:8080/cvmfs/atlas.cern.ch;http://s1swinburne-cvmfs.openhtc.io:8080/cvmfs/atlas.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.local
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.local
CVMFS_USER=cvmfs # from /etc/cvmfs/default.conf



The cvmfs_config showconfig -s cvmfs-config show this


CVMFS_REPOSITORY_NAME=cvmfs-config.cern.ch
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED=no # from /etc/cvmfs/config.d/cvmfs-config.cern.ch.local
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY='auto;DIRECT' # from /etc/cvmfs/config.d/cvmfs-config.cern.ch.local
CVMFS_KCACHE_TIMEOUT=30 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.local
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=yes # from /etc/cvmfs/default.local
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_QUOTA_LIMIT=6144 # from /etc/cvmfs/default.local
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_SEND_INFO_HEADER=yes # from /etc/cvmfs/default.local
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/cvmfs-config.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/cvmfs-config.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/cvmfs-config.cern.ch;http://s1fnal-cvmfs.openhtc.io/cvmfs/cvmfs-config.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.local
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.local
CVMFS_USER=cvmfs # from /etc/cvmfs/default.conf



Still get error for LHC ATLAS

https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10834339[/quote]

Also, any suggestionS what version of virtual box to install with Mint 21 (kernel 5.150.78-generic x86_64)?
I currently have 6.1.38, but it doesn't seem to work with BOINC(flatpak).
9) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48361)
Posted 6 Aug 2023 by stratos412
Post:
@computezrmle

I will try step-by-step your instructions and I will post again the results.
Some of them I have forgotten.
10) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48349)
Posted 5 Aug 2023 by stratos412
Post:
i got also errors.


Probing /cvmfs/atlas.cern.ch... Failed!
Probing /cvmfs/atlas-condb.cern.ch... Failed!
Probing /cvmfs/grid.cern.ch... Failed!
Probing /cvmfs/cernvm-prod.cern.ch... Failed!
Probing /cvmfs/sft.cern.ch... Failed!
Probing /cvmfs/alice.cern.ch... Failed!


Also this


Warning: CernVM-FS map is not referenced from autofs maps (/etc/auto.master)
Warning: failed to access http://s1cern-cvmfs.openhtc.io/cvmfs/atlas.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1cern-cvmfs.openhtc.io
Warning: failed to access http://s1ral-cvmfs.openhtc.io/cvmfs/atlas.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1ral-cvmfs.openhtc.io
Warning: failed to access http://s1bnl-cvmfs.openhtc.io/cvmfs/atlas.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1bnl-cvmfs.openhtc.io
Warning: failed to access http://s1fnal-cvmfs.openhtc.io/cvmfs/atlas.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1fnal-cvmfs.openhtc.io
Warning: failed to access http://s1cern-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1cern-cvmfs.openhtc.io
Warning: failed to access http://s1ral-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1ral-cvmfs.openhtc.io
Warning: failed to access http://s1bnl-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1bnl-cvmfs.openhtc.io
Warning: failed to access http://s1fnal-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1fnal-cvmfs.openhtc.io
Warning: failed to access http://s1cern-cvmfs.openhtc.io/cvmfs/grid.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1cern-cvmfs.openhtc.io
Warning: failed to access http://s1ral-cvmfs.openhtc.io/cvmfs/grid.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1ral-cvmfs.openhtc.io
Warning: failed to access http://s1bnl-cvmfs.openhtc.io/cvmfs/grid.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1bnl-cvmfs.openhtc.io
Warning: failed to access http://s1fnal-cvmfs.openhtc.io/cvmfs/grid.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1fnal-cvmfs.openhtc.io
Warning: failed to access http://s1cern-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1cern-cvmfs.openhtc.io
Warning: failed to access http://s1ral-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1ral-cvmfs.openhtc.io
Warning: failed to access http://s1bnl-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1bnl-cvmfs.openhtc.io
Warning: failed to access http://s1fnal-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1fnal-cvmfs.openhtc.io
Warning: failed to access http://s1cern-cvmfs.openhtc.io/cvmfs/sft.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1cern-cvmfs.openhtc.io
Warning: failed to access http://s1ral-cvmfs.openhtc.io/cvmfs/sft.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1ral-cvmfs.openhtc.io
Warning: failed to access http://s1bnl-cvmfs.openhtc.io/cvmfs/sft.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1bnl-cvmfs.openhtc.io
Warning: failed to access http://s1fnal-cvmfs.openhtc.io/cvmfs/sft.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1fnal-cvmfs.openhtc.io
Warning: failed to access http://s1cern-cvmfs.openhtc.io/cvmfs/alice.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1cern-cvmfs.openhtc.io
Warning: failed to access http://s1ral-cvmfs.openhtc.io/cvmfs/alice.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1ral-cvmfs.openhtc.io
Warning: failed to access http://s1bnl-cvmfs.openhtc.io/cvmfs/alice.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1bnl-cvmfs.openhtc.io
Warning: failed to access http://s1fnal-cvmfs.openhtc.io/cvmfs/alice.cern.ch/.cvmfspublished through proxy http://squid:3128
Warning: failed to use Geo-API with s1fnal-cvmfs.openhtc.io
[/quote]
11) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48348)
Posted 5 Aug 2023 by stratos412
Post:
Thanks for the help guys. So, I run a command to check what's wrong with cvmfs
If anyone can help...


Running /usr/bin/cvmfs_config atlas:
CVMFS_REPOSITORY_NAME=atlas.cern.ch
CERNVM_GRID_UI_VERSION=
CVMFS_ALIEN_CACHE=
CVMFS_ALT_ROOT_PATH=
CVMFS_AUTHZ_HELPER=
CVMFS_AUTHZ_SEARCH_PATH=
CVMFS_AUTO_UPDATE=
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CACHE_PRIMARY=
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED=
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEBUGLOG=
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DNS_RETRIES=
CVMFS_DNS_TIMEOUT=
CVMFS_EXTERNAL_FALLBACK_PROXY=
CVMFS_EXTERNAL_HTTP_PROXY=
CVMFS_EXTERNAL_SERVER_URL=
CVMFS_EXTERNAL_TIMEOUT=
CVMFS_EXTERNAL_TIMEOUT_DIRECT=
CVMFS_FALLBACK_PROXY=
CVMFS_FOLLOW_REDIRECTS=
CVMFS_HIDE_MAGIC_XATTRS=
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY=
CVMFS_IGNORE_SIGNATURE=
CVMFS_INITIAL_GENERATION=
CVMFS_IPFAMILY_PREFER=
CVMFS_KCACHE_TIMEOUT=2 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAX_IPADDR_PER_PROXY=
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MAX_TTL=
CVMFS_MEMCACHE_SIZE=
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_NFS_SHARED=
CVMFS_NFS_SOURCE=
CVMFS_OOM_SCORE_ADJ=
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_PROXY_TEMPLATE=
CVMFS_PUBLIC_KEY=
CVMFS_QUOTA_LIMIT=4000 # from /etc/cvmfs/default.conf
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_REPOSITORY_DATE=
CVMFS_REPOSITORY_TAG=
CVMFS_ROOT_HASH=
CVMFS_SEND_INFO_HEADER=no # from /etc/cvmfs/default.conf
CVMFS_SERVER_CACHE_MODE=
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/atlas.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/atlas.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/atlas.cern.ch;http://s1fnal-cvmfs.openhtc.io/cvmfs/atlas.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_SYSLOG_FACILITY=
CVMFS_SYSLOG_LEVEL=
CVMFS_SYSTEMD_NOKILL=
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_TRACEFILE=
CVMFS_TRUSTED_CERTS=
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_USER=cvmfs # from /etc/cvmfs/default.conf
CVMFS_WORKSPACE=

Running /usr/bin/cvmfs_config atlas-condb:
CVMFS_REPOSITORY_NAME=atlas-condb.cern.ch
CERNVM_GRID_UI_VERSION=
CVMFS_ALIEN_CACHE=
CVMFS_ALT_ROOT_PATH=
CVMFS_AUTHZ_HELPER=
CVMFS_AUTHZ_SEARCH_PATH=
CVMFS_AUTO_UPDATE=
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CACHE_PRIMARY=
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED=
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEBUGLOG=
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DNS_RETRIES=
CVMFS_DNS_TIMEOUT=
CVMFS_EXTERNAL_FALLBACK_PROXY=
CVMFS_EXTERNAL_HTTP_PROXY=
CVMFS_EXTERNAL_SERVER_URL=
CVMFS_EXTERNAL_TIMEOUT=
CVMFS_EXTERNAL_TIMEOUT_DIRECT=
CVMFS_FALLBACK_PROXY=
CVMFS_FOLLOW_REDIRECTS=
CVMFS_HIDE_MAGIC_XATTRS=
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY=
CVMFS_IGNORE_SIGNATURE=
CVMFS_INITIAL_GENERATION=
CVMFS_IPFAMILY_PREFER=
CVMFS_KCACHE_TIMEOUT=2 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAX_IPADDR_PER_PROXY=
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MAX_TTL=
CVMFS_MEMCACHE_SIZE=
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_NFS_SHARED=
CVMFS_NFS_SOURCE=
CVMFS_OOM_SCORE_ADJ=
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_PROXY_TEMPLATE=
CVMFS_PUBLIC_KEY=
CVMFS_QUOTA_LIMIT=4000 # from /etc/cvmfs/default.conf
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_REPOSITORY_DATE=
CVMFS_REPOSITORY_TAG=
CVMFS_ROOT_HASH=
CVMFS_SEND_INFO_HEADER=no # from /etc/cvmfs/default.conf
CVMFS_SERVER_CACHE_MODE=
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch;http://s1fnal-cvmfs.openhtc.io/cvmfs/atlas-condb.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_SYSLOG_FACILITY=
CVMFS_SYSLOG_LEVEL=
CVMFS_SYSTEMD_NOKILL=
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_TRACEFILE=
CVMFS_TRUSTED_CERTS=
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_USER=cvmfs # from /etc/cvmfs/default.conf
CVMFS_WORKSPACE=

Running /usr/bin/cvmfs_config grid:
CVMFS_REPOSITORY_NAME=grid.cern.ch
CERNVM_GRID_UI_VERSION=
CVMFS_ALIEN_CACHE=
CVMFS_ALT_ROOT_PATH=
CVMFS_AUTHZ_HELPER=
CVMFS_AUTHZ_SEARCH_PATH=
CVMFS_AUTO_UPDATE=
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CACHE_PRIMARY=
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED=
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEBUGLOG=
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DNS_RETRIES=
CVMFS_DNS_TIMEOUT=
CVMFS_EXTERNAL_FALLBACK_PROXY=
CVMFS_EXTERNAL_HTTP_PROXY=
CVMFS_EXTERNAL_SERVER_URL=
CVMFS_EXTERNAL_TIMEOUT=
CVMFS_EXTERNAL_TIMEOUT_DIRECT=
CVMFS_FALLBACK_PROXY=
CVMFS_FOLLOW_REDIRECTS=
CVMFS_HIDE_MAGIC_XATTRS=
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY=
CVMFS_IGNORE_SIGNATURE=
CVMFS_INITIAL_GENERATION=
CVMFS_IPFAMILY_PREFER=
CVMFS_KCACHE_TIMEOUT=2 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAX_IPADDR_PER_PROXY=
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MAX_TTL=
CVMFS_MEMCACHE_SIZE=
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_NFS_SHARED=
CVMFS_NFS_SOURCE=
CVMFS_OOM_SCORE_ADJ=
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_PROXY_TEMPLATE=
CVMFS_PUBLIC_KEY=
CVMFS_QUOTA_LIMIT=4000 # from /etc/cvmfs/default.conf
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_REPOSITORY_DATE=
CVMFS_REPOSITORY_TAG=
CVMFS_ROOT_HASH=
CVMFS_SEND_INFO_HEADER=no # from /etc/cvmfs/default.conf
CVMFS_SERVER_CACHE_MODE=
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/grid.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/grid.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/grid.cern.ch;http://s1fnal-cvmfs.openhtc.io/cvmfs/grid.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_SYSLOG_FACILITY=
CVMFS_SYSLOG_LEVEL=
CVMFS_SYSTEMD_NOKILL=
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_TRACEFILE=
CVMFS_TRUSTED_CERTS=
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_USER=cvmfs # from /etc/cvmfs/default.conf
CVMFS_WORKSPACE=

Running /usr/bin/cvmfs_config cernvm-prod:
CVMFS_REPOSITORY_NAME=cernvm-prod.cern.ch
CERNVM_GRID_UI_VERSION=
CVMFS_ALIEN_CACHE=
CVMFS_ALT_ROOT_PATH=
CVMFS_AUTHZ_HELPER=
CVMFS_AUTHZ_SEARCH_PATH=
CVMFS_AUTO_UPDATE=
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CACHE_PRIMARY=
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED=
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEBUGLOG=
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DNS_RETRIES=
CVMFS_DNS_TIMEOUT=
CVMFS_EXTERNAL_FALLBACK_PROXY=
CVMFS_EXTERNAL_HTTP_PROXY=
CVMFS_EXTERNAL_SERVER_URL=
CVMFS_EXTERNAL_TIMEOUT=
CVMFS_EXTERNAL_TIMEOUT_DIRECT=
CVMFS_FALLBACK_PROXY=
CVMFS_FOLLOW_REDIRECTS=
CVMFS_HIDE_MAGIC_XATTRS=
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY=
CVMFS_IGNORE_SIGNATURE=
CVMFS_INITIAL_GENERATION=
CVMFS_IPFAMILY_PREFER=
CVMFS_KCACHE_TIMEOUT=2 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAX_IPADDR_PER_PROXY=
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MAX_TTL=
CVMFS_MEMCACHE_SIZE=
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_NFS_SHARED=
CVMFS_NFS_SOURCE=
CVMFS_OOM_SCORE_ADJ=
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_PROXY_TEMPLATE=
CVMFS_PUBLIC_KEY=
CVMFS_QUOTA_LIMIT=4000 # from /etc/cvmfs/default.conf
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_REPOSITORY_DATE=
CVMFS_REPOSITORY_TAG=
CVMFS_ROOT_HASH=
CVMFS_SEND_INFO_HEADER=no # from /etc/cvmfs/default.conf
CVMFS_SERVER_CACHE_MODE=
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch;http://s1fnal-cvmfs.openhtc.io/cvmfs/cernvm-prod.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_SYSLOG_FACILITY=
CVMFS_SYSLOG_LEVEL=
CVMFS_SYSTEMD_NOKILL=
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_TRACEFILE=
CVMFS_TRUSTED_CERTS=
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_USER=cvmfs # from /etc/cvmfs/default.conf
CVMFS_WORKSPACE=

Running /usr/bin/cvmfs_config sft:
CVMFS_REPOSITORY_NAME=sft.cern.ch
CERNVM_GRID_UI_VERSION=
CVMFS_ALIEN_CACHE=
CVMFS_ALT_ROOT_PATH=
CVMFS_AUTHZ_HELPER=
CVMFS_AUTHZ_SEARCH_PATH=
CVMFS_AUTO_UPDATE=
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CACHE_PRIMARY=
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED=
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEBUGLOG=
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DNS_RETRIES=
CVMFS_DNS_TIMEOUT=
CVMFS_EXTERNAL_FALLBACK_PROXY=
CVMFS_EXTERNAL_HTTP_PROXY=
CVMFS_EXTERNAL_SERVER_URL=
CVMFS_EXTERNAL_TIMEOUT=
CVMFS_EXTERNAL_TIMEOUT_DIRECT=
CVMFS_FALLBACK_PROXY=
CVMFS_FOLLOW_REDIRECTS=
CVMFS_HIDE_MAGIC_XATTRS=
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY=
CVMFS_IGNORE_SIGNATURE=
CVMFS_INITIAL_GENERATION=
CVMFS_IPFAMILY_PREFER=
CVMFS_KCACHE_TIMEOUT=2 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAX_IPADDR_PER_PROXY=
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MAX_TTL=
CVMFS_MEMCACHE_SIZE=
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_NFS_SHARED=
CVMFS_NFS_SOURCE=
CVMFS_OOM_SCORE_ADJ=
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_PROXY_TEMPLATE=
CVMFS_PUBLIC_KEY=
CVMFS_QUOTA_LIMIT=4000 # from /etc/cvmfs/default.conf
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_REPOSITORY_DATE=
CVMFS_REPOSITORY_TAG=
CVMFS_ROOT_HASH=
CVMFS_SEND_INFO_HEADER=no # from /etc/cvmfs/default.conf
CVMFS_SERVER_CACHE_MODE=
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/sft.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/sft.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/sft.cern.ch;http://s1fnal-cvmfs.openhtc.io/cvmfs/sft.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_SYSLOG_FACILITY=
CVMFS_SYSLOG_LEVEL=
CVMFS_SYSTEMD_NOKILL=
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_TRACEFILE=
CVMFS_TRUSTED_CERTS=
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_USER=cvmfs # from /etc/cvmfs/default.conf
CVMFS_WORKSPACE=

Running /usr/bin/cvmfs_config alice:
CVMFS_REPOSITORY_NAME=alice.cern.ch
CERNVM_GRID_UI_VERSION=
CVMFS_ALIEN_CACHE=
CVMFS_ALT_ROOT_PATH=
CVMFS_AUTHZ_HELPER=
CVMFS_AUTHZ_SEARCH_PATH=
CVMFS_AUTO_UPDATE=
CVMFS_BACKOFF_INIT=2 # from /etc/cvmfs/default.conf
CVMFS_BACKOFF_MAX=10 # from /etc/cvmfs/default.conf
CVMFS_BASE_ENV=1 # from /etc/cvmfs/default.conf
CVMFS_CACHE_BASE=/var/lib/cvmfs # from /etc/cvmfs/default.conf
CVMFS_CACHE_DIR=/var/lib/cvmfs/shared
CVMFS_CACHE_PRIMARY=
CVMFS_CHECK_PERMISSIONS=yes # from /etc/cvmfs/default.conf
CVMFS_CLAIM_OWNERSHIP=yes # from /etc/cvmfs/default.conf
CVMFS_CLIENT_PROFILE= # from /etc/cvmfs/default.conf
CVMFS_CONFIG_REPO_REQUIRED=
CVMFS_CONFIG_REPOSITORY=cvmfs-config.cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DEBUGLOG=
CVMFS_DEFAULT_DOMAIN=cern.ch # from /etc/cvmfs/default.d/50-cern-debian.conf
CVMFS_DNS_RETRIES=
CVMFS_DNS_TIMEOUT=
CVMFS_EXTERNAL_FALLBACK_PROXY=
CVMFS_EXTERNAL_HTTP_PROXY=
CVMFS_EXTERNAL_SERVER_URL=
CVMFS_EXTERNAL_TIMEOUT=
CVMFS_EXTERNAL_TIMEOUT_DIRECT=
CVMFS_FALLBACK_PROXY=
CVMFS_FOLLOW_REDIRECTS=
CVMFS_HIDE_MAGIC_XATTRS=
CVMFS_HOST_RESET_AFTER=1800 # from /etc/cvmfs/default.conf
CVMFS_HTTP_PROXY=
CVMFS_IGNORE_SIGNATURE=
CVMFS_INITIAL_GENERATION=
CVMFS_IPFAMILY_PREFER=
CVMFS_KCACHE_TIMEOUT=2 # from /etc/cvmfs/default.local
CVMFS_KEYS_DIR=/etc/cvmfs/keys/cern.ch # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_LOW_SPEED_LIMIT=1024 # from /etc/cvmfs/default.conf
CVMFS_MAX_IPADDR_PER_PROXY=
CVMFS_MAX_RETRIES=3 # from /etc/cvmfs/default.local
CVMFS_MAX_TTL=
CVMFS_MEMCACHE_SIZE=
CVMFS_MOUNT_DIR=/cvmfs # from /etc/cvmfs/default.conf
CVMFS_MOUNT_RW=
CVMFS_NFILES=131072 # from /etc/cvmfs/default.conf
CVMFS_NFS_SHARED=
CVMFS_NFS_SOURCE=
CVMFS_OOM_SCORE_ADJ=
CVMFS_PROXY_RESET_AFTER=300 # from /etc/cvmfs/default.conf
CVMFS_PROXY_TEMPLATE=
CVMFS_PUBLIC_KEY=
CVMFS_QUOTA_LIMIT=4000 # from /etc/cvmfs/default.conf
CVMFS_RELOAD_SOCKETS=/var/run/cvmfs # from /etc/cvmfs/default.conf
CVMFS_REPOSITORIES=atlas,atlas-condb,grid,cernvm-prod,sft,alice # from /etc/cvmfs/default.local
CVMFS_REPOSITORY_DATE=
CVMFS_REPOSITORY_TAG=
CVMFS_ROOT_HASH=
CVMFS_SEND_INFO_HEADER=no # from /etc/cvmfs/default.conf
CVMFS_SERVER_CACHE_MODE=
CVMFS_SERVER_URL='http://s1cern-cvmfs.openhtc.io/cvmfs/alice.cern.ch;http://s1ral-cvmfs.openhtc.io/cvmfs/alice.cern.ch;http://s1bnl-cvmfs.openhtc.io/cvmfs/alice.cern.ch;http://s1fnal-cvmfs.openhtc.io/cvmfs/alice.cern.ch' # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_SHARED_CACHE=yes # from /etc/cvmfs/default.conf
CVMFS_STRICT_MOUNT=no # from /etc/cvmfs/default.conf
CVMFS_SYSLOG_FACILITY=
CVMFS_SYSLOG_LEVEL=
CVMFS_SYSTEMD_NOKILL=
CVMFS_TIMEOUT=5 # from /etc/cvmfs/default.conf
CVMFS_TIMEOUT_DIRECT=10 # from /etc/cvmfs/default.conf
CVMFS_TRACEFILE=
CVMFS_TRUSTED_CERTS=
CVMFS_USE_CDN=yes # from /etc/cvmfs/default.local
CVMFS_USE_GEOAPI=yes # from /etc/cvmfs/domain.d/cern.ch.conf
CVMFS_USER=cvmfs
12) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48346)
Posted 5 Aug 2023 by stratos412
Post:
@meax.

About cgroup. I can see that the group is cgroup2fs


@computezrmle

Before the Mint upgrade and the installation of the latest BOINC(flatpak), those where my PCs clients:

1) https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10822907
which change to https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10834339

2) https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10822906
which change to https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10834344

Why on BOTH PCs I see "Virtualbox (6.1.38_Ubuntur153438) installed, CPU has hardware virtualization support and it is enabled" ?

Those where my last LHC tasks BEFORE Linux upgrade.
https://lhcathome.cern.ch/lhcathome/results.php?hostid=10822906
https://lhcathome.cern.ch/lhcathome/results.php?hostid=10822907

How do I configure local CVMFS and what it should be?
13) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48341)
Posted 4 Aug 2023 by stratos412
Post:
I am going to try some different version of virtual box to see if I get lucky.

I also see in the system information that oracle vm box is there

The ''sudo virt-what'' returns null results... Something must be wrong
14) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48339)
Posted 4 Aug 2023 by stratos412
Post:
PC_1: https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10834344

PC_2: https://lhcathome.cern.ch/lhcathome/show_host_detail.php?hostid=10834339


Edit: Virtualization is enabled in BIOS. I had the following before the Mint upgrade, on both PCs

Linux Mint 20.3 [5.4.0-150-generic|libc 2.31 (Ubuntu GLIBC 2.31-0ubuntu9.9)]
Virtualbox (6.1.38_Ubuntur153438) installed, CPU has hardware virtualization support and it is enabled
15) Questions and Answers : Unix/Linux : Running virtualbox on LHC (Message 48337)
Posted 4 Aug 2023 by stratos412
Post:
Hello all.

I am dealing with an issue after the major upgrade on linux mint 20.3 to mint 21. A few thing got broken unfortunately....

I had to install from Linux Mint Software Manager the BOINC(flatpack), since the classic BOINC packages refused to work properly....
I managed to make BOINC(flatpack) to work, but I am dealing an issue with Virtualbox.
First, I have installed the latest version (7.0) but it didn't work.
Then, I installed version 6.1.34 since it was the last one that worked properly with BOINC before the upgrade. Also installed the extension pack of virtualbox.
(link: https://www.virtualbox.org/wiki/Download_Old_Builds_6_1 // version 6.1.34 for Ubuntu 22.04)
Every time I get an error from BOINC: ''Message from server: Virtualbox is not isntalled".

Any idea how to fix it?
16) Questions and Answers : Getting started : Theory Application- Error (Message 47692)
Posted 16 Jan 2023 by stratos412
Post:
Thanks for the reply.
Damn that was bad. It seemed to be running fine for 10 days...
I didn't notice about minimum RAM requirements, since other projects fail to sent tasks in BOINC client if there is not much RAM. (i.e Rosetta, Climate Prediction etc)
There is also a warning on the Boinc Event Log if something does not meet the requirements.
Never mind. I will try again on another PC.
17) Questions and Answers : Getting started : Theory Application- Error (Message 47690)
Posted 16 Jan 2023 by stratos412
Post:
Hello all.

Can someone please check why I got an an error in this task? I finished it ontime, however it got an computation error at the end...
(P.S. I cannot post it in the Theory Application Message Board)

https://lhcathome.cern.ch/lhcathome/result.php?resultid=375454242


<core_client_version>7.16.6</core_client_version>
<![CDATA[
<stderr_txt>
2023-01-06 10:04:15 (37096): Detected: vboxwrapper 26206
2023-01-06 10:04:15 (37096): Detected: BOINC client v7.16.6
2023-01-06 10:04:26 (37096): Detected: VirtualBox VboxManage Interface (Version: 6.1.38)
2023-01-06 10:04:26 (37096): Detected: Heartbeat check (file: 'heartbeat' every 1200.000000 seconds)
2023-01-06 10:04:26 (37096): Successfully copied 'init_data.xml' to the shared directory.
2023-01-06 10:04:26 (37096): Successfully copied 'input' to the shared directory.
2023-01-06 10:04:27 (37096): Create VM. (boinc_36df9f5f16a80810, slot#2)
2023-01-06 10:04:28 (37096): Setting Memory Size for VM. (630MB)
2023-01-06 10:04:28 (37096): Setting CPU Count for VM. (1)
2023-01-06 10:04:28 (37096): Setting Chipset Options for VM.
2023-01-06 10:04:29 (37096): Setting Graphics Controller Options for VM.
2023-01-06 10:04:30 (37096): Setting Boot Options for VM.
2023-01-06 10:04:30 (37096): Setting Network Configuration for NAT.
2023-01-06 10:04:30 (37096): Enabling VM Network Access.
2023-01-06 10:04:31 (37096): Disabling USB Support for VM.
2023-01-06 10:04:31 (37096): Disabling COM Port Support for VM.
2023-01-06 10:04:32 (37096): Disabling LPT Port Support for VM.
2023-01-06 10:04:32 (37096): Disabling Audio Support for VM.
2023-01-06 10:04:33 (37096): Disabling Clipboard Support for VM.
2023-01-06 10:04:33 (37096): Disabling Drag and Drop Support for VM.
2023-01-06 10:04:34 (37096): Adding storage controller(s) to VM.
2023-01-06 10:04:34 (37096): Adding virtual disk drive to VM. (vm_image.vdi)
2023-01-06 10:04:34 (37096): Adding network bandwidth throttle group to VM. (Defaulting to 1024GB)
2023-01-06 10:04:34 (37096): forwarding host port 37059 to guest port 80
2023-01-06 10:04:34 (37096): Enabling remote desktop for VM.
2023-01-06 10:04:35 (37096): Required extension pack not installed, remote desktop not enabled.
2023-01-06 10:04:35 (37096): Enabling shared directory for VM.
2023-01-06 10:04:35 (37096): Starting VM using VBoxManage interface. (boinc_36df9f5f16a80810, slot#2)
2023-01-06 10:04:37 (37096): Successfully started VM. (PID = '37515')
2023-01-06 10:04:37 (37096): Reporting VM Process ID to BOINC.
2023-01-06 10:04:37 (37096): Guest Log: BIOS: VirtualBox 6.1.38
2023-01-06 10:04:37 (37096): Guest Log: CPUID EDX: 0x178bfbff
2023-01-06 10:04:37 (37096): Guest Log: BIOS: No PCI IDE controller, not probing IDE
2023-01-06 10:04:37 (37096): Guest Log: BIOS: AHCI 0-P#0: PCHS=16383/16/63 LCHS=1024/255/63 0x0000000002800000 sectors
2023-01-06 10:04:37 (37096): VM state change detected. (old = 'poweredoff', new = 'running')
2023-01-06 10:04:37 (37096): Detected: Web Application Enabled (http://localhost:37059)
2023-01-06 10:04:37 (37096): Preference change detected
2023-01-06 10:04:37 (37096): Setting CPU throttle for VM. (70%)
2023-01-06 10:04:37 (37096): Setting checkpoint interval to 600 seconds. (Higher value of (Preference: 480 seconds) or (Vbox_job.xml: 600 seconds))
2023-01-06 10:04:39 (37096): Guest Log: BIOS: Boot : bseqnr=1, bootseq=0032
2023-01-06 10:04:39 (37096): Guest Log: BIOS: Booting from Hard Disk...
2023-01-06 10:04:45 (37096): Guest Log: BIOS: KBD: unsupported int 16h function 03
2023-01-06 10:04:45 (37096): Guest Log: BIOS: AX=0305 BX=0000 CX=0000 DX=0000 
2023-01-06 10:05:47 (37096): Guest Log: vgdrvHeartbeatInit: Setting up heartbeat to trigger every 2000 milliseconds
2023-01-06 10:05:47 (37096): Guest Log: vboxguest: misc device minor 56, IRQ 20, I/O port d020, MMIO at 00000000f0400000 (size 0x400000)
2023-01-06 10:06:09 (37096): Guest Log: VBoxService 5.2.6 r120293 (verbosity: 0) linux.amd64 (Jan 15 2018 14:51:00) release log
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.004701 main     Log opened 2023-01-06T08:06:08.882597000Z
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.041812 main     OS Product: Linux
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.041964 main     OS Release: 4.14.76-13.cernvm.x86_64
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.042039 main     OS Version: #1 SMP Tue Oct 16 18:26:15 CEST 2018
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.042104 main     Executable: /usr/sbin/VBoxService
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.042105 main     Process ID: 3176
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.042106 main     Package type: LINUX_64BITS_GENERIC
2023-01-06 10:06:09 (37096): Guest Log: 00:00:00.113486 main     5.2.6 r120293 started. Verbose level = 0
2023-01-06 10:08:00 (37096): Guest Log: 10:07:59 EET +02:00 2023-01-06: cranky: [INFO] Detected Theory App
2023-01-06 10:08:00 (37096): Guest Log: 10:07:59 EET +02:00 2023-01-06: cranky: [INFO] Checking CVMFS.
2023-01-06 10:08:25 (37096): Guest Log: Probing /cvmfs/sft.cern.ch... Failed!
2023-01-06 10:08:25 (37096): Guest Log: 10:08:25 EET +02:00 2023-01-06: cranky: [ERROR] 'cvmfs_config probe sft.cern.ch' failed.
2023-01-06 11:43:39 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 11:43:39 (37096): Status Report: Elapsed Time: '6000.000000'
2023-01-06 11:43:39 (37096): Status Report: CPU Time: '234.240000'
2023-01-06 13:22:41 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 13:22:41 (37096): Status Report: Elapsed Time: '12000.000000'
2023-01-06 13:22:41 (37096): Status Report: CPU Time: '333.030000'
2023-01-06 15:01:42 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 15:01:42 (37096): Status Report: Elapsed Time: '18000.000000'
2023-01-06 15:01:42 (37096): Status Report: CPU Time: '431.080000'
2023-01-06 16:40:43 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 16:40:43 (37096): Status Report: Elapsed Time: '24000.000000'
2023-01-06 16:40:43 (37096): Status Report: CPU Time: '529.500000'
2023-01-06 18:19:46 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 18:19:46 (37096): Status Report: Elapsed Time: '30000.000000'
2023-01-06 18:19:46 (37096): Status Report: CPU Time: '631.700000'
2023-01-06 19:58:49 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 19:58:49 (37096): Status Report: Elapsed Time: '36000.000000'
2023-01-06 19:58:49 (37096): Status Report: CPU Time: '726.850000'
2023-01-06 21:37:51 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 21:37:51 (37096): Status Report: Elapsed Time: '42000.000000'
2023-01-06 21:37:51 (37096): Status Report: CPU Time: '818.570000'
2023-01-06 23:16:55 (37096): Status Report: Job Duration: '864000.000000'
2023-01-06 23:16:55 (37096): Status Report: Elapsed Time: '48000.000000'
2023-01-06 23:16:55 (37096): Status Report: CPU Time: '909.790000'
2023-01-07 00:55:57 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 00:55:57 (37096): Status Report: Elapsed Time: '54000.000000'
2023-01-07 00:55:57 (37096): Status Report: CPU Time: '1004.360000'
2023-01-07 02:34:58 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 02:34:58 (37096): Status Report: Elapsed Time: '60000.000000'
2023-01-07 02:34:58 (37096): Status Report: CPU Time: '1112.460000'
2023-01-07 04:13:58 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 04:13:58 (37096): Status Report: Elapsed Time: '66000.000000'
2023-01-07 04:13:58 (37096): Status Report: CPU Time: '1220.620000'
2023-01-07 05:52:58 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 05:52:58 (37096): Status Report: Elapsed Time: '72000.000000'
2023-01-07 05:52:58 (37096): Status Report: CPU Time: '1327.350000'
2023-01-07 07:31:59 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 07:31:59 (37096): Status Report: Elapsed Time: '78000.000000'
2023-01-07 07:31:59 (37096): Status Report: CPU Time: '1434.660000'
2023-01-07 09:10:59 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 09:10:59 (37096): Status Report: Elapsed Time: '84000.000000'
2023-01-07 09:10:59 (37096): Status Report: CPU Time: '1540.640000'
2023-01-07 10:50:01 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 10:50:01 (37096): Status Report: Elapsed Time: '90000.000000'
2023-01-07 10:50:01 (37096): Status Report: CPU Time: '1638.740000'
2023-01-07 12:29:04 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 12:29:04 (37096): Status Report: Elapsed Time: '96000.000000'
2023-01-07 12:29:04 (37096): Status Report: CPU Time: '1730.850000'
2023-01-07 14:08:07 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 14:08:07 (37096): Status Report: Elapsed Time: '102000.000000'
2023-01-07 14:08:07 (37096): Status Report: CPU Time: '1821.360000'
2023-01-07 15:47:10 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 15:47:10 (37096): Status Report: Elapsed Time: '108000.000000'
2023-01-07 15:47:10 (37096): Status Report: CPU Time: '1911.520000'
2023-01-07 17:26:13 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 17:26:13 (37096): Status Report: Elapsed Time: '114000.000000'
2023-01-07 17:26:13 (37096): Status Report: CPU Time: '2003.020000'
2023-01-07 19:05:16 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 19:05:16 (37096): Status Report: Elapsed Time: '120000.000000'
2023-01-07 19:05:16 (37096): Status Report: CPU Time: '2093.580000'
2023-01-07 20:44:19 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 20:44:19 (37096): Status Report: Elapsed Time: '126000.000000'
2023-01-07 20:44:19 (37096): Status Report: CPU Time: '2183.990000'
2023-01-07 22:23:23 (37096): Status Report: Job Duration: '864000.000000'
2023-01-07 22:23:23 (37096): Status Report: Elapsed Time: '132000.000000'
2023-01-07 22:23:23 (37096): Status Report: CPU Time: '2275.750000'
2023-01-08 00:02:26 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 00:02:26 (37096): Status Report: Elapsed Time: '138000.000000'
2023-01-08 00:02:26 (37096): Status Report: CPU Time: '2367.500000'
2023-01-08 01:41:29 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 01:41:29 (37096): Status Report: Elapsed Time: '144000.000000'
2023-01-08 01:41:29 (37096): Status Report: CPU Time: '2457.750000'
2023-01-08 03:20:32 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 03:20:32 (37096): Status Report: Elapsed Time: '150000.000000'
2023-01-08 03:20:32 (37096): Status Report: CPU Time: '2556.370000'
2023-01-08 04:59:32 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 04:59:32 (37096): Status Report: Elapsed Time: '156000.000000'
2023-01-08 04:59:32 (37096): Status Report: CPU Time: '2664.550000'
2023-01-08 06:38:33 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 06:38:33 (37096): Status Report: Elapsed Time: '162000.000000'
2023-01-08 06:38:33 (37096): Status Report: CPU Time: '2772.840000'
2023-01-08 08:17:33 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 08:17:33 (37096): Status Report: Elapsed Time: '168000.000000'
2023-01-08 08:17:33 (37096): Status Report: CPU Time: '2881.380000'
2023-01-08 09:56:34 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 09:56:34 (37096): Status Report: Elapsed Time: '174000.000000'
2023-01-08 09:56:34 (37096): Status Report: CPU Time: '2989.080000'
2023-01-08 11:35:37 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 11:35:37 (37096): Status Report: Elapsed Time: '180000.000000'
2023-01-08 11:35:37 (37096): Status Report: CPU Time: '3087.440000'
2023-01-08 13:14:40 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 13:14:40 (37096): Status Report: Elapsed Time: '186000.000000'
2023-01-08 13:14:40 (37096): Status Report: CPU Time: '3185.890000'
2023-01-08 14:53:43 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 14:53:43 (37096): Status Report: Elapsed Time: '192000.000000'
2023-01-08 14:53:43 (37096): Status Report: CPU Time: '3283.850000'
2023-01-08 16:32:46 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 16:32:46 (37096): Status Report: Elapsed Time: '198000.000000'
2023-01-08 16:32:46 (37096): Status Report: CPU Time: '3383.020000'
2023-01-08 18:11:49 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 18:11:49 (37096): Status Report: Elapsed Time: '204000.000000'
2023-01-08 18:11:49 (37096): Status Report: CPU Time: '3481.460000'
2023-01-08 19:50:52 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 19:50:52 (37096): Status Report: Elapsed Time: '210000.000000'
2023-01-08 19:50:52 (37096): Status Report: CPU Time: '3578.890000'
2023-01-08 21:29:55 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 21:29:55 (37096): Status Report: Elapsed Time: '216000.000000'
2023-01-08 21:29:55 (37096): Status Report: CPU Time: '3676.860000'
2023-01-08 23:08:58 (37096): Status Report: Job Duration: '864000.000000'
2023-01-08 23:08:58 (37096): Status Report: Elapsed Time: '222000.000000'
2023-01-08 23:08:58 (37096): Status Report: CPU Time: '3775.640000'
2023-01-09 00:47:59 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 00:47:59 (37096): Status Report: Elapsed Time: '228000.000000'
2023-01-09 00:47:59 (37096): Status Report: CPU Time: '3881.400000'
2023-01-09 02:27:00 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 02:27:00 (37096): Status Report: Elapsed Time: '234000.000000'
2023-01-09 02:27:00 (37096): Status Report: CPU Time: '3991.160000'
2023-01-09 04:06:00 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 04:06:00 (37096): Status Report: Elapsed Time: '240000.000000'
2023-01-09 04:06:00 (37096): Status Report: CPU Time: '4101.750000'
2023-01-09 05:45:00 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 05:45:00 (37096): Status Report: Elapsed Time: '246000.000000'
2023-01-09 05:45:00 (37096): Status Report: CPU Time: '4209.900000'
2023-01-09 07:24:01 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 07:24:01 (37096): Status Report: Elapsed Time: '252000.000000'
2023-01-09 07:24:01 (37096): Status Report: CPU Time: '4318.650000'
2023-01-09 09:03:01 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 09:03:01 (37096): Status Report: Elapsed Time: '258000.000000'
2023-01-09 09:03:01 (37096): Status Report: CPU Time: '4427.100000'
2023-01-09 10:42:02 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 10:42:02 (37096): Status Report: Elapsed Time: '264000.000000'
2023-01-09 10:42:02 (37096): Status Report: CPU Time: '4535.150000'
2023-01-09 12:21:04 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 12:21:04 (37096): Status Report: Elapsed Time: '270000.000000'
2023-01-09 12:21:04 (37096): Status Report: CPU Time: '4640.650000'
2023-01-09 14:00:07 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 14:00:07 (37096): Status Report: Elapsed Time: '276000.000000'
2023-01-09 14:00:07 (37096): Status Report: CPU Time: '4739.950000'
2023-01-09 15:39:10 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 15:39:10 (37096): Status Report: Elapsed Time: '282000.000000'
2023-01-09 15:39:10 (37096): Status Report: CPU Time: '4839.890000'
2023-01-09 17:18:13 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 17:18:13 (37096): Status Report: Elapsed Time: '288000.000000'
2023-01-09 17:18:13 (37096): Status Report: CPU Time: '4939.970000'
2023-01-09 18:57:15 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 18:57:15 (37096): Status Report: Elapsed Time: '294000.000000'
2023-01-09 18:57:15 (37096): Status Report: CPU Time: '5043.610000'
2023-01-09 20:36:15 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 20:36:15 (37096): Status Report: Elapsed Time: '300000.000000'
2023-01-09 20:36:15 (37096): Status Report: CPU Time: '5152.200000'
2023-01-09 22:15:15 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 22:15:15 (37096): Status Report: Elapsed Time: '306000.000000'
2023-01-09 22:15:15 (37096): Status Report: CPU Time: '5261.660000'
2023-01-09 23:54:16 (37096): Status Report: Job Duration: '864000.000000'
2023-01-09 23:54:16 (37096): Status Report: Elapsed Time: '312000.000000'
2023-01-09 23:54:16 (37096): Status Report: CPU Time: '5370.320000'
2023-01-10 01:33:16 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 01:33:16 (37096): Status Report: Elapsed Time: '318000.000000'
2023-01-10 01:33:16 (37096): Status Report: CPU Time: '5479.480000'
2023-01-10 03:12:17 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 03:12:17 (37096): Status Report: Elapsed Time: '324000.000000'
2023-01-10 03:12:17 (37096): Status Report: CPU Time: '5587.320000'
2023-01-10 04:51:17 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 04:51:17 (37096): Status Report: Elapsed Time: '330000.000000'
2023-01-10 04:51:17 (37096): Status Report: CPU Time: '5697.140000'
2023-01-10 06:30:18 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 06:30:18 (37096): Status Report: Elapsed Time: '336000.000000'
2023-01-10 06:30:18 (37096): Status Report: CPU Time: '5807.660000'
2023-01-10 08:09:18 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 08:09:18 (37096): Status Report: Elapsed Time: '342000.000000'
2023-01-10 08:09:18 (37096): Status Report: CPU Time: '5915.630000'
2023-01-10 09:48:21 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 09:48:21 (37096): Status Report: Elapsed Time: '348000.000000'
2023-01-10 09:48:21 (37096): Status Report: CPU Time: '6015.860000'
2023-01-10 11:27:24 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 11:27:24 (37096): Status Report: Elapsed Time: '354000.000000'
2023-01-10 11:27:24 (37096): Status Report: CPU Time: '6116.560000'
2023-01-10 13:06:27 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 13:06:27 (37096): Status Report: Elapsed Time: '360000.000000'
2023-01-10 13:06:27 (37096): Status Report: CPU Time: '6209.130000'
2023-01-10 14:45:31 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 14:45:31 (37096): Status Report: Elapsed Time: '366000.000000'
2023-01-10 14:45:31 (37096): Status Report: CPU Time: '6301.740000'
2023-01-10 16:24:34 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 16:24:34 (37096): Status Report: Elapsed Time: '372000.000000'
2023-01-10 16:24:34 (37096): Status Report: CPU Time: '6401.270000'
2023-01-10 18:03:36 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 18:03:36 (37096): Status Report: Elapsed Time: '378000.000000'
2023-01-10 18:03:36 (37096): Status Report: CPU Time: '6501.320000'
2023-01-10 19:42:37 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 19:42:37 (37096): Status Report: Elapsed Time: '384000.000000'
2023-01-10 19:42:37 (37096): Status Report: CPU Time: '6608.760000'
2023-01-10 21:21:38 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 21:21:38 (37096): Status Report: Elapsed Time: '390000.000000'
2023-01-10 21:21:38 (37096): Status Report: CPU Time: '6719.020000'
2023-01-10 23:00:38 (37096): Status Report: Job Duration: '864000.000000'
2023-01-10 23:00:38 (37096): Status Report: Elapsed Time: '396000.000000'
2023-01-10 23:00:38 (37096): Status Report: CPU Time: '6828.020000'
2023-01-11 00:39:39 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 00:39:39 (37096): Status Report: Elapsed Time: '402000.000000'
2023-01-11 00:39:39 (37096): Status Report: CPU Time: '6938.740000'
2023-01-11 02:18:39 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 02:18:39 (37096): Status Report: Elapsed Time: '408000.000000'
2023-01-11 02:18:39 (37096): Status Report: CPU Time: '7050.200000'
2023-01-11 03:57:40 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 03:57:40 (37096): Status Report: Elapsed Time: '414000.000000'
2023-01-11 03:57:40 (37096): Status Report: CPU Time: '7162.510000'
2023-01-11 05:36:40 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 05:36:40 (37096): Status Report: Elapsed Time: '420000.000000'
2023-01-11 05:36:40 (37096): Status Report: CPU Time: '7272.330000'
2023-01-11 07:15:40 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 07:15:40 (37096): Status Report: Elapsed Time: '426000.000000'
2023-01-11 07:15:40 (37096): Status Report: CPU Time: '7381.880000'
2023-01-11 08:54:43 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 08:54:43 (37096): Status Report: Elapsed Time: '432000.000000'
2023-01-11 08:54:43 (37096): Status Report: CPU Time: '7480.920000'
2023-01-11 10:33:46 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 10:33:46 (37096): Status Report: Elapsed Time: '438000.000000'
2023-01-11 10:33:46 (37096): Status Report: CPU Time: '7574.440000'
2023-01-11 12:12:49 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 12:12:49 (37096): Status Report: Elapsed Time: '444000.000000'
2023-01-11 12:12:49 (37096): Status Report: CPU Time: '7669.210000'
2023-01-11 13:51:53 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 13:51:53 (37096): Status Report: Elapsed Time: '450000.000000'
2023-01-11 13:51:53 (37096): Status Report: CPU Time: '7762.260000'
2023-01-11 15:30:56 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 15:30:56 (37096): Status Report: Elapsed Time: '456000.000000'
2023-01-11 15:30:56 (37096): Status Report: CPU Time: '7855.820000'
2023-01-11 17:09:59 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 17:09:59 (37096): Status Report: Elapsed Time: '462000.000000'
2023-01-11 17:09:59 (37096): Status Report: CPU Time: '7949.110000'
2023-01-11 18:49:02 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 18:49:02 (37096): Status Report: Elapsed Time: '468000.000000'
2023-01-11 18:49:02 (37096): Status Report: CPU Time: '8045.820000'
2023-01-11 20:28:02 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 20:28:02 (37096): Status Report: Elapsed Time: '474000.000000'
2023-01-11 20:28:02 (37096): Status Report: CPU Time: '8156.470000'
2023-01-11 22:07:03 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 22:07:03 (37096): Status Report: Elapsed Time: '480000.000000'
2023-01-11 22:07:03 (37096): Status Report: CPU Time: '8266.550000'
2023-01-11 23:46:03 (37096): Status Report: Job Duration: '864000.000000'
2023-01-11 23:46:03 (37096): Status Report: Elapsed Time: '486000.000000'
2023-01-11 23:46:03 (37096): Status Report: CPU Time: '8376.140000'
2023-01-12 01:25:05 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 01:25:05 (37096): Status Report: Elapsed Time: '492000.000000'
2023-01-12 01:25:05 (37096): Status Report: CPU Time: '8477.790000'
2023-01-12 03:04:08 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 03:04:08 (37096): Status Report: Elapsed Time: '498000.000000'
2023-01-12 03:04:08 (37096): Status Report: CPU Time: '8571.710000'
2023-01-12 04:43:12 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 04:43:12 (37096): Status Report: Elapsed Time: '504000.000000'
2023-01-12 04:43:12 (37096): Status Report: CPU Time: '8667.700000'
2023-01-12 06:22:15 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 06:22:15 (37096): Status Report: Elapsed Time: '510000.000000'
2023-01-12 06:22:15 (37096): Status Report: CPU Time: '8763.510000'
2023-01-12 08:01:18 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 08:01:18 (37096): Status Report: Elapsed Time: '516000.000000'
2023-01-12 08:01:18 (37096): Status Report: CPU Time: '8861.280000'
2023-01-12 09:40:21 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 09:40:21 (37096): Status Report: Elapsed Time: '522000.000000'
2023-01-12 09:40:21 (37096): Status Report: CPU Time: '8955.490000'
2023-01-12 11:19:24 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 11:19:24 (37096): Status Report: Elapsed Time: '528000.000000'
2023-01-12 11:19:24 (37096): Status Report: CPU Time: '9054.020000'
2023-01-12 12:58:24 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 12:58:24 (37096): Status Report: Elapsed Time: '534000.000000'
2023-01-12 12:58:24 (37096): Status Report: CPU Time: '9164.980000'
2023-01-12 14:37:25 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 14:37:25 (37096): Status Report: Elapsed Time: '540000.000000'
2023-01-12 14:37:25 (37096): Status Report: CPU Time: '9274.480000'
2023-01-12 16:16:26 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 16:16:26 (37096): Status Report: Elapsed Time: '546000.000000'
2023-01-12 16:16:26 (37096): Status Report: CPU Time: '9374.250000'
2023-01-12 17:55:28 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 17:55:28 (37096): Status Report: Elapsed Time: '552000.000000'
2023-01-12 17:55:28 (37096): Status Report: CPU Time: '9469.910000'
2023-01-12 19:34:31 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 19:34:31 (37096): Status Report: Elapsed Time: '558000.000000'
2023-01-12 19:34:31 (37096): Status Report: CPU Time: '9566.070000'
2023-01-12 21:13:31 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 21:13:31 (37096): Status Report: Elapsed Time: '564000.000000'
2023-01-12 21:13:31 (37096): Status Report: CPU Time: '9676.750000'
2023-01-12 22:52:32 (37096): Status Report: Job Duration: '864000.000000'
2023-01-12 22:52:32 (37096): Status Report: Elapsed Time: '570000.000000'
2023-01-12 22:52:32 (37096): Status Report: CPU Time: '9787.560000'
2023-01-13 00:31:32 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 00:31:32 (37096): Status Report: Elapsed Time: '576000.000000'
2023-01-13 00:31:32 (37096): Status Report: CPU Time: '9900.070000'
2023-01-13 02:10:33 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 02:10:33 (37096): Status Report: Elapsed Time: '582000.000000'
2023-01-13 02:10:33 (37096): Status Report: CPU Time: '10012.080000'
2023-01-13 03:49:33 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 03:49:33 (37096): Status Report: Elapsed Time: '588000.000000'
2023-01-13 03:49:33 (37096): Status Report: CPU Time: '10124.240000'
2023-01-13 05:28:34 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 05:28:34 (37096): Status Report: Elapsed Time: '594000.000000'
2023-01-13 05:28:34 (37096): Status Report: CPU Time: '10235.870000'
2023-01-13 07:07:34 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 07:07:34 (37096): Status Report: Elapsed Time: '600000.000000'
2023-01-13 07:07:34 (37096): Status Report: CPU Time: '10347.570000'
2023-01-13 08:46:37 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 08:46:37 (37096): Status Report: Elapsed Time: '606000.000000'
2023-01-13 08:46:37 (37096): Status Report: CPU Time: '10443.350000'
2023-01-13 10:25:40 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 10:25:40 (37096): Status Report: Elapsed Time: '612000.000000'
2023-01-13 10:25:40 (37096): Status Report: CPU Time: '10541.110000'
2023-01-13 12:04:44 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 12:04:44 (37096): Status Report: Elapsed Time: '618000.000000'
2023-01-13 12:04:44 (37096): Status Report: CPU Time: '10638.330000'
2023-01-13 13:43:45 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 13:43:45 (37096): Status Report: Elapsed Time: '624000.000000'
2023-01-13 13:43:45 (37096): Status Report: CPU Time: '10742.310000'
2023-01-13 15:22:46 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 15:22:46 (37096): Status Report: Elapsed Time: '630000.000000'
2023-01-13 15:22:46 (37096): Status Report: CPU Time: '10853.470000'
2023-01-13 17:01:48 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 17:01:48 (37096): Status Report: Elapsed Time: '636000.000000'
2023-01-13 17:01:48 (37096): Status Report: CPU Time: '10953.080000'
2023-01-13 18:40:51 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 18:40:51 (37096): Status Report: Elapsed Time: '642000.000000'
2023-01-13 18:40:51 (37096): Status Report: CPU Time: '11049.090000'
2023-01-13 20:19:52 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 20:19:52 (37096): Status Report: Elapsed Time: '648000.000000'
2023-01-13 20:19:52 (37096): Status Report: CPU Time: '11161.310000'
2023-01-13 21:58:52 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 21:58:52 (37096): Status Report: Elapsed Time: '654000.000000'
2023-01-13 21:58:52 (37096): Status Report: CPU Time: '11273.220000'
2023-01-13 23:37:52 (37096): Status Report: Job Duration: '864000.000000'
2023-01-13 23:37:52 (37096): Status Report: Elapsed Time: '660000.000000'
2023-01-13 23:37:52 (37096): Status Report: CPU Time: '11384.670000'
2023-01-13 23:39:26 (37096): Preference change detected
2023-01-13 23:39:26 (37096): Setting CPU throttle for VM. (70%)
2023-01-13 23:39:27 (37096): Setting checkpoint interval to 600 seconds. (Higher value of (Preference: 480 seconds) or (Vbox_job.xml: 600 seconds))
2023-01-14 01:16:54 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 01:16:54 (37096): Status Report: Elapsed Time: '666000.569395'
2023-01-14 01:16:54 (37096): Status Report: CPU Time: '11497.370000'
2023-01-14 02:55:54 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 02:55:54 (37096): Status Report: Elapsed Time: '672000.569395'
2023-01-14 02:55:54 (37096): Status Report: CPU Time: '11608.910000'
2023-01-14 04:34:54 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 04:34:54 (37096): Status Report: Elapsed Time: '678000.569395'
2023-01-14 04:34:54 (37096): Status Report: CPU Time: '11723.140000'
2023-01-14 06:13:55 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 06:13:55 (37096): Status Report: Elapsed Time: '684000.569395'
2023-01-14 06:13:55 (37096): Status Report: CPU Time: '11836.620000'
2023-01-14 07:52:55 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 07:52:55 (37096): Status Report: Elapsed Time: '690000.569395'
2023-01-14 07:52:55 (37096): Status Report: CPU Time: '11947.940000'
2023-01-14 09:31:57 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 09:31:57 (37096): Status Report: Elapsed Time: '696000.569395'
2023-01-14 09:31:57 (37096): Status Report: CPU Time: '12056.370000'
2023-01-14 11:11:00 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 11:11:00 (37096): Status Report: Elapsed Time: '702000.569395'
2023-01-14 11:11:00 (37096): Status Report: CPU Time: '12158.580000'
2023-01-14 12:50:03 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 12:50:03 (37096): Status Report: Elapsed Time: '708000.569395'
2023-01-14 12:50:03 (37096): Status Report: CPU Time: '12261.060000'
2023-01-14 14:29:06 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 14:29:06 (37096): Status Report: Elapsed Time: '714000.569395'
2023-01-14 14:29:06 (37096): Status Report: CPU Time: '12362.740000'
2023-01-14 16:08:09 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 16:08:09 (37096): Status Report: Elapsed Time: '720000.569395'
2023-01-14 16:08:09 (37096): Status Report: CPU Time: '12465.330000'
2023-01-14 17:47:12 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 17:47:12 (37096): Status Report: Elapsed Time: '726000.569395'
2023-01-14 17:47:12 (37096): Status Report: CPU Time: '12566.670000'
2023-01-14 19:26:15 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 19:26:15 (37096): Status Report: Elapsed Time: '732000.569395'
2023-01-14 19:26:15 (37096): Status Report: CPU Time: '12668.870000'
2023-01-14 21:05:18 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 21:05:18 (37096): Status Report: Elapsed Time: '738000.569395'
2023-01-14 21:05:18 (37096): Status Report: CPU Time: '12770.420000'
2023-01-14 22:44:21 (37096): Status Report: Job Duration: '864000.000000'
2023-01-14 22:44:21 (37096): Status Report: Elapsed Time: '744000.569395'
2023-01-14 22:44:21 (37096): Status Report: CPU Time: '12871.400000'
2023-01-15 00:23:24 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 00:23:24 (37096): Status Report: Elapsed Time: '750000.569395'
2023-01-15 00:23:24 (37096): Status Report: CPU Time: '12975.800000'
2023-01-15 02:02:27 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 02:02:27 (37096): Status Report: Elapsed Time: '756000.569395'
2023-01-15 02:02:27 (37096): Status Report: CPU Time: '13079.680000'
2023-01-15 03:41:27 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 03:41:27 (37096): Status Report: Elapsed Time: '762000.569395'
2023-01-15 03:41:27 (37096): Status Report: CPU Time: '13192.770000'
2023-01-15 05:20:27 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 05:20:27 (37096): Status Report: Elapsed Time: '768000.569395'
2023-01-15 05:20:27 (37096): Status Report: CPU Time: '13300.410000'
2023-01-15 06:59:28 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 06:59:28 (37096): Status Report: Elapsed Time: '774000.569395'
2023-01-15 06:59:28 (37096): Status Report: CPU Time: '13408.520000'
2023-01-15 08:38:28 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 08:38:28 (37096): Status Report: Elapsed Time: '780000.569395'
2023-01-15 08:38:28 (37096): Status Report: CPU Time: '13516.700000'
2023-01-15 10:17:30 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 10:17:30 (37096): Status Report: Elapsed Time: '786000.569395'
2023-01-15 10:17:30 (37096): Status Report: CPU Time: '13621.860000'
2023-01-15 11:56:33 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 11:56:33 (37096): Status Report: Elapsed Time: '792000.569395'
2023-01-15 11:56:33 (37096): Status Report: CPU Time: '13714.910000'
2023-01-15 13:35:36 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 13:35:36 (37096): Status Report: Elapsed Time: '798000.569395'
2023-01-15 13:35:36 (37096): Status Report: CPU Time: '13809.190000'
2023-01-15 15:14:39 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 15:14:39 (37096): Status Report: Elapsed Time: '804000.569395'
2023-01-15 15:14:39 (37096): Status Report: CPU Time: '13903.360000'
2023-01-15 16:53:40 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 16:53:40 (37096): Status Report: Elapsed Time: '810000.569395'
2023-01-15 16:53:40 (37096): Status Report: CPU Time: '14009.880000'
2023-01-15 18:32:40 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 18:32:40 (37096): Status Report: Elapsed Time: '816000.569395'
2023-01-15 18:32:40 (37096): Status Report: CPU Time: '14120.780000'
2023-01-15 20:11:41 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 20:11:41 (37096): Status Report: Elapsed Time: '822000.569395'
2023-01-15 20:11:41 (37096): Status Report: CPU Time: '14232.990000'
2023-01-15 21:50:41 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 21:50:41 (37096): Status Report: Elapsed Time: '828000.569395'
2023-01-15 21:50:41 (37096): Status Report: CPU Time: '14344.290000'
2023-01-15 23:29:42 (37096): Status Report: Job Duration: '864000.000000'
2023-01-15 23:29:42 (37096): Status Report: Elapsed Time: '834000.569395'
2023-01-15 23:29:42 (37096): Status Report: CPU Time: '14456.080000'
2023-01-16 01:08:42 (37096): Status Report: Job Duration: '864000.000000'
2023-01-16 01:08:42 (37096): Status Report: Elapsed Time: '840000.569395'
2023-01-16 01:08:42 (37096): Status Report: CPU Time: '14569.420000'
2023-01-16 02:47:43 (37096): Status Report: Job Duration: '864000.000000'
2023-01-16 02:47:43 (37096): Status Report: Elapsed Time: '846000.569395'
2023-01-16 02:47:43 (37096): Status Report: CPU Time: '14680.410000'
2023-01-16 04:26:43 (37096): Status Report: Job Duration: '864000.000000'
2023-01-16 04:26:43 (37096): Status Report: Elapsed Time: '852000.569395'
2023-01-16 04:26:43 (37096): Status Report: CPU Time: '14795.440000'
2023-01-16 06:05:44 (37096): Status Report: Job Duration: '864000.000000'
2023-01-16 06:05:44 (37096): Status Report: Elapsed Time: '858000.569395'
2023-01-16 06:05:44 (37096): Status Report: CPU Time: '14904.180000'
2023-01-16 07:44:44 (37096): Status Report: Job Duration: '864000.000000'
2023-01-16 07:44:44 (37096): Status Report: Elapsed Time: '864000.569395'
2023-01-16 07:44:44 (37096): Status Report: CPU Time: '15012.240000'
2023-01-16 07:44:44 (37096): Powering off VM.
2023-01-16 07:44:49 (37096): Successfully stopped VM.
2023-01-16 07:44:49 (37096): Deregistering VM. (boinc_36df9f5f16a80810, slot#2)
2023-01-16 07:44:49 (37096): Removing network bandwidth throttle group from VM.
2023-01-16 07:44:49 (37096): Removing VM from VirtualBox.
07:44:55 (37096): called boinc_finish(0)

</stderr_txt>
<message>
upload failure: <file_xfer_error>
  <file_name>Theory_2390-1112290-326_1_r350639367_result</file_name>
  <error_code>-161 (not found)</error_code>
</file_xfer_error>
</message>
]]>




©2024 CERN