Message boards :
Sixtrack Application :
SIXTRACKTEST
Message board moderation
Previous · 1 . . . 3 · 4 · 5 · 6 · 7 · 8 · 9 · Next
Author | Message |
---|---|
Send message Joined: 29 Feb 16 Posts: 157 Credit: 2,659,975 RAC: 0 |
Hello Crystal Pellet, many thanks for the feedback. Yes, I have experienced similar problems on my Linux desktop pc. As you have figured out on your own, the big beast is mem_alloc.log, a debug file we use to monitor dynamic array allocation. We forgot to put it under DEBUG pre-processing flag, and we abused of dynamic array allocation in a specific routine for beam-beam effects. We have fixed the two issues already yesterday, but we wait for other stuff to come up before updating the exes. In the meanwhile, I am submitting jobs with v5.01.01 without beam-beam effects. We are also testing version pinning, so that's why you may see arriving sixtracktest tasks with v4.6.30. I am a bit more interested by your successful tasks, e.g. https://lhcathome.cern.ch/lhcathome/result.php?resultid=212498881 https://lhcathome.cern.ch/lhcathome/result.php?resultid=212498882 https://lhcathome.cern.ch/lhcathome/result.php?resultid=212498883 They ran for quite a while (i.e. >1h), and I don't understand how it is possible that they did not hit the disk barrier... |
Send message Joined: 14 Jan 10 Posts: 1411 Credit: 9,433,926 RAC: 11,615 |
I am a bit more interested by your successful tasks, e.g. I don't understand either, cause I digged into the result with id 212498881 and found the mem_alloc file a few minutes before the task finished. I was monitoring big files for LHC-dev > 524,288 bytes. 17-Dec-2018 14:03:28 [LHC@home] Starting task sixtrack5p1p1_hl13B1__1__s__62.31_60.32__1_2__5__80_1_sixvf_boinc33_2 17-Dec-2018 15:45:05 [LHC@home] Computation for task sixtrack5p1p1_hl13B1__1__s__62.31_60.32__1_2__5__80_1_sixvf_boinc33_2 finished Big files in slot 7 (slot for above task) localtime ...... mod time .. size.......file 15:40:22.14 12:50:15 4517924 ".\7\fort.16" 15:40:22.14 15:38:52 728247 ".\7\fort.6" 15:40:22.14 14:03:28 4611986 ".\7\fort.9" 15:41:27.21 12:50:15 4517924 ".\7\fort.16" 15:41:27.21 14:03:28 1177848 ".\7\fort.31" 15:41:27.21 15:38:52 728247 ".\7\fort.6" 15:41:27.21 14:03:28 4611986 ".\7\fort.9" 15:41:27.21 15:41:16 739613414 ".\7\mem_alloc.log" 15:42:34.19 12:50:15 4517924 ".\7\fort.16" 15:42:34.19 14:03:28 1177848 ".\7\fort.31" 15:42:34.19 15:41:50 729597 ".\7\fort.6" 15:42:34.19 14:03:28 4611986 ".\7\fort.9" 15:42:34.19 15:41:16 739613414 ".\7\mem_alloc.log" 15:43:37.11 12:50:15 4517924 ".\7\fort.16" 15:43:37.11 14:03:28 1177848 ".\7\fort.31" 15:43:37.11 15:41:50 729597 ".\7\fort.6" 15:43:37.11 14:03:28 4611986 ".\7\fort.9" 15:43:37.11 15:41:16 739613414 ".\7\mem_alloc.log" 15:44:41.16 12:50:15 4517924 ".\7\fort.16" 15:44:41.16 14:03:28 1177848 ".\7\fort.31" 15:44:41.16 15:41:50 729597 ".\7\fort.6" 15:44:41.16 14:03:28 4611986 ".\7\fort.9" 15:44:41.16 15:41:16 739613414 ".\7\mem_alloc.log" Next task in slot 7: 15:45:46.17 12:51:33 4517924 ".\7\fort.16" |
Send message Joined: 28 Sep 04 Posts: 722 Credit: 48,339,757 RAC: 29,701 |
New tasks for sixtracktest 501.01 just arrived to two of my hosts (one for each). They are still in the queue waiting to be crunched. |
Send message Joined: 18 Dec 15 Posts: 1785 Credit: 117,273,852 RAC: 71,412 |
although 484 such tasks should be available according to the Project Status Page, my host doesn't download any :-( |
Send message Joined: 26 Oct 18 Posts: 96 Credit: 4,188,598 RAC: 0 |
although 484 such tasks should be available according to the Project Status Page, my host doesn't download any :-( An hour later I see theres over 10K tasks in progress. My understanding is that server status doesn't get updated very often here. I would guess all the tasks were already delivered even if server page still showed some tasks being available. 484 is moderately small portion of 10K and there must be plenty of hosts that all together will suck that amount instantly if it's available. Perhaps you computer was late. |
Send message Joined: 29 Sep 17 Posts: 14 Credit: 5,244,196 RAC: 0 |
I don't understand either, cause I digged into the result with id 212498881 and found the mem_alloc file a few minutes before the task finished. Just an update: This was a debug file we used during development of SixTrack 5 when we added dynamic allocation of the large tracking arrays. This should not have gone into the BOINC executables and has been fixed in the current test execs. SixTrack 5 Core Developer. github.com/SixTrack/SixTrack |
Send message Joined: 24 Oct 04 Posts: 1169 Credit: 54,074,888 RAC: 51,458 |
Server status says Unsent sixtracktest 21986 for about 24 hours (or more) but they won't send out |
Send message Joined: 7 Apr 18 Posts: 20 Credit: 137,327 RAC: 0 |
There is over 124k sixtracktest tasks available and no in progress. What's wrong? |
Send message Joined: 29 Feb 16 Posts: 157 Credit: 2,659,975 RAC: 0 |
we are inquiring the reason for that with the IT guys - will come back to you once the situation is clarified. |
Send message Joined: 15 Jul 05 Posts: 247 Credit: 5,974,599 RAC: 0 |
I wonder if there is something with the task deadline or similar that prevents tasks from being picked up. From the server side, Sixtracktest tasks are pushed to the scheduler shared memory. However, my BOINC client cannot get any task either. |
Send message Joined: 29 Feb 16 Posts: 157 Credit: 2,659,975 RAC: 0 |
an update - there seems to be an issue with a feature recently added in the BOINC server code, that was used for submitting those tasks. The tasks have been deleted, and we are debugging the feature - thanks Nils! |
Send message Joined: 28 Sep 04 Posts: 722 Credit: 48,339,757 RAC: 29,701 |
SIXTRACKTEST tasks are available again but I seem to be unable to download them again. |
Send message Joined: 16 May 14 Posts: 15 Credit: 7,357,879 RAC: 0 |
I can't download any either, why? |
Send message Joined: 24 Oct 04 Posts: 1169 Credit: 54,074,888 RAC: 51,458 |
2/23/2019 1:56:33 AM | LHC@home | No tasks are available for sixtracktest sixtracktest 134332 Unsent 385 In progress |
Send message Joined: 27 Sep 08 Posts: 831 Credit: 688,448,880 RAC: 143,314 |
my mac has some so maybe they are all mac tasks? |
Send message Joined: 1 Feb 06 Posts: 66 Credit: 9,723 RAC: 0 |
Hi all, Any update about ARM tasks for any of the applications? My Pi will come in few days... |
Send message Joined: 2 May 07 Posts: 2228 Credit: 173,797,371 RAC: 18,407 |
Are we on the way to virtualize the LHC? Three Computer run 3 Days for this sixtracktest successful ;-) https://lhcathome.cern.ch/lhcathome/workunit.php?wuid=117930634 |
Send message Joined: 29 Feb 16 Posts: 157 Credit: 2,659,975 RAC: 0 |
Hi all, Hello Guiri-One[Andalucia], sorry for the late reply. We do have both sixtrack and sixtracktest being distributed with ARM exes - we haven't solved all the issues with (no longer so) recent Android distributions. Anyway, I think your Pi should have already received and crunched some tasks... Are we on the way to virtualize the LHC? Indeed, with 10^7 turns, we can simulate 800s of beam dynamics. It is not the typical time we keep the beam in the LHC (~several hours), but we can simulate more than 50% of the energy ramp, for instance. Anyway, these were just some tasks so that long - I am not sure they are efficient, and we are looking into segmenting the simulation steps - e.g. 10 jobs of 10^6 turns run in sequence instead of a big job with 10^7 turns. For that, we have to make use of the check-point/restart option, and we have to send the necessary files as starting conditions and at the end of the simulation + checking that there are particles to be tracked. We should soon be able to have this. Thanks a lot for the support, keep up the good work, and happy crunching! |
Send message Joined: 2 May 07 Posts: 2228 Credit: 173,797,371 RAC: 18,407 |
Hi Alessio, to split the longrunners is a good idea. Atlas give us 200 Beams and this is a lot of CPU-Time to do this. |
Send message Joined: 2 May 07 Posts: 2228 Credit: 173,797,371 RAC: 18,407 |
Have a Computer with only sixtracktest for the Moment: The wingman had this Error: exceeded elapsed time limit 13480.39 (1920000000.00G/109981.44G)</message> https://lhcathome.cern.ch/lhcathome/result.php?resultid=238806725 |
©2024 CERN