Message boards :
ATLAS application :
Real tasks now being submitted
Message board moderation
Author | Message |
---|---|
Send message Joined: 13 May 14 Posts: 387 Credit: 15,314,184 RAC: 0 |
We are now submitting real ATLAS tasks instead of the same test task over and over again. These are from the same pool of tasks that are sent to the ATLAS@Home project and the results will be really used! There will be a constant stream of tasks, which I will increase gradually. At some point we will ask people to move from ATLAS@Home to here and stop sending new tasks to ATLAS@Home. Happy crunching! |
Send message Joined: 9 Dec 14 Posts: 202 Credit: 2,533,875 RAC: 0 |
This is great news! Unfortunately I was not able to get any tasks the last couple of days. Have you already increased the stream of tasks? If not, when are you planning to do so? Because for me its nearly impossible to get some ATLAS tasks here. Also server status show 0 unsent tasks pretty much 24/7 for ATLAS. |
Send message Joined: 13 May 14 Posts: 387 Credit: 15,314,184 RAC: 0 |
I've increased the rate of tasks. Unfortunately the task submission got broken over the weekend so that's why nothing was available but it's fixed now. |
Send message Joined: 17 Sep 04 Posts: 99 Credit: 30,693,528 RAC: 5,378 |
It looks like no tasks are available at the moment. Thanks. Regards, Bob P. |
Send message Joined: 2 Sep 04 Posts: 453 Credit: 193,569,815 RAC: 13,099 |
|
Send message Joined: 2 Sep 04 Posts: 453 Credit: 193,569,815 RAC: 13,099 |
|
Send message Joined: 2 Sep 04 Posts: 453 Credit: 193,569,815 RAC: 13,099 |
What kind of Limitations for WUs is established ? HM, I switched Max_Number_of_Jobs from No-Limit to 10 and now I got 10 WUs Supporting BOINC, a great concept ! |
Send message Joined: 13 May 14 Posts: 387 Credit: 15,314,184 RAC: 0 |
I think we might be affected by the huge sixtrack queue again. The default number of jobs to download if you have no limit is 1 per CPU. EDIT: I see from the other thread you have <project_max_concurrent>2</project_max_concurrent> which I suppose gives you 2 WU if you have no limit on the project preferences. |
Send message Joined: 2 Sep 04 Posts: 453 Credit: 193,569,815 RAC: 13,099 |
I think we might be affected by the huge sixtrack queue again. The default number of jobs to download if you have no limit is 1 per CPU. So far all is working fine, since I changed my settings as described below. Actual I'm switching my clients over to LHC and they all got work immediatly and with no delay. All seems to be fine (knock on wood, knock on wood, knock on wood) Supporting BOINC, a great concept ! |
Send message Joined: 14 Jan 10 Posts: 1279 Credit: 8,484,048 RAC: 1,651 |
EDIT: I see from the other thread you have This setting is only client side. The server is not aware of it. So you could set max # Jobs 4 in your project preferences, get 4 and run max 2 locally and 2 "Ready to start". With the 200MB ATLAS download file a new task could start immediately after one task has finished. |
Send message Joined: 17 Sep 04 Posts: 99 Credit: 30,693,528 RAC: 5,378 |
...The default number of jobs to download if you have no limit is 1 per CPU. I am only getting work for my physical CPU's, not the hyperthreaded CPU's. Thus I am only able to run half the number of jobs that I did on Atlas@home. Is there a way to correct this so I can run all of my cores? Thank you. Regards, Bob P. |
Send message Joined: 2 Sep 04 Posts: 453 Credit: 193,569,815 RAC: 13,099 |
I am only getting work for my physical CPU's, not the hyperthreaded CPU's. Thus I am only able to run half the number of jobs that I did on Atlas@home. Is there a way to correct this so I can run all of my cores? Play a little bit with Max_Number_of_Jobs I switched Max_Number_of_Jobs from No-Limit to 10 and now I got 10 WUs Supporting BOINC, a great concept ! |
Send message Joined: 17 Sep 04 Posts: 99 Credit: 30,693,528 RAC: 5,378 |
I am only getting work for my physical CPU's, not the hyperthreaded CPU's. Thus I am only able to run half the number of jobs that I did on Atlas@home. Is there a way to correct this so I can run all of my cores? Yeti, where do I find that setting? Thanks! Regards, Bob P. |
Send message Joined: 2 Sep 04 Posts: 453 Credit: 193,569,815 RAC: 13,099 |
Yeti, where do I find that setting? Thanks! Go To: https://lhcathome.cern.ch/lhcathome "Your Account" "LHC@Home settings" choose the right one (home/Office/School or None) for your machine change the setting Supporting BOINC, a great concept ! |
Send message Joined: 17 Sep 04 Posts: 99 Credit: 30,693,528 RAC: 5,378 |
Thanks! Regards, Bob P. |
Send message Joined: 17 Sep 04 Posts: 99 Credit: 30,693,528 RAC: 5,378 |
Almost out of work units! Thanks. Regards, Bob P. |
Send message Joined: 19 Feb 08 Posts: 708 Credit: 4,336,250 RAC: 0 |
LHC Atlas is running double core on my oldest Linux box.Let'see if it can complete and validate like it does on my newer Windows 10 PC where all other LHC tasks except SixTrack fail miserably. Tullio |
©2024 CERN