Message boards :
Sixtrack Application :
Getting NO sixtrack WUs even tho my prefs ARE to accept them & test WUs AND serv status keeps saying there are THOUSANDS of Sixtrack WUs to send??
Message board moderation
Author | Message |
---|---|
Send message Joined: 21 Aug 07 Posts: 46 Credit: 1,503,835 RAC: 18 |
I have 4 hosts and have the same problem on these 2: Computer 9631414 and Computer 10342612. But these 2 get sixtrack jobs OK: Computer 9631419 and Computer 9926211. All 4 use the same LHC@home preferences. |
Send message Joined: 7 Feb 14 Posts: 99 Credit: 5,180,005 RAC: 0 |
Hello Life ... oPEA, according to me this is a common problem from Sixtrack project. Hundreds of available tasks are not much to feed all LHC hosts (yours too). Server status is not reliable as much as you think because what you see is an hourly update. That handful of tasks is sent in a couple of minutes. So one believes there are a lot of tasks, but the most of time there are 0 tasks ready to send. I've experienced a strange behaviour some time ago. If you set 10/10 days of additional work and your host gets a couple of hundreds of tasks, then it will go on getting new work because it will be requesting new task every time it will contact server to report finished ones. So the probability of getting new work is good. |
Send message Joined: 21 Aug 07 Posts: 46 Credit: 1,503,835 RAC: 18 |
Luigi, I've experienced the long standing problem you described (when the server status is not up to date and there was a usually a shortage of available Sixtrack work). But this a new problem which (I believe) started around the time of the LHC@home consolidation. As I stated in my post, I have 4 hosts, and now, only 2 of them will download Sixtrack work when it is available. The other 2 get the message from the server that no work is available for Sixtrack. In fact, I have manually updated all 4 computers (nearly simulataneously), got a bunch of Sixtrack work for the 2 which will download work while, at the same time, getting the no work available message on the other 2. I think there's a bug in the software somewhere. Stick |
Send message Joined: 21 Aug 07 Posts: 46 Credit: 1,503,835 RAC: 18 |
I have 4 hosts and have the same problem on these 2: Computer 9631414 and Computer 10342612. But these 2 get sixtrack jobs OK: Computer 9631419 and Computer 9926211. All 4 use the same LHC@home preferences. I need to amend the above: Computer 10342612 did receive and process 2 Sixtrack units on 1/14/17. However, Computer 9631414 continues to receive "No Sixtrack work available" messages even when Server Status shows that work is available and Computer 9631419 receives WU's when it requests work simultaneously. |
Send message Joined: 29 Sep 04 Posts: 281 Credit: 11,859,285 RAC: 0 |
Sixtrack work is always sporadic, usually coming in small batches. If one of your hosts is lucky enough to get, say, 5 tasks, then by definition there are 5 less tasks available for your other hosts. With 12,000 active hosts, all the work can be snapped up almost as soon as it is made available. |
Send message Joined: 19 Feb 08 Posts: 708 Credit: 4,336,250 RAC: 0 |
I got 20 SixTrack task on my 64-bit Linux box running SuSE Leap 42.2. Tullio |
Send message Joined: 21 Aug 07 Posts: 46 Credit: 1,503,835 RAC: 18 |
Sixtrack work is always sporadic, usually coming in small batches. If one of your hosts is lucky enough to get, say, 5 tasks, then by definition there are 5 less tasks available for your other hosts. With 12,000 active hosts, all the work can be snapped up almost as soon as it is made available. Ray, I am very experienced with the difficulties of getting Sixtrack work and I am convinced that this is something different. As I said in Message 28487, I believe it's a new problem related to the LHC consolidation. Computer 9631414 (which is only capable of doing Sixtrack) hasn't been able to get any work since the consolidation. Computer 9631419 is also only capable of doing Sixtrack and it gets work frequently. Today, when no Sixtrack was available at all, I noticed something which may be a clue. As I do frequently, I manually updated Computer 9631414 and Computer 9631419 simultaneously. Since no Sixtrack work was available, both returned messages listing that all the projects that did not send work. Computer 9631419 also returned a message (in red) saying that no VBox was installed. Computer 9631414 also does not have VBox installed but it did not receive the red No VBox message. I should note that I hadn't noticed this difference prior to now because the last 8 to 10 times I did the simultaneous manual update, Sixtrack work was available and Computer 9631419 received some but Computer 9631414 did not. Stick |
Send message Joined: 28 Sep 04 Posts: 675 Credit: 43,629,530 RAC: 15,968 |
Have you checked that both hosts are connected to the new project URL https://lhcathome.cern.ch/lhcathome/ ? |
Send message Joined: 21 Aug 07 Posts: 46 Credit: 1,503,835 RAC: 18 |
Yes, I changed to the new URL several weeks ago - a few days after it was announced. Surprisingly, Computer 9631414 started receiving Sixtrack work on 1/18/17 - the day after my last post (when I indicated it was not receiving (red) no VBox available messages). And a few hours before it started receiving work, I did a manual update and for the first time the server returned a "no VBox available" message. This makes me think somebody must have done something on the LHC end because I didn't do anything on my end. |
Send message Joined: 12 Jul 11 Posts: 857 Credit: 1,619,050 RAC: 0 |
Well glad you got some work (at last). We are currently running a challenge so there should be plenty work. I do not manage the server but I shall contact my colleagues and try and find out what is going on. Eric. |
Send message Joined: 21 Aug 07 Posts: 46 Credit: 1,503,835 RAC: 18 |
Well, it's time for me to eat some humble pie. Apparently, when I changed over to the new URL, I only got it right on 3 out of 4 computers. I fixed Computer 9631414 today. Makes me think that somebody must have fixed something on the old URL server a couple of days ago. Sorry for causing a wild goose chase. |
Send message Joined: 12 Jul 11 Posts: 857 Credit: 1,619,050 RAC: 0 |
No need to apologise, there are certainly some problems with getting work. Eric. |
©2024 CERN