Message boards : LHC@home Science : WU's
Message board moderation

To post messages, you must log in.

AuthorMessage
The Ancient One

Send message
Joined: 13 Jul 05
Posts: 6
Credit: 6,139
RAC: 0
Message 15573 - Posted: 19 Nov 2006, 13:33:50 UTC

Hi,
Can any one tell me if their recieving any wu's? I've not recieved any since September when they where doing the migration thing. When I visit the sites main page it just says 'up, No work units' even the RSS feed dosn't give any clues to whats going on

ID: 15573 · Report as offensive     Reply Quote
Profile Morgan the Gold
Avatar

Send message
Joined: 18 Sep 04
Posts: 38
Credit: 173,867
RAC: 0
Message 15575 - Posted: 19 Nov 2006, 16:54:59 UTC

:) wu are few and far between got a few last week and a few a month ago

accross a number of pc's that is
ID: 15575 · Report as offensive     Reply Quote
River~~

Send message
Joined: 13 Jul 05
Posts: 456
Credit: 75,142
RAC: 0
Message 15579 - Posted: 19 Nov 2006, 17:46:53 UTC

If you have left your machines connectiong regularly, you should have got some WU twice this month. There was work for around 8hrs on Nov 2nd and again for around 8hrs on 18th. We normally expect that if work is available for 4hrs or more there will be some for everyone who is connecting regularly. See this thread for more info on what might have stopped your machine getting work.

If you wait hoping to see some on the main page, and then reconnect your machines then yoy are unlikely to be fast enough unless you spend all your life watching the main page.

You may also be interested in a thread about why this project rarely has work

River~~
ID: 15579 · Report as offensive     Reply Quote
PovAddict
Avatar

Send message
Joined: 14 Jul 05
Posts: 275
Credit: 49,291
RAC: 0
Message 15583 - Posted: 19 Nov 2006, 20:02:36 UTC - in response to Message 15573.  

Hi,
Can any one tell me if their recieving any wu's? I've not recieved any since September when they where doing the migration thing.

The project usually doesn't have work. There are around 6 forum threads asking the same thing here. Search before posting please.
ID: 15583 · Report as offensive     Reply Quote
Nachtmeister

Send message
Joined: 8 Feb 07
Posts: 4
Credit: 253,123
RAC: 0
Message 16589 - Posted: 21 Mar 2007, 22:39:28 UTC

Hello to all,

when will there more WU available. I´m rejoined the projekt, but didn´t recieve any work. Please add some more work. It is boring to join a projekt but didn´t get any work.

regards

Nachtmeister
ID: 16589 · Report as offensive     Reply Quote
Ariel
Avatar

Send message
Joined: 7 Mar 07
Posts: 59
Credit: 7,906
RAC: 0
Message 16590 - Posted: 22 Mar 2007, 1:55:54 UTC
Last modified: 22 Mar 2007, 1:57:08 UTC

Oh, oh, oh!

It's a unique issue in the LHC forums!!

Whoopee!!!

Nachtmeishter: "Please add some more work. It is boring to join a projekt but didn´t get any work."
--Oh, yeah. I feel for you, buddy. Love & Peace. Ciao. Auf Wiedersehen.





Ariel: Certified "Too Cute for LHC" Cruncher!


. . . . . . . . . . . . -- Consider the lilies.
ID: 16590 · Report as offensive     Reply Quote
Proxima

Send message
Joined: 29 Aug 05
Posts: 1
Credit: 6,721
RAC: 0
Message 16810 - Posted: 3 May 2007, 21:41:25 UTC

Ich habe mit dem englischen so meine Probleme, aber vielleicht kann ja mal jemand posten, ob und wann wieder mit Arbeit/WUs zu rechnen ist. Ich habe LHC immer gerne gerechnet und würde es sofort wieder rechnen.
ID: 16810 · Report as offensive     Reply Quote
Collin

Send message
Joined: 8 May 07
Posts: 1
Credit: 0
RAC: 0
Message 16859 - Posted: 11 May 2007, 0:53:40 UTC - in response to Message 16590.  

Oh, oh, oh!

It's a unique issue in the LHC forums!!

Whoopee!!!

Nachtmeishter: "Please add some more work. It is boring to join a projekt but didn´t get any work."
--Oh, yeah. I feel for you, buddy. Love & Peace. Ciao. Auf Wiedersehen.




Was that sarcasm? lol

I did a search on this subject and found that maybe there'll be some WU's for those of us willing to share our pc resources are able to do so. I'll hang around for awhile. As it appears to stand right now, there are more than an ample number of contributors to this project.

Thanks to River~~ for linking to some helpfully clarifying items.
ID: 16859 · Report as offensive     Reply Quote
The Ancient One

Send message
Joined: 13 Jul 05
Posts: 6
Credit: 6,139
RAC: 0
Message 17341 - Posted: 13 Jul 2007, 2:56:47 UTC - in response to Message 15573.  

Hi,
Can any one tell me if their recieving any wu's? I've not recieved any since September when they where doing the migration thing. When I visit the sites main page it just says 'up, No work units' even the RSS feed dosn't give any clues to whats going on


Still not recieving wu's see below

13/07/2007 03:09:30|lhcathome|Sending scheduler request: To fetch work
13/07/2007 03:09:30|lhcathome|Requesting 1029 seconds of new work
13/07/2007 03:09:35|lhcathome|Scheduler RPC succeeded [server version 505]
13/07/2007 03:09:35|lhcathome|Deferring communication for 7 sec
13/07/2007 03:09:35|lhcathome|Reason: requested by project
13/07/2007 03:09:35|lhcathome|Deferring communication for 2 hr 11 min 32 sec
13/07/2007 03:09:35|lhcathome|Reason: no work from project
13/07/2007 03:50:29|lhcathome|Sending scheduler request: Requested by user
13/07/2007 03:50:29|lhcathome|Requesting 1029 seconds of new work
13/07/2007 03:50:34|lhcathome|Scheduler RPC succeeded [server version 505]
13/07/2007 03:50:34|lhcathome|Deferring communication for 7 sec
13/07/2007 03:50:34|lhcathome|Reason: requested by project
13/07/2007 03:50:34|lhcathome|Deferring communication for 2 hr 24 min 22 sec
13/07/2007 03:50:34|lhcathome|Reason: no work from project

I've re-attached some weeks ago to ensure I have the correct url but still no work
ID: 17341 · Report as offensive     Reply Quote
gmclutario

Send message
Joined: 11 Jul 07
Posts: 1
Credit: 4,039
RAC: 0
Message 17354 - Posted: 14 Jul 2007, 8:23:50 UTC

me too...could not have a bit of that chunk of the WUs.... was thinking for three days already that there is something wrong with my account...


I just hope , in a short period we do get that WUs
ID: 17354 · Report as offensive     Reply Quote
Aries

Send message
Joined: 11 Jul 07
Posts: 3
Credit: 55,348
RAC: 0
Message 17365 - Posted: 15 Jul 2007, 14:24:30 UTC

I think i am a lucky person :-D

Have a look here:
15/07/2007 8:46:56 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 8:46:56 PM|lhcathome|Reason: To fetch work
15/07/2007 8:46:56 PM|lhcathome|Requesting 864000 seconds of new work
15/07/2007 8:47:01 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 8:47:01 PM|lhcathome|No work from project
.
.
.
And then... this all happened :-)

.
.
.
15/07/2007 10:58:39 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 10:58:39 PM|lhcathome|Reason: To fetch work
15/07/2007 10:58:39 PM|lhcathome|Requesting 864000 seconds of new work
15/07/2007 10:58:44 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 10:58:44 PM|lhcathome|No work from project
15/07/2007 10:59:44 PM|lhcathome|Fetching master file
15/07/2007 10:59:49 PM|lhcathome|Master file download succeeded
15/07/2007 10:59:54 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 10:59:54 PM|lhcathome|Reason: To fetch work
15/07/2007 10:59:54 PM|lhcathome|Requesting 864000 seconds of new work
15/07/2007 10:59:59 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 10:59:59 PM|lhcathome|No work from project
15/07/2007 11:00:59 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 11:00:59 PM|lhcathome|Reason: To fetch work
15/07/2007 11:00:59 PM|lhcathome|Requesting 864000 seconds of new work
15/07/2007 11:01:04 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 11:01:04 PM|lhcathome|No work from project
15/07/2007 11:02:04 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 11:02:04 PM|lhcathome|Reason: To fetch work
15/07/2007 11:02:04 PM|lhcathome|Requesting 864000 seconds of new work
15/07/2007 11:02:09 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 11:02:11 PM|lhcathome|Started download of sixtrack_4.67_windows_intelx86.exe
15/07/2007 11:02:11 PM|lhcathome|Started download of bottomOverlay_1.02_.tga
15/07/2007 11:02:16 PM|lhcathome|Finished download of bottomOverlay_1.02_.tga
15/07/2007 11:02:16 PM|lhcathome|Throughput 18829 bytes/sec
15/07/2007 11:02:16 PM|lhcathome|Started download of courier_16_bold_1.01_.tga
15/07/2007 11:02:18 PM|lhcathome|Finished download of courier_16_bold_1.01_.tga
15/07/2007 11:02:18 PM|lhcathome|Throughput 7447 bytes/sec
15/07/2007 11:02:18 PM|lhcathome|Started download of courier_16_regular_1.01_.tga
15/07/2007 11:02:19 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 11:02:19 PM|lhcathome|Reason: To fetch work
15/07/2007 11:02:19 PM|lhcathome|Requesting 439926 seconds of new work
15/07/2007 11:02:24 PM|lhcathome|Finished download of courier_16_regular_1.01_.tga
15/07/2007 11:02:24 PM|lhcathome|Throughput 3935 bytes/sec
15/07/2007 11:02:24 PM|lhcathome|Started download of courier_24_bold_1.01_.tga
15/07/2007 11:02:24 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 11:02:24 PM|lhcathome|No work from project
15/07/2007 11:02:29 PM|lhcathome|Finished download of courier_24_bold_1.01_.tga
15/07/2007 11:02:29 PM|lhcathome|Throughput 13453 bytes/sec
15/07/2007 11:02:29 PM|lhcathome|Started download of helpOverlay_1.01_.tga
15/07/2007 11:02:58 PM|lhcathome|Finished download of helpOverlay_1.01_.tga
15/07/2007 11:02:58 PM|lhcathome|Throughput 28303 bytes/sec
15/07/2007 11:02:58 PM|lhcathome|Started download of logo_back_1.01_.tga
15/07/2007 11:03:05 PM|lhcathome|Finished download of logo_back_1.01_.tga
15/07/2007 11:03:05 PM|lhcathome|Throughput 35845 bytes/sec
15/07/2007 11:03:05 PM|lhcathome|Started download of logo_flare_1.01_.tga
15/07/2007 11:03:09 PM|lhcathome|Finished download of logo_flare_1.01_.tga
15/07/2007 11:03:09 PM|lhcathome|Throughput 20356 bytes/sec
15/07/2007 11:03:09 PM|lhcathome|Started download of logo_machine_1.01_.tga
15/07/2007 11:03:13 PM|lhcathome|Finished download of logo_machine_1.01_.tga
15/07/2007 11:03:13 PM|lhcathome|Throughput 20198 bytes/sec
15/07/2007 11:03:13 PM|lhcathome|Started download of logo_machine_light_1.01_.tga
15/07/2007 11:03:17 PM|lhcathome|Finished download of logo_machine_light_1.01_.tga
15/07/2007 11:03:17 PM|lhcathome|Throughput 20231 bytes/sec
15/07/2007 11:03:17 PM|lhcathome|Started download of logo_text_glow_1.01_.tga
15/07/2007 11:03:21 PM|lhcathome|Finished download of sixtrack_4.67_windows_intelx86.exe
15/07/2007 11:03:21 PM|lhcathome|Throughput 39907 bytes/sec
15/07/2007 11:03:21 PM|lhcathome|Started download of logo_text_left_1.01_.tga
15/07/2007 11:03:23 PM|lhcathome|Finished download of logo_text_glow_1.01_.tga
15/07/2007 11:03:23 PM|lhcathome|Throughput 35790 bytes/sec
15/07/2007 11:03:23 PM|lhcathome|Started download of logo_text_right_1.01_.tga
15/07/2007 11:03:25 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 11:03:25 PM|lhcathome|Reason: To fetch work
15/07/2007 11:03:25 PM|lhcathome|Requesting 439941 seconds of new work
15/07/2007 11:03:26 PM|lhcathome|Finished download of logo_text_left_1.01_.tga
15/07/2007 11:03:26 PM|lhcathome|Throughput 25365 bytes/sec
15/07/2007 11:03:26 PM|lhcathome|Started download of logo_trail_1.01_.tga
15/07/2007 11:03:29 PM|lhcathome|Finished download of logo_text_right_1.01_.tga
15/07/2007 11:03:29 PM|lhcathome|Throughput 38089 bytes/sec
15/07/2007 11:03:29 PM|lhcathome|Started download of progress_1.03_.tga
15/07/2007 11:03:30 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 11:03:30 PM|lhcathome|No work from project
15/07/2007 11:03:32 PM|lhcathome|Finished download of logo_trail_1.01_.tga
15/07/2007 11:03:32 PM|lhcathome|Throughput 36159 bytes/sec
15/07/2007 11:03:32 PM|lhcathome|Started download of progress_bg_1.03_.tga
15/07/2007 11:03:38 PM|lhcathome|Finished download of progress_1.03_.tga
15/07/2007 11:03:38 PM|lhcathome|Throughput 52135 bytes/sec
15/07/2007 11:03:38 PM|lhcathome|Started download of proton_1.06_.tga
15/07/2007 11:03:40 PM|lhcathome|Finished download of progress_bg_1.03_.tga
15/07/2007 11:03:40 PM|lhcathome|Throughput 43077 bytes/sec
15/07/2007 11:03:40 PM|lhcathome|Started download of topGradient_1.01_.tga
15/07/2007 11:03:41 PM|lhcathome|Finished download of proton_1.06_.tga
15/07/2007 11:03:41 PM|lhcathome|Throughput 10940 bytes/sec
15/07/2007 11:03:41 PM|lhcathome|Started download of topOverlay_1.02_.tga
15/07/2007 11:03:45 PM|lhcathome|Finished download of topGradient_1.01_.tga
15/07/2007 11:03:45 PM|lhcathome|Throughput 27555 bytes/sec
15/07/2007 11:03:45 PM|lhcathome|Started download of tubeInside_1.09_.tga
15/07/2007 11:03:51 PM|lhcathome|Finished download of tubeInside_1.09_.tga
15/07/2007 11:03:51 PM|lhcathome|Throughput 35893 bytes/sec
15/07/2007 11:03:51 PM|lhcathome|Started download of wtest_btest__1__s__64.24_59.25__4_6__5__45_1_sixvf_boinc10030.zip
15/07/2007 11:03:54 PM|lhcathome|Finished download of wtest_btest__1__s__64.24_59.25__4_6__5__45_1_sixvf_boinc10030.zip
15/07/2007 11:03:54 PM|lhcathome|Throughput 10767 bytes/sec
15/07/2007 11:03:54 PM|lhcathome|Started download of wtest_btest__1__s__64.24_59.25__4_6__5__60_1_sixvf_boinc10031.zip
15/07/2007 11:03:55 PM|lhcathome|Finished download of topOverlay_1.02_.tga
15/07/2007 11:03:55 PM|lhcathome|Throughput 59832 bytes/sec
15/07/2007 11:03:55 PM|lhcathome|Started download of wtest_btest__2__s__64.21_59.22__4_6__5__30_1_sixvf_boinc10032.zip
15/07/2007 11:03:56 PM||request_reschedule_cpus: files downloaded
15/07/2007 11:03:57 PM|lhcathome|Finished download of wtest_btest__1__s__64.24_59.25__4_6__5__60_1_sixvf_boinc10031.zip
15/07/2007 11:03:57 PM|lhcathome|Throughput 10940 bytes/sec
15/07/2007 11:03:57 PM|lhcathome|Finished download of wtest_btest__2__s__64.21_59.22__4_6__5__30_1_sixvf_boinc10032.zip
15/07/2007 11:03:57 PM|lhcathome|Throughput 10961 bytes/sec
15/07/2007 11:03:57 PM|lhcathome|Started download of wtest_btest__1__s__64.2_59.21__4_6__2__30_1_sixvf_boinc10019.zip
15/07/2007 11:03:57 PM|lhcathome|Started download of wtest_btest__2__s__64.21_59.22__4_6__5__45_1_sixvf_boinc10033.zip
15/07/2007 11:03:58 PM||request_reschedule_cpus: files downloaded
15/07/2007 11:03:58 PM||request_reschedule_cpus: files downloaded
15/07/2007 11:04:00 PM|lhcathome|Finished download of wtest_btest__2__s__64.21_59.22__4_6__5__45_1_sixvf_boinc10033.zip
15/07/2007 11:04:00 PM|lhcathome|Throughput 11009 bytes/sec
15/07/2007 11:04:01 PM||request_reschedule_cpus: files downloaded
15/07/2007 11:04:01 PM|lhcathome|Finished download of wtest_btest__1__s__64.2_59.21__4_6__2__30_1_sixvf_boinc10019.zip
15/07/2007 11:04:01 PM|lhcathome|Throughput 10757 bytes/sec
15/07/2007 11:04:02 PM||request_reschedule_cpus: files downloaded
15/07/2007 11:04:31 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 11:04:31 PM|lhcathome|Reason: To fetch work
15/07/2007 11:04:31 PM|lhcathome|Requesting 440606 seconds of new work
15/07/2007 11:04:37 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 11:04:37 PM|lhcathome|No work from project
15/07/2007 11:05:37 PM|lhcathome|Sending scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi
15/07/2007 11:05:37 PM|lhcathome|Reason: To fetch work
15/07/2007 11:05:37 PM|lhcathome|Requesting 440619 seconds of new work
15/07/2007 11:05:42 PM|lhcathome|Scheduler request to http://lhcathome.cern.ch/lhcathome_cgi/cgi succeeded
15/07/2007 11:05:42 PM|lhcathome|No work from project

Oh well, that is good enough for me :-) at least there was something for me to receive. I hope others got some too.
ID: 17365 · Report as offensive     Reply Quote
Profile Gary Roberts

Send message
Joined: 22 Jul 05
Posts: 72
Credit: 3,962,626
RAC: 0
Message 17367 - Posted: 16 Jul 2007, 3:21:17 UTC


15/07/2007 11:02:04 PM|lhcathome|Requesting 864000 seconds of new work


Yes you were lucky to be in the right place at just the right time :).

This project rarely has work and when there is work, the deadline is usually 7 days. It's not really a good idea to set your cache to 10 days (864000 seconds) because of the possibility that you could get work that you can't even start in time, let alone return. Even if you could do it in time, much of it would be "wasted effort" because the quorum would have already been completed on many work units well before you would be able to start.

It's much more "project friendly" to set a reasonable cache size.


Cheers,
Gary.
ID: 17367 · Report as offensive     Reply Quote
Aries

Send message
Joined: 11 Jul 07
Posts: 3
Credit: 55,348
RAC: 0
Message 17372 - Posted: 16 Jul 2007, 13:47:15 UTC - in response to Message 17367.  


15/07/2007 11:02:04 PM|lhcathome|Requesting 864000 seconds of new work


Yes you were lucky to be in the right place at just the right time :).

This project rarely has work and when there is work, the deadline is usually 7 days. It's not really a good idea to set your cache to 10 days (864000 seconds) because of the possibility that you could get work that you can't even start in time, let alone return. Even if you could do it in time, much of it would be "wasted effort" because the quorum would have already been completed on many work units well before you would be able to start.

It's much more "project friendly" to set a reasonable cache size.



Thank you for the heads up :-) i'm on my last one now. the other projects that i participate in normally run "batch's" or don't always have w.u. So as soon as I see one that has got some w.u.i suspend another and run it manually :-)

So basically, i run two w.u. concurrently, suspend one job and get the smaler w.u. out assap ;-) works great.
besides lhc, I have proteins@home, boincsimap and a few otehrs that have w.u always, so I won't be running on empty. i.e Eindtein@home & seti@home (+ Beta).
my system as configured, is always on "No new work" as soon as units from a project come through. Those are: Einstein, Tanpaku, Seti & Seti Beta & World Community Grid. So as soon as I am running low on W.U. I choose one, get some work and then leave it on "no new work" again and always have Proteins, Boincsimap & LHC to be getting work :-) Boinc has some work, so I am happy with the "under dogs". These are all science related, and that gets me going. some other projects, ie, withing WCG, I have not signed up since I do not beleive in them :-( (example: AIDS due to that I do not think my work would go to other peoples pockets. Same as some others, but... anyway it is an unrelated topic :-D

Anyway, thanks again, and I hope I have answered any questioned you might have had for why I do things the way I do.

ps I use to have my pc with seti only and set to the minimum. Found out that it wasn't always the best, since the project had many times been left dry, either due to connection, server issues, or other things.
ID: 17372 · Report as offensive     Reply Quote

Message boards : LHC@home Science : WU's


©2024 CERN