Message boards : Number crunching : Last wu crunched - kind of a sad feeling
Message board moderation

To post messages, you must log in.

AuthorMessage
Profile B-Roy

Send message
Joined: 1 Sep 04
Posts: 55
Credit: 21,297
RAC: 9
Message 4958 - Posted: 6 Nov 2004, 17:40:32 UTC

No more wus in my queue, but I am eagerly waiting for the comeback. Should we stay connected (client trying to connect), or should we detach and reattach as soon as you will be online again?

finally I think that the whole team has done a tremendous job during the last weeks. Well done!

In a footnote, I want to aid my wish for a more representative screensaver and a server- and project-status page.
ID: 4958 · Report as offensive     Reply Quote
Angstrom

Send message
Joined: 2 Sep 04
Posts: 23
Credit: 9,276
RAC: 0
Message 4960 - Posted: 6 Nov 2004, 18:07:25 UTC

Is this now officially the end then?

I know things will lift off again but if there are definetly no more LHC units?

I'll move my computers to another project if thats the case.......Oh bu$$er that seems to mean SETI at the moment, guess I'll have to be hypocritical and go crunch some of their units for a while.

Angs
ID: 4960 · Report as offensive     Reply Quote
STE\/E

Send message
Joined: 2 Sep 04
Posts: 352
Credit: 1,393,150
RAC: 0
Message 4985 - Posted: 7 Nov 2004, 10:48:13 UTC
Last modified: 7 Nov 2004, 10:48:30 UTC

It looks like it Angstrom, I haven't been able to download any new WU's for 3 or 4 days now so they are probably winding down and allowing everybody to empty their Caches.

I still have work to last a couple of days yet on a few computers but have run out on some and attached them back to Seti for now ...
ID: 4985 · Report as offensive     Reply Quote
ric

Send message
Joined: 17 Sep 04
Posts: 190
Credit: 649,637
RAC: 0
Message 4987 - Posted: 7 Nov 2004, 11:17:13 UTC - in response to Message 4985.  
Last modified: 7 Nov 2004, 11:17:32 UTC

some of the clients could dl work this morning.

Some tunescan are with ;-))


the Report Deadline is set to November 21. 2004.
This means nothing.

I believe/hope, at least til this date.
Or if more work is ready to dl, behind or around.

Interesting: many _4, _5, _7, _5 Endings. highest seen _9

are with.

Could this be the "never returned" work, reput/regenerated to the dl dir to complete the work cycle(s)?

one client got 19 WUs. But only one.

Oviously the final stage before hiphernation is loaded.





ID: 4987 · Report as offensive     Reply Quote
Profile Krunchin-Keith [USA]
Volunteer moderator
Project tester
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 2 Sep 04
Posts: 209
Credit: 1,482,496
RAC: 0
Message 4999 - Posted: 7 Nov 2004, 14:59:28 UTC
Last modified: 1 Jan 2005, 15:59:16 UTC

ID: 4999 · Report as offensive     Reply Quote

Send message
Joined: 25 Oct 04
Posts: 27
Credit: 1,205
RAC: 0
Message 5001 - Posted: 7 Nov 2004, 15:13:20 UTC

Same here ,, out of work , I�ve switched back to seti as the 4.07 seems to have fixed the slowdown problems there when lhc comes back on line in dec / jan I will revert back to 100% lhc

I did find something of a bug in the process
If you set lhc at 200 and seti at 100 and to use 3 cpu�s I found that if there�s no work from lhc then it will only download enough from seti to keep 2 of them happy and leave 1 cpu idle ?

Thought if there�s no work from one project it would compensate and use the available capacity on another project

Dave

ID: 5001 · Report as offensive     Reply Quote
Profile sysfried

Send message
Joined: 27 Sep 04
Posts: 282
Credit: 1,415,417
RAC: 0
Message 5002 - Posted: 7 Nov 2004, 15:31:35 UTC - in response to Message 5001.  

finished my last WU this morning... have 900+ credits pending.... :-)
ID: 5002 · Report as offensive     Reply Quote
Profile Trane Francks

Send message
Joined: 18 Sep 04
Posts: 71
Credit: 28,399
RAC: 0
Message 5003 - Posted: 7 Nov 2004, 16:50:24 UTC - in response to Message 5001.  

> Same here ,, out of work , I�ve switched back to seti as the 4.07 seems to
> have fixed the slowdown problems there when lhc comes back on line in dec /

You switched back in order to get fewer credits for crunching? That's a wee bit odd. :-D

ID: 5003 · Report as offensive     Reply Quote

Send message
Joined: 25 Oct 04
Posts: 27
Credit: 1,205
RAC: 0
Message 5007 - Posted: 7 Nov 2004, 17:19:49 UTC

Fewer credits per work unit , less time per work unit but more work units processed

The end result is same credits for same processing time
but more work is done , better for the science that way

so no real loss to credits :)

Dave


ID: 5007 · Report as offensive     Reply Quote
Granite T. Rock

Send message
Joined: 24 Oct 04
Posts: 6
Credit: 67,366
RAC: 0
Message 5012 - Posted: 8 Nov 2004, 6:52:13 UTC

You might find some WU's trickle out as deadlines expire and an additional WU needs to be resent. While my one computer has been out of work for a few days... Another one of my computers received 3 new units November 7th.
ID: 5012 · Report as offensive     Reply Quote
STE\/E

Send message
Joined: 2 Sep 04
Posts: 352
Credit: 1,393,150
RAC: 0
Message 5015 - Posted: 8 Nov 2004, 9:59:27 UTC - in response to Message 5012.  

> You might find some WU's trickle out as deadlines expire and an additional WU
> needs to be resent. While my one computer has been out of work for a few
> days... Another one of my computers received 3 new units November 7th.
==========

Yes, I recieved 1 lone tunescana on 1 computer last night & 22 Regular WU's & 2 tunescana on another 1 also last night.

It was kinda weird really, I just happened to try & connect to the Web Page & I did so I Updated that Computer & bam it downloaded 24 WU's. But as soon as I did that it said the Project was down again & I couldn't connect to the Web Page again.

I just hit it lucky I guess when I tried. I'm still crunching LHC WU's on 4 PC's but 1 of them will be out in a few hr's and 2 will be out by the end of the day & the other 1 out sometime tomorrow unless I can get more downloaded to it.

It's ok though, it's getting to be a real hassle trying to keep LHC work on them so I'm going to re-connect to Seti & crunch over there until LHC comes back up once they shut down. I'll just leave them set to 50/50 & if any trickle in they'll get crunched that way ... :)
ID: 5015 · Report as offensive     Reply Quote
Profile Trane Francks

Send message
Joined: 18 Sep 04
Posts: 71
Credit: 28,399
RAC: 0
Message 5017 - Posted: 8 Nov 2004, 10:04:01 UTC - in response to Message 5015.  

> It's ok though, it's getting to be a real hassle trying to keep LHC work on
> them so I'm going to re-connect to Seti & crunch over there until LHC
> comes back up once they shut down. I'll just leave them set to 50/50 & if
> any trickle in they'll get crunched that way ... :)

That's how I've always done it. I figure that the core client knows what it's doing with regard to managing its connections to the various projects. Forcing connections all the time does NOT do the project servers any favours, especially with servers as fragile as the ones currently hosting LHC@home.

ID: 5017 · Report as offensive     Reply Quote

Message boards : Number crunching : Last wu crunched - kind of a sad feeling


©2024 CERN