Message boards : Number crunching : only 10 k WU's left
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 . . . 9 · Next

AuthorMessage
Profile Jim Baize
Avatar

Send message
Joined: 17 Sep 04
Posts: 103
Credit: 38,543
RAC: 0
Message 8723 - Posted: 20 Jul 2005, 5:31:37 UTC - in response to Message 8722.  
Last modified: 20 Jul 2005, 5:31:56 UTC

Well, now we know where those last 10K WU's went. :P

Jim

> I had my cache set at 10 days, so I should have 3 to 4 days of work left.
> Wooohooo!
>
> Live long and crunch (if you got 'em).
>
> Paul.
>
ID: 8723 · Report as offensive     Reply Quote
Profile Magic Quantum Mechanic
Avatar

Send message
Joined: 24 Oct 04
Posts: 1155
Credit: 52,200,387
RAC: 55,167
Message 8728 - Posted: 20 Jul 2005, 7:28:50 UTC

I knew we didn't need any help doing this







*Samson Ben Yoseph*
Volunteer Mad Scientist For Life
ID: 8728 · Report as offensive     Reply Quote
Profile littleBouncer
Avatar

Send message
Joined: 23 Oct 04
Posts: 358
Credit: 1,439,205
RAC: 0
Message 8730 - Posted: 20 Jul 2005, 8:06:31 UTC - in response to Message 8722.  

> I had my cache set at 10 days, so I should have 3 to 4 days of work left.
> Wooohooo!
>
> Live long and crunch (if you got 'em).
>
> Paul.
>
So did I....
We will see, who crunchs longer...-)

greetz littleBouncer

ID: 8730 · Report as offensive     Reply Quote
Vedran Brnjetic
Avatar

Send message
Joined: 16 Jul 05
Posts: 24
Credit: 6,549
RAC: 0
Message 8733 - Posted: 20 Jul 2005, 9:51:32 UTC - in response to Message 8730.  

> > I had my cache set at 10 days, so I should have 3 to 4 days of work left.
>
> > Wooohooo!
> >
> > Live long and crunch (if you got 'em).
> >
> > Paul.
> >
> So did I....
> We will see, who crunchs longer...-)
>
> greetz littleBouncer
>
>

A bit of competition out here... sign me up.
<img border="0" src="http://boinc.mundayweb.com/one/stats.php?userID=2328&amp;trans=off" />
<img border="0" src="http://boinc.mundayweb.com/one/stats.php?userID=2328&amp;prj=5&amp;trans=off" />
ID: 8733 · Report as offensive     Reply Quote
Vedran Brnjetic
Avatar

Send message
Joined: 16 Jul 05
Posts: 24
Credit: 6,549
RAC: 0
Message 8739 - Posted: 20 Jul 2005, 13:18:24 UTC

Hey guys thers more work comming.

Up, 19986 workunits to crunch

as of now
<img border="0" src="http://boinc.mundayweb.com/one/stats.php?userID=2328&amp;trans=off" />
<img border="0" src="http://boinc.mundayweb.com/one/stats.php?userID=2328&amp;prj=5&amp;trans=off" />
ID: 8739 · Report as offensive     Reply Quote
Profile sysfried

Send message
Joined: 27 Sep 04
Posts: 282
Credit: 1,415,417
RAC: 0
Message 8741 - Posted: 20 Jul 2005, 14:08:53 UTC - in response to Message 8739.  

Up, 25049 workunits to crunch
ID: 8741 · Report as offensive     Reply Quote
Profile Paul D. Buck

Send message
Joined: 2 Sep 04
Posts: 545
Credit: 148,912
RAC: 0
Message 8743 - Posted: 20 Jul 2005, 15:39:39 UTC

Up to 30,000 and change ...

Yea!!!
ID: 8743 · Report as offensive     Reply Quote
Grenadier
Avatar

Send message
Joined: 2 Sep 04
Posts: 39
Credit: 441,128
RAC: 0
Message 8745 - Posted: 20 Jul 2005, 18:17:25 UTC

88799. Guess we're good for another week or so. :-)
ID: 8745 · Report as offensive     Reply Quote
Profile sysfried

Send message
Joined: 27 Sep 04
Posts: 282
Credit: 1,415,417
RAC: 0
Message 8746 - Posted: 20 Jul 2005, 19:11:04 UTC - in response to Message 8745.  

> 88799. Guess we're good for another week or so. :-)
>
breaking the 100.000 mark now
ID: 8746 · Report as offensive     Reply Quote
Colin Porter

Send message
Joined: 14 Jul 05
Posts: 35
Credit: 71,636
RAC: 0
Message 8747 - Posted: 20 Jul 2005, 21:00:52 UTC - in response to Message 8746.  

> > 88799. Guess we're good for another week or so. :-)
> >
> breaking the 100.000 mark now
>
125K+ now.

I like this project.

It's not the speed, but the quality - Until I get a faster computer.
ID: 8747 · Report as offensive     Reply Quote
ric

Send message
Joined: 17 Sep 04
Posts: 190
Credit: 649,637
RAC: 0
Message 8748 - Posted: 20 Jul 2005, 21:13:11 UTC - in response to Message 8747.  
Last modified: 20 Jul 2005, 21:15:06 UTC

> I like this project.

Why?

It's the same like nearly everywhere.

U got Work, U crunch it and return it back.

And the loop never ends.

Just the screensaver and the labeling changes. And the processing times.

Do you want to crunch until christmas?

;)


(I like LHC because it's a "local" play/home run, whenever there would be problems, I can printout the results and have a "walk" to LHC to bring the results back myself.. so far this project runs really well. The only serious problem to be talked about (IMHO), at least in the past, they suffered from the error of no work)


(so many messages without a synergy stats? Let's change it)




ID: 8748 · Report as offensive     Reply Quote
Profile David C Thompson

Send message
Joined: 13 Jul 05
Posts: 10
Credit: 992
RAC: 0
Message 8947 - Posted: 26 Jul 2005, 21:01:20 UTC - in response to Message 8747.  

Phew! Good to know that the LHC people are churning out work as fast as we can process it!


> > > 88799. Guess we're good for another week or so. :-)
> > >
> > breaking the 100.000 mark now
> >
> 125K+ now.
>
> I like this project.

-------
David Thompson: Law for Aerospace, Engineering, Biology, and IP
ID: 8947 · Report as offensive     Reply Quote
Vedran Brnjetic
Avatar

Send message
Joined: 16 Jul 05
Posts: 24
Credit: 6,549
RAC: 0
Message 8953 - Posted: 26 Jul 2005, 23:31:20 UTC - in response to Message 8947.  

Server Status

Up, 152508 workunits to crunch

I think it's a new record.
<img border="0" src="http://boinc.mundayweb.com/one/stats.php?userID=2328&amp;trans=off" />
<img border="0" src="http://boinc.mundayweb.com/one/stats.php?userID=2328&amp;prj=5&amp;trans=off" />
ID: 8953 · Report as offensive     Reply Quote
cjohnston1158

Send message
Joined: 22 Jul 05
Posts: 2
Credit: 5,593
RAC: 0
Message 8985 - Posted: 28 Jul 2005, 2:12:39 UTC

7/27/2005 10:09:38 PM|LHC@home|Sending scheduler request to http://lhcathome-sched1.cern.ch/scheduler/cgi
7/27/2005 10:09:38 PM|LHC@home|Requesting 0 seconds of work, returning 0 results
7/27/2005 10:09:39 PM|LHC@home|Scheduler request to http://lhcathome-sched1.cern.ch/scheduler/cgi succeeded


I can't download work units.. It is only on a couple computers that I can't. SETI and Einstein both download fine, but cant get LHC or Predictor to download. Any help Paul or anyone else?
ID: 8985 · Report as offensive     Reply Quote
John McLeod VII
Avatar

Send message
Joined: 2 Sep 04
Posts: 165
Credit: 146,925
RAC: 0
Message 8989 - Posted: 28 Jul 2005, 2:49:50 UTC - in response to Message 8985.  

> 7/27/2005 10:09:38 PM|LHC@home|Sending scheduler request to
> http://lhcathome-sched1.cern.ch/scheduler/cgi
> 7/27/2005 10:09:38 PM|LHC@home|Requesting 0 seconds of work, returning 0
> results
> 7/27/2005 10:09:39 PM|LHC@home|Scheduler request to
> http://lhcathome-sched1.cern.ch/scheduler/cgi succeeded
>
>
> I can't download work units.. It is only on a couple computers that I can't.
> SETI and Einstein both download fine, but cant get LHC or Predictor to
> download. Any help Paul or anyone else?
>
Leave it alone for a bit, and LHC should start downloading work again.

Causes of 0 seconds of work request.

1) An overloaded host - no projects will request work.
2) A project with enough work on a host with enough work.
3) A project that has used more than its share of CPU time.

There is a known bug in 4.45. A critical value is not cleared after every work period.

The only fix is to stop and restart BOINC. If you are desperate, you can open the client_state.xml file and hand edit the LT debts (while BOINC is halted).


BOINC WIKI
ID: 8989 · Report as offensive     Reply Quote
Jayargh

Send message
Joined: 24 Oct 04
Posts: 79
Credit: 257,762
RAC: 0
Message 8990 - Posted: 28 Jul 2005, 2:54:41 UTC - in response to Message 8989.  

> >
> The only fix is to stop and restart BOINC. If you are desperate, you can open
> the client_state.xml file and hand edit the LT debts (while BOINC is halted).
>
Sorry JMVII but that is not the only fix.... the short of it for a temporary quick fix suspend all other workunits and/or projects and LHC will download... stopping and starting Boinc does not work...the other part of a longer term fix is changing resource settings
ID: 8990 · Report as offensive     Reply Quote
John McLeod VII
Avatar

Send message
Joined: 2 Sep 04
Posts: 165
Credit: 146,925
RAC: 0
Message 9011 - Posted: 28 Jul 2005, 22:58:14 UTC - in response to Message 8990.  

> > >
> > The only fix is to stop and restart BOINC. If you are desperate, you can
> open
> > the client_state.xml file and hand edit the LT debts (while BOINC is
> halted).
> >
> Sorry JMVII but that is not the only fix.... the short of it for a temporary
> quick fix suspend all other workunits and/or projects and LHC will download...
> stopping and starting Boinc does not work...the other part of a longer term
> fix is changing resource settings
>
Stopping and starting BOINC does clear out the array of values that is a problem, and projects that have no work should start moving in the correct direction again.


BOINC WIKI
ID: 9011 · Report as offensive     Reply Quote
Desti

Send message
Joined: 16 Jul 05
Posts: 84
Credit: 1,875,851
RAC: 0
Message 9067 - Posted: 31 Jul 2005, 21:35:25 UTC - in response to Message 8953.  
Last modified: 31 Jul 2005, 21:36:13 UTC

Up, 33835 workunits to crunch
37 concurrent connections


We need some new workunits in the next two days :-)
Linux Users Everywhere @ BOINC
[url=http://lhcathome.cern.ch/team_display.php?teamid=717]
ID: 9067 · Report as offensive     Reply Quote
Scottatron

Send message
Joined: 18 Sep 04
Posts: 28
Credit: 59,744
RAC: 0
Message 9081 - Posted: 1 Aug 2005, 21:44:44 UTC

Down to approx 4700 WUs now! Hopefully there is some more work in the pipelines...
ID: 9081 · Report as offensive     Reply Quote
Colin Porter

Send message
Joined: 14 Jul 05
Posts: 35
Credit: 71,636
RAC: 0
Message 9082 - Posted: 1 Aug 2005, 21:57:32 UTC

Oh dear - looks like cold turkey time. Hopefully not for too long.

It's not the speed, but the quality - Until I get a faster computer.
ID: 9082 · Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 . . . 9 · Next

Message boards : Number crunching : only 10 k WU's left


©2024 CERN