Message boards :
Number crunching :
Some discoveries.(to predicte how long a WU will take)
Message board moderation
Author | Message |
---|---|
Send message Joined: 23 Oct 04 Posts: 358 Credit: 1,439,205 RAC: 0 |
I noticed the following with the WU's from 21.March: Example: v64boince6ib1-56s14_16675_1_sixvf_22230 I examined the bolded # in the WU-name. s4_ : 7 WU's avg. 11 hours 40 min./WU s6_ : 8 WU's avg. 11 hours 20 min./WU s8_ : 7 WU's avg. 04 hours 10 min./WU s10_ : 8 WU's avg. 0 hours 35 min./WU s12_ : 8 WU's avg. 0 hours 20 min./WU s14_ : 2 WU's avg. 0 hours 10 min./WU s16_ : 2 WU's avg. 0 hours 07 min./WU s18_ : 3 WU's avg. 0 hours 04 min./WU from the hole 'stock' only 1 returned (s18_) with 0.00 CPU-time. Hope for the next work! greetz littleBouncer |
Send message Joined: 23 Oct 04 Posts: 358 Credit: 1,439,205 RAC: 0 |
Some updates about prediction-time that 'boince' - WU will taken to be crunched. I have set the s14_ - WU as the 'normal' length token WU to 100%. So you can estimate (calculate) your prediction-time for your system. I noticed the following with the WU's from 21.March: (and the same with those from 29. March) Example: v64boince6ib1-56s14_16675_1_sixvf_22230 I examined the bolded # in the WU-name. s4_ : 7 WU's avg. 11 hours 40 min./WU : 100.0000 % s6_ : 8 WU's avg. 11 hours 20 min./WU : 95.0000 % s8_ : 7 WU's avg. 04 hours 10 min./WU : 33.3333 % s10_ : 8 WU's avg. 0 hours 35 min./WU : 5.5556 % s12_ : 8 WU's avg. 0 hours 20 min./WU : 2.7778 % s14_ : 2 WU's avg. 0 hours 10 min./WU : 1.3889 % s16_ : 2 WU's avg. 0 hours 07 min./WU : -.- s18_ : 3 WU's avg. 0 hours 04 min./WU : -.- Maybe this table is usable... greetz littleBouncer the WU's >s16_ are [IMO] 'critical', and will certainly abort also sometimes with 0 CPU-time |
Send message Joined: 23 Oct 04 Posts: 358 Credit: 1,439,205 RAC: 0 |
only to edit the title. |
Send message Joined: 2 Sep 04 Posts: 352 Credit: 1,393,150 RAC: 0 |
Little Bouncer, I feel you are correct in your assumption that the higher the s# then the faster the WU will complete ... But everybody will have to monitor their own computers to see just what Time the WU's complete in to get an true accurate Time for the WU's ... I say this because after monitoring my own Computers the Times you gave for the WU's differ quit a bit from the times I get with the WU's. About the only thing that holds true is the higher the s# the faster time they complete in ... I would think the reason for this is because of the difference in the speed of the CPU's, whether you have a Intel or a AMD CPU, whether you are running in HT Mode or not if you have a Intel, and how much you use your computer or computers to do other things with ... |
Send message Joined: 23 Oct 04 Posts: 358 Credit: 1,439,205 RAC: 0 |
I made the table just as a hint. Surely you must monitor your own system, than it is the idea to adapte your own 'stats'. the table is really rudimental, there are other parameters, like the TPS (if you toggle the 'F'-key'), such thing I didn't transforme to the table. greetz littleBouncer BTW: those faster WU's are very important for science, don't abort them! (and I didn't mention the table for use to abort faster WU's!) |
Send message Joined: 23 Oct 04 Posts: 358 Credit: 1,439,205 RAC: 0 |
Only a 'little update'. the line: I have set the s14_ - WU as the 'normal' length token WU to 100%. is wrong. s14_ should be s4_ correct line: I have set the s4_ - WU as the 'normal' length, which a WU will take, to 100%. Sorry for this misstake greetz littleBouncer |
Send message Joined: 2 Sep 04 Posts: 16 Credit: 15,568 RAC: 0 |
Interesting stats! I would, however, be much more interested in the ranges of times for each of the workunit types (i.e., slowest and fastest times for each) rather than just the average so that we get a clearer picture of their distributions. I suspect that s4 and s6 and the group of s14-s18 include considerably overlapping times (e.g., many of the s6 workunits take longer than the average s4 time, etc.). |
Send message Joined: 23 Oct 04 Posts: 358 Credit: 1,439,205 RAC: 0 |
Some addition: [QUOTE from Chrulle] the number in s(number) like for example s16 is the amplitude of the starting conditions, the higher the amplitude the more likely it is that particles will be lost. Chrulle LHC@home developer [/QUOTE] Tread THX to Chrulle for this update. littleBouncer @ Scott Brown, Sure the table isn't 'exact' yet, and it is only made as a hint to make your own 'stats' about the 's(numbers)' for example. You can use the dates from the 'BOINCLOGX' v1.22 History (if you use this application). |
©2024 CERN