1) Message boards : LHC@home Science : Super Large Hadron Collider (Message 19823)
Posted 24 Jul 2008 by Profile westsail
Post:
Brilliant! So does this mean we might actually have work again someday?

The Super Large Hadron Collider (SLHC) is a proposed upgrade to the Large Hadron Collider to be made around 2012. The upgrade aims at increasing the luminosity of the machine by factor of 10 to 1035 cm−2s−1, providing a better chance to see rare processes and improving statistically marginal measurements. There exist many different paths to the upgrade. A collection of different designs of the high luminosity interaction regions is being maintained at [1]. A workshop was held in 2006 to establish which are the most promising options [2]. A comprehensive press article on this workshop can be found at the CERN Courier. A summary of the possible machine parameters can be found at Machine parameters collection.

Increasing LHC luminosity involves reduction of beam size at the collision point and either reduction of bunch length and spacing, or significant increase in bunch length and population. The maximum integrated luminosity increase of the existing options is about a factor of 4 higher than the to the LHC ultimate performance, unfortunately far below the LHC upgrade project's initial ambition of a factor of 10. However at the latest LUMI'06 workshop [3], several suggestions were proposed to boost the LHC peak luminosity by another factor of 10 beyond nominal towards 1035 cm−2s−1.

The resultant higher event rate poses important challenges for the particle detectors located in the collision areas[4].

wikipedia
2) Message boards : Number crunching : How often does LHC shut down? (Message 19485)
Posted 18 Apr 2008 by Profile westsail
Post:

I third that!, lol
A high bandwidth project, even if it took days or a couple weeks to crunch a single WU would be doable and if need be could parallel multiple CPU's if it's supported to crunch it faster. There are many of us that are flexible enough with our hardware that those numbers are not that big of a deal really. If you set up the requirements, people will crunch it. You'd be surprised how many people could attach a 10,20, or even 50 gigaflop cluster out of old junk if you asked them to. In fact I think that would be a very popular project, because people would look at it more as "elite credits" and build up to crunch it.


umm...Forth!

It would be a very popular project indeed. l33t credits for those with sufficient Horsepower. Too bad OpenMosix atm doesn't deal with boinc well. Seems like it would work with a single project meant to run on clusters in my mind but not sure. As echoed by others bandwidth and diskspace are non issues for many here.

Another route I could see being hugely succesfull, and provide a level of stability/reliabilty with results not seen when using multiple hardware platforms. Would be if...there was a way to make an app that just needed ridiculous amounts of cpu but could handle limited ram. The coolest thing in the world would be to have a project like LHC (that so many people feel so passionately about) make use of all the PS3's.
Look at the enourmous computing power Folding brought to the table with the ps3 app. I think a project like this could really show what ps3's can do. Look at the groups making small ps3 clusters with 6-8 machines to replace leased supercomputer time. Here is a company that makes turn key ps3 clusters.
http://terrasoftsolutions.com/
Anyway, sorry to ramble on and on just been really excited about the LHC and about the crunching power of PS3, but not too interested in the science at ps3grid. Not enough to farm them anyway. So I think this would be a match made in heaven.



©2024 CERN