Questions and Answers : Unix/Linux : Verify GPU is being utilized, how to?
Message board moderation

To post messages, you must log in.

AuthorMessage
David E. Merchant

Send message
Joined: 11 Apr 17
Posts: 39
Credit: 7,735,161
RAC: 0
Message 41941 - Posted: 18 Mar 2020, 7:40:36 UTC

Is LHC code able to utilize the GPU(s?) How can I tell if they are being utilized now or not? GPUtrip (or something like that) seemed to be working just fine.

(This post created on the computer with the strongest GPU I have at the moment.)
ID: 41941 · Report as offensive     Reply Quote
computezrmle
Volunteer moderator
Volunteer developer
Volunteer tester
Help desk expert
Avatar

Send message
Joined: 15 Jun 08
Posts: 2413
Credit: 226,555,663
RAC: 131,251
Message 41944 - Posted: 18 Mar 2020, 8:59:33 UTC - in response to Message 41941.  

Currently none of the LHC@home apps is doing scientific calculations on GPUs.
See the app list here:
https://lhcathome.cern.ch/lhcathome/apps.php

Last October Riccardo posted a comment about GPU app development:
https://lhcathome.cern.ch/lhcathome/forum_thread.php?id=5173&postid=40258
ID: 41944 · Report as offensive     Reply Quote
David E. Merchant

Send message
Joined: 11 Apr 17
Posts: 39
Credit: 7,735,161
RAC: 0
Message 41945 - Posted: 18 Mar 2020, 9:06:42 UTC - in response to Message 41944.  

Dang, that sure seems shortsided. For all the math-heavy aspects of the LHC project, it at least bears looking into. Pretty easy to move a virtual environment to a container design. However, IF their algorithm(s) are not good candidates for the array processing that GPU's are so outstandingly good at, then my point is moot. However, I just betcha it is.

Just think how much faster (sometimes better than 1000x) the sims and analyses could progress if the GPU's were utilized.
ID: 41945 · Report as offensive     Reply Quote
maeax

Send message
Joined: 2 May 07
Posts: 2101
Credit: 159,819,191
RAC: 123,837
Message 41947 - Posted: 18 Mar 2020, 10:13:13 UTC - in response to Message 41945.  

Accuracy before speed. You can see this also in WCG.
ID: 41947 · Report as offensive     Reply Quote
David E. Merchant

Send message
Joined: 11 Apr 17
Posts: 39
Credit: 7,735,161
RAC: 0
Message 42075 - Posted: 6 Apr 2020, 22:24:32 UTC - in response to Message 41947.  

As long as long doubles, or at least long floats are used in the GPU, accuracy should still be equal to CPU runs, AIUI. We have the option (in the software) to truncate to 16 bits or even ints sometimes, as needed. Why would we use shorts or ints in scientific calcs?
ID: 42075 · Report as offensive     Reply Quote

Questions and Answers : Unix/Linux : Verify GPU is being utilized, how to?


©2024 CERN