SCIRun User Mailing List

Text archives Help


Re: [SCIRUN-USERS] forward modeling with threads


Chronological Thread 
  • From: "Darren Weber" <dweber@radmail.ucsf.edu>
  • To: "J. Davison de St. Germain" <dav@cs.utah.edu>
  • Cc: scirun-users@sci.utah.edu, Jason.Crane@mrsc.ucsf.edu, "Jeff Block" <jblock@mrsc.ucsf.edu>, "Sarang S. Dalal" <sarang@hurricane.ucsf.edu>
  • Subject: Re: [SCIRUN-USERS] forward modeling with threads
  • Date: Mon, 27 Mar 2006 16:22:33 -0800
  • Organization: UCSF Department of Radiology


Thanks,  Dav.

I appreciate the clarification about MPI vs threads. I do need to do more homework about threads and distributed computing. If you can recommend good text books or online material, I would be glad to hear about it.

I am guessing that the forward solution for each source location and the sensor array can be threaded. As far as I know, there are no dependent relations between source locations in the forward model (the inverse is another kettle of fish). I'm not sure about how it would be threaded exactly, I just assume SCIrun has already solved the details in c/c++ code. I have a lot to learn about this, but not much time to do so.

We have a grid system that can schedule independent jobs (it can run MPI code also, of course). The grid consists of linux 32bit and 64bit systems (including some dual xeon and Opteron systems). In this case, it could run forward solutions for multiple subjects in parallel. Depending on the machine doing the processing, each forward solution would be threaded and run locally on one machine in the grid. The more processors and memory the machine has, the faster the forward solution computations.

The dream that I have is to create a shell script that takes one subject argument. In the shell script, it sets all the paths and identifies all the data elements required for the forward solution. The data elements are already preprocessed for efficient input into a SCIrun algorithm. What I do not know is what SCIrun algorithm to use and how to get my data elements into the right formats. Our systems admin guys are looking into the tricky bits of compiling SCIrun (maybe there are specific code modules that would do this specific job). If they cannot compile, they may resort to rpm installation.

Thanks, Darren



J. Davison de St. Germain wrote:

Hi,

  Usually when someone says a "grid system" they mean a distributed
cluster type environment... again usually using a message passing
system (usually MPI).  If this is what you mean, then no, SCIRun does
not support this type of multi-processing.  SCIRun uses threads (eg:
pthreads or sproc (SGI)).  Threads are 'shared memory' beasts, and
thus don't help in a distributed environment.  However, if you have a
multi-processor computer (many new linux boxes have 2 or more
processors, and many SGIs have lots of processors), then SCIRun can be
(fairly straigtforwardly) made to take advantage of it.

   Now as to the particulars of greating a parallel algorithm for
solving forward solutions, I'm not sure of the answer.  Perhaps others
on this list will have some insight.  Note, the FAQ probably doesn't
have the specific information you might need.

   Please feel free to write back with more specifics.


Hi,

this looks helpful:

http://software.sci.utah.edu/doc/Installation/FAQ/faq.html

Does this point to sufficient information about SCIrun multithreading for forward solutions. I would appreciate some advice on getting scripts setup to run multithreaded forward solutions for EEG/MEG and spherical or boundary element models. As a novice to threaded computations on a grid system, we can seek advice from a local expert in their use, but some startup tips and potential pitfalls with SCIrun would be valuable. We have a grid system that includes solaris and linux systems.

I assume that inverse solutions are not threaded to the same extent as forward solutions, although the BLAS may be threaded.

Take care, Darren



   Sincerely,
          J. Davison de St. Germain

-----------------------------------------------------------------------
- J. Davison de St. Germain         dav@sci.utah.edu   (801) 581-4078 -
- Chief Software Engineer           http://www.cs.utah.edu/~dav       -
- SCI Institute, SE C-SAFE          University of Utah                -
-----------------------------------------------------------------------




--

Darren L. Weber, Ph.D.
Visiting Postdoctoral Scholar

Dynamic Neuroimaging Laboratory, Department of Radiology,
University of California, San Francisco,
185 Berry Street, Suite 350, Box 0946,
San Francisco, CA 94107, USA.

Tel: +1 415 353-9444
Fax: +1 415 353-9421
www: http://dnl.ucsf.edu/users/dweber

"To explicate the uses of the brain seems as difficult
a task as to paint the soul, of which it is commonly
said, that it understands all things but itself."
--Thomas Willis (The Anatomy of the Brain and Nerves, 1664)


===========================================================================
== The SCIRun Users mailing list: send email to majordomo@sci.utah.edu   ==
== for more details.                                                     ==
== Please acknowledge use of SCIRun in your papers and reports:          ==
==   see http://software.sci.utah.edu/scirun-biopse_citation.bib         ==
===========================================================================





Archive powered by MHonArc 2.6.16.

Top of page