-
Notifications
You must be signed in to change notification settings - Fork 0
Home
simpayce edited this page May 5, 2015
·
1 revision
Welcome to the ParR wiki!
Parallel processing in R isrequired for data intensive jobs. It is well possible to run a number of scripts using MPI. MPI allows parallel processing in separate machines by sharing in memory data across the processes that run on the separate individual processors, by passing messages.
****Message Passing Interface - MPI
- A standard message-passing library definition
-
- developed in 1993 by a group of parallel computer vendors, computer scientists, and applications developers.
- Bindings for C, C++, Fortran 77/90/95.
- Available on a wide variety of architectures
-
- Super computers
-
- Clusters
-
- Desktop computers
- Distributed Memory Paradigm
-
- All inter-task communication is by message passing.
- All parallelism is explicit
-
- programmer is responsible for parallelism and implements it with MPI constructs.
The standard hello world program in MPI: #include <mpi.h> int main (int argc, char **argv) { MPI_Init (&argc, &argv); printf (“Hello World!\n”); MPI_Finalize(); return EXIT_SUCCESS; }