In this exercise you can practice using C-functions in Cython modules. The code for this exercise is located under cython/c-functions. Fibonacci numbers are a sequence of integers defined by the …
In this exercise you can practice using static type declarations in Cython modules. The code for this exercise is located under cython/static-typing. Continue with the simple Cython module for subtracting …
If you would like to further practice your parallel programming skills, we have as a final parallel programming hands-on exercise parallelization of the heat equation solver with MPI. Source code …
We hope that you have enjoyed the fourth, and final, week of Python in High Performance Computing! This week, we have looked into parallel computing using MPI for Python. With …
In this exercise we test different routines for collective communication. Source code for this exercise is located in mpi/collectives/ First, write a program where rank 0 sends an array containing …
Collective communication routines in MPI also include routines for global communication between all the processes. Global collective communication is extremely costly in terms of performance, so if possible one should …
You can use collective communication to collect data from all tasks and move them into a single task, i.e. how to move data from many to one. 1 Gather Gather …
In MPI context, a communicator is a special object representing a group of processes that participate in communication. When a MPI routine is called, the communication will involve some or …
Collective communication transfers data between all the processes in a communicator. MPI includes collective communication routines not only for data movement, but also for collective computation and synchronisation. For example, …
In this exercise we explore non-blocking communication Source code for this exercise is located in mpi/non-blocking/ Go back to the Message chain exercise and implement it using non-blocking communication.
When communication routines are blocking, it means the programme is stuck waiting as long as communication is taking place. Blocking routines will exit only once it is safe to access …
In this exercise we explore a typical communication pattern, one-dimensional acyclic chain. Source code for this exercise is located in mpi/message-chain/ Write a simple program where every MPI task sends …
In this exercise we study sending and receiving data between two MPI process. Source code for this exercise is located in mpi/message-exchange/ Communicating general Python objects Write a simple program …
MPI for Python offers very convenient and flexible routines for sending and receiving general Python objects. Unfortunately, this flexibility comes with a cost in performance. In practice, what happens under …
Since MPI processes are independent, in order to coordinate work, they need to communicate by explicitly sending and receiving messages. There are two types of communication in MPI: point-to-point communication …