Description
This training introduces the Message Passing Interface (MPI) standard for programming on distributed-memory machines. The main MPI functionalities will be covered (environment variables, point-to-point communication, collective communication, derived data types, and process topology). An introduction to OpenMP for programming on shared-memory machines will also be provided. Practical sessions with exercises will illustrate the main concepts.
🎯 Training objective
Acquire the fundamentals of parallel programming with MPI and OpenMP.
✅ Learning outcomes
By the end of the training, participants will be able to:
-
Parallelize a simple C/Fortran program (around 50 lines) using the MPI library and/or OpenMP directives
-
Identify and apply appropriate OpenMP directives (work sharing, synchronization) depending on the problem to solve
-
Identify and apply appropriate MPI functions (point-to-point, collective communication, communicators, topologies) depending on the problem to solve
-
Address the main challenges of parallel programming (transition from sequential to parallel, shared memory, message passing, deadlocks)
📚 Teaching methods
The course alternates between lectures and hands-on exercises. A final multiple-choice exam is used for evaluation. Training rooms are equipped with computers, and participants may work in pairs.
👤 Lead instructor: Jeoffrey Legaux
👥 Target audience
Engineers, physicists, computer scientists, and numerical modelers wishing to acquire the basics of MPI and OpenMP parallel programming.
🔑 Prerequisites
Participants must:
-
Be employed by a European company (employer certificate required)
-
Hold a Master’s degree (Bac+5 or equivalent) or higher
-
Know basic Linux commands
-
Master one of the following programming languages: Fortran or C
-
Have at least a B2 level in English (CEFR), as the training may be delivered in English or French depending on the audience
To confirm prerequisites, applicants must complete one of the following questionnaires and achieve at least 75% correct answers to be eligible:
-
Fortran questionnaire: https://goo.gl/forms/IqDvVXfOYYqR0NMr1
- C questionnaire: https://goo.gl/forms/WwR3wvQVz2dYy6AX2
Registration
I certify that I obtained at least 75% correct answers — I register.
📆 Program
The training alternates between theoretical lectures and hands-on computer sessions (programming with Fortran or C). Course examples will be provided in Fortran.
Day 1
-
9:00 AM: Welcome and coffee
-
9:15 – 10:00 AM: Introduction to parallel computing and programming models MPI – OpenMP; Basics of OpenMP – Shared memory
-
10:00 – 10:45 AM: Exercises
-
10:45 – 11:00 AM: Break
-
11:00 – 11:45 AM: Work sharing
-
11:45 – 12:30 PM: Exercises
-
12:30 – 2:00 PM: Lunch break
-
2:00 – 2:45 PM: Synchronization – Pitfalls
-
2:45 – 3:30 PM: Exercises
-
3:30 – 3:45 PM: Break
-
3:45 – 4:30 PM: Introduction to the message-passing parallel programming model – Point-to-point communication
-
4:30 – 5:30 PM: Exercises
Day 2
-
9:00 – 10:00 AM: Point-to-point and collective communications
-
10:00 – 10:45 AM: Exercises
-
10:45 – 11:00 AM: Break
-
11:00 – 11:45 AM: Collective communications
-
11:45 – 12:30 PM: Exercises
-
12:30 – 2:00 PM: Lunch break
-
2:00 – 2:45 PM: Derived data types
-
2:45 – 3:30 PM: Exercises
-
3:30 – 3:45 PM: Break
-
3:45 – 4:30 PM: Communicators – Topologies
-
4:30 – 5:15 PM: Exercises
-
5:15 – 5:30 PM: Wrap-up
📊 Evaluation
A final exam will be held at the end of the training.