Chapter 1 Introduction

Quote

All computers are now parallel. Specifically, all modern computers support parallelism in hardware through at least one parallel feature, including vector instructions, multithreaded cores, multicore processors, multiple processors, graphics engines, and parallel co-processors.

In short, parallel programming is programming.

1.1 Think Parallel

Operations need to be done simultaneously. If the original iterative form is not suitable, we need to convert them to the appropriate parallel structure.

1.2 Performance

Two problems:

  • Computation may not be the bottleneck. Instead, access to memory or communication may be.
  • The potential for scaling performance is constrained by the algorithm's span (longest sequential-only part, or the critical path).

This book focuses on the shared memory machine model.

1.3 Motivation: Pervasive Parallelism

Three walls:

  • Power wall
  • Instruction-level parallelism (ILP) wall
  • Memory wall

1.3.3 Need for Explicit Parallel Programming

Serial traps: the long-sustained serial illusion has built traps into our tools and ways of thinking. For example, pointers work well in serial programs but become a nightmare in parallel programs.

So we need to:

  • Create appropriate tools.
  • Think in parallel.

Two types of parallelism:

  • Mandatory parallelism
  • Optional parallelism

1.4 Structured Pattern-Based Programming

1.5 Parallel Programming Models

1.6 Organization of this Book

1.7 Summary

Last change: 2023-04-18, commit: c74863c