22.1 Concurrency, Processes, and Threads
22.1.1 Concurrency
Concurrency is the ability of a system to manage multiple tasks that can make progress independently. These tasks may represent different activities, such as handling user input, reading from a file, or performing CPU-intensive computations. On a single CPU core, concurrency often relies on preemptive multitasking, where the operating system rapidly switches between tasks to give the illusion of simultaneous execution. During each context switch, the operating system saves the current task's state (e.g., registers, program counter) and restores the state of another task. Context switching introduces overhead due to saving and restoring state and because of potential cache invalidations.
Typical concurrency pitfalls include:
- Deadlocks: Occur when two or more threads each hold a resource the other thread needs, causing them all to block indefinitely.
- Race conditions: Happen when the outcome of a program depends on the timing of thread execution, producing non-deterministic results.
Rust's ownership model and type system reduce these pitfalls by enforcing strict rules about how data is shared and mutated across threads. Many concurrency errors are caught by the compiler before they can lead to runtime issues.
22.1.2 Processes and Threads
- Processes: A process is an independent unit of execution with its own memory space. Most operating systems isolate processes from each other, and communication usually occurs through sockets, pipes, or shared memory.
- Threads: A thread is a smaller execution unit within a process, sharing the process's memory space. While sharing data among threads within the same process can be convenient, it also introduces a higher risk of data races if synchronization is not properly managed.
Rust helps mitigate these risks with mechanisms such as mutexes and atomic types that ensure safe concurrent access to shared data.