Parallel Computing
Using multiple processors simultaneously to solve a computational problem faster.
Definition
Parallel computing is a type of computation in which multiple calculations or processes are carried out simultaneously, using multiple processors or cores. It exploits the concurrency present in many large problems by dividing work into smaller tasks that can be processed in parallel. Parallel computing is essential for high-performance computing applications including weather simulation, molecular dynamics, deep learning training, and scientific research.
Example
“Training a large language model on billions of text examples would take centuries on a single processor; by distributing the computation across thousands of GPUs working in parallel, researchers reduce training time to weeks.”
Synonyms
- concurrent computing
- multi-processor computing
- distributed computation
Antonyms / Opposites
- sequential computing
- serial computing
Images
CC-licensed · free to useVideo
Related Terms
- GPU Computing
- Distributed Systems
- Algorithm
- Machine Learning
