Edge Computing
Processing data closer to where it is generated rather than in a centralized cloud data center.
Also: fog computing
Definition
Edge computing is a distributed computing paradigm that brings computation and data storage closer to the sources of data (edge of the network), rather than relying on centralized data centers. This reduces latency, conserves bandwidth, improves privacy, and enables real-time processing for use cases like IoT devices, autonomous vehicles, AR/VR, and industrial automation. Edge computing complements cloud computing — time-sensitive processing happens at the edge, while non-time-sensitive analytics occur in the cloud.
Example
“A factory's quality control cameras use edge computing to detect defects in real time without sending gigabytes of video to the cloud for analysis.”
Synonyms
- distributed computing
- fog computing
- near-user processing
Antonyms / Opposites
- centralized computing
- cloud computing
Images
CC-licensed · free to useVideo
Related Terms
- cloud-computing
- iot
- latency
- cdn
