What are edge and fog data centers?
Ever-changing technology – for example, mobile phones, streaming video services, robotic surgery, and self-driving cars – requires time-sensitive data exchange. A recent Gartner report predicts that by 2025, 85% of corporate and organizational infrastructure strategies will use “on-premises, colocation, cloud, and edge delivery options, compared with 20% in 2020.”* As billions of devices connect to data-intensive applications like the Internet of Things (IoT), 5G mobile technology, and artificial intelligence (AI), cloud servers become overwhelmed with data, creating long processing ques. The data deluge combined with the cloud’s distance from the devices creates both bandwidth and latency issues. These challenges have given rise to the use of both edge and fog data centers, distributed cloud technology physically located close to end users in order to deliver lightning-speed service with minimal latency.
The instant exchange of information among devices is mission-critical to many services. For example, imagine this scenario:
a self-driving vehicle reaches a busy intersection just as the traffic light turns from red (STOP) to green (GO). At the same time, a pedestrian is crossing the street in front of the on-coming car, taking no heed of the changing lights. The vehicle must respond at once, not only to the traffic light, but its sensors must also react to anyone in its vicinity. The vehicle’s technology must process all of this information in a split second in order to make a proper response that ensures the safety of everyone at the intersection. Human lives depend on this technology working in real time.
And not only do consumers benefit from distributed cloud technology, industries like oil, gas, renewable energy, healthcare, and the military are just beginning to harness the power of edge and fog computing for operations where even a delay of one or two seconds can be detrimental and even catastrophic.
Although edge and fog data centers do not circumvent the need for origin servers, they address the cost, latency, and security concerns associated with moving massive amounts of data back and forth across a network. Both edge and fog computing:
- decrease network and internet latency
- improve system response time in remote mission-critical applications
- reduce the amount of data sent to the cloud
Comparing edge and fog
Like edge data centers, fog data centers facilitate computing that pushes data to platforms found close to the data’s origin (e.g., interactive screens, motors, pumps, sensors). In order to prevent latency issues, both edge and fog data centers perform computation tasks ordinarily conducted in a far-away cloud. Edge and fog, however, are different architectures.
The idea of edge computing – the need to compute close to the data source – has been around for decades. As cloud technology evolved, industries quickly learned that work done at the edge of the network instead of in the cloud is more efficient and lowers costs. In edge computing, relevant data is processed and stored where work is done – on the device or sensor itself without being transferred anywhere – in order to instantly complete the task at hand. Edge computing, however, produces massive volumes of data that must be transferred to the cloud for more complex processing (e.g., analytics, trend analysis). And that is where fog computing comes in. Essentially, fog acts as a bridge between the edge and the cloud.
Fog computing also exists at the edge of the network – between the edge and the cloud – receiving data gathered from the endpoints (the edge) and processing it within nodes situated within the local area network (LAN). A fog server is essentially a small copy of the larger cloud. The data from the edge is first processed in the fog and then the data goes to the main cloud server. Fog computing moves any information not needed by the edge process to the cloud to accomplish more complex tasks that are not necessarily mission critical.
For example, if millions of people around the world at the same time want to play a particular song on a music streaming service and that song is on a server in the United States, then processing that request would create a long que, slowing service to consumers. Fog servers, which are located close to the devices requesting the song, are able to stream that song immediately to the end device (the edge) with minimal latency. The fog node maintains a copy of that song, enabling other consumers in that same geographic area to also stream the song instantaneously.
Challenges and transformations
Physically, because they are distributed across geographic regions serving a limited number of devices, edge and fog data centers have a much smaller footprint than traditional data centers. And both architectures work in unison to 1) supply instant service as well as 2) optimize cloud-based systems by preventing data bottlenecks and offloading the strain that a data deluge places on traditional data centers. These computing operations can present certain challenges that consumers should look to relevant edge and fog data center vendors to address:
- performance (lower latency, high fidelity, and speed)
- scalability (can you make upgrades without making major architectural and capacity changes)
- endurance (survive varying temperature ranges and vibrations out in the field)
- easy to maintain
- security (encryption of data before it leaves the edge and fog)
Advanced fiber optic technology is the life blood of edge and data centers, serving as the information highways that are transforming industries. These transformations allow consumers, applications, and devices to perform mission-critical tasks in real time that weren’t possible before. Edge and fog data centers use fiber cables to push the intelligence, processing power, and communication capabilities of an edge gateway or appliance directly into devices like PLCs (programmable logic controllers), PACs (programmable automation controllers), and especially EPICs (edge programmable industrial controllers).
What is most important to work optimally at the edge of a network is that the architecture is robust – fully transforming data at the edge while managing all of the various data types that exist. It should be able to work autonomously while still running seamlessly from the edge to the fog to the cloud.
For more information on our Data Center and Network Solutions, please visit Data Center Solutions | Networking Technology in Data Tech (sumitomoelectriclightwave.com).
*Gartner, The Everywhere Enterprise: A Gartner Q&A with David Cappuccio, 2020