Edge Computing for remote and real-time experiences
The term cloud computing refers to computer processing and storage happening on servers in data centres, with outputs sent to and displayed on a user’s local machine in a separate location based on the users needs and commands.
This opens up the possibility for users to not require expensive computers to complete complex tasks, instead a more basic machine can forward inputs to the cloud and receive the results.
Edge computing is a distributed computing paradigm that moves servers closer to the user to reduce latency and bandwidth. This is a progression from content delivery networks.
Internet connections are now getting fast enough (with both low-latency and high bandwidth transfers) to stream very large amounts of data between a user and a server at exceptional speed.
This makes it possible for servers to run ever more complex applications and deliver the visual and audio results to users in high resolution.
An example of this technology being used at the time of writing this article is Google Stadia which launched in 2019. Its a subscription gaming platform where game and graphics computing happens on edge servers and the results are received on the users device of choice- a smartphone tablet, computer or TV- essentially a scene with a game controller. The quality currently is; visuals (up to 4K at the time of writing this, with 8K planned), along with 5.1 sound. The quality degrades with lower speed internet connections, down to a minimum recommended connection of 10 Mbps for 720p video at 60fps with stereo sound.
Other comparable services to Google Stadia include Sony’s PlayStation Now, Nvidia’s GeForce Now, Amazon’s Luna, and Microsoft’s xCloud.
What interests me greatly here is the lower tech requirements for a user to be able to use a wide range of complex applications in realtime.
At the time of writing high end VR experiences need to be run on expensive PC equipment, and mostly with a tethered connection to a headset for best results. Facebook’s Oculus VR Quest series of headsets are standalone devices with onboard Qualcomm processors and a version of the open source Android operating system. The Quest devices are capable of running impressive VR experiences, but in order to achieve the quality provided by a PC VR experience one needs to tether the device to a PC with a USB-C cable for the duration of the experience- the PC running the software and graphics processing, and delivering the visuals and sound to the device.
It is also possible to achieve this process wirelessly within one’s home, streaming the data from a PC to the Quest headset through a 5G router (it is recommended for the PC to connect using an ethernet cable). A popular app for achieving this is Virtual Desktop. This lets the user utilise the full potential of a gaming PC in their home while experiencing VR untethered (avoiding the annoyance of becoming tangled in wires).
The possibility for a user to not require a high end gaming PC to run high end VR experiences is very interesting and exciting. I imagine that within a relatively short time, perhaps a few years, similar VR streaming services will become available.
Further to this, 5G enables higher bandwidth and lower latency mobile connections.
As augmented reality devices become more widely available I predict that these technologies will enable lighter, cheaper devices capable of running experiences powered by edge computing, wirelessly. I can really start to imagine a world augmented with complex data and experiences on demand and reacting in realtime to user inputs.
References
Hamilton, Eric (27 December 2018). “What is Edge Computing: The Network Edge Explained”. cloudwards.net. Retrieved 2020-03-12.