PC in the cloud? What challenges are streaming services facing?

The demand for cloud streaming of games and powerful processors is becoming increasingly popular, but many technical challenges persist. Why isn’t the concept of cloud computing new, what challenges will new streaming services face, and will remote computing ever take over?

Why is the concept of Cloud Computing not new?

Although the concept of cloud computing may seem new, its origins actually date back to some of the first consumer computers of the 1960s and 1970s. Before we can understand how remote computing became commonplace during this time, we need to d first understand why computers were big and expensive.

For a computer to be practical, it must be able to perform calculations on real-world data that can be used in real-world applications. For example, early military computers were used to calculate ordnance tables that would allow ships to accurately hit targets, while some early IBM machines were used to count tabulated data for the national census in the States -United.

Regardless of the technology used, computers are generally practical in the real world when they have data buses of at least 16 bits, memories of several hundred MB, and instruction speeds of several million per second. Now, that doesn’t mean that 8-bit machines can’t be used in practice, because 8-bit machines have dominated the personal computer market for some time, but while they can be useful for home applications, they certainly cannot be used for large-scale computing applications such as scientific research and transaction processing.

So even early computers had to scale quickly to be practical. For example, the IBM System 360, which was released in 1964, had 32-bit words, 24-bit address space, and could run up to 16.6 MIPS. Such computers would be built from discrete components, which would see their size increase rapidly. Thus, it makes economic sense to build a large computer with decent capabilities that can be used for many purposes instead of a smaller system with reduced capabilities (because those reduced capabilities would be insufficient for demanding tasks).

However, the development of very large mainframe machines also made sense when engineers developed the concept of remote computing and time-sharing. It is significantly cheaper to have a large mainframe capable of handling 100 users all using terminals (screen, keyboard and a basic interface) than trying to give all 100 users their own dedicated machines.

This use of remote computing continued for several decades, during which companies purchased large mainframe computers to which employees logged on using small desktop terminals. As computing power improved, these terminals were gradually replaced by desktop computers capable of providing some degree of local processing, while the central computer provided users with additional processing power if necessary. Some mainframes even allowed remote access over telephone networks with the use of a modem, leading some companies to rent out their unused processing power to others under time-sharing schemes.

What challenges will the new streaming services face?

Many companies have recently attempted to bring computing back into mainframes and data centers. For example, Amazon Web Services provides online cloud computing services that are easily scalable with applications such that an application requiring more resources can be dynamically assigned to them.

Another example would be Microsoft Azure which essentially provides the same services as Amazon Web Services. Google also provides cloud-based software solutions such as Google Docs and Google Sheets, which are free browser versions of Word and Excel.

The benefits of cloud computing are numerous, from being accessible from anywhere, the ability to remove the need for powerful computers, the ability to work across multiple platforms, and the security offered by data centers ( unexpected power loss will not see the data loss on the user side). Additionally, cloud computing places the responsibility for system maintenance on the data center, and the large number of users pooling their resources provides access to hardware that would otherwise be too expensive to own (server-grade processors , high-end GPUs, etc.).

In fact, Nvidia and Microsoft are exploring the idea of ​​streaming computing resources. For example, Nvidia offers a game streaming service called GeForce Now that allows subscribers to remotely play games on servers hosted by Nvidia. Microsoft is also reportedly developing its own Xbox Anywhere service, which saves customers the hassle of buying consoles and remotely accessing Xbox games on Microsoft servers.

However, remote computing services such as those developed by Microsoft and offered by Nvidia face many challenges. By far the biggest challenge is the quality of the connection between the user and the server.

For non-intensive applications such as word processing, the quality of an Internet connection is rarely an issue because the amount of data exchanged between client and server is minimal. However, an intensive application such as a game requires not only high quality video streaming but also low latency input response.

Whereas the typical gamer can expect to see pings between 20ms and 40ms on a good day when running gaming software locally, relying on a remote server to stream video and keystrokes on the keyboard could considerably aggravate the situation. Additionally, the increased distance between a player and the streaming server will compound this effect, and thus either latency suffers or video quality.

So, only those who have excellent connections between themselves and the server will be able to enjoy these streaming services. Even then, such a service can only handle a limited number of users at a time, and it is highly likely that users in one geographic location will be playing at the same time (i.e. after the work, afternoon, etc.). This could cause performance degradation during peak hours, further reducing video quality or increasing input latency.

Will remote computing ever take over?

Although great progress has been made in the field of remote computing, remote computing is unlikely to become the norm for intensive applications. Everyday applications such as word processing will greatly benefit from cloud computing as documents can be accessed anywhere and anytime, but those who enjoy gaming will often invest energy and time in their gaming device. , trying to support all existing games will be difficult for servers to manage, which could limit remote gaming to a few popular titles while other smaller games continue to be played on mainstream PCs.

Subscription services, however, could make remote computing more popular as companies try to find alternative methods of generating revenue. By keeping all hardware and software on remote platforms, those seeking to access them must pay monthly fees, which can provide businesses with a healthy source of revenue. Of course, subscription models are universally hated as they are seen as a way to prevent outright ownership of hardware and software, and abuse of such a revenue model could see customers actively refuse to support IT solutions remotely.

Comments are closed.