“Isn’t the point of cloud gaming not to have a good GPU”

This was a comment I got on a recent post (here), followed by a winky smiley face – smugly suggesting that I’d missed the point.

My initial reaction was to overreact, but then I remembered that I’d agreed to always call my VP of Marketing before I replied to comments and she talked me down from my tree. It’s a system that works for us.

Being able to stream a massive game or application to a potato device is, indeed, one very small and niche use case of the concept of cloud gaming (or rather the underlying concept of remote rendering streamed to end-users as video). It can be very interesting, it has uses if you can afford spending $1/hour on streaming your content. And it can be spun into a compelling sales & marketing pitch by those who don’t quite understand the realities. It is also utterly and completely dead in the water and not where the world is going in any number of ways.

By reducing the whole concept of reimagining why you would want to run games and other applications in the cloud to essentially saying that is about doing what you’ve always done, but with Google paying for it and you being able to play Doom on a potato laptop you have narrowed your horizons to the point where you miss the actual point.

The whole point of Cloud Gaming is the big Whole Point

It’s doing things that you could not do with locally installed applications, at a level of scale, reach and immediacy that is unimaginable even today. Today a new free MMO set in the Call of Duty universe can get 15M people trying it in 4 days, and we will look back and think that number is cute one day. That is when we’ve unleashed the true power of the cloud.

Andressen Horowitz didn’t just lead an early investment round into cloud-only MMO start-up Mainframe industries. And sure are you know what SoftBank did not lead on $500M into Improbable because ‘the whole point of cloud gaming is that you don’t need a GPU’. Epic is not doing what it is doing with Unreal Engine because it believes the future is row upon row of soon to be obsolete console chipsets soldered into blade servers (which is how Stadia, PlayStation Now and xCloud work btw – that kit has zero other use).

If you pay attention and join up the dots, these industry moves are all leading indicators of what’s coming. But there are other markers out there if you look closely.

A key one is the oddly quiet rise and rise of the consumer GPU. Public and even industry perception is lagging massively behind device shipments. Most people assume you need a gaming PC or console to game. Not true anymore – Intel have been working hard to defend against AMD and Nvidia – so much so they are giving stuff away for free. Check out this detailed analysis: https://www.techspot.com/review/1951-intel-core-i7-1065g7-iris-plus-graphics/

Here’s a good example: take a look at the image to the right:

It is the detail of last years integrated CPU/GPU products like you’d find in your normal laptop in BestBuy, PC World, MediaMarkt, wherever you live. The interesting thing is the TFLOP performance at >1. Doesn’t seem like much – but the original Xbox One had ~1.3TFLOP and that can still play some very very nice games. The point? Intel just gave you a free basic Xbox One when you bought that laptop – just look at how much of the chip physically is Graphics!

Things get even better in the mobile space where Apple have set the pace so, so hard that frankly miracles are happening in terms of price / performance ratios for graphical processing on low power mobile devices. This is particularly important because these ARM-based silicon designs filter through to more price sensitive BOMs like those of TVs over time.

What has generally been missed is a quiet revolution 

The upward spiral of the power of consumer GPUs. Recently Apple released a new MacBook Air. For years a byword for poor graphics performance traded off against thinness and lightness. Not any more – in the lead of why you should buy it was that it had 80% better GPU performance than the last one, which only came out 18 months ago.

Short version – the potato device will very soon be a thing of the past. Consumer and manufacturer habits are changing that every day. Soon most devices with a screen will be able to render most things in real-time 3D. Sure, not everything is going to be a 4K @60FPS on ultra settings monster machine overnight, but the direction of travel is very very clear. More consumers and business devices with better GPUs, year after year after year.

So what? Our argument is – let’s use those ever-better consumer GPUs! Polystream has been working for a few years on a new technology called ‘command streaming’. It lets you run your application or game in the cloud and render it locally, making use of the GPU in the client device by streaming the graphics commands from the cloud in real-time. This lets you harness the true power of the cloud to build incredible experiences, but without the horrendous capex and opex costs of having to use cloud GPUs to stream the results to end-users. 

The whole point of Polystream is that you can stream real-time 3D interactive without a GPU in the cloud!