Monday, 2 July 2012

Why game video streaming is too expensive

The news about GaiKai being acquired by Sony reminded me I needed to write up this post.

Don't expect to be playing lots of cheap games on GaiKai after the Sony acquisition. The economics of video streaming of games are tough.

Whilst you can buy cheap time of a virtual PC on a network from a myriad of cloud companies- at Google I/O their Compute Engine was announced to offer a basic virtualised CPU at 14.5c per hour.

But, this cost needs to be qualified when thinking about a game streaming service.

Firstly, a virtualised CPU is shared between lots of customers processes. Whilst you get an average performance you paid for, peak performance varies greatly. That makes for occasional stuttering that doesn't matter in a business application, but would be bad in a FPS game.

Next up is the impact of GPU performance. These services render the 3d on the server. That means that you need a GPU in the server to do so, and most commodity servers just don't have GPU's with any umph. Fortunately, Nvidia's announcement of a virtualisable GPU for servers will help, but as yet the costs and loadings aren't visible.

It's worth remembering too that a cloud hosted game demands more of the CPU/GPU than the same game does when running on a home PC- as it must compress the video from the game with very low latency so it's small enough to send to the consumer.

So having their own servers (or deals to put GPU's into 3rd party racks) isn't too bad?

Well it would be, but there's a compounding issue: the speed of light. Unlike ordinary cloud services, games need fast response times, too fast for servers to be located far from their users. That means that whoever owns GaiKai needs to equip enough servers close to their customers to cover PEAK loads.

So, round Christmas, where perhaps 50% of PS3's might be operating in peak hours, there'd need to be the capacity to service them from servers with GPU's close to the customer.

It's very hard to get to a final answer- as lots depends on the deals you can do and the scale you can reach, but in my opinion it'll be well above $1/hr to provision the service at 720p, and maybe twice that again if you want to go to 1080p.

That has implications- if the average AAA title has 20 hrs of gameplay then that's a big dent in the game economics, unless you believe that the consumer will be happy to pay an extra $20/game to save on buying a new console?



3 comments:

  1. I believe a lot of this is over exaggerated. When running the games on the same hardware the games can be highly optimized and even designed to share resources... for instance if you have 20 people playing the same game off the same server, you only need to load textures and what not into memory once. As for compressing it, dedicated hardware can easily handle this and Sony could easily build their own. AFAIK OnLive just encodes in H264 for streaming which wouldn't be all that difficult.

    As far as latency, I get around 6-16ms most of the time to any decent host. Latency correction exists and is used all the time anyway, so lag shouldn't be too much of an issue.

    Long story short: If OnLive can do this on their own, why can't Sony?

    I do hope that they just offer this as a service though and it's not for all games. Although, one awesome advantage would be that your console wouldn't really be outdated. Your console becomes more or less a "Thin Client" to their data-centers, which can constantly be upgraded to improve game performance without the need for a new console. Too bad the companies are corrupt and will likely make you buy the game for $60+, where they can take it away if they feel like it.

    ReplyDelete
  2. My point is not that it can't be done- it can. The issue that isn't talked about much is cost. If servers with big GPU's become standard and easy to buy in the cloud then it will get cheaper, but at the moment, fitting big GPU's into existing datacentres in big enough quantities to deal with peak play is going to make the acquisition cost small by comparison. After all they "just encode in H264" depends on a GPU to achieve the required latency.

    I agree that smart coding will improve utilisation on dedicated hardware as you suggest, so you can maybe get more than 1 player-per-server-core. However, that Shiny nvidia keppler GPU is only going to render maybe 4? simultaneous players at something like next-gen quality, less at 1080p.

    ReplyDelete
  3. I'm researching into the costs of adding GPU+CPU configurations into datacenters in NAPs and make them profitable. Right now it seems pretty hard since the choice of GPUs is limited to only NVIDIA, and their GeForce GRID GPU seems to be pretty big, expensive and limited to only 4 gamestreams per GPU (however, that could be optimized further more with low-end games). Also, I wonder if the GeForce GRID takes 2U in a rack or not. That would kills expandability of the service...
    With that being said, I do think that with a future of heterogeneous computing in sight to appear very soon, it could be feasible to do game video streaming (although not high end) on the cheap.
    With a GPU tightly integrated into the CPU could offer even lower latency on the encoding and could save space in server's rack units.
    Also, maybe there are other ways of cloud gaming that we have not yet implemented; like localized cloud computing on games (ie: only render some heavy data from the cloud, while the rest is being rendered locally)
    There are a lot of interesting solutions

    ReplyDelete