Now See Live 3D Worlds on Your Pinkie


We’re starting to see a new crop of "Mobile 3D" solutions hit the market. So let’s take a minute to see what’s up, what’s coming, and what’s still hype.

For anyone who read the "Second Life on iPhone" stories or saw some demo video, the key take away is that the 3D graphics are not being rendered on the iPhone itself. It’s being rendered on a beefy server somewhere else and delivered essentially as compressed video.

Now, this is not the simplest thing in the world to pull off, by the way. It’s definitely worth some kudos to anyone who can minimize the extra round-trip latency for your controls, improve the UI, encode that video stream cheaply and cleanly, and work out a server co-location solution that doesn’t cost them an arm and a leg.

But there’s the rub. If you were to put a single 1U quad core rendering server in a managed hosting facility (assuming they even support your choice of hardware), it could easily cost you from $1-$5 per hour, every hour. You can probably only serve one rendered video stream per server at any given time without some serious R&D expenditures and some really hot hardware. So you need at least as many servers as your peak customer load, which could be thousands or many thousands if you’re going to make a business out of it, and even cheap hosting might still require you to recoup $1 per server hour. How many people are willing to pay an extra $1 per hour to play a laggy(-er) compressed version of Second Life or Worlds or Warcraft on their iPhone? Any guesses?

Keep in mind, it’s not like serving web pages, where a single server can handle thousands of simultaneous requests because the payloads are so small and relatively infrequent. It’s more like YouTube’s hosting problem, except not even YouTube has to continuously encode each video frame they send you. They do that once, ahead of time, and just serve the files up as fast as they can — they’re probably limited by bandwidth, not CPU.

This approach is not entirely new. SGI worked on remote rendering and even got a patent based on work of former colleague of mine (who is now at Google). People always want more rendering power than they can currently get on their local machine and using big iron is the only way to cheat Moore’s Law (barring time travel, that is). And remote rendering is common in the hotel business, if you’ve ever seen those video games you can play on your in-room TV.

What I’d personally like to see is something really new. Let’s get that grid computing farm that can speed up key parts of our 3D rendering pipelines. I mean, why send finished compressed video at all? The mobile client can render polygons, flattened to 2D if need be. It just can’t too many things at once, so choose carefully.

It may in fact be more efficient to send compressed polygons, pre-transformed, occlusion culled, and lit, and treat the local graphics hardware as a cool form of device independent 3D video decompressor. Hell, that would even work in Flash. And the benefit is that it could run on generic CPUs. The dedicated 3D hardware is only essential when you’re doing the full render.

While we’re at it, let’s offload the shared physics calculations (which is what SL mainly represents) to the servers. Let’s offload the animation. Let’s offload whatever individual bits and pieces of the mobile app are slowing us down.

But whatever solution we try, let’s also make it future-proof. A Flash client that offloads geometry transformation & lighting to a server would be cool and unique today, but what about in a year or two, when Flash is fully hardware accelerated?

Remember, the same hardware we put in those servers today will be available on mobile phones in 2-4 years. In fact, some of the best "mobile 3D" solutions companies are simply riding that curve, doing their software R&D now so when the phones are ready and popular, they won’t even need the servers.

But what would happen if Intel or AMD or NVidia decided to do the "grid" thing with enough GPUs? Would they even think about cannibalizing their consumer products near-term, or might there be a way for such a service to help sell even more cheap (to them) silicon, and to more devices than they reach today?

Well, Intel seems to be partnering with Comverse on the mobile Second Life demo, so maybe they’re already heading that way.

 

  1. #1 by chris adams on June 14, 2008 - 9:23 pm

    I have to question the assumption that people wouldn’t want to use the builtin hardware gl on an iPhone. The demos in the keynote suggest that it’d be as good as current portable game systems or the desktop hardware which launched things like mmorpgs.

  2. #2 by avi on June 14, 2008 - 9:33 pm

    It’s not that people wouldn’t want to use the built-in hardware. I’m saying some people will use it, perhaps even anyone who is writing directly for those platforms.

    But different people will have different ways of adapting content that was originally made for consoles and PCs. Some will use server acceleration. Some will simplify and optimize the graphics to meet the requirements of a smaller system. And some will wait until the mobile hardware catches up.

(will not be published)


  1. geometry solutions
  2. world of warcraft cheat