Do virtual globes distort the Earth?


Ogle Earth: A blog about Google Earth. « Do virtual globes distort the Earth? »

Stefan tagged an interesting question about whether the 3D globe on your desktop (say, Google Earth) is a truly accurate portrayal of the real world, or if there is distortion introduced by computer graphics. The answer is that ignoring sampling quality (e.g., from the satellite and aerial photos) and output image quality (from your 3D hardware), the 3D math forms a near-perfect reconstruction of reality, better, IMO, than any 2D map can do.

The thing about 2D maps, useful as they are, is that they are by definition flat. The Earth is roughly spherical (somewhat bulging in the middle and flatter at the poles). Any 2D projection must then either stretch certain parts of the world or require cuts or white-space to accomodate. Different 2D maps are better at representing different things. But one of the most common map projections, the Mercator, used in many classrooms, is often criticized for making masses in the far north and south seem bigger than reality. It’s why we often think that North America is so much bigger than Africa, for example. But even maps that cut the image do distort to some extent, because the uncut areas are still treated as flat.

With 3D virtual globes, we’re mapping a roughly spherical shape to a roughly spherical shape. Those sorts of errors go away and we can get a much more faithful reproduction of reality, much as we would see if we flew to any given spot. That’s not always the point with mapping, but the question was one of distortion, not utility.

The key idea is that 3D graphics works very much like simplified holography. You can imagine there is a virtual photographic plate or screen somewhere in the virtual world, with a virtual eye sitting behind it, representing your POV. The process of “rendering” is about capturing light from the virtual world that hits this virtual plate that happens to be aimed directly at that virtual eye. As long as the spatial relationship between the virtual eye and plate matches your real eye and screen, you’re getting a perfect reconstruction, geometrically speaking.

True holography goes a step further and captures any light hitting that plate from any direction, regardless of any virtual eye. So you could later view the plate from any angle and get the correct result. One correct image per eye is called stereoscopic, which is what we commonly call “3D” for movies, monitors, and holography.

Since the relationship between the real eye and the screen is assumed, for common 3D graphics, you will get some distortion if you sit off to the side of your monitor, at an angle, or even if you sit too far or to close. An app like Google Earth assumes a Field of View (FOV = the angle from your point of view to the edges of that virtual photographic plate) as 60 degrees and an observer looking straight at the screen. This means if you have a 20″ wide monitor running full-screen, you would need to put your eyes about 20″ away (17.3″ to be more precise) and centered to get the ideal reconstruction. Some 3D games use a 90 or higher degree FOV, which means you’d need to sit even closer to make it right. Most people don’t bother, and I, for one, get nauseated when the visuals mismatch like that. But most people don’t mind.

Going forward

In the not-too-distant future, computers will be able to eye-track us poor users. And the good news is that common 3D graphics hardware could be told to use that information to render the correct 3D perspective no matter where we sit, and even if we move. The result would be as good as traditional holography, perhaps better considering color issues from using lasers with photographic plates, but perhaps only for one or several simultaneous users.

Objects can even be made to come through or float in front of the monitor, as long as the observer could draw an imaginary straight line from their eye to the object and land somewhere on the screen. In other words, every time you see an advertisement of some big virtual object floating in front of a small “3D” monitor, the advertising is misleading. You can only see the parts of the object that are directly in line with the screen.

There was a project by Michael Deering et al at Sun Microsystems a few years back called the “Holographic Workstation” that did did it more or less correctly, even accomodating the distortion by a user’s eyeglasses (Michael wears coke-bottle glasses). Given the proliferation of cameras and advances in optical gesture recognition, true 3D is not too far away.

  1. #1 by Stefan on October 18, 2006 - 6:38 am

    Ooh, can I nitpick? If your horizontal POV is 60 degrees, and you screen is 20 inches wide, then you should place yourself 17.3 inches away from the screen (tan(60) = 1.73) because you want to create an equilateral triangle, not an isosceles triange.

  2. #2 by avi on October 18, 2006 - 7:52 am

    You are correct, sir.

(will not be published)