More on Google Acquires ImageAmerica

Google Acquires ImageAmerica – All Points Blog  

I’ve seen some head-scratching over this acquisition, especially in light of Microsoft’s purchase of other earth-content generation companies. So while I know nothing about Google’s current interests or plans, and next to nothing about ImageAmerica, I can add a few notes that may clarify some things. [I guess this could go into a "part 3" of the Google Earth series, if I can find enough patents to go into that in more depth. But let me finish part 2 first...]

After reading the patent that Adena dug up, it sounds to me like ImageAmerica has a clever camera that removes several of the more time-consuming steps from the process of producing usable aerial imagery.

What’s so time consuming? Well, consider that every photograph is taken from some angle, never straight down, as we’d ideally want to make a big global texture. Even satellites generally shoot imagery from a skewed angle — if they only shot straight down, they would only be able to photograph the one swath of the planet that’s directly below them, unless they carried lots of fuel to change orbits.

So this generally-tilted imagery must be generally-fixed before it can be used in something like Google Earth or other GIS applications. That’s called "orthorectification" — essentially warping a given image such that every pixel seems like it was captured from directly overhead.

There’s more to it, of course. The terrain being viewed isn’t perfectly flat, even in Kansas. Hills will be closer to the camera, and depending on the focal length of the lens (related to the distance of the camera and field of view), tall objects will seem to radiate away from the vanishing point in the original pictures, as perspective images tend to do. Oddly enough, the imagery needs to be made to at least seem perfectly flat before it can be properly applied to rendered 3D terrain, like hills and mountains.

One last thing that must be done is that different orthorectified images, taken from different angles and perhaps even different times of day, must all be stitched together to make a seamless result. If you’ve ever seen a building in Google Earth that seems strangely angled in two or more directions at once, or two buildings that seem to tilt into each other, you’ve seen the result of stitching different images together. Images can be perfectly orthorectified individually, but when combined, they may still leave perspective-based artifacts.

This is true of both satellite and aerial photography, btw. But planes are typically worse because of the exaggerated motion — every image has a unique and sometimes very different perspective from the others. The perspective warping is also stronger because the planes generally use wider angle lenses.

My best guess from reading the patent abstract is that ImageAmerica’s proprietary camera uses a linear-array lens, which means it could capture a continuous but linear stream of digital pixels as an airplane flies, not a series of discrete X by Y pixel images. That could make it easier to correct for perspective as the images are acquired (using GPS, sensors, etc…). And the perspective changes are much smoother from one part of a giant panorama to another. Those stitching artifacts I mentioned might as a result be minimized, at least on one axis — but that’s a big win already. And the other axis could be improved by carefully choosing your flight path to match multiple passes over a city, e.g., always shooting east. The result seems like a much simpler orthorectification process for a much larger area at once, perhaps even fast enough for real-time acquisition.

This technology could be very interesting for Google Street View as well. Right now, they seem to use discrete photos and have some not-too-bad warping from "node to node" (a node being where a panoramic image was snapped while the van drove).

If ImageAmerica indeed can capture a more continuous stream of panorama, and if Google can find a better way to deliver that to your browser, it would improve the user experience of Street View tremendously, making it more like watching a panoramic 3D movie as one drives down the street.

This is all speculation, of course, but we’ll see what happens in a year or so.