Researchers at IBM Zurich Research Laboratory claim a novel approach to accessing patient medical records — using the human body as the 3D framework in the same way that Google Earth uses the Earth as a framework to fuse and navigate geospatial information. Spin the body, click on a body part, and zoom in closer to get more information.
The Video Trace system isn’t completely automatic. It requires a modeler to mark the boundaries of surfaces in at least a few frames of video. But it does seem extremely fast and well-integrated.
This isn’t quite ready to turn, say, Google’s Street View photos or video into 3D models just yet, but it’s close. Once the system can infer surfaces without human intervention, perhaps only requiring people to go in and fix mistakes, it might be ready to scale up to cities. Right now, it would still be too labor intensive. But it’s impressive nonetheless.
I tried linking the video here, but it’s being difficult. Click the link above for an interesting demo of their approach.
If you read RealityPrime, you probably already have a good idea of the actual connection between Snow Crash and Google Earth. But if you read any of the other blog entries about the intriguing Arizona State University beta test without also coming here, you might have been lead astray on at least one key point.
So I’d like to be as clear as possible about the role Snow Crash played for Google Earth. I’ve also modified the old post on my personal blog to reflect this added clarity for anyone who wanders by. Continue reading
Over the next 12 months, there will several dozen new 2D/3D social virtual worlds launching, all vying to take Second Life’s crown as the leader in the non-gaming space. They want to be the "YouTube of 3D content," the "Real 3D MySpace," the "FaceBook of 3D UGC" (User-Generated Content).
Obviously, most of them will not be huge. Most of them will actually disappear almost as quickly as they arrive, once the people come and go and the funding dries up. But some will stick around, either merged into bigger entities or grown to independence. What will make one succeed and another fail? That’s the billion dollar question.
That’s just a highlights video.
Here’s a link to the the complete media stream from the event (about 1hr, plus).
I’m posting this on the professional site because Randy has made some great contributions to the field of virtual reality, science, and education — and everyone should know about it.
I’m also posting it because I know Randy. I worked with him when I was at Disney. And I can attest that he’s not putting on a show or brave face. That’s exactly how he is in person: positive, smart, and genuine.
He doesn’t want pity or sadness. He’s achieved his childhood dreams. So I’m not sad for him, but for his family, friends, and, in fact, the rest of the world, who will miss him, whether they all know it or not.
Intel Snaps Up Havok | Metaversed
This is fairly big news in the game and virtual world’s industry. Havok has had one of the best sets of physics middle-ware around, used in Second Life (on the simulators) and in many games. Intel is in the business of making chips. So why would they buy a software company like this?
Well, there are a few reasons. First and foremost, Havok’s competition (apart from roll-your-own solutions and a few FOSS libraries) is hardware — the PPU — or physics processing unit, analogous to the GPU for graphics. Not many people have PPUs yet, but there will be more and more of a push from marquis games and high-end gaming PC makers.
In fact, I’d expect AMD, which swallowed ATI recently, to begin offering more generalized or bundled accelerator chipsets that can handle graphics, physics, and perhaps even AI together on the same board. Havok can take advantage of ATI GPUs as well as NVidia, but with this acquisition, we can only hope that continues.
So what is Intel to do with their own "multi-core" CPU push and sub-par (but very widely installed) motherboard 3D graphics business? Well, anything that needs more CPU cycles vs. buying specialty add-in boards is good for Intel. And physics simulation software scales nicely to multiple cores. It’s pretty simple when it comes down to it. If Intel gave Havok away for cheap or free, Aegia would have a much harder time selling PPU hardware.
The story that Intel is getting more and more into virtual worlds is interesting. Second Life, for example, is somewhat of a "thin client" approach (though not so thin — still lots of CPU needed to make it smooth), leaving
more of the heavy lifting to the grid of simulators, which could use 80 core chips yesterday. Intel would certainly like selling beefy simulator chips, but there are far more customers out there to attend to. And virtual worlds often lead themselves to optimizations that push computing work to the server, vs. the more plentiful home PC.
So games and Peer to Peer virtual worlds may be more up their alley. Intel is reportedly paying one Palo Alto company, qwaq, for some undisclosed work, probably relating to business worlds. They’re using Croquet, which is an open source peer to peer world, of varying levels of visual fidelity. I doubt they use Havok, if the open source physics engines are reasonably sufficient, but I imagine Intel will make more investments in this area over the next few years.