I had heard about Microsoft's Photosynth technology before, but it didn't click with me until last week when I was in a meeting with their Icons of Imaging. In that meeting, one of Microsoft's engineers did a live demo and the lights upstairs suddenly went on. I won't do the technology justice describing it here, but imagine a vacation to Paris. You're going to want to capture the iconic Eiffel Tower in a photograph. Yet for many, the resulting image won't be anything special. It will likely be exactly like millions of other shots. When you show it to your family, they'll respond "yup, that's the Eiffel Tower," and move on. Now imagine that same vacation, but like a good photographer, you don't just take one image of the tower. Instead, you take tens or hundreds of different pictures, each from a different angle and field of view. You zoom in close for detail. You point your camera up, down, and every which way. Now you've got a library of different angles – certainly more interesting – but I definitely don't want to sit through THAT slide show. Many of us have seen the movie The Matrix and marveled at "Bullet Time" or have seen the Super Bowl several years ago when the producers were able to combine different camera angles to produce a 3D move-around scene to follow the action from any angle. It blew my mind to realize that they had composited a 3D scene in real time from about 10 different 2D camera angles. Microsoft has taken that same idea and made it work on still photographs. Using my Paris vacation example, imagine taking those hundreds of photos and having them composited into a 3D scene that you can rotate and zoom in real time. Now your, "what I did on my summer vacation slide show" becomes far more interesting. This makes far more sense when you see it vs. just reading it described in text. To "kick the tires" I created a synth of my ThinkPad W700, which you can find here. I set the machine up in my driveway and then took about 150 shots from all angles. Then I uploaded them to the Photosynth site and a few minutes of plugging and chugging later, my synth was ready. The software took each one of those photos and computed a "point cloud" from all of them. The first image below shows that cloud. If you look closely, you can see the keyboard and ThinkPad logo.
It then used that 3D point cloud to create the synth in which I could zoom in and out in real time. If I wanted a different view, I simply used my mouse to rotate the image. What impressed me most is that it was FAST.
After I rotated, the intermediate navigational images would dissolve and I could clearly see my image again.
To read this makes it sound complicated. To see it in action makes it a simple thing to grasp. I encourage you to try it for yourself and see. (Note: I am pretty sure that you need Windows, but from what I understand, a Mac OS version is in the works.) Beyond the technology itself being cool, this has the potential to have a big effect on our business. Imagine our next product release – the ThinkPad T4000. On the day it is announced, you arrive at lenovo.com and see the usual list of specs. Then you see a 3D walk around tour link. You click the link and it brings a synth up of the new product and you explore every which angle you'd like. I'm fully aware that 3D walk around models have been available for years. You can find them on any number of product web sites, including ours. The difference is that any 3D walk around that you see has cost the vendor thousands of dollars to make and develop. Thus, all vendors are very choosy about which products for which they make these 3D models. Now this process is dirt cheap. I was able to make my synth in about 15 minutes of photography time and about 30 minutes of process and upload time. It isn't as professional and slick, but I wasn't trying to be scientific about it. In a studio setting you could get far better results than I did. Over time this technology will certainly evolve and get better to a point. Dirt cheap means that we could make more 3D tours available so you can clearly see all of the details of the product before you buy. After you purchase your system, the technology still has lots of great uses. For example, I replaced my brother's cracked display screen in his ThinkPad last week. I had to use our old-fashioned, 2D paper-based Hardware Maintenance Manual to complete the task. Thirty screws, two hours, and lots of cussing later, we succeeded. Now imagine our hardware maintenance manuals being augmented with 3D synths of each step of the process. I could have zoomed in to see WHICH screw I needed to remove. I could have seen exactly how to route the wireless cables through the hinge. I could have seen exactly what the snap holding the bezel on the display frame looked like. Our Lenovo Training team could use it for their PC Architecture course to allow people to zoom in and out of an open PC to show all of the bits and bytes. Again, yes they can do this today, but they don't because it is extremely expensive and time consuming to create. Our product development team could also use this to develop the next generation of notebooks. The technology doesn't exist today, but imagine taking that same sequence of product pictures with a known reference item included in the photo, like a quarter. We know a quarter's exact dimensions. Software could conceivably use it to determine each surface's length, width, breadth, and curvature based on its appearance in the synth vs. that of the quarter. Take that mathematical rendering and feed it into CAD software which outputs to a milling machine. Literally within hours of seeing Vendor X's new widget, our development team could have an exact replica sitting on their desks. Who knows, next generation software could probably use the length of the waves of light reflecting off of the object to determine what materials it is made of. I also see some possibilities in my own job. I regularly buy competitive machines and put them side by side against our own products. If I were to create a synth of our products side by side with vendor X, I could allow our sales teams to literally zoom in and see what I'm talking about as I make points about how each machine is physically constructed and feature/function differences. (I may in fact just do that.) As a final note, after I posted the ThinkPad W700 synth, I happened to do a search for "ThinkPad" to try and find the hyperlink for this blog entry. I was quite surprised to find that several people had already done synths of their own ThinkPad notebooks – including a member of our own Lenovo team. I guess I've done the modern equivalent of searching for Ansel Adams's tripod holes in Yosemite.
By the way, if you are into photography, Lenovo and Microsoft are concurrently running a contest called Name Your Dream Assignment (on Facebook) The winner will get a ThinkPad W700 and $50,000 to make their dream assignment a reality. Terms and conditions are on the site. Good luck!