Wp7 Photosynth

Michael King, the Gartner keynote presenter at the Dev Summit this morning, mentioned a good use case for a mobile augmented reality app. He described a worker using a mobile device to retrieve annotated virtual reality images indicating what steam fitting to mark up.

That brought to mind the photosynth of GasWorks park in Seattle. It seems like the photos in this synth could be streamed to a user who walks around the park with annotations. Think of how first responders to a refinery could use such an application.

This will require microsoft to expose position and orientation of the camera so that a silverlight app, similar to the silverlight photosynth viewer, could retrieve and orient the photo as the camera is moved.

The app would also need to support annotation markup that could be attached to individual synth images.

1 comment so far

  1. Nate Lawrence on

    You may enjoy this talk from Photosynth’s architect from June of 2011: http://bit.ly/are2010blaise

    Regarding Microsoft exposing the position and orientation of the cameras in a synth, I would point you to Christoph Hausner’s SynthExport, Henri Astre’s PhotosynthToolkit, both of which have figured out how to export the camera parameters. (Also see related work from Josh Harle and Greg Downing.)

    The coordinates are relative only to the synth, however Mark Willis, Nathan Craig, and Jorn Anke have all published workflows to convert Photosynth coordinates to real world coordinates.

    I should point out that in any synth which has been geo-aligned by its author, it is possible to gather a file for the synth from the Photosynth WebService which specifies which zoom level on Bing Maps the point cloud is aligned to, as well as its orientation on the map. This should provide a decent head start for converting Photosynth coordinates to real world.

    Also see Henri Astre’s work on registering video to point cloud reconstructions on his Visual Experiments weblog.


Leave a comment