On a tip-off (once again) from the inestimable Matt Jones, of a "great hack of google maps by john udell". It’s really cool that Udell creates this trail from his walk that then uses Google Maps to show not only the trail, but photos and videos at certain way points.
What excites me, and I think this is what Matt is pointing at, is that Udell’s system is like Lifeblog in that it basically is automatically recording one’s life. Udell just sorts it by location rather than time. Udell could use his phone as the Life Recorder, as we do, by automatically collecting the way point pictures and videos, instead of manually collecting the photos and videos himself (I keep pestering GPS-dude Chris Heathcote to try this out).
A few things Udell said that were interesting: 1) that this should be simple; 2) that here’s a tool to actually collectively annotate the planet.
Cool.
Link: Jon Udell: A Google Maps walking tour of Keene, NH.
We’ve been toying with ideas like this, since we do have some location info in Lifeblog. But our granularity, for various reasons, is nowhere near the same as Udell’s. If we could have such granularity, we’d clearly connect it with maps somehow. Indeed, when you start connecting relevant meta-data to objects, the different ways of viewing them gets interesting. And I don’t mean just first level meta-data such as location, time, or date, but meta-data on the meta-data handling info (who saw me, who sent me, who was I sent to), or frequency info (how many times was I viewed, how many times was a message sent to someone). yada yada yada – this is nothing new in meta-data circles. 😉
When Python gets access to the camera (and microphone), we’ll rock and roll!