With my star undergrads Ian and Abe, and backend support from Ziad, we put together this mashup for the holidays! We use the data from the Twitter streaming crawler we built (for our NSF-funded work) to get Instagram photos posted on Twitter that have the word Christmas in the tweet, and where the photo location is available on Twitter. We then add the Google Streetview of the photo location and, well, mash them all together.
The result is an interesting juxtaposition (as one comment on my Facebook post captured well) of the “small instagram-style photos (typically close-up, indoors) against the backdrop of the (typically distant, outdoors) Google street views”. As such, the StreetView gives context to the Instagram photo and maybe provides the settings in which the activity in the photo is taking place, another dimension of understanding, often much stronger than the text of the tweet itself.
The app is also an interesting (and mostly unintended) statement about privacy — I don’t know what these users would feel like knowing their environment is exposed to all, and not just in a default bland zoomed-in map format.
As a side note, after all this filtering, surprisingly little data satisfied all these criteria, mostly (I suspect) because Twitter requires specific user authorization for location information to be posted in tweets. In other words, even though many (most?) Instagrams will have location data, a lot of those will not have their data available when posted on Twitter.
There are extra features coming for this app (e.g., choosing your own keywords), but more on that later.
Happy holidays and enjoy the beat!