So, I interviewed David over skype in part one, and we talked about all manner of stuff, including the growth of mobile technology and how HTML5 and CSS will work with even the simplest of mobile phones (as long as it has some sort of display).
In this interview we continue our look at ‘Project Duke’ – a technology that aims to store and make retrievable the vast amount of unstructured data existing in the world today, and not just online.
There are all sorts of implications for the vast amount of unstructured data ‘out there’, including how to break it down and how to visualise it.
And in a real eye-opener (bad pun), David predicts that Bluetooth-enabled spectacles will give us real-time HUDs (heads-up displays) full of augmented reality meta data. Think of it like this: already you can walk past a store and suddenly receive a text message on your phone letting you know of their specials on sale. Now imagine that same shop giving you visuals of those shoes, or that dress, or that jacket… complete with regular price, sale price, how long you have left before the sale price vanishes, and so on.
Again, nearly an hour of brain dump and stunning thinking from a man with a brain the size of a planet. Worth listening to if you want to know where we are heading online.
p.s. I added the movie links we spoke about in the first interview to the interview post – revisit the post to see the movies.