Google Glass is one of the next big things in technology. It is disruptive and it creates a whole new world of possibilities and applications.
I have been waiting for something like this for many years…
Question: Comment ça va? (How’s it going?)
I am doing well. (Ça va bien)
Fine, and you? (Bien, et toi?)
Meh (Comme-ci, comme ça)
It’s going bad (Ça va mal)
Then depending on where you look or some interaction, you would select the response and it will tell you how to read it out loud. And over time, it can get tuned with the way you speak or thought of responses, so that they can be more personal. I thought about how awesome it would have been to have a device like that and how I wouldn’t have to sit in French class learning how to conjugate verbs and how useful this device would be during tests when I look down at my paper, I’ll be able to just see it all in English. I had this thought every single time I was in French class through the many years leading up to University. As a side note, I did eventually give up on something like this becoming a reality while I was still a student, so I just sucked it up and learned French. Though I seem to have forgotten a lot of it due to disuse… (Sorry Madame!) But that being said, I still thought of this universal translator in every class.
Fast forward to university, this concept of augmented reality resurfaced for me as being closer to reality when I attended one of Steve Mann‘s lectures where he demonstrated a prototype EyeTap device. Steve Mann has been working on wearable computers since what seems like the beginning of time. The thought of French class came back into my mind. Furthermore, because now my vision has deteriorated (courtesy of some late night, low light video game playing), I also saw the implications of having light be sent to your eyes that would have automatically adjusted to any myopia and hyperopia (and any of the varying degrees in between). When I put on the EyeTap during the demonstration, I saw that it could have compensated for my vision weakness, and that it could have been tuned to shoot light in just the right (and out of focus for you 20/20 people) misalignment to get me a crisp and clean image. Sadly, at the time of the prototype, it was monochromatic (red) and hard to make out the images.
Jump to Google I/O 2012, when they announced Project Glass, I was SO SO regretting not hitting the buy on that (lottery) ticket. It was like a dream come true. With Google Translate and possibly other Google Services integrated, I knew that what I had wished for in grade school can become a reality. After missing out on the Google I/O ticket, I had been following Glass ever since. When Google ran their #IfIHadGlass campaign, I jumped at the chance and posted my use cases. Sadly, I was not picked. At that time, I was hoping that I could have Glass before Evelyn was born so that I can try to document as much as I can.
Now, present day, they had given their current Google Glass Explorers invites to give to friends, I managed to contact some very nice explorers who were making their invites available to people who had good use cases. I had mentioned that my primary use case will be to document my daughter growing up. I had recently seen a video of a baby’s first steps captured through glass. It was amazing to see that they were recording hands free and they can do whatever was needed with their hands to ensure that the baby didn’t get hurt. I thought that would be an amazing use case, especially since I travel a lot for work too. With Helen wearing Glass at home, I would be able to see Evelyn’s first steps even if I am at work. So thank you to my Google Glass “Parents”, who spoke with me and generously offered their invite, thoughts, and experiences, Valentin Mayamsin, Drew Rasmussen, and Emma Tweddell! I look forward to exploring and documenting the world through these glasses.Onward to exploring!
P.S. Do you remember The Game (Season 5 Episode 6) everyone got addicted to in Star Trek:TNG that Riker got when visiting another planet? There’s an app for that on Glass!