Earlier this summer, I picked up a pair of Ray-Ban Stories smart glasses.
The frames look similar to my Ray-Ban Wayfarers, except there’s a camera lens in the corner of the frames, a computer in the arms and tiny speakers that can play music and podcasts.
These smart glasses aren’t meant to be a replacement for your mobile phone (yet), but they allow me to make calls, listen to podcasts and record videos when I’m outdoors.
I find them quite useful. I’m no longer digging through my pockets to answer a call. I can interact with the digital world without having to look down at my phone. It might even help my posture.
These smart glasses are also a reminder that as our technology keeps getting better and better, the way we interact with it keeps evolving too…
Back in the 1960s, giant mainframe computers crunched data and helped land a man on the moon.
These machines took up a whole room and cost millions in today’s dollars, and they were instrumental in crunching numbers to get our astronauts home safe.
By the 1980s, chip sizes and costs had decreased enough that every household in America could own a computer. Suddenly, spreadsheets and school projects were a breeze.
I can still hear the clicking and beeping of my Apple IIc’s floppy disk drive!
A decade later, laptops allowed us to take our computers on the go.
Those early laptops were heavy, slow and expensive.
But they allowed businesspeople to take their work on the road and college students to study in coffee shops.
Then in the late 2000s, smartphones arrived.
And computers are now with us every second of the day. Ordering takeout? You don’t reach for a phonebook anymore, you open DoorDash or GrubHub. Calling a taxi? Uber.
They’ve almost become an extension of our physical forms. Sadly, when I leave my smartphone at home, I feel lost.
We rely on them to direct us to where we’re going, play our favorite songs and keep track of our life history through digital photos. According to a survey by research firm Statista, 46% of Americans spend five to six hours on their mobile phones daily.
Another 11% spends seven hours or more.
And the next evolution in how we interact with the digital world is right around the corner…
Google’s Early Vision of a Hands-Free Future
In 2014, I had the privilege of visiting Google’s New York City headquarters and trying out its smart glasses.
These hadn’t been released to the public yet, but a friend who was working on the project brought me in for a “test drive.”
At the time, the smart glasses were a little slow and clunky.
The hardware was still a little too big, and the operating system didn’t always hear the voice commands, so you had to repeat yourself a few times.
On the streets of New York, onlookers might have thought you were talking to yourself while wearing science-fiction headgear. Hardly practical or fashionable.
But things have changed in eight years.
Chips have gotten much faster, and voice-recognition software more accurately “hears” commands.
That means a new way to interact with our computers and our world is right around the corner.
All Eyes Turn to Big Tech’s Next Gold Rush
Back in the 1960s, J.C.R. Licklider wrote a research paper called “Man-Computer Symbiosis.”
He was a computer scientist who had a vision for the internet long before it existed.
Licklider outlined that the interactions between computers and users would become simpler over time.
As he predicted, we’ve gone from mainframes to desktops to laptops to smartphones.
And it’s looking like the next iteration of computers will be wearable technology, like my smart glasses.
Right now, Meta Platforms Inc. (Nasdaq: META) is leading the way with its Ray-Ban Stories smart glasses.
CEO Mark Zuckerberg believes this is how we will interact with the digital world. He’s already committed $10 billion toward the metaverse this year.
Apple is moving quickly to get a competitive product on the market. The company is rumored to have a secret team of hundreds of employees working on virtual (VR) and augmented reality (AR) projects.
Five years ago, the company launched ARKit, an augmented reality platform for iOS devices. It enables developers to produce apps that interact with the world using the device’s cameras and sensors.
As devices have gotten faster thanks to improved chips and 5G, augmented reality is becoming more usable.
There are rumors that Apple is expected to launch an AR/VR product sometime in 2023. When this happens, it will solidify smart glasses as the next big computational device (much like its first iPhones did for the smartphone).
Of course, the next evolution could come in the form of an implantable device, like Tesla CEO Elon Musk’s concept called Neuralink.
I know which one I would choose, but I’ll put it to you:
Would you rather wear smart glasses … or take a chip in your cerebellum?
Let us know at WinningInvestorDaily@BanyanHill.com.
Regards,
Editor, Strategic Fortunes
P.S.The potential around AI and the metaverse is enormous. It’s why I recommended a stock to my Strategic Fortunes readers to capitalize on these next-gen trends. I recommend you check out my presentation on the Next Gen Effect if you’d like to learn more. Just click here.