I've always been interested in AR. My Master's supervisor was Bob Stone, probably one of the biggest heavyweights in the field. In 2014, when I was at Torr Vision Group, the whole research group was working on systems that would integrate into AR glasses with transparent OLED lenses. I was working on stereo vision. Philip Torr had a vision (no pun intended) of a HUD for his motorcycle helmet.
Fast forward to today and I'm still an "early adopter". I haven't owned a laptop for many years, instead first using my NReal Airs as a "monitor", and now a minimal setup via my EvenRealities G1s.
This is a demo of how I run an SSH client on my Even Realities G1 smart glasses, as multiple people have asked me. It's a weird look when you're sitting somewhere in public typing on a wireless split keyboard while staring off into space!
I connect my keyboard and my glasses to my phone, and have a small web app that sits in the middle. It can send text and images to the glasses (although the images are a bit janky), and it can capture recorded audio from the glasses, although I'm still on the lookout for a good local speech to text model. Via SSH, I can do a ton of stuff without any extra work needed!
Just a quick disclaimer: I actually regret buying these, as I have many issue with the product and, more importantly, the company. But it was an expensive purchase, so I'm making the best of it!