>I've never come upon a compelling personal use case for smart glasses
There are tonnes, it's just the technology isn't there yet; glasses are too bulky and heavy, the fov sucks, the resolution sucks, light transmittance sucks.
But the use cases are incredibly plentiful; stuff like this (music sheets, documentation, web browsing), getting realtime directions with a blue line or directional hints when walking around an unfamiliar place, overlays/information at tourist sites, home automation/controlling devices.
I remember an old anime or some show where it's a world where a digital world is overlaid the real world where AIs and devices from the digital layer can be interacted with in a similar way...what was it hmmm.
We've also got a scanning feature that does OCR for sheet music, to get music into our system. Plus there's a full-featured notation editor. A good overview is at https://www.soundslice.com/features/
I'm using tabs not notes, but I'm assuming/hoping your solution will adapt quite easily.
I wonder if you could use a microphone to listen for the notes in order to get auto-scrolling. Because you know the general timing, you're not searching through the entire song (likely) but honing down on the exact point that person is at. An inobtrusive metronome might be nice to.
Congats! One of the best projects I've seen in a long time, and particularly such a good use case for the early stage of this hardware.
-- the project uses "Even Realities G1" AR glasses (640x200, 25°FoV, 1bit green), while the "Epson Moverio" AR glasses can have overwhelmingly superior specs (1920x1080, 34°FoV, full RGB) for possibly an even lower price;
-- software wise, it «uses AugmentOS's SDK to communicate with Mentra servers which talk to the mobile app which talks to your ... glasses» - while an Epson Moverio system would just directly use the glasses as a display for an Android device...
Both gaps between the available and the employed make very little sense.
One looks normal enough to wear all the time.
With that being said, although lilypond is very intelligent about all sorts of typesetting minutiae, but it's probably difficult to wrangle it to run on smart glasses.
[1] https://lilypond.org/doc/v2.24/Documentation/essay/engraving...
I've found the display capabilities of the current gen smartglasses pretty disappointing. Yes they're less obtrusive, but the resolution is pitiful. I've found the Vufine a lot more useful, if more ridiculous looking.
The Nordic MCU they use isn't actually the limiting factor, rather it's the glasses' firmware. For bitmaps from third party apps (like AugmentOS), they enforce 194 byte chunk sizes and do not support RLE. Their first-party app does not have these limitations. We're stuck with this problem for the G1, but we're working with hardware partners to make sure future glasses don't have these issues.
The ability to adapt paper music would be useful. In some genres -- I play big-band jazz -- virtually no material is available in printed form, or it's in the composer's preferred format, which is typically PDF.
I've been an avid enthusiast and promoter of Meta Ray Bans since Oct 2023. They are very handy and I think for anyone person who wear sunglasses or glasses and uses their phone to take pics or vids then they make a ton of sense (both things you can do with them without needing your phone.. also ask them for the time). Though Im not sure even the HN population is much about them.
Albeit I love them I do not think as you see the media and i guess Zuckerberg saying they are the next computing platform that to be true. You can not take selfies with smart glasses unless they offer a pop out tiny drone in the glasses to take pics of u lol. Thus, I think they will be complementary to our personal pocket smart and or upcoming pocket AI devices, which will able to take the best selfies of you ever (ur AI friend see on the lock screen directs you to the best light to get the best selfies).