I’ve tried the glasses myself, and I’m convinced that wearable eyewear like this will eventually replace the mobile phone. With ongoing advances in miniaturization, it’s only a matter of time before AR and AV are fully integrated into everyday wearables.
Smartphones were a step back in a lot of ways. Typing is slower. No mouse. Fingers are fat and imprecise. The result is most applications were severely dumbed down to work on a smartphone.
The trade-off was portability. Everyone can carry a smartphone, so it's okay that the human-interaction is worse in a lot of ways. Then, when we need that richer interaction, we can reach for a laptop.
The problem with smart glasses is they go even a step further in how poor the interaction is. Speech as an interface for computers is perhaps the worst interface. Yes, it's neat and shows up in sci-fi all the time. But if you think about it, it's a very bad interface. It's slow, it's imprecise, it's wishy-washy, it context dependent. Imagine, for example, trying to navigate your emails by speech only. Disaster.
Smart glasses, however, are not more portable than phones. Not by much. Everyone already has a phone. So what do we gain from smart glasses? IMO, not very much. Smart glasses may become popular, but will they replace the smartphone? In my opinion, fat chance.
What I think is more likely, actually, is smartphones replacing smart glasses. They already have cameras. So the capabilities are about the same, except smart phones can do WAY more. For most people, I imagine, the occasional "look at this thing and tell me about it" usecase can be satisfied by a smartphone.
Good point, and it could be argued the user soon followed that dumbification, with youngest generations not even understanding the file/folder analogy.
I think we can go dumber ! Why need an analogy at all ? It will all be there, up in your face and you can just talk to it !
There's also touch pads on the side of the smart glasses as another input option. And I could imagine some people liking little trackball-esque handheld controllers(like from the Black Mirror episode "The Entire History of You").
And there's also air gestures using cameras on the smart glasses to watch what your hands are doing.
I don't think any of these has the raw data input bandwidth that a keyboard has, and for a lot of use cases even a touchscreen could be better. But maybe that can be made up by the hands-free, augmented reality features of smart glasses.
I was among the nerds who swore I'd never use a touch keyboard, and I refused to buy a smartphone without a physical keyboard until 2011. Yes, typing on a screen was awful at first. But then text prediction and haptics got better, and we invented swipe keyboards. Today I'm nearly as fast and comfortable on a touch keyboard as I am on a physical one on a "real" computer.
My point is that input devices get better. We know when something can be improved, and we invent better ways of interacting with a computer.
If you think that we can't improve voice input to the point where it feels quicker, more natural and comfortable to use than a keyboard, you'd be mistaken. We're still in very early stages of this wave of XR devices.
In the past couple of years alone, text-to-speech and speech recognition systems have improved drastically. Today it's possible to hold a nearly natural sounding conversation with AI. Where do you think we'll be 10 years from now?
> Imagine, for example, trying to navigate your emails by speech only. Disaster.
That's because you're imagining navigating a list on a traditional 2D display with voice input. Why wouldn't we adapt our GUIs to work better with voice, or other types of input?
Many XR devices support eye tracking. This works well for navigation _today_ (see some visionOS demos). Where do you think we'll be 10 years from now?
So I think you're, understandably, holding traditional devices in high regard, and underestimating the possibilities of a new paradigm of computing. It's practically inevitable that XR devices will become the standard computing platform in the near future, even if it seems unlikely today.
AR will always be somewhat awkward until you can physically touch and interact with the material things. It’s useful, sure, but not a replacement.
Haptic feedback is probably my favorite iPhone user experience improvement on both the hardware and software side.
However, I will never be able to type faster than on my keyboard, and even with the most advanced voice inputs, I will always be able to type longer and with less fatigue than if I were to use my voice—having ten fingers and one set of vocal cords.
All options are going to be valid and useful for a very long time.
There's nothing tactile about a glass pane. It's simply a medium through which we access digital objects, and a very clunky one at that. Yet we got used to it in a very short amount of time.
If anything, XR devices have the possibility to offer a much more natural tactile experience. visionOS is already touch-driven, and there are glove-like devices today that provide more immersive haptics. Being able to feel the roughness or elasticity of a material, that kind of thing. It's obviously ridiculous to think that everyone will enjoy wearing a glove all day, but this technology can only improve.
This won't be a replacement for physical objects, of course. It will always be a simulation. But the one we can get via spatial computing will be much more engaging and intuitive than anything we've used so far.
> I will never be able to type faster than on my keyboard, and even with the most advanced voice inputs, I will always be able to type longer and with less fatigue than if I were to use my voice—having ten fingers and one set of vocal cords.
Sure, me neither—_today_. But this argument ignores the improvements we can make to XR interfaces.
It won't just be about voice input. It will also involve touch input, eye tracking, maybe even motion tracking.
A physical board with keys you press to produce single characters at a time is a very primitive way of inputting data into a machine.
Today we have virtual keyboards in environments like visionOS, which I'm sure are clunky and slow to use. But what if we invent an accurate way of translating the motion of each finger into a press of a virtual key? That seems like an obvious first step. Suddenly you're no longer constrained by a physical board, and can "type" with your hands in any position. What if we take this further and can translate patterns of finger positions into key chords, in a kind of virtual stenotype? What if we also involve eye, motion and voice inputs into this?
These are solvable problems we will address over time. Thinking that just because they're not solved today they never will be is very shortsighted.
Being able to track physical input from several sources in 3D space provides a far richer environment to invent friendly and intuitive interfaces than a 2D glass pane ever could. In that sense, our computing is severely constrained by the current generation of devices.
> It's practically inevitable that XR devices will become the standard computing platform in the near future
Yeah I mean I just really doubt it. I'm not seeing a whole lot of benefit over smartphones, which are already ubiquitous. At best, I'm hearing that it won't suck that much. Which... okay not really high praise.
I'm sure, like the smartphone, it will replace SOME usecases. The difference is that the usecases the smartphone replaced were really important ones that cover 80% of common stuff people do. So now everyone has a smartphone.
Will that be the case with XR? I doubt it. The usecases it will cover will be, at absolute best, incremental as compared to the smartphone. And, I presume, the smartphone will cover those usecases too. Which is why I think it's more likely smartphones swallow these glasses thingy than the other way around.
I'm not trying to convince anyone. Believe what you want to believe :)
> But I am saying that, as a programmer, if you told me I had to only use an iPhone at work I'd probably set myself on fire.
Sure, me too. But that's a software and ergonomics problem. There's no way you will ever be as productive on a 6" display, tapping on a glass pane, as you would on a much larger display(s), with a more comfortable physical keyboard with far richer haptics. Not to mention the crippled software environment of iOS.
But like I mentioned in other threads, it would be shortsighted to think that interfaces of XR devices will not be drastically better in the future. Everyone keeps focusing on how voice input is bad, ignoring that touch, eye and motion tracking in a 3D environment can deliver far richer interfaces than 2D displays ever did. Plus voice input will only get better, as it has greatly improved over the last 2 years alone.
> I'm not seeing a whole lot of benefit over smartphones, which are already ubiquitous. At best, I'm hearing that it won't suck that much. Which... okay not really high praise.
Have you seen the user avatars in visionOS 26? Go watch some demos if you haven't.
Being able to have a conversation with someone that feels like they're physically next to you is _revolutionary_. Just that use case alone will drive adoption of XR devices more than anything else. Video conferences on 2D displays from crappy webcams feels primitive in comparison. And that is _today_. What will that experience be like in 10 years?
I'm frankly surprised that a community of tech nerds can be so dismissive of a technology that offers more immersive digital experiences. I'm pretty sure that most people here own "battlestations" with 2+ screens. Yet they can't imagine what the experience of an infinite amount of screens in a 3D environment could be like? Forget the fact that today's generation of XR displays are blurry, have limited FoV, or anything else. Those are minor limitations of today's tech that will improve over time. I'm 100% sure that once all of those issues are ironed out, this community will be the first to adopt XR for "increased productivity". Hell, current gen devices are almost there, and some are already adopting them for productivity work.
So those are just two examples. Once the tech is fully mature, and someone creates a device that brings all these experiences together in a comfortable and accessible package, it will be an iPhone-like event where the market will explode. I suspect we're less than a decade away from that event.
When somebody finally gets a clue and implements that, no typist on Earth will be able to keep up with it.
That's because the communication is going from a person to a person and both are very highly tuned to not only hear the words, but the tone, context, subtext, and undertones. There can be all kinds of information packed in a few words that have nothing to do with the words.
Machines, even LLMs, can't do this. I don't think they every will. So typing and shortcut commands and the like are far more efficient interacting with a computer.
Laptops, of course, have the much bigger screen and keyboard, not really replicated by smartphones. They have use-cases that smartphone can’t cover well for hardware reasons. So they’ve stuck around (in a notably diminished form).
If good AR glasses become a thing… I dunno, they could easily replace monitors generally, right? Then a laptop just becomes a keyboard. That’s a hardware function that seems necessary.
What niche is left for the smartphone?
I believe that was the entire point of the comparison. Smartphones replaces SOME use cases of laptops in the same way ubiquitous smart glasses could replaces SOME use cases of smartphones.
If you are afraid of technology, Android or iPadOS is lightyears ahead of Windows or MacOS.
It's more than enough to handle paying bills, applying for jobs, etc. Hell, a Bluetooth keyboard and a bit of grit + GitHub CodeSpaces and you can write develop applications.
You can also cast your screen to a TV or on a handful of phones use USB c to HDMI.
But I also don’t think either of us is gaining anything through this interaction, so... shrug
The post I was responding to clearly meant “like smartphones replaced […] laptops,” which is to say, they don’t think AR glasses will replace smartphones (because smartphones didn’t completely replace laptops). I get that.
Then I pointed out that smartphones did more-or-less replace a number of other electronic devices. And there are some reasons they didn’t fully replace laptops. Then I went on to think about the niches that could exist should AR glasses become a major thing.
I actually read it as "it will be a replacement in some ways, but also very much not be a replacement in many other ways"
It is hard to say when the peak of laptops in circulation was, right? Because simultaneously the tech has been maturing (longer product lifetimes) and smartphones have taken some laptop niches.
I’m not even clear on what we’re measuring when we say “replace.” Every non-technical person I know has a laptop, but uses it on maybe a weekly basis (instead of daily, for smartphones).
BTW, I have to consciously turn off my cybersecurity mindset when thinking about smart glasses. It's hard not to see all the new attack vectors they introduce.
I wear my Ray Ban Metas a lot (bought in 2023) and love them but i can't take selfies with them. I have to pull out my phone. They are complimentary to phone tho i do enjoy not having my phone on me to take pics, vids and ask it for the time now (add 5G to it and it will do more like stream music).
Whatever Open AI is working on to replace the iPhone it will need to be able take selfies! I'm betting it's just an AI Phone with the experience of the movie H.E.R. where almost everything is done from the lock screen and it takes the best selfies of you (gets you to the best lighting) and everything under the sun.
Sounds like a value proposition for society, to me!
Me, (old millennial) can not even conceive getting any real work done just on the smartphone. But I'm a power user. I need to log onto linux servers and administer them. Or I need to crack open Excel files and use spreadsheets. Not an ordinary user.
You only really need one for doing some type of work
Laptops and tablets replaced desktops. Nobody sits down in an office and does work on a smartphone.
Smartphones replaced phones, pagers, music players and cameras.
10 years ago all my non-tech friends and family had laptops. Now they all use their smartphones as primary computing devices. My nephew who just graduated from high school and works in IT doesn’t even own a personal laptop.
Also, mini pcs is a new trend nowadays. I wouldn't say that this is the direction things go any more.
I'd also say a mini PC is still a "performant desktop" in a smaller form factor, which is probably a reaction to gaming desktops becoming unnecessarily large and unwieldy. Similar to how importing Japanese kei trucks has become popular now that American pickups are sized for vanity and not work practicality.
Mini PCs have the same problem like laptops, in trying to squeeze performance in a small form factor which then poses a heat dissipation issue (and even more because the adapter is typically inside the form in this case and that results in more heat). And you cannot put the same high end gaming gpu in a small form factor, which is an extra characteristic they share with laptops vs larger desktops.
Also macs are fine with gaming if the game actually runs. It would not be my primary choice or suggestion if looking specifically for a gaming machine, but I also do not need to look for another machine for gaming now that I have a macbook anyway, as it runs games good enough at high end settings. But a mini pc would not have been a suggestion as gaming machine either.
Smart glasses will probably do the same to smartphones.
Things are rarely completely replaced, at least not quickly.
Now that we have USB-C monitors, phones have USB-C, and high-end phones have CPU performance similar to low-end desktop CPUs (A18 vs Intel 14100), we could actually start replacing laptops with phones for some use cases.
I would be glad to only have to take an external monitor to use with my phone while traveling, but there is little I can do and iphones is not very user friendly in such a way.
Once there is an actual usable in-glasses screen, I will agree.
A few years ago I tried someone's smartglasses with a screen. It basically had similar functionality to my first Fitbit: it would show texts, notifications, caller ID.
I really want one of those and went looking, but couldn't find it.
Shameless plug: We build an open source OS for glasses that works with them. AugmentOS.org
Even Realities G1 are the best HUD glasses on the market right now. They’re the first pair (with prescription) that I can wear all-day without pain, and without looking like a dork.
My team used to main the Vuzix Z100 glasses, starting with the Vuzix Ultralite reference design that predated them. We won’t touch them these days (and recently stopped selling them on our store).
Others… Meizu StarV Air 2 and INMO GO2: both lack public SDK, GO2 is too heavy. Brilliant Labs Frame: cool prototyping toy, awful glasses.
For “AI glasses” that have camera, no display:
You have the RayBan and a number of companies making these. The only one I can recommend is our upcoming Mentra Live (https://mentra.glass). It has the same camera sensor as the RayBan, but runs open source software & has an SDK.
For more sci-fi glasses that run Android and have display + camera, see the INMO Air 3, TCL RayNeo X3. These are too heavy to be worn as regular glasses, but are fun prototyping tools.
All these companies will exist in 2026. As for a 5 year horizon, I’d place my bets on Even Realities, and Vuzix (as a waveguide supplier, not consumer HW). Meizu and TCL will stick around as Chinese megacorps, though I’m 50/50 they will continue developing consumer smart glasses. Brilliant Labs is cooked unless they turn things around with their next pair of glasses.
Google & Android XR: I don’t expect their glasses to be competitive for at least a few HW generations at minimum. In terms of public information, we know they’re monocular and heavy (>45g), which is an immediate killer for the majority of users.
Meta, Microsoft, Apple, etc. are far more likely to snitch on you to the government you actually live under.
I'm not the gp, but for me, there are several bigger concerns:
First, the possibility that access could be leveraged for intelligence gathering or industrial espionage. The goal might be geopolitics, but I still don't want my data to be fodder nor do I want to explain to my employer that I'm responsible for their breach.
Second, the possibility of becoming collateral damage during an escalation of hostilities between my country and China. If I've grown dependent a device, I face significant disruption if they block cloud services or even outright remotely brick it. The war in Ukraine demonstrated this isn't limited to the other country's exports, but they're still at the greatest risk.
So yeah, a company snitching on me to big brother I live under is just one threat I have to consider when giving access to all my data.
It certainly doesn’t compare, but not the way you think.
Do you have any experience with their progressives? The ones I'm trying are so lousy that I'm going to try multifocal contacts next week. According to the order form, their progressive lenses seem somewhat decent.
I sure as f* hope not. I already struggle with my cellphone addiction and all of its constant distractions and assaults on my attention span, the last thing I want is something from one of the largest advertising companies on the planet glued to my face.
It's sort of like blaming the obesity epidemic on lack of willpower. Yes, any individual is responsible for himself. At the same time, companies have found better and more ingenious ways to addict lots of us using food. When I look back at pictures from the 1950s and see that nearly everyone is skinny/normal weight, am I just supposed to think that they had so much more willpower than today's people?
We'll need to overhaul the concept of limited liability before we do that though, the thought of someone being left without their eyes because a company goes bankrupt and no-one is at fault is pretty horrifying.
Ray Ban does it for their Meta glasses, but Lensology can handle stronger prescription lenses.
Ehh, there is nothing special about the lens, all the magic is in the frame, and the rayban and oakley frames look very similar to their standard versions. Getting new lenses for sunglasses is very common.
Have you never had prescription sunglasses?
The frame will probably change slightly over time to make them incompatible.
I just sent an old pair of glasses to eyeglasses.com for new lenses. I never considered this to be a big deal.
This is probably true.
The rest of your comment is probably not true for most people.
It just depends on how strong your prescription is and how willing the shop/website is to do special orders.
If you are almost blind, then your choice of lenses/frames will be much lower than if you are only slightly blind(most people). Any reputable eyeglass/optician shop should be able to make custom lenses for pretty much any frame. They can't always do the super sleek shades that some people like to wear.
I see this world all the time at the beach; lots of people wear sunglasses there.
Because it’s not going to ever be socially acceptable to just start talking to your glasses vs silently typing on a phone in most public places/situations.
2. How confident would you have been about predicting the smartphone’s effects on society today back in say 1995?
With that said, I don’t think these can replace phones until they’re quite a lot smaller and lighter. And to make it worse, you’d need at least two pairs - regular and sun. Possibly three if you’re someone who regularly uses safety glasses.
Meta says they will open it up though.
That said, I'm not sure I'd want smart glasses. Being stuck on a computer for work, I try to take some time every day to be completely free of digital things. It's hard enough to do that with a smart phone in my pocket vying for my attention. I imagine it would be only harder with smart glasses over my eyeballs.
Those things on glasses and I ditch my phone immediately.
I also don't understand how they're used to locate items around the house. Is there some sort of GPS? Or do you mean it helps by virtue of seeing (e.g. prescription)?
AR glasses will be a hit, no doubt, but I don't see what's so special about glasses with a mic, camera and speaker on them. Seems especially for an older person that it would be more useful getting a phone with a screen and pointing at things and seeing it on a display.
A phone you have to hold in your hand whereas glasses you don't. Therefore glasses are superior for these use cases.
I’m very curious what this person did before these glasses were released.
The glasses have a camera, and small speakers near your ears. They also have a microphone, so you can give them voice commands. Like Amazon Alexa, but in the glasses.
Hey Meta, read the text on this label and tell me what it says.
Hey Meta, do you see the keys on the counter?
Hey Meta, can you tell me what is in front of me?
It projects the sound into your ear.
And mobile phones aren’t going anywhere because mobile computing has peaked: there are no use cases that require a device with a different form factor, it’s just a matter of lifestyle preference.
If we’re abandoning screen based devices, I’d rather have a small 2000s style flip phone with all the latest tech and LLM features built in, than something like glasses, which clash too much with fashion choices. Bonus if the battery life is insane.
It tends to wear on the bridge of the nose after a while. And I'm sure these e-glasses are going to be heavier than normal glasses with a battery and electronics in addition to the normal things glasses have.
I couldn't fathom if I would use these things for myself (at least not now, cause I'm ok with my Smartphone and don't really want to get a Meta account), but this, definitely changes my perspective a little.
I returned them cause I didn’t like forcing a camera into everyone’s face unannounced and the photos it took weren’t very good (vertical pics cutoff most stuff in my field of view, weird choice of focal length. Maybe with two cameras they could have a wide angle and a telephoto but the ray bans at least just had the one.
I think that they've done it, this is Meta's iPod
I would love to try these types of devices but there is no way I'd ever give money to Meta or put my personal information into their systems or encourage my friends and family to do so either.
Hopefully Meta puts in a bunch of R&D to see what works in this space and then someone else (Apple?) just copies it.
Meta running the show is a non-starter.
At least in regard to Palantir you understand their business. Meta masquerades, hides, and cowers their shady practices behind consumer friendly products.
Toxic lollipops labeled properly as toxic vs toxic lollipops labeled with a tiny * that requires consumer research. Which one do you think most people will reach for first?
I'm yet to be convinced these are useful and not just another way to inject ads directly into eyeballs.
How are these "smart glasses" legal in places like Germany where you (supposedly) can't even have a dashcam?
its actually quite good. but it took them twoish years to get it into production.
but only for research, not on these glasses.
I’m personally more excited about the Mentra Live glasses, which are fully programmable with AugmentOS.
When I was testing my voice AI app with them there were no major issues: https://apps.apple.com/app/apple-store/id6737482921?pt=12710...
In a way, that's similar how you can't change iPhone/AirPods to stop responding to "Hey Siri" and trigger ChatGPT instead. So I still label my take as weird )
i would argue, though, that having integrated access to AI that can react to what the user is seeing is a form of digitally augmented reality.
I discovered goodr recently and they are great. 25$ high quality sunglasses that I can actually trust have real UV ratings. Seeing people wear ray bans or oakleys is really funny
EDIT: ok apparently anywhere else than the poorest of countries, too, really.
The difference is that oakley are sports glasses, which means that meta can now start sponsoring sports events, which they couldnt do with rayban.
whats interesting, is that these glasses look normal, and not like the standard disphit magnets that oakley normally cater for
I won't be buying it though - I tried talking to Meta AI in voice mode from my phone, and it's response to anything STEM related is basically "tee hee I'm just an AI I cant do that". My current assignee to my phone's AI hardware button is microsoft's. I assume it's an OpenAI model, but it lets you speed their voice up, which I value greatly.
A:
With Oakley Meta's glasses, you can:
Capture high-quality video and photos hands-free with a built-in ultra-wide 12 MP camera.
Listen to music, podcasts and more through Bluetooth speakers seamlessly integrated into the frames
Make and take phone calls hands-free
Live-stream your adventures, travels or daily life
Use Meta AI for instant information and assistance – just say "Hey Meta".
----
If you don't need the camera then just use a smartwatch that does much more. Maybe get a camera wrapped to your forehead instead.
Didnt realise there were that many social media pickup artists :-)
Titanium glasses are lightweight because a very minimal amount of material is used. This is possible for regular glasses because you can make them with a ~1mm cross-section. When you want to put electronics inside of them, you need much more material.
I think the reason titanium glasses feel nice is primarily because they have minimal contact with your skin and very low mass.
My frames weigh 6 grams, with lenses they're 14 grams. The Meta Ray Bans are 50 grams. If you could make the frames from pure helium they still wouldn't feel close.
I assume the poster above imagined the something inside could just be voids, like a tiny aircraft. But yes, some kind of low density filler could also add some stability in areas you don't want mass metal but also don't have electronics or battery "cargo".
Edit: Just checked, it does exist.
Between the weight of material and the electronics, I don't really see anything approaching the feel that someone that discerning would want.
I mean, the material is nice, but you're not making it light weight that way.
I guess the problem is can you extrude and form something so small with the precision and metallurgical properties you want to maintain. You probably don't want to just cast it in the final shape, right?
Believe if you worked at Meta when the glasses just came out there was also a limited fully transparent frame as well.
What makes these FB glasses any different / special? Do they automagically obscure the views when in compromising positions?
/sarc
[1] - https://en.wikipedia.org/wiki/GI_glasses
[2] - https://www.urbandictionary.com/define.php?term=glassholes
It's kinda weird to me that this feels so dystopian.
Before smartphones it's not like I was sitting there appreciating the world around me. I was just bored and unhappy. Now I'm paying my bills, watching funny videos, looking up interesting things about something I heard about earlier in the day, etc.
But still, there's something off-putting about a group of strangers doing something mundane like waiting for a bus and staring at their phones the whole time.
I'm not sure what I would gain by sitting in a doctor's office waiting for my appointment and being bored vs. sitting there with my phone reading/watching something.
The problem (in my opinion) comes when you start replacing everything you do with just scrolling on your phone. Like if I'm bored at home I just scroll TikTok instead of playing a sport or learning a new skill.
This is exactly why regardless of how breakthrough Meta makes their wearables, it will never reach full market potential as their brand will always be known as a privacy nightmare. Apple is going to win here if they can get the price right.
VR I think most people can rationalize it as these headsets are used in very controlled states, like at home on the couch or living room - but for Meta to convince people to use this in their every day life..hard sell.
I think the biggest hurdles to widespread adoption are ease of connectivity (getting pics and videos “out” of glasses and onto phone/laptop/cloud), ultra high def images, size of on device storage, and battery life. Those are tough, given the form factor. But, if cracked, these will be huge.
Google Glass remembers…
In Switzerland we have a nice law that forbids people filming others without their consent. And law is actually enforceable here, fines are juicy and repeatable offenses punished harshly. I personally wouldn't outright punch in the face a person wearing such glasses but asked them to stop wearing them around me and my kids, and where it goes further depends on them.
Also, why is the design in those so weird? They look like kids glasses for 2 bucks, all photos seem like they are not sitting on the face well, zuck including.
Quick to justify violence for someone that opened with "the government will protect me with fines after the fact". Just noticing.
This escalated fast.
In my mind it was inevitable that we would reach this point. The novel Snow Crash predicted this exact phenomenon (along with the Metaverse, of course, which is where they are trying to drive this). It’s the same with companies issuing cryptocurrencies and the like.
We aren’t totally locked in to the techno-feudal state just yet, but we’re getting there. Pretty fascinating how foreseeable these last many years and decades have been.
It's a twisted kind of foresight, a lot like prophesies in ancient stories, that both predict the future and steer their subjects toward it. Neal Stephenson wrote books that appealed to (many, but not only) nerdy boys during their formative years. Lo and behold, 30 years later some of those boys have achieved wealth and power, and are trying to make random aspects of those books real.
Keep that in mind if you ever write a YA novel...
I get the sentiment, but I really enjoyed the Oakley Flak series, the fit was superb. BUT, I hated the feeling that it was perceived as a statement rather than utility. I bought them originally for high field of vision and blocking glare when riding bikes.
I lost a couple pairs and one got scratched really bad - and I don’t have a nice alternative in mind that fits a reasonable budget. *Am open to suggestions, I need ONE nice pair of polarized sunglasses that’l last.
Oh for sure, they were quite good, there was a real reason they became popular. But like you said, it got to be annoying that wearing Oakleys was a statement. (Plus, at least where I lived, they were part of the "douche uniform.")
In terms of recommendations, I'm still getting lots of mileage out of my Ray-Ban Wayfarers, and they're polarized.
There's camera's everywhere at this point: every doorbell/garage, every store, every light on the street, even my friend's pet/baby monitors when I visit. I hate it.
I was visiting a museum yesterday and someone was using these to livestream/record their own (bad) tour. Security stopped someone doing this with their phone earlier, but had no idea what this guy was doing.
I asked if those are cameras, he said yes. I asked if he’s recording he said now. I told him in any case I find it very off-putting to have cameras in my face and that I’m going to go. Shook his hand and that was it.
I feel like that’s the right way to handle it. I’m sure I’ll need to keep doing it.
We already have enough mass surveillance devices. But I suppose two arguments could be made (1) we don't need more of these, or (2) peak surveillance has already been achieved, and adding one more doesn't make any difference.
You really need young people to carry tech like this, and needing them to wear millennial fashion from 10 years ago so camera and compute fit will just make it that much harder.
Heavy frames and large lenses tend to compensate for larger noses, and other facial issues (although they won't come out and say that). Clear glasses can really focus on the eyes.
I know a couple of women that have made large, heavy-rimmed glasses into a real fashion statement.
Not sure why theverge gets linked so much here.
But at least the last paragraph seems to be adding something, although the rest of the article is indeed just a re-hash of the press-release.
> Meta recently signed a multi-year deal with EssilorLuxottica, the parent company behind Ray-Ban, Oakley, and other eyewear brands. The Meta Ray-Bans have sold over two million pairs to date, and EssilorLuxottica recently disclosed that it plans to sell 10 million smart glasses with Meta annually by 2026. “This is our first step into the performance category,” Alex Himel, Meta’s head of wearables, tells me. “There’s more to come.”
Always-on microphone and camera sold by one of the world's sketchiest privacy invaders? Check.
Display that actually takes advantage of the glasses form factor? Nope. Sounds like this could just as easily be the Humane pin.
People were so angry in 2013.
I don't know about everyone, but I found it pretty hard to use. Caveat, I didn't get them fit to me, I was supervising an intern working on a speculative Glass project, and they were fit to him.
AR would be neat, but voice interfaces are acheivable at an approachable cost. I'm not one to talk to a computer, and I wear prescription lenses, so these glasses don't appeal to me, but I can see there's a market there, not sure how big or if Meta can capture it.
Mic and speakers, too.
Glass attempted a display, but IMHO, it was unusable, so I understand why you would try the same thing with no display. Or the same thing, but mounted on your wrist (Google Wear).
The camera may or may not be always on, but it can be turned on by software activated by the always-on mic (again, demonstrated by the promo video), so it would be best to treat it as though it is.
[0] https://about.fb.com/news/2025/06/introducing-oakley-meta-gl...
I agree that the primary issue is that it's a software-controlled microphone with no off switch controlled by software written by Meta. I only emphasized the wake word listening in response to OP's claim that it's not always on when it must be.
> 15. "Loaded firearm" means any firearm loaded with ammunition or any firearm which is possessed by one who, at the same time, possesses a quantity of ammunition which may be used to discharge such firearm.
So I guess I'm using the New York definition of an always-on camera.
they have proven over and over and over and over again they are absolutely not trustworthy.
at some point we have to come to grips with the fact that people like zuck, elon, andreeson, and other tech monarchs are openly hostile and despise us when we ask for anything remotely resembling transparency for their companies but repeatedly abuse us and openly scoff at our privacy.
the fact that we collectively don’t understand the repercussions of this really is a bad sign.
i very well may have misunderstood your meaning, tho. i hope so.
There have been at least five AR glasses that I can think of and this is only one that anyone really uses. So, no.
Actually nevermind I saw this sick demo on Reddit of an AR putting assistant but I think they had to strap a depth camera on the device. So AR means mini golf pro?
It's ridiculous and disappointing. I think Facebook is used to not providing real value add to their users and thinking just exposure to cybernetics is enough of a sell. That's completely saturated now though.
Yes. Haven't you seen his new Gen-Z midlife crisis haircut? Clearly he is a very cool and relatable guy.