Ref: https://www.amazon.com/Extension-Extender-0-65ft-Thunderbolt...
USB-C gets rid of all the stupid previous decisions on the physical connectors (orientation required but not obvious, fragile clips, too large, too small), the physical side of things is now set and hopefully all devices, chargers and outlets will now converge on usb-c.
Yes getting the right cable can make a difference but the situation is so much better than before, partly because phone manufacturers were forced by the EU to adopt one connector early one. I’m so glad Apple’s proprietary connector is gone.
Apple made Lightning when the rest of the world was still mucking about with Micro-USB, which I would argue is just about the worst connector ever in common use. The only type of cable where I routinely kept a half dozen on hand because they failed so damn often.
I do like USB-C, but despite being superior (physically) on paper, it's not as robust as Lightning, definitely more finicky. But it has more capability, which is important.
I have a compulsion for fixing things, so I've seen a lot of gadgets where a connector has been broken away from a circuit board due to repetitive stress on a plug. The most common have been audio plugs -- headphone jacks in cellphones, and some connectors in musical instrument gear. I'd much prefer to replace a $5 cable than an expensive phone or gadget.
But of course it's arguable that they made it too delicate.
Now that I'm on my soap box... I've also seen a lot of damaged cables where the breakage is in the wire just as it exits one of the plugs. And a common cause is the habit of coiling your cables neatly by wrapping them as tightly as possible. Since I mentioned musical gear, I'm a working musician, and I cringe when I see how people -- even engineers -- treat cables. I always advise people to watch one or two of the ubiquitous videos where some burly roadie shows the proper way of coiling and handling a cable. I'm a bassist, and I have cables that have lasted 20+ years.
I've had two devices where the MicroUSB socket has broken off the PCB. Not a huge amount considering I've probably had tens of devices with MicroUSB power over the years but a truly inconvenient amount given the impossibility of a home fix (for most people.)
Now I use those magnetic-plug cables and just leave the MicroUSB ends in whatever I might need to charge to avoid the physical stress.
Not sure if it's the connector or the build quality, but want to throw in the opposite experience.
- Pulling on the cable to unplug it, instead ensure you pull on the solid connector on the end.
- Bending at the point of the cable connector, resting the phone upright on the cable + connector when plugged in (e.g. in cup holder in a car) or stretching the cable too long that it causes a bend in the cable at the connector when plugged in.
There was a recent HN post about cable abuse and it said coiling too tight doesn’t itself damage cables (I will add I don’t like how it makes the cable get a memory and wants to kind of recoil itself all the time), but I think the action of too tight coiling incidentally puts more stress on where it joins the connector.
Had every Apple device that used Lightning and consequently have had a veritable smorgasbord of cables from official to Poundland to weird keyring ones; never had a single one fail.
Then again, I've not had a MicroUSB or USB-C cable fail on me either (without obvious physical damage like the one I half-melted by injudicious aiming of a blowtorch.)
I settled on buying packs of 3rd party braided cables for myself and parents so we could switch them out more easily.
With MagSafe, I rarely use a cable at all anymore!
But to be fair I've also had many issues with Lightning. A few shorted out and became unusable and burnt on one side. And those were 100% original bought in the Apple store, as were the 5W chargers and iPhone this happened with.
Knockoffs were generally terrible and might stop working. A "genuine" cable bought from big retailer turned out to be a knockoff once after a software update, resulting in annoying popups from Apple. And some knockoffs were so bad they didn't stay in.
Even certified Mfi ones from Belkin somehow felt different, like the tolerances were slightly off. Those worked though.
Overall, I think it's had a good run and was underrated as a connector physically, but on the whole I like USB-C and it's more open ecosystem more.
Lightning females are basically eternal for all intents and purposes even if they do feel a bit looser at the end of the terminal's lifetime by eg letting water into the connector part itself.
I don't know that it is any worse with USB-C as all usb-C devices I own are far more sheltered from everything.
USB-C is still welcome though because all other types are barely pluggable into a compatible device.
Apart from that though it was proprietary, which is awful for lots of reasons; that’s the main reason I’m happy to see it gone.
Now, HDMI, on the other hand... yeesh
50% failure is an admirable and lofty bar that all electrical connectors should strive to meet.
Lightning is so awesome and universal that Apple has never even bothered fitting it to a pedestrian device like a computer, and has reserved it for only their most very-exclusive, high-tech devices (like the portable telephones and mice that were once available at astutely prestigious retail locations such as Wal-Mart).
Seriously, this Lightning connector is like the best Kool Aid ever. It's a shame that they stopped making it; it could have been everywhere, if only it had more time in a truly free market.
12 glorious years was clearly not enough time. It deserved so much more.
And by the time you revise the pinout, you effectively have a different connector. Lightning was nice-ish to plug in, but the wear-component was on the expensive device, not the cheap cable, and pairing it with the shit data transfer rate makes it a terrible connector
I'm stuck putting wire labels on every USB c cable I own. I can't tell the difference between a 3A and 5A cable otherwise, same for usb2.0 only cables vs 3.1 vs 3.2 4x,whatever the fuck.
USB-C has allowed me to grab one decent two-port charging brick, two solid 6ft cables, and charge just about everything I own just by keeping those in my backpack. If I think I'll need to move any data fast, etc., I just throw my one good USB4 cable in my bag, too.
I will admit, though, that I've had some crappy situations at work where it turned out my flaky monitor setup was due to the stupid work-provided docks coming with cables that only supported 10Gbps. Better labeling would've solved those ones.
The steam deck forced me to finally pay attention to the usb-c ecosystem and I can only imagine how some non tech people might get with mysteriously bad or slow charging.
I find it crazy that Apple went back to magsafe in the m4 (maybe earlier but that's the machine I have at work). But at least you can still charge over usb-c.
There are some weird active cables but the vast majority of USB cables you'd buy today just need a speed rating and a note of whether they're 60 or 240 watts.
this is 100% Claude-generated,and without citations I'd be very careful at trusting it. wonder why whoever prompted this in existence would not include actual references and sources of information.
disclaimer: me -> everyday CC user, so trust me, this thing loves to spit nonsense.
I don't particularly care if it's right or not but this is ...weird. Especially from Rands.
I can't parse what the idea is here, like, what's being communicated and why. The "minimal writing" version says too little, the "throw everything and the kitchen sink version" says too much. And enough of both is slop (meaning, unneeded) that it's hard to orient yourself and find a guidepost, if there is one.
And I love using AI, and my reading comprehension scores have never been below 99.9%. Idk why I'm even sharing that. It's just, it's not me, it's not some battle I'm fighting, it really is a real problem, not just "oh it's Claude", it's bad writing in an alien way from an author I've always loved.
EDIT: After my 11th minute and 4th read on this, it has become clear to me that the idea is, you don't want to use the cable that comes with your iPhone for general USB data transmission because it is slow. The noise in the short version is USB IF, 5gbps, MacBook Neo.
The Twitter link that's on the footer of your website is wrong Mr. 0.1 percentile.
Are you okay?
(why am I asking? its in header, link works, aggro interaction)
Not sure what value someone generating slop like this thinks they are adding but I think it’ll become a strong social stigma to generate articles and people will later be very embarrassed by all this slop.
USB-C is in fact completely fine in normal use, and cheap cables are about the only problem with it.
I'd very much rather not have a new connector shape every time the technology improves and devices and cables gain new capabilities. The benefit of where USB-C is at, is the new stuff is backwards compatible with previous generations. The complaints in the early years - about one connector, unpredictable capabilities - were wrong. It took time for this benefit to accrue.
Also all the version numbers and brand names have been confusing, but the bandwidth is just a single number that goes up each generation and covers most of the issues now. There are just a few edge cases this doesn't cover these days.
In this way, I would be able to see (using the advanced, integrated bionic vision system that I've carried with me and used every day I've been alive) what it is that I have before me instead of plugging them in one at a time to some electronic oracle to try to discern the details of the invisible magic inside.
And yet, this requirement already misses the other thing it should state: it's power rating. Because even two cables with the same bandwidth can have widely different power rating, and thus powering capacity or charging speed for different devices.
Powering capacity sometimes matters, but are there any devices out there where the charging speed would be meaningfully different? As in, they use significantly more than 60 watts to charge? (I looked up some of those super fast charging phones and they don't seem to be following the USB standards in the first place.)
Any device which can charge at 100W or more? Like lots of laptops, as well as my ebile batteries?
Soldering irons can benefit from being able to peak above 60W when heating.
A dainty little USB-powered Pinecil v2 can peak at ~126W with appropriate firmware and an EPR 28v PD 3.1 power supply. It's an impressive feat. :) (And, yes, it requires a USB cable that is e-marked for 240W before this is allowed to happen.)
That said: 28v EPR is a bit usual. A more typical configuration runs on 20v USB PD at no more than ~64W, like a cheap, genuine [safe], used 65w Lenovo laptop charger cheerfully provides.
Also, RJ45 is terribly fragile if you keep plugging and unplugging it, eventually that latch will break. And copper can barely support 10G and is terribly power hungry when it does that. And the cables get thick and inflexible.
Nah, there's enough space for an RJ45 connector on the 0.48" thick E7270, so there's certainly enough space for one on the 0.61" Macbook Pro 14. The trick is putting the connector on the display hinge.
Laptops no longer come with ethernet ports because (a) wifi is good enough for most people, most of the time; (b) apple went USB-C-only in ~2018 and other 'premium laptops' copied it; and (c) by the time that trend reversed and laptops started re-adding hdmi and usb a ports, demand for ethernet connectors was lower than ever.
Every single time someone has provided pre-terminated cabling for one of my jobs to "save time" or to "make it easier", this provision has done neither.
Instead, it has consistently multiplied both the time required and the installation difficulty. It has done these things while also producing an inferior end result.
It is my anecdotal observation that it's NFG.
https://www.lenovo.com/us/en/p/accessories-and-software/cabl...
The adapter still has to adapt. That requires power, adds cost, and adds negligible-but-non-zero latency. I don't love the proprietary port being proprietary, but the fact remains it is native Ethernet with no caveats.
AFAIK, thunderbolt cables are also copper - so what trickery do they use for supporting USB4-80? i believe both connectors use differential pair wires for signalling.
Devices plugged into an Ethernet network are true peers, but USB is master-slave by necessity. Ethernet devices have unique addresses, but USB devices can be anonymous, only identified based on the port they're plugged into. Ethernet is best-effort with buffering and packet dropping, but USB provides guaranteed delivery with tightly bounded latency. Ethernet signals must travel up to 100 meters but USB requires the host and device to be within a few meters. You could reuse the physical wires, maybe (we already do! USB runs on twisted-pair) but nothing else, from the connector to the topology, is usable.
I have not yet made a laptop to output PoE. Though it would be tremendously useful for provisioning IP cameras, there are dedicated thick-tablet-shaped devices for that, which do source PoE from their batteries.
Some can even give and receive power and look the same as others that can't!
You already have a high throughput data cable between your PC and monitor. Carry USB over displayport or whatever. At least then you can use more than one PC on an entire city block.
"The plug on this device represents the latest thinking of the electrical industry's Plug Mutation Group, which, in a continuing effort to prevent consumers from causing hazardous electrical current to flow through their appliances, developed the Three-Pronged Plug, then the Plug Where One Prong is Bigger Than the Other. Your device is equiped with the revolutionary new Plug Whose Prongs Consist of Six Small Religious Figurines Made of Chocolate. DO NOT TRY TO PLUG IT IN! Lay it gently on the floor near an outlet, but out of direct sunlight, and clean it weekly with a damp handkerchief."
On the downside, it has highlighted what a cowboy industry manufacturing USB-C cables is.
Power capacity is relatively easy to measure ad-hoc via voltage drop from one end to the other...USB-PD controllers already do this and can even fine-tune the voltage to make sure that if the device receiving (sinking) power needs 20V they'll send 20.4V or 20.9V to compensate for voltage drop so that the charging device gets 20V on its end.
But actual maximum data throughput is hard to know. The only way to really "know" how much data can flow through a cable is with an expensive oscilloscope or cable tester. Because 80Gbps cables run at ~13GHz so, at minimum you need a 26GHz scope (Nyquist–Shannon sampling theorem) or more practically a 52GHz scope. And it turns out it's really expensive to measure electrical signals 52 billion times per second. The necessary devices start at $15,000 (cable signal integrity tester) [0] on the very low end and only work for max 10Gbps USB 3.2 cables, or past $270,000 for 80Gbps USB4 cables (proper 60GHz oscilloscope) [1].
On the high end, each signal integrity test device can actually cost $1-2 million [2] where the base unit starts at $670,000 plus then spending additional money for hardware-accelerated analysis, specialized active probes, and the specific PAM-3 / USB4 compliance software packages.
0: https://www.totalphase.com/products/advanced-cable-tester-v2...
1: https://www.edn.com/12-bit-oscilloscope-operates-up-to-65-gh...
2: https://www.eevblog.com/forum/testgear/uxr1104a-infiniium-ux...
The interface IC almost certainly also estimates signal quality, but it's likely hard to get that information out of it.
If only they all did. I have a significant percentage in my pile with no e-Marker chip. They'll be the first to be culled once I get around to that, mind.
If a USB4 device can output a USB4 stream and the receiver can check that stream for errors, isn’t that sufficient?
It could be reasonable for computers to be allowed to trigger a data throughput test and the peripheral would state "I support up to 40Gbps of receiving/sending", and then send a simple pattern that can be generated on the fly. But a lot of devices can't receive/send that 80Gbps of data for long enough to perform a decent test - the storage, RAM, buffers, etc get depleted or act as bottlenecks.
If you know enough to accurately interpret the measurements you get from that, you know enough to write your own computer program to try to send 80Gbps from one computer to another and use DMA to process it in real-time without hitting storage (which a lot of peripherals likely don't have the CPU to accomplish).
If you don't know enough to write those test applications, you probably don't know enough to interpret the results of a built-in test function and the measurements would confuse and frustrate a lot of well-meaning, nerdy, but under-educated consumers who make assumptions about why they're not actually getting the rated speed.
Idk, my opinion doesn't go one way or the other here. Perhaps I myself don't quite know enough to be a good judge of that concept.
All an end user cares about is if the cable is the bottleneck, if you think you have known-good devices. If I have a MacBook and a good NVMe enclosure, I want to know if my cable is fast enough, rather than have it quietly fall back to 3.2 or worse.
Your information is out of date. You can buy 240W chargers from Framework which I assume are just rebranded Delta chargers:
https://frame.work/products/power-adapter-240w
The Framework 16 supports this 240W charging input, as well.
This is because the cross-sectional-area of the conductor would create an inflexible cable – and even then the connector (even though rated) could never handle a sustained 240W in the real world.
Fires. Fires everywhere... this is why no 240W chip exists.
src: electrician
USB-IF certifies plenty of USB cables as being tested safe for 240W. The reason 240W chargers don't exist is due to cost and a chicken-and-egg problem. There’s not really any demand for it.
Idealized, sure it'll work. But any realworld ports will be arc/fire hazards (e.g. after corrosion, wear, damage).
This was on show hn only yesterday.
Probably can't tell you anything about the other end of the cable though.
> Is this hard to do or just something normal people never care about?
If i believed in conspiracies i'd say the usb consortium or mafia or whatever it's called is pressuring software developers to not display that info. Otherwise they'd have "normal people" with torches and pitchforks at their door.
There’s a reason that Windows barely shows any errors until the system fully halts.
The problem with most of those is that either users don't care until it's too late ("I need to get this done now, I'll delete files later"), third party applications are the cause and Windows can't/shouldn't interfere (did a program memory leak or is the user pushing the boundaries of what the system can handle?), or because there's not much the user can do about it ("your GPU driver crashed", well gee, my drivers are up to date, let me spend half a month's wages on a new GPU then, shall we?).
The only "too late" errors I've seen on Windows are when something very important has crashed and the system needs to shut down for data integrity (crss.exe crashing on school computers comes to mind, though I doubt that was the fault of Microsoft), or when something unpredictable went wrong, like a file ending up corrupt because of a failing hard drive or flipped bit in memory.
Microsoft actually created a dedicated screen to monitor errors and failures of all kinds (https://www.elevenforum.com/t/view-reliability-history-in-wi...) that's been around since Vista. It used to open up automatically if you clicked a popup after certain errors, but it appears Microsoft eventually stopped doing that. Going by how many "today I learned" posts I find when I look up the feature, I'm guessing nobody who actually understands what the screen does ever used the feature.
For many/most applications, 5V/1A power + 480Mbps USB 2.0 data is supported on every or almost every USB cable and device, and exceeds requirements. USB C being ubiquitous and capable of these makes it a the most consolidated/universal power + data standard I have experienced in my life. It's also a small connector that's easy to plug in.
There are exceptions: Charging your laptop or phone benefits from higher current. External drives or other mass data transfer benefits from high speed. I look at the electronics devices (Computer peripherals and otherwise), and most are fine with USB-C for power and data, not coming close to the limits on either.
USB-C ports aren't allowed to provide power until after configuration, but a lot of USB-C chargers provide 5V regardless. This is wrong, but it does mean you can use a dumb C-to-micro cable which doesn't include the necessary electronics. (A pull-down resistor at least.)
And of course there's no way to tell by the looks of the cable.
The guy in the shop plugged it in to a USB-A port via a cheap A-to-C cable, and the mouse immediately came to life. Of course. I felt like an idiot.
I didn't get a faulty unit. Whoever designed the mouse was treating the USB-C plug like a newer micro-usb port. The mouse just expected 5V over the port. They clearly didn't bother testing it with a proper USB-C charger.
I returned it anyway and got a mouse that wasn't broken.
Absolutely baffling, but it only happened to me for brands where I should've figured.
As a hardware engineer among other things, that was one of the first things I learned about interfacing with USB C. How do so many consumer devices keep getting this wrong in the year of our lord 2026?
I understand the technical reasons behind it, but in this case - the actual expectation is to be able to use usb-c to charge other gadgets.
If you had a device that wanted 12V input on a USB-C port without negotiation (these products exist, and are dangerous because they come with chargers that just output 12V without any negotiation at all…), whose fault is it? The vendor who chooses to ignore the clearly defined spec to save a few cents and risks damaging devices, or the vendor who follows spec and prevents damaging random devices?
Neither side is wrong per se, though it's quite annoying that Apple didn't implement PPS. Then again, if you're buying Apple, you should probably expect these kinds of shenannigans and be ready to need to buy dedicated peripherals.
At work, our quick test for if a device implements USB PD correctly is to plug it into an Apple power supply (optionally with a PD protocol sniffer in line). If it doesn't work (either no/intermittent VBUS or the wrong VBUS), it's always been the case that the device is doing something wrong.
It can be annoying but strictly speaking their fault.
Apple, somewhat famously, build their power adapters incredibly well.
If they’re not charging something my default assumption will be: that thing doesn’t support PD.
I've been much happier since switching to Anker chargers, works much better with my Lenovo and drastically more portable than the Apple ones. It's better able to fit certain situations where the Apple brick won't fit into sockets that are close to the ground / desk, at least not without a bulky extension cable.
A bit of snark, but don't forget the Apple charger recall:
https://support.apple.com/ac-wallplug-adapter
(That said, I do think Apple's chargers were designed far better than most, and I loved that they put so much design thought into the world travel kit. Anker doesn't have the interchangeable heads, but it turns out their chargers are multi-region and a simple adapter head does the job just as well, in a smaller form factor than the Apple bricks. I still somewhat miss Magsafe as well, Magsafe 1 was excellent.)
That said, the only weirdness I've experienced is a device that came with a USB C to A cable that would not take power from a C to C
> The lie.
> The gap.
> The names.
> The age.
> The trap.
> The buy.
> The truth.
> The chain.
> The lunacy.
> The cheat sheet.
Fucking LLMs have literally ruined the word "the" for me.
Also, I encourage people not to change their writing style just to avoid patterns that AI likes to use. I'm going to continue my em dashes.
The discussion here is much more interesting IMO.
https://web.archive.org/web/20170918052437/http://www.jerkci...
Wow he was totally replaced, how weird, here's another example and they totally changed the strip:
https://web.archive.org/web/20170918052444/http://www.jerkci... https://bonequest.com/712
"The lie, the age, the gap, the trap, the names, the buy, the ..."
I really don't come to HN to read such a stuff and HN is full of it since months. Please let us flag it and filter it out.
We appear to have taken a good idea and made it shit very quickly.
If the USB forum enforced their specifications, everyone would be complaining that their cables are now ten times the price, and people would still buy knock-off cables.
Same goes with chargers: I bought a 100W charger that stops delivering 100W after it overheats about half an hour into a session. I could spend twice as much on a charger that sustains the charge, but I probably wouldn't have bought that charger at all for that price.
USB-C would either be branded a bullshit expensive standard (like Apple's Thunderbolt cables are generally regarded) or an incomplete standard that gives manufacturers too much leeway.
I, for one, am quite happy that I can just buy a USB C charger now rather than spend 180 euros on an OEM replacement, even if I ocassionally need to throw a cable into the "garbage that came with an accessoire" bin.
What? The USB mafia has been at it since usb 1.1 or at best 2.0...
I mean, it's dumb to charge a phone with it, since you don't need 80Gbps capability, but it'll fit your requirement of not being confusing :P
So USB 1.1 was 12Mbps (theoretical). USB 2.0 as 480Mbps (theoretical)... kind of. It got complicated because a distinction was made between USB 2.0 Full Speed and USB 2.0 Hi Speed. "Full" Speed was just USB 1.1 (12Mbps). USB 2.0 Hi Pseed was the 480Mbps. I assume they didn't want to confuse consumers who might wonder if they can plug USB 1.1 and 2.0 together but they just created more confusion. Nikon famously started saying USB 2.0 for Full speed, as just one example.
So the version number is useless to consumers and should never be used.
This got a whole lot worse with USB 3.0+ because more capabilities got added to the standard but not all cables supported them so you could look at a cable and have no idea what it could do. Capabilities include:
- Data. This started at 5Gbps for SuperSpeed but has gone higher with subsequent versions.
- Power (max wattage varied)
- USB Alt Mode (DP, HDMI or TB over USB-C)
So how do you capture at least 5 capabilities of a cable? You can't make a cable do everything. That's prohibitively expensive and also massively limits cable length.
Whatever the case, saying things like "USB 3.2 Gen 2" was not the answer.
Which just gives two properties to care about: data rate and power. I can’t remember a usb plug which didn’t have the space to add 2 numbers / 8 characters.