In this case the output wasn’t actually used for financial modeling. If it had been, it would have been caught immediately when someone put it into a table where they calculated the price or the supply constraints or anything else.
In reality copper is just convenient. We use it because it's easy to work with, a great conductor, and (until recently) quite affordable. But for most applications there's no reason we couldn't use something else!
For example, a 1.5mm2 copper conductor is 0.0134kg/m, which at current prices is $0.17 / meter. A 2.4mm2 aluminum conductor has the same resistance, weighs 0.0065kg/m, which at current prices is $0.0195 / meter!
Sure, aluminum is a pain to work with, but with a price premium like that there's a massive incentive to find a way to make it work.
Copper can't get too expensive simply due to power demands because people will just switch to aluminum. The power grid itself had been using it for decades, after all - some internal datacenter busbars should be doable as well.
Residential aluminum is a Really Bad Idea because DIY Dave will inevitably do something wrong - which then leads to a fire hazard. Copper is a lot more forgiving.
But a large scale datacenter, solar farm, or battery storage installation? Those will be installed and maintained by trained electricians, which means they actually know what a "torque wrench" is, and how to deal with scary words like "corrosion" and "oxidation".
Like I said: it's what's used for most of the power grid. With the right training it really isn't a big deal.
For commercial installs, it shouldn't be a problem as long as it's planned for.
That said, there is no reason we can't design better connectors that can withstand the expansion and shrinkage cycles, like spring loaded or spring cage connectors.
Aluminum bus bars(solid, often exposed) would be designed for the required power levels and installation criteria.
The biggest reason is that aluminum oxidizes, and unlike copper, the oxide layer has high resistivity. In theory that shouldn’t be an issue in datacenters hiring expert technicians.
Aluminum has a higher resistance, which means the same diameter will get hotter than copper. Make the cable thicker and its resistance drops, which means it gets less hot.
Want more amps at the same temperature? Ohm's law still applies: just use a thicker cable.
Look at the electrical fires of the 1950’s and 1960’s as an example, and that was at household levels of current.
Aluminum is used, but everything accounts for the insane coefficient of linear expansion and other annoying properties.
Each feeder can be aluminum if you put special goop on any copper connections. Breakers accept it just fine, etc.
You should avoid it for smaller wiring, though. There's special 8000 series aluminum if you're trying to be serious with Al feeders
> "Tat sounds like the ultimate catalyst for the commodities market and copper has been hitting records."
"Tat" should be "That", imo.
The history is quite interesting and well worth checking out.
I can't recommend a book on the subject, but I do heartily recommend "Longitude", which is about the challenges of inventing the first maritime chronometers for the purpose of accurately measuring longitude.
It's not the most aesthetic one, but it was at the time the most able to be measured.
https://developer.nvidia.com/blog/nvidia-800-v-hvdc-architec...
Quickly doing such "back of an envelope" calculations, and calling out things that seem outlandish, could be a useful function of an AI assistant.
Sure, using or not using your brain is a negligible energy difference, so if you aren't using it you really should, for energy efficiency's sake. But I don't think the claim that our brains are more energy efficient is obviously true on its own. The issue is more about induced demand from having all this external "thinking" capacity on your fingertips
I agree with your point about induced demand. The “win” wouldn’t be looking at a single press release with already-suspect numbers, but rather looking at essentially all press releases of note, a task not generally valuable enough to devote people towards.
That being said, we normally consider it progress when we can use mechanical or electrical energy to replace or augment human work.
Also, while a body itself uses only 100W, a normal urban lifestyle uses a few thousand watts for heat, light, cooking, and transportation.
Add to that the tier-n dependencies this urban lifestyle has—massive supply chains sprawling across the planet, for example involving thousands upon thousands of people and goods involved in making your morning coffee happen.
And that's ignoring sources like food from agriculture, including the food we feed our food.
To be fair, AI servers also use a lot more energy than their raw power demand if we use the same metrics. But after accounting for everything, an American and an 8xH100 server might end up in about the same ballpark
Which is not meant as an argument for replacing Americans with AI servers, but it puts AI power demand into context
1: https://www.nature.com/articles/s41598-024-54271-x?fromPaywa...
Whether talking weight or bulk a decimal place is approximately the difference between needing a wheelbarrow, a truck, a semi truck, a freight train and a ship.
Even among engineering fields routine handling of diverse and messy unit systems (e.g. chemical engineering) are relatively uncommon. If you work in one of these domains, there is a practiced discipline to detect unit conversion mistakes. You can do it in your head well enough to notice when something seems off but it requires encyclopedic knowledge that the average person is unlikely to have.
A common form of this is a press release that suggests a prototype process can scale up to solve some planetary problem. In many cases you can quickly estimate that planetary scale would require some part of the upstream inputs to be orders of magnitude larger than exists or is feasible. The media doesn't notice this part and runs with the "save the planet" story.
This is the industrial chemistry version of the "in mice" press releases in medicine. It is an analogue to the Gell-Mann amnesia effect.
Source: we benchmark this sort of stuff at my company and for the past year or so frontier models with a modest reasoning budget typically succeed at arithmetic problems (except for multiplication/division problems with many decimal places, which this isn't).
ChatGPT 5.2 has recently been churning through unsolved Erdös problems.
I think right now one is partially validated by a pro and the other one I know of is "ai-solved" but not verified. As in: we're the ones who can't quite keep up.
https://arxiv.org/abs/2601.07421
And the only reason they can't count Rs is that we don't show them Rs due to a performance optimization.
Those of us who don’t base our technical understandings on memes are well aware of the tooling at the disposal of all modern reasoning models gives them the capability to do such things.
Please don’t bring the culture war here.