If an entire nation trips offline then every generator station disconnects itself from the grid and the grid itself snaps apart into islands. To bring it back you have to disconnect consumer loads and then re-energize a small set of plants that have dedicated black start capability. Thermal plants require energy to start up and renewables require external sources of inertia for frequency stabilization, so this usually requires turning on a small diesel generator that creates enough power to bootstrap a bigger generator and so on up until there's enough electricity to start the plant itself. With that back online the power from it can be used to re-energize other plants that lack black start capability in a chain until you have a series of isolated islands. Those islands then have to be synchronized and reconnected, whilst simultaneously bringing load online in large blocks.
The whole thing is planned for, but you can't really rehearse for it. During a black start the grid is highly unstable. If something goes wrong then it can trip out again during the restart, sending you back to the beginning. It's especially likely if the original blackout caused undetected equipment damage, or if it was caused by such damage.
In the UK contingency planning assumes a black start could take up to 72 hours, although if things go well it would be faster. It's one reason it's a good idea to always have some cash at home.
Edit: There's a press release about a 2016 black start drill in Spain/Portugal here: https://www.ree.es/en/press-office/press-release/2016/11/spa...
A full grid black start is orders of magnitude more complex. You’re not just reviving one machine — you’re trying to bring back entire islands of infrastructure, synchronize them perfectly, and pray nothing trips out along the way. Watching a rig wake up is impressive. Restarting a whole country’s grid is heroic.
The words "it's a miracle it works at all" routinely popped up in those conversations, which is... something you don't want to hear about any sort of power generation - especially not nuclear - but it's true. It's a system basically built to produce "common accidents". It's amazing that it doesn't on a regular basis.
Funny thing is, those are the exact words I use when talking to people about networking. And realistically anytime I dig deep into the underlying details of any big enough system I walk away with that impression. At scale, I think any system is less “controlled and planned precision” and more “harnessed chaos with a lot of resiliency to the unpredictability of that chaos”
Components aren’t reliable. The whole thing might be duct tape and popsicle sticks. But the trick for SRE work is to create stability from unreliable components by isolating and routing around failures.
It’s part of what made chaos engineering so effective. From randomly slowing down disk/network speed to unplugging server racks to making entire datacenters go dark - you intentionally introduce all sorts of crazy failure modes to intentionally break things and make sure the system remains metastable.
Seek only to understand it well enough to harness the chaos for more subtle useful purpose, for from chaos comes all the beauty and life in the universe.
The syncronasation of a power grid ... Wow.
Or the U.S. financial system. Or civilization in general.
The reason people work together is fundamentally the same reason you go to work - self interest. You're rarely there because you genuinely believe in the mission or product - mostly you just want to get paid and then go do your own thing. And that's basically the gears of society in a nutshell. But you need the intelligence to understand the bigger picture of things.
For instance Chimps have intricate little societies that at their peak have reached upwards of 200 chimps. They even wage war over them and in efforts to expand them or control their territory. This [1] war was something that revolutionized our understanding of primates behaviors, which had been excessively idealized beforehand. But they lack the intelligence to understand how to bring their little societies up in scale.
They understand full well how to kill the other tribe and "integrate" their females, but they never think to e.g. enslave the males, let alone higher order forms of expansion with vassalage, negotiated treaties, and so on. All of which over time trend towards where we are today, where it turns out giving somebody a little slice of your pie and letting him otherwise roam free is way more effective than just trying to dominate him.
Citation needed on that one.
> Consider that Sparta and Athens were separated by only 130 miles, yet couldn't possibly have been further apart!
They spoke the same language, shared the same literature, practiced the same religion, had a long history of diplomatic ties. When the Persians razed Athens, they took refuge with the Spartans.
> For instance Chimps have intricate little societies that at their peak have reached upwards of 200 chimps.
Again, I don't think this claim stands to evidence. The so called chimp war you mention is about a group of about a dozen and a huge fight that broke out among them. That doesn't support the idea that they are capable of 200-strong 'intricate' groupings.
Put another way, you're arguing against an example and not a fundamental premise. Proving the example is correct doesn't really get us anywhere since presumably you disagree with the fundamental premise.
That sounds very much like "Just believe me." or even more "The rules were that you guys weren’t gonna fact-check"
> I have no idea what you're trying to argue.
Presumably you know what you are trying to argue. That is what the questions were about.
> Proving the example is correct doesn't really get us anywhere
You would have solid foundations to build your premise from. That is what it would get us.
First we check the bricks (the individual facts), then we check if they were correctly built into a wall (do the arguments add up? are the conclusions supported by the reasoning and the facts?). And then we marvel at the beautiful edifice you have built from it (the premise). Going the other way around is ass-backwards.
> you're not really formulating any argument or contrary view yourself.
I don't know what viewpoint namaria has. I know that "Sparta and Athens [..] couldn't possibly have been further apart" is ahistorical. They were very similar in many regards. If you think they were that different you have watched too many modern retellings, instead of reading actual history books. That's my contrary view.
> For instance Chimps have intricate little societies that at their peak have reached upwards of 200 chimps.
Here the question is what do we believe to be "societies". The researchers indeed documented hundreds of chimps visiting the same human made feeding station. Is that a society now? I don't think so, but maybe you think otherwise. What makes the Chimps' behaviour a society as opposed to just a bunch of chimps at the same place?
I'd much rather focus on "prepping" by building social resiliency, instead. The local community I'm plugged into is much stronger together than anything I could possibly build individually.
Computer networking is not the same. Our networks will not explode. I will grant you that they can be shite if not designed properly but they end up running slowly or not at all, but it will not combust nor explode.
If you get the basics right for ethernet then it works rather well as a massive network. You could describe it as an internetwork.
Basically, keep your layer 1 to around 200 odd maximum devices per VLAN - that works fine for IPv4. You might have to tune MAC tables for IPv6 for obvious reasons.
Your fancier switches will have some funky memory for tables of one address to other address translation eg MAC to IP n VLAN and that. That memory will be shared with other databases too, perhaps iSCSI, so you have to decide how to manage that lot.
EVPN uses BGP to advertise MAC addresses in VXLAN networks which solves looping without magic packets, scales better and is easier to introspect.
And we didn't even get into the provider side which has been using MPLS for decades.
A problem with high bandwidth networking over fiber is that since light refracts within the fiber some light will take a longer path than other, if the widow is too short and you have too much scattering you will drop packets.
So hopefully someone doesn't bend your 100G fiber too much, if that isn't finicky idk what is, DAC cables with twinax solve it short-range for cheaper however.
What’s your source?
Perhaps the safest assumption is that system reliability ultimately depends on quite a lot of factors that are not purely about careful engineering.
I wonder however how being part of the "continental Europe synchronous grid" affects this, and how it isolates to Portugal and Spain like this.
But yeah there are a lot of capacitors that want juice on startup that happily kills any attempt to restore power. My father had "a lot" of PA speakers at home and when we tripped the 3680w breaker (16A 220v) we had to kill some gear to get it back up again. I'm also very sure we had 230v because I lived close to the company I worked for and we ran small scale DC operations so I could monitor input voltage and frequency on SNMP so through work I had "perfect amateur" monitoring of our local grid. Just for fun I got notifications if the frequency dropped more than .1 and it happened, but rarely. Hardly ever above though since that's calibrated over time like Google handle NTP leap seconds.
I love infrastructure
I realized the tech must have been winding up a flywheel, and then the pilot engaged a clutch to dump the flywheel's inertia into the engine.
The engineer in me loves the simplicity and low tech approach - a ground cart isn't needed nor is a battery charger (and batteries don't work in the cold). Perfect for a battlefield airplane.
---
I saw an exhibit of an Me-262 jet fighter engine. Looking closely at the nacelle, which was cut away a bit, I noticed it enclosed a tiny piston engine. I inferred that engine was used to start the jet engine turning. It even had a pull-start handle on it! Again, no ground cart needed.
---
I was reading about the MiG-15. American fighters used a pump to supply pressurized oxygen to the pilot. The MiG-15 just used a pressurized tank of air. It provided only for a limited time at altitude, but since the MiG-15 drank fuel like a drunkard, that was enough time anyway. Of course, if the ground crew forgot to pressurize it, the pilot was in trouble.
Again, simple and effective.
point of trivia: Messerschmitt, yes, but Bf-109, produced by Bayerische Flugzeugwerke.
you don't want to get your flugzeug works confused
BTW, since we are Birds of a Feather, I bet you'd like the movie "The Blue Max". It's really hard to find on bluray, but worth it! The flying sequences are first rate, and no cgi.
Similarly, the US Navy maintains banks of pressurized air flasks to air-start emergency diesels. Total Capacity being some multiple of the required single-start capacity
Random fact: Those starters are a plot point in the 1965 film The Flight of the Phoenix, where the protagonists are trying to start a plane that’s stranded in the Sahara, but only have a small supply of starter cartridges left.
Is that what Dr. Sattler is doing in this scene from Jurassic Park?
Nice attention to detail by the filmmakers.
There will be costs/losses by the various power companies which weren’t generating during all this of course, but also fixing this is by definition outside of their control (the grid operators are the ones responsible).
I’m sure public backlash will cause some changes of course. But the same situation in Texas didn’t result in the meaningful changes one would expect.
That’s because there is no effective regulation of the state’s power industry. Since they’re (mostly) isolated from the national grid, they aren’t required to listen to FERC, who told them repeatedly that they should winterize their power plants. And a state-level, the regulators are all chosen by the Governor, who receives huge contributions from the energy industry, so he’s in no rush to force them to pay for improvements.
The real irony was the following summer during a heatwave, when they also experienced blackouts. Texas energy: not designed for extreme cold, not designed for extreme heat. Genius!
I miss the food in Texas, but that’s about it.
Small diesels could be an option but they're harder to pull start for a given size.
I once needed to jump-start a small marine diesel, many miles from land...
There was a small lever that cuts compression. You have to get it spinning really fast before restoring compression! It's definitely a lot of work!
EDIT - Here is a cheap modern small marine diesel [1]. The operation manual suggests that you don't have to do anything to get it spinning quickly, you just have to crank it 10 times, put away the crank handle, and then flip the compression switch. That's progress!
[1] https://www.yanmar.com/marine/product/engines/1gm10-marine-d...
Cranks and decompression levers are gone for at least 30-40 years now tho.
They're my kryptonite, but I accept it's mostly my ignorance.
Air compressors have more valves and gaskets that are vulnerable to oxidation, especially in salty environments, so I'd have thought the upkeep between the two, the two stroke would be easier.
Having good, fresh fuel on an oil rig. They need an engine that can run on crude.
It’s not the type of thing that using directly is economically feasible, even for emergency situations.
Maybe there are other concerns for an oil rig.
The hand-pumped air compressor is the tool of last resort. You can try an engine start if there's someone there who's able to pump it. You don't have to worry about how much charge is left in your batteries or whether or not the gasoline for the 2-stroke pump engine has gone stale. It's the tool that you use as an alternative to "well, the batteries are dead too, guess we're not going to start the engine tonight... let's call the helicopters and abandon ship"
Could the batteries be dead and the generators not start? I guess but it's very unlikely. I get that on an oil rig it might be a matter of life and death and you need some kind of manual way to bootstrap but there's not much that's more reliable than a 12V lead-acid battery and a diesel engine in good condition.
I think I'd take Lithium Ion batteries over lead acid for almost every conceivable use-case. They are superior in almost every way. Lighter, less likely to leak acid everywhere, better long term storage (due to a low self-discharge) and better cold weather discharge performance. The only drawback would be a slightly increased risk of fire with Lithium.
In a real black start, the guys might very well grab a portable generator and just use that instead. But having the option to hand crank something rather than rely on batteries that might run flat is good.
That tends to be for very large engines, where the extra plumbing isn’t a problem.
And then phase will align itself a couple times a minute so what's difficult about that part?
Most vessels will experience a blackout periodically and the emergency generator start fine, normally on electric or stored air start, and then the main generators will come up fine. It's really not delicate, complex or tricky - some vessels have black outs happen very often, and those that don't will test it periodically. There will also be a procedure to do it manually should automation fail.
There are air starters on some emergency generators that need handling pumping. These will also get tested periodically.
The most complex situation during black out restoration would be manual synchronisation of generators but this is nothing compared to a black start.
A rare but sobering opportunity to reflect on something we usually take for granted: electricity.
We live in societies where everything depends on the grid — from logistics and healthcare to communications and financial systems. And yet, public awareness of the infrastructure behind it is shockingly low. We tend to notice the power grid only when it breaks.
We’ve neglected it for decades. In many regions, burying power lines is dismissed as “too expensive.” But compare that cost to the consequences of grid collapse in extreme weather, cyberattacks, or even solar storms — the stakes are existential. High-impact, low-frequency events are easy to ignore until they’re not.
That's 20 years without any significant problems in the grid, apart from small localized outages.
It's not hard to start taking things for granted if it works perfectly for 20 years.
Many people don't even have cash anymore, either in their wallet or at home. In case of a longer power outage a significant part of the population might not even be able to buy food for days.
Even if you have cash many shops would not sell anything in case of a mass outage because registers are just clients which depend on a cloud to register a transaction. Not reliable but cheap when it works.
The real question is how long can some of the smaller banks' datacenters stay up.
Lest also forget the Crowdstrike drama where many supermarkets simply went dark, in some instances for nearly 24 hours, despite working communication links. But I digress.
But both major supermarkets nearby worked on diesel generators and payment by card worked flawlessly. I guess they had satellite connection.
It might have been more complicated in small villages but people living in rural areas ually still use a lot of cash.
Literally true. However:
- If it takes them 10 minutes to fire up the generator, then 5 minutes to restart the network and registers, that is no big issue (in a many-hour outage)
- At least in my part of the USA, many supermarkets do have generators - because storm damage causes local outages relatively often, and they'd lose a lot of money if they couldn't keep their freezers and refrigerators powered. Since the power requirements of the lighting and registers are just (compared to the cooling equipment) a rounding error, those are also on generators.
However I assume this can work offline with the data being uploaded later though, as basically all the small supermarkets and shops were still open here (_incredibly_ chaotic though), and on the big supermarkets card payments were working (TBF, even the free wifi was working there, I guess they probably have some satellite connection).
So, what's really interesting is that these sorts of social collapses have happened. In fact, they often happen when natural disasters strike.
When they do happen, mutual aid networks just sort of naturally spring up and capitalism ends up taking a backseat. All the sudden worrying about the profits of Walmart are far less important than making sure those around you don't starve.
As it turns out, most people, even managers of stores, aren't so heartless as to let huge portions of the population starve. Everyone expects "mad max" but that scenario simply hasn't played out in any natural disaster. In fact, it mostly only ends up being like that when central authority arrives and starts to try and bring "order" back.
You can read about this behavior in "A Paradise Built in Hell" [1].
Looting only ever happens when areas hve started being evacuated and most shop owners + law enforcement are elsewhere.
I often wonder if we should leave energy/telecommunications in a state where they can and do fail with some degree of frequency that reminds us to have a back up plan that works.
I had thought that the (relatively) recent lockdowns had taught us how fragile our systems are, and that people need a local cache of shelf stable foods, currency, and community (who else discovered that they had neighbours during that time!)
For something like this, a local electricity generation system (solar panels, wind/water turbines, or even a ICE generator) would go a long way to ensuring people continued to have electricity for important things (freezers)
If we're talking about a situation where the grid goes down, the mobile internet is most definitely not working.
0:49 and light came back, and woke us for a moment.
So yeah, you need local first POS applications.
And who's fault is that? Why did europe allow this?
Why will the US allow this, eventually?
In Spain it's now illegal to pay with cash for transactions over 1000EUR. Absurd.
In Norway they recently made it mandatory in most circumstances to accept cash for transactions up to 20,000kroner (~1700EUR): https://www.norges-bank.no/en/topics/notes-and-coins/the-rig...
I don't know how true the relationship between the cashless lifestyle and safety actually is, but it works and I feel ok; I'm not sure that the prospect of a few hours of national blackout once in 20 years will make me change my mind significantly.
As an added benefit, no bank knows where I bought and when, which I find is a great advantage over the alternatives. (I also use Gpay; this comes from someone who just found a good middle point without forgetting about the more reliable, physical and privacy friendly option)
I didn't mean literally zero cash, but once the bulk of your transactions are by card, you don't need to constantly go to the ATM and replenish your cash reserves
Of course I get that carrying coins and notes is cumbersome, but if we've managed to live all through the 80's and 90's with it, I think we can manage to keep doing it. 100% digital money is giving up on a huge level of self-determination and privacy that I wouldn't feel comfortable with, but I guess as newer generations grow up already pre-adoctrinated and not being able to compare the before-and-after, in the end society will end giving up.
I don't think it's just a generational divide.
I do understand the privacy and self-determination problems of a cashless society but I have to admit I'm just to weak-minded to care about that in practice; the practicality of just paying even for just coffee with my phone is just too big for me to care for it.
Not sure I understand how that's different than today? You set a time and place, then you meet there, are people doing more than that today? Seems the youngsters understand this concept as well as older people, at least from the people I tend to meet like that.
> the practicality of just paying even for just coffee with my phone is just too big for me to care for it.
Interestingly enough, no matter if you had cash or card yesterday you couldn't get a coffee anywhere, as none of the coffee machines had power and even in the fancier places where they could have made the coffee without power, they didn't have electricity for the grinder itself, so no coffee even for them.
no, today people are continuously updating you about their whereabouts and assume you can just change time, place continuously and if you don't have the phone people get lost and panic. Ok I'm exaggerating of course, but there is a grain of truth in this
I once used an aggregator app to summon a handyman to my place. My request was simple: move two pieces of furniture around my very small apartment.
So I find a reputable service within the app, I schedule it, and they send a guy. He shows up to my door breathless, with some kind of sob story about a vehicle breakdown. I dismiss that out of hand and he gets to work. He did a fine job and it didn't take very long.
Then we get to the point of settling up, so I announce I'm going to pay in the app. He looks really disappointed and says he usually takes cash. I realized at this point that he was ready to shake me down, and also he would incidentally be discovering where I stashed my cash, when I reached for it with him there in the room. So disappointed. So I send the money out in the app and I show the confirmation screen to the guy. And I felt so bad that I followed up with a tip in the same fashion.
But at the end of the day he was just a garden-variety cash-in-hand scammer and I had no reason to feel guilty, because I had unwittingly outwitted him by trusting the app. And the company had no qualms about it.
Another time, I had a very short cab ride to the laundry. And it did not take long for the driver to spin a gigantic tale about his auntie addicted to gambling used up all their savings and they was really hurting for money. I was shifting uncomfortably wondering why I was hearing this. So the cabbie parks the car and his POS machine shuts off. He's like "oh it's out of order" so here he is, shaking me down and expecting me to go fetch cash to put in his grubby hands.
I stared him up and down, started taking photos, and got out of there. I discussed with dispatch. They said if he's not accepting cards and I intended to pay by card, I owe him nothing.
So again a cash-in-hand sob-story scammer was foiled. The cab service was crazy enough to assign him to pick me up additional times. This is why I ride Waymo, folks!!!
The laborer was simply trying to actually get paid vs. deal with the overhead of the app. Somewhat shady perhaps - since it routes around the company taking their cut for finding him the work, and likely avoids taxes. I've paid these sorts of guys cash every single time I've used such a service and exactly zero of them have "shook me down" or cared where I stored the money. They make so little already I'm happy to help them out with a smile.
Cabbies simply want cash for pretty much the same reason. They get charged an astronomical "service fee" by the cab company, and likely are avoiding taxes as well. I agree that such a situation is more shady in general, but I've actually had (NYC) cops side with cabbies on this topic and force me to go get cash at the ATM or get arrested. I also use car services now over cabs whenever possible due to this reason - mostly for convenience, never out of fear of being robbed though.
The chances of you getting mugged/stolen from for using cash are just the same as the chances of you getting mugged for no apparent reason walking home. Perhaps the collective dis-use of cash has reduced these odds, but you specifically is utterly irrelevant.
Yes, well, I choose not to participate in shady shit like that. Is that OK that I prefer to make transactions as laid out by their employer and not every random guy?
> happy to help them out with a smile
So you choose to be knowingly complicit in tax-avoidance schemes. That's fine; you do you, but some of us steer clear of shady shit, just on principle, you know? Perhaps the company deserves their cut as well -- they get paid so little already, amirite?
Also if there was nothing unusual about their choice of payment, then why must they regale me with these shitty sob stories? Am I supposed to be moved to tears at their hardship and heroism at making it to my door, that I must promptly cover their expenses? They are not panhandlers, they are service providers.
No, I ordered a service and I pay for the value of the service, according to the Company's rates. The cab company was clear about it: either I pay how I want to or I don't owe them. Nobody's arresting me for refusing to fork over cash. That's a scam.
That's not the only time I was cash-scammed by a cab driver. They will pull every trick in the book, and surely they compare and trade notes on their marks.
It gets even worse: my simple insistence on transacting with the cab company earned me fake receipts. Yes, they faked every receipt that they sent me in email. The totals were all fudged down to be much smaller than what I paid, including a $0 tip. It was very very obvious, especially when the rides booked in the app were generating duplicates showing different calculations. I reported it twice to their backend developers and they said that there were some coding errors in device drivers; please stand by for a fix. LOL!
That's a scam to ensure that taxpayers can't get reimbursed for out-of-pocket medical expenses. Most/all cab companies provide NEMT services as well, and they can't stand when people go outside of insurance companies. So they falsified my receipts.
And again, that's why I trust Waymo.
You can do as many electronic transactions as you wish without internet or electricity, provided you have something with charged battery. Problem is the transaction cannot be verified without internet, but when internet gets restored, all transactions can be applied.
That technology exists for more than a decade, so banks will implement it in 20 or 50 years. Most sane people will not wait patiently for half a century till some software engineer implements electronic transactions with COBOL, and we will use some kind of blockchain much sooner than that.
Nahhh - some banks have some parts of the infrastructure in COBOL. Specifically larger retail banks often have their ledgers in COBOL. Most of them want rid and are actively getting rid. Most places have had programs to root COBOL out since before 2000, but there are residual implementations remain. The ledgers are the hardest place to deal with because of the business case as well as the awkwardness. Basically there's not much of an advantage (or at least hasn't been) in modernizing so keeping the thing going has been the option. Now people want to have more flexible core systems so that they can offer more products, although not so sure that customers want this or can consume it. Still - it supports the idea of modernisation so not many people are keen to challenge.
The most common big implementations I come across are in Java.
Anyway point remains, electronic transactions with no internet or electricity is a solved problem, and banks don't want to solve it or they can't due to incompetency or maliciousness.
Currency transactions worth their weight in gold, it is of utmost importance for transactions to always be published to a central authority right away. If they don't have to be published, they should not exist at all. Imagine people buying stuff without anyone knowing right away! That should never, ever exist, for any reason.
[1] https://thenextweb.com/news/ancient-programming-language-cob...
I can imagine it and it happens all the time. Your version sounds very dystopian.
Most of our modern economy and systems are built to reduce redundancy and buffers - ever since the era of “just in time” manufacturing, we’ve done our best to strip out any “fat” from our systems to reduce costs. Consequently, any time we face anything but the most idealized conditions, the whole system collapses.
The problem is that, culturally, we’re extremely short-termist- normally I’d take this occasion to dunk on MBAs, and they deserve it, but broadly as a people we’re bad at recognizing just how far down the road you need to kick a can so you’re not the one who has to deal with it next time and we’ve gotten pretty lazy about actually doing the work required to build something durable.
This is a solution that teenager put in management position would think of(along with hire more people as solution to inefficient processes), not a paid professional.
Systems like electric grid, internal water management (anti-flood) shouldn't be lean, they should be antifragile.
What's even more annoying that we have solutions for a lot of those problems - in case of electric grids we have hydroelectric buffers, we have types of powerplants that are easier to shutdown and startup than coal, gas or wind/solar(which cannot be used for cold start at all).
The problem is that building any of this takes longer than one political term.
How do you make those systems antifragile rather than simply highly resilient?
I postulate the grid as a whole is antifragile, but not enough for the renewable era. We still don’t know what was the root cause of the Spanish blackout almost 24h after it happened.
> This is a solution that teenager put in management position would think of(along with hire more people as solution to inefficient processes), not a paid professional.
What kind of comment is this? Toyota has been using and refining it for decades. It wasn’t invented yesterday by some “teenagers”. Such a state of HN’s comment section.
JIT is definitely not perfect as exposed during the Covid period, but it isn’t without merits and its goal isn’t “reducing safety margin”.
Then we have JIT in computing, such as JVM.
Sure it is. That's exactly how it achieves the higher profitability. Safety margin costs money. Otherwise known as inefficiency.
Slack in the system is a good thing, not a bad thing. Operating at 95% capacity 24x7 is a horrible idea for society in general. It means you can't "burst mode" for a short period of time during a true emergency.
It's basically ignoring long tail risk to chase near-term profits. It's a whole lot of otherwise smart people optimizing for local maxima while ignoring the big picture. Certainly understandable given our economic and social systems, but still catastrophic in the end one day.
Of course not, they're optimising shareholder profit.
We've had substantial disruptions, but they've not been particularly irrecoverable or sustained.
The chips shortage has been difficult, but it's also been little more than an inconvenience when you look at it in terms of goods being available to consumers or whatever.
I feel like to many technologists, the internet is still "the place you go to to play games and chat with friends", just like it was 20 years ago. Even if our brains know it isn't true, our hearts still feel that way.
I sometimes feel like the countries cutting off internet access during high school final exams have a point. If you know the internet will be off and on a few days a year, your systems will be designed accordingly, and if anything breaks, you'll notice quickly and during a low-stakes situation.
But I wonder from a reliability (or lack of cascading failures) point of view whether synchronous islands interconnected with DC interconnects is more robust than a large synchronous network?
Also I suspect there is far more renewables on the grid now than in 2016.
This is potentially the first real black start of a grid with high renewable (solar/wind) penetration that I am aware of. Black starts with grids like this I imagine are much more technically challenging because you have generation coming on the grid (or not coming on) that you don't expect and you have to hope all the equipment is working correctly on "(semi)-distributed" generation assets which probably don't have the same level of technical oversight that a major gas/coal/nuclear/hydro plant does.
I put in another comment about the 2019 outage which was happened because a trip on a 400kV line caused a giant offshore wind farm to trip because its voltage regulator detected a problem it shouldn't have tripped the entire wind output over.
Eg: if you are doing a black start and then suddenly a bunch of smallish ~10MW solar farms start producing and feeding back in "automatically", you could then cause another trip because there isn't enough load for that. Same with rooftop solar.
The South Australia System Black in 2016 would count - SA already had high wind and rooftop solar penetration back then. There's a detailed report here if you're interested:
https://www.aemo.com.au/-/media/Files/Electricity/NEM/Market...
Non tied solar won't affect the grid at all. So this is a non-issue.
Grid tie requires the grid to tie to, otherwise it can't synchronize. So it stays disconnected.
Why would that prevent you from being grid-tie? I have 53 panels (~21kw) grid tied and pushing to the grid, but in the event of grid failure my panels will still operate and push into my 42kwh battery array which will power the entire house. ( The batteries take over as the 'virtual grid source). I can then augment the batteries with generator and run fully off grid for an extended amount of time ( weeks in my case ).
I really thought that sentence was going to end with "it makes it a lot easier to handle that segment".
Yeah you have some big problems if it's a complete surprise, but your status quo monitoring would have to be very strangely broken for it to be a complete surprise. Instead it should be a mild complicating factor while also being something that reduces your load a lot and lets you get things running quicker.
You need to calculate for it but I don't think this would be a problem
Does this really qualify as "black start" when they can rely on the bigger EU grid?
It really depends on the region though because almost all large hydroelectric dams are designed to be primary black-start sources to restore interconnects and get other power plants back up quickly in phase with the dam. i.e. in the US 40% of the country has them so it’s relatively easy to do. The hardest part is usually the messy human coordination bit because none of this stuff is automated (or possible even automatable).
* the load spike from everyone’s motors and compressors booting up at the same time
The power plants with direct connections have hard lines and black-start procedures that get power out to the most important customers like telecom infrastructure, which provides the rest of the comms. In a real world full restart it’s going to mean organizing workers at many substations to babysit old infrastructure so cellular is pretty much mandatory.
Instead, there are literally hundreds of smaller wind/solar installations. Some of which depend on rapidly fading cellular communication to restart. And some might need an actual site visit to throw on the physical breakers.
For Spain the external power and synchronization can come from France rather than generators which will help, but the process and complexities are still mostly the same. Call it a dark start, perhaps.
If Portugal (on the West) had to wait for that, it would probably have taken even longer.
> A black start is the process of restoring an electric power station, a part of an electric grid or an industrial plant, to operation without relying on the external electric power transmission network to recover from a total or partial shutdown.[1]
Only the first power plant in a black-start (like a hydroelectric dam or gas plant started by a backup generator) is truly "black started." The rest don't fit that definition because they depend on an external power source to spin up and synchronize frequency before burning fuel and supplying any energy to the grid. If they didn't, the second they'd turn on they'd experience catastrophic unscheduled disassembly of the (very big) turbines.
Only the first power plant can come online without the external transmission network.
In fact, if you’re not sure which will start first, you might go that way. They’re all disconnected from the grid at that time anyway.
Then again they might be less prepared precise because of the euro grid is available
If your Factory uses too much power, theres not enough energy to run the power plants generation, which decreases your power production. Death spiraling until theres no power.
You have to disconnect the factory, and independently power your power plants back up until you have enough energy production to connect your factory up again.
Another "trick" is those burner inserters are black start capable. They can pick up fuel and feed themselves to keep running without an electrical network.
I also tend to put Schmitt triggers in low priority areas. They've got a battery on the main grid next to them and if the battery drops below 50% power they remain off until it goes back above 75% power.
Starting back up from zero is significantly easier, as you are completely isolated and have zero load. You turn the power plant on, and start slowly adding local load to ramp up. Synchronize with neighboring plants where possible to build the grid back up. The only issue is that a power plant needs a significant amount of power to operate, so you need something to provide power before you can generate power. In most cases you can just piggyback off the grid, but in an isolated black start it means you need a beefy local independent generator setup. That costs money and it's rarely needed, so only a few designated black start plants have them, paid for by the grid as a whole.
Man I need to go play some more.
It's far more problematic for the UK because all the interconnects are DC.
That sounds like someone explaining why the solution is so bad, before describing what the solution is.
Electricity markets and electricity networks are designed by the regulator.
Incentives are planned by the regulator so that individual stations or companies have the correct incentives to have capabilities that the network grid needs.
One example is financial incentives to provide black start capabilities. Another example is incentives to provide power during peak loading (peaker plants). There are many more examples of incentives designed so that the needs of the whole network are met.
If an operator is incentivised to act selfishly in such a way that the grid will fail, then that is a failure of the regulator (not the individual operator).
Blaming individual people or companies for systemic faults is generally a bad thinking habit to form. There are too many examples where I see individuals get blamed. Fixing our systems is hard but casting blame in the wrong places is not helpful. It's difficult to find a good balance between an individual's responsibilities and society's responsibilities.
Not quite. They are _influenced_ by the regulators.
And Europe has been incentivizing trash-tier low-quality solar and wind power, by making it easy to sell energy (purely on a per-Joule basis) on the pan-European market.
Meanwhile, there is no centralized capacity market or centralized incentives for black start and grid forming functionality.
There absolutely is. Look up terms like "Frequency Containment Reserve" and "automatic Frequency Restoration Reserve". The European energy market takes transport capacity in account, and there is separate day-ahead trading to supply inertia and spare generating capacity. Basically, power plants are being paid to standby, just in case another plant or a transmission line unexpectedly goes offline.
Similarly, grid operators offer contracts for local black start capacity. The technical requirements are fixed, and any party capable of meeting those can bid on it.
It's quite a lucrative market, actually. If during the summer a gas plant is priced out of the market by cheap solar, it can still make quite a bit of money simply by being ready to go at a moment's notice - and they'll make a huge profit if that capacity is actually needed.
There certainly is in New Zealand, although the dollar amounts are quite small. If your countries regulator doesn't incentivise the capability, I believe that is a fault of your regulator.
Transpower (NZ) says:
We may enter into black start contracts with parties who can offer the black start service compliant with our technical requirements and the Code. Black start is procured on a firm quantity procurement basis (via a monthly availability fee and/or a single event fee for specified stations). Black start costs are allocated to Transpower as the Grid Owner (see clause 8.56 in the Code for details)
FYI: here's a list of other contracted services that are procured to benefit network reliability/restart/resiliancy: https://www.transpower.co.nz/system-operator/information-ind...The UK keeping its own time just makes things easier for it IMO.
To synchronize the isolated grids, they all need to operate with an exact match of supply and demand. Any grid with an oversupply will run fast, any grid with an undersupply will run slow. When it comes to connecting, the technical source-of-truth doesn't matter: you just need to ensure that there will be a near-zero flow the moment the two are connected - which means both sides must individually be balanced.
And remember: if you are operating a tiny subgrid you have very little control over the load (even a single factory starting up can have a significant impact), and your control over the supply is extremely sluggish. Matching them up can take days, during which each individual subgrid has very little redundancy.
On the other hand, the interconnect essentially acts as a huge buffer. Compared to the small grid being connected, it essentially has infinite source and sink capacity. For practical purposes, it is operating at a fixed speed - any change is averaged out over the entire grid. This makes it way easier to connect an individual power plant (it just has to operate at near-zero load itself, move to meet a fixed frequency target [which is easy because there is no load to resist this change], and after connection take on load as desired) and to reconnect additional load (compared to the whole grid, a city being connected is a rounding error).
I imagine you can get close enough by syncing to a shared time source like GPS or the DCF77 signal, as long as you communicate how the phase is supposed to match up to the time source. Or at least you could get close enough that you can then quickly sync the islands the traditional way.
The question is if it's worth the effort and risk. Cold starting a power grid is a once in a lifetime event (at least in Europe, I imagine some grids are less stable) and Spain seems to plan to have everything back up again in 10 hours. Maybe if the entire European grid went down we would attempt something like that by having each country start up on their own, then synchronize and reconnect the European grid over the following week.
https://en.wikipedia.org/wiki/Baltic_states_synchronization_...
The harder part is this: To pump power into the grid you lead the cycle ever so slightly, as if you were trying to push the cycle to go faster. If instead you lag the phase the grid would be pumping power into you.
That lead is very very small, and probably difficult to measure and synchronize on. I would imagine that when the two grids connect everything jumps just a little as power level equalize, it probably generates a lot of torque and some heat, I would assume it's hard on the generator.
From a physics point of view, by leading the cycle you introduce a tiny voltage difference (squared), divided by the tiny resistance of the entire grid. And that's how many watts (power) you are putting into the grid.
Would this suggest the grid hasn't snapped apart, or is it just not possible to tell from the data?
Coal, pumped hydro, and nuclear generation all went to 0 around the same time, but presumably that's those sources being disconnected from the grid to balance demand? https://transparency.entsoe.eu/generation/r2/actualGeneratio...
https://x.com/RedElectricaREE/status/1916818043235164267
We are beginning to recover power in the north and south of the peninsula, which is key to gradually addressing the electricity supply. This process involves the gradual energization of the transmission grid as the generating units are connected.
I see load dropping to zero on that graph, or rather, load data disappears an hour ago.
If the grid frequency goes too far out of range then power stations trip automatically, it's not an explicit decision anyone takes and it doesn't balance load, quite the opposite. A station tripping makes the problem worse as the frequency drops even further as the load gets shared between the remaining stations, which is why grids experience cascading failure. The disconnection into islands is a defense mechanism designed to stop equipment being too badly damaged and to isolate the outage.
Last actual load value for Spain at 12:15: https://transparency.entsoe.eu/load-domain/r2/totalLoadR2/sh...
Last actual load value for France at 12:00: https://transparency.entsoe.eu/load-domain/r2/totalLoadR2/sh...
https://transparency.entsoe.eu/generation/r2/actualGeneratio...
Everything dropped to zero except wind and solar, which took huge hits but not to zero. I expect those have been disconnected too, as they cannot transmit to the grid without enough thermal plant capacity being online, but if the measurement at some plants of how much they're generating doesn't take into account whether or not they were disconnected upstream they may still be reporting themselves as generating. You can't easily turn off a solar plant after all, just unplug it.
Either that, or they're measuring generation and load that's not on the grid at all.
Rooftop solar for example just shows as a reduction in demand, not 'generation' per se.
None of this gear is suited to a black start. If you had total grid loss for a month you could doubtless rewire it to power the farm when it's windy despite no grid, maybe even run some battery storage for must-have services like a few lights so they keep working on still days but you could not start the grid from here.
It's not just about the power. System components cannot be brought to operating temperatures, speeds and pressures faster than mechanical tolerances allow. If a thermal plant is cold & dark, it can take days to ramp it to full production.
In some cases yes. Modern combined cycle plants can take as little as 30 minutes to ramp to full output. Older designs can take upward of 4-6 hours.
If you have steam as an indirection, that's when things take a really long time. Natural gas turbines are a more direct cycle.
"Luckily", France is at an historically high level of production capacity at the moment and the connection between the two countries was reestablished fast.
According to RTE (French network manager), the interconnection was maxed yesterday at around 3GW of power.
Sadly, while Spain is part of CESA, it's not very well connected. I wouldn't be surprised if one the takeaway from the whole incident is that more interconnections are needed.
More than cash it was important yesterday to have the following in case it would have lasted longer:
- a battery powered am/fm radio with spare new batteries
- some candles and matches
- food reserves for a few days that don't need refrigeration: bread, anything in can, pasta, rice...
- some kind of gaz or alcohol stove, dry wood or bbq charcoal: you can always make a fire in the middle of the street where there is no risk of burning things around.
- water reserve (I always have like 24L of drinking water) and since I hate waste I regularly fill jerrycans when waiting for hot water in the shower that I use for manual washes (kitchenware or gears).
1. The grid has to fully collapse with no possibility of being rescued by interconnection
2. As a result, a generation asset has to be started without external power or a grid frequency to synch to
3. An asset capable of this is usually a small one connected to a lower voltage network that has to then backfeed the higher voltage one
4. Due to the difficulty of balancing supply/demand during the process, the frequency can fluctuate violently with a high risk of tripping the system offline again
None of this applies in yesterday's case:
The rest of the European synchronous grid is working just fine.
News reports stated Spain restored power by reconnecting to France and Morocco.
By reestablishing the HV network first, they can directly restart the largest generation asset with normal procedures.
As they bring more and more load or generation online, there's little risk of big frequency fluctuations because the wider grid can absorb that.
But with solar, how is the synchronization provided? In like a giant buck? Or in software somehow? Does the phase shift matter as much as in the electromechanical systems?
My intuition is that solar would make the grid harder to keep stable (smaller mass spinning in sync) but also may offer more knobs to control things (big DC source that you can toggle on/off instantly.. as long as sun is out). But I don’t actually know.
Currently the main driver of battery deployments is not so much energy price time arbitrage as "fast frequency fresponse": you can get paid for providing battery stabilization to the grid.
(for the UK not Spain: https://www.axle.energy/blog/frequency )
So if you have a smarter solar panel, or a smart battery, you can stabilize the grid. I’m assuming that all of the traditional software complexity things in distributed systems apply here: you want something a little bit smart, to gain efficiency benefits, but not too smart, to gain robustness benefits.
My intuition is that bringing the market into it at small timescales probably greatly increases the efficiency significantly but at the cost of robustness (California learned this “the hard way” with Enron)
> Phase matching is still required, wherever the phase difference is not zero there is a deadweight loss of power as heat
If the electronic controller is “ahead of” (leading) the grid, then that heat would come from the solar plant; if it is “behind” (following) then that heat comes from the grid. Is that right? And likely, solar plants opted for the simplest thing, which is to always follow, that way they never need to worry about managing the heat or stability or any of it.
I wonder if the simplest thing would be for large solar plants to just have a gigantic flywheel on site that could be brought up via diesel generators at night…
If you mean how does solar detect phase and synchronize to the grid: https://en.wikipedia.org/wiki/Phase-locked_loop
If you mean how does solar act to reinforce the grid: search for terms like "grid forming inverter vs. grid following inverter" though not all generators are the same in terms of how much resilience they add to the grid, esp. w.r.t. the inertia they do or do not add. See e.g. https://www.greentechmedia.com/squared/dispatches-from-the-g...
The main difficulty is that the software of grid-following inverters tend to make them trip out very suddenly if the grid parameters get too far out of spec (they will only follow the grid so far), but once the grid is good they basically instantly synchronise.
But all large solar farms are likely to be mandated to switch from grid following to grid forming inverters eventually which will make them beneficial for grid security because they will help provide 'virtual intertia' that looks exactly the same to the rest of the grid as spinning mass does.
Low Grid frequency & voltage can cause an increase in current & heating of transmission lines and conductors and can damage the expensive things, this is why these systems trip out automatically at low frequency or low voltage, and why load shedding is necessary
I'm not saying you're wrong, but this isn't obviously correct to me.
Since solar going to a grid is completely dependent upon electronic DC->AC conversion, I would expect that it could follow a lot greater frequency deviation for a lot longer than a mechanical system that will literally rip itself apart on desync.
The real reason that small scale solar PV is grid following (i.e. it depends on an external voltage and frequency reference) is that this ensures power line safety during a power outage. That's it.
An inverter can be programmed to start in the absence of an external reference and it can operate at a wide range of frequencies.
Here's why generators were running here despite the grid being available. A generator has a very short lifetime and in order to prolong it, some owners learned to run those in the very optimal schedule. Which sometimes requires a minimum amount of time to run in a single cycle. Thus if you started it you are committed to run for X hours.
I took down the servers though, so you probably can't easily try it. I don't know if I added a way to configure the lobby server. I should have! It's open source though. And there is a video about that thing on my YouTube: https://www.youtube.com/watch?v=6TPgfa7LbiI
The game is bad and nothing of what we planned on doing actually made it into the game. The video is long and boring too. But maybe someone finds this cool and is inspired by this and makes a game like this.
The first 15 minutes of the game were actually about getting the ship moving, first by reading the manuals of half a dozen different ship systems and then following some procedure outlined in those manuals (parts of which were simply incorrect), maybe having to do some things in sync with your other players and stuff like that. I think it would have been cool to add multiple reactors and start them up in sync and stuff. The different ship systems were actually Lua programs that interacted via a message bus. So kind of a unknown computer architecture?
- Cause of event not known yet.
- They noticed power oscillations from the Spanish grid that tripped safety mechanisms in the Portuguese grid. At the time, due to the cheaper prices, the Portuguese grid was in a state of importing electricity from Spain.
- They are bringing up multiple power systems and the Portuguese grid is able to supply 100% of needs if required. It was not configured in such a state at the moment of event.
- They had to restart the black start more than once, since while starting, noticed instabilities in some sectors that forced them to restart the process.
- Time for full recovery unknown at this time, but it will take at least 24 hours.
The entire EU runs on one synchronised grid so from that perspective a single 'province' went offline, not the grid.
The complex process of configuring the transmission network to bring grid power to each power plant in succession is the same.
https://en.wikipedia.org/wiki/Continental_Europe_Synchronous...
> have some cash at home
For maybe the first 24 hours at a grocery store, and then not so sure. Would your neighbors sell you supplies and food? Maybe not? And so many places now depend on cashless transactions and doubtful they have pen, paper, lockbox, and safe as a contingency plan.
That said, lots of people hit the cafés and had to resource to cash payments. There was also lots of people buying bottled water at the shops.
So basically, you could divide people in two groups. Those that took it like an extra Sunday, and those that took it like the beginning of a war or something :-D
If you try to connect another generator to the grid, it needs to be at the same point (phase) in the sine-wave cycle, so that its power contribution is added, not subtracted.
If it's not in sync, huge currents can flow, causing damage. Sort of like connecting jumper cables backwards.
Ukraine went through many black starts in the first winter of Russian strikes against energy. I guess they built a skill of recovering it quickly enough that it started happening faster and easier every next time.
Most places are so dependent upon electricity that they can't even take cash during a blackout. And they don't even have the mechanical machines to take a credit card imprint anymore.
There is precedent for major power outages, a huge majority of which are not malicious: https://en.wikipedia.org/wiki/List_of_major_power_outages
I remember the day when the Swiss railway power network went down for a day (in 2005) because one power line was down for maintenance and someone pressed the wrong button and produced a short circuit somewhere else. It's a bit like the incidents in planes were one engine has a problem and the crew shut down the other one by mistake.
https://edition.cnn.com/2025/02/13/asia/sri-lanka-power-outa...
I saw a video the other day of some human running and jumping on a transformer after hopping a fence, dancing on the transformer in a distribution site.
It ended as you'd expect, a bright light, a lot of curse words from the camera operator who was probably blinded temporarily.
Electricity does not care.
I can't believe and I'm horrified someone actually published such a video.
Trying to stay on the facts, this incident is likely accidental but some people even the very workers at energy companies could send a message for, I don't know. A pay raise?
Yes, you can try to hold the country hostage for your salary by going on strike, but that's the sort of thing that results in very energetic union-busting.
Actually sabotaging the infrastructure would result in terrorism charges, or at the very least the JSO treatment.
https://en.wikipedia.org/wiki/Moore_County_substation_attack
Unsolved to this day.
Not that type of hospital.
A tree falling can be addressed by vegetation management and trimming. A power line sagging because of excess heat is operator error.
These are not remotely the same.
I hate this term, and look forward to using it all the time.
As you have identified A wider right of way costs more.
Usually for lines above some voltage, perhaps 200kV, the cost of an outage due to a tree strike outweighs the cost of additional vegetation management so they will clear the right of way wide enough that no tree can fall and hit the power line.
Around here for 130kV the right of way is still as narrow as it can be and we annually take down the riskiest trees as this is the best for our budget, which is not unlimited.
They go through and remove damaged trees near the easements of the highway lines, as well as branches that could break into lines.
As an aside we lived on the same section of grid as the sheriff, and our power was rock solid for a few years, then he left office and now our power is better than average (at least better than our neighbors who's power line cones from the other direction).
Residential distribution voltage varies by utility but it’s usually in the medium voltage range, 5kV to 35kV, with 13.8kV being common.
In Texas, the electric providers cut staff and maintenance to maximize shareholder value. They will not have redundant systems and redundant plants out of the goodness of their hearts. The Texas marketplace actually allowed them in the odd event of an outage to charge astronomical spot prices thinking this will incentivize them to have redundant systems. This was a foolish fantasy.
Now in Texas, discussion of how to cost share redundancy have taken place. But no one wants to pay for it. https://www.texastribune.org/2023/03/01/texas-power-market-p...
If you want a no fail grid you need to incentivise a no fail grid.
Solar PV/thermal + wind: ~78%
Nuclear: 11.5%
Co-generation: 5%
Gas-fired: ~3% (less than 1GW)
This is (a) incredibly impressive to achieve and (b) definitely the point at which the battery infrastructure needs to catch up in order to reduce the risk of such incidents.
This non-existent technology will surely catch up very soon, I wonder what takes them so long.
The only battery available at this scale is hydro and it doesn't do very well in Spain because of droughts.
Makes the case for favouring flywheels over batteries.
http://claverton-energy.com/active-power-article-flywheel-en... as a smaller scale example. $330/kW but at that price it only has 15 seconds of carry-over, basically just enough to get the diesel generator fired up.
Spinning reserve in the grid is equipment that capable of long-term generation very quickly. In the case of hydroelectric dams, they will often cut off the water supply to some of the turbines and use air pressure to push the water out of the way; the generator attached to the turning essentially turns into a motor and keeps the turbine spinning. If you need to bring it online, you open the water valve and let the air out.
Similar situation with natural gas-fired simple cycle turbines. They’re sitting there running at low output. Need more? Just add fuel. For combined cycle it might take a bit for the boiler to warm up for full output but having the first stage running full tilt will get it warmed up fast.
In a solar grid you probably have milliseconds instead of seconds, this could be the reason why the automation failed in this case.
https://www.proactiveinvestors.co.uk/companies/news/1057463/...
That's insane, imagine if it let go.
Or if you consider the Irish grid (average consumption around 5 GW) that's enough energy to power the grid for about 0.8 seconds (obviously it's not going to have enough instantaneous power output to do that, but again for a sense of scale).
If Ireland had 10 of them, that'd be 8 grid-seconds worth of energy. Although, of course, actual disturbances aren't going to be that large. A few percent imbalance perhaps?
So if the whole grid had an instantaneous 10% imbalance, one of those units could carry it for 8 seconds.
(EDIT: changed energy numbers to fit the appropriate power grid)
See also: https://en.wikipedia.org/wiki/Aeolian_harp
Edit: might be a completely different kind of oscillation than I was thinking of. https://news.sky.com/story/spain-portugal-power-outage-lates...
So I would also like to understand how this works :)
They already became a laughing stock once for promising the "strongest possible response" for the Nord Stream 2 sabotage [1].
[1] https://www.reuters.com/business/energy/eu-sees-sabotage-nor...
I only have a layman's understanding of power grids, but I thought they were incredibly hardened, with backups and contingencies in depth
Are the grids at this scale really this brittle? Would there be a death toll from this?
I also wouldn't blame malice without corroborating evidence
Some are harder than others, and some have random flaws which nobody can really predict.
Spain seems in the transition to renewables, so it's possible that they have some flaws because they are still in the process, or because it's something which never happened before and is unknown territory. Also, Spain had some economic problems in the last decade, maybe someone build to cheap or was even cheating somewhere.
> Are the grids at this scale really this brittle? Would there be a death toll from this?
Hospitals should have backup-systems. Traffic should be able to stop in time. I guess the most problematic parts are people stuck in elevators and other spaces which only open electrical, as also the loss of cellular phone-connections for calling helpers.
All the mobile phone installations that I saw had power for at least 24-72hrs depending on how far from civilization they were. The carriers have backups and everything.
The problem in these kind of situations is the saturation of the mobile network, not its availability.
Very poorly explained right now by Space Weather News. I am waiting for an updated explanation.
So frequent it even has its own wikipedia page: https://en.wikipedia.org/wiki/Category:Airliner_accidents_an...
I think we should prepare for the worst though. It's wrong to assume it's not an attack too, and until we can conclude it's not an attack we should be prepared to deal with the possible consequences and act accordingly.
We're a remote business so it seemed like I'd just rudely dropped off the call, but as everything was down I couldn't let people know what'd happened.
Apparently it was caused by botched maintenance work affecting 30,000 houses, but the timing was so perfect I can't help thinking it was because our AGI overlords really didn't want me to deliver that talk for some reason.
Three quarter of the production disconnects from the grid between 12:30 and 13:00, with only a bit of solar and onshore wind sticking around.
I don't think we're able to tell from the data if one is the cause of the other, are we? Since if production was lost, load would have to be shedded to balance the grid, and if load was lost (e.g. due to a transmission failure), production would have to be disconnected to balance the grid.
That started from a combination of a lightning strike and generator trip, but turned into a local cascade failure as lots of distributed generation noticed that the frequency was under 49Hz and disconnected itself. I suspect the Spanish situation will be similar - inability to properly contain a frequency excursion, resulting in widespread generator trips.
(I suspect this is going to restart a whole bunch of acrimony about existing pain points like grid maintenance, renewables, domestic solar, and so on, probably with the usual suspects popping up to blame renewables)
Renewables were a factor in the blackout here in Brazil a couple of years ago: the models used by the system operator did not correspond to reality, many solar and wind power plants disconnected on grid disturbances quicker than specified. That mismatch led the system operator to allow a grid configuration where a single fault could lead to a cascade (more power was allowed through a power line than could be redistributed safely if that power line shut off for any reason), and that single fault happened when a protection mechanism misbehaved and disconnected that power line. The main fix was to model these solar and wind power plants more conservatively (pending a more detailed review of their real-life behavior and the corresponding update of the models), which allowed them to correctly limit the power going through these power lines.
If you want an excruciating level of detail, the final 614-page report is at https://www.ons.org.br/AcervoDigitalDocumentosEPublicacoes/R... (in Portuguese; the main page for that incident is at https://www.ons.org.br/Paginas/Noticias/Ocorr%c3%aancia-no-S...).
If you have a large spinning inertial mass like a factory motor or a power generation turbine, it's extremely important. Imagine a manual car transmission, but there's no slip-clutch, you need to perfectly align engine with the wheels rotating at 300mph, and the inertial mass you're up against if it's not perfectly synchronized is a freight train.
That's why generators trip offline in a blackout cascade if the frequency deviates out of spec. The alternative is your turbine turns into a pile of very expensive shiny scrap metal.
Frequency coordination is absolutely critical, via phase coordination. A large generator must not get significantly out of phase. So frequency going out of spec triggers the generator to "trip" (disconnect).
I don't know what specific threat is addressed by tripping generators offline when the frequency deviates by 1 Hz. Are they so mechanically fragile that is already damaging to them, or is it a precautionary measure because that kind of instability is likely to precede sudden frequency or phase jumps that are damaging?
Both actually. A frequency mismatch between what the grid has and what the turbine is supplying causes significant thermal losses, so you got to trip the generator anyway rather sooner than later, but a significant frequency deviation is always a warning sign that something is Massively Broken and requires immediate attention to find the cause - too low a frequency means you need to shed load immediately, too high means you need to shed generator capacity immediately.
Its like a three legged race. You and your partner have to run in synchronism. If either of you slows down or speeds up, the other can trip and fall over taking both of you out.
if a high power transformer goes out of tune... it melts or blows up (or both); it'll try to shut itself down before that. getting it back on becomes a problem if other transformers do the same thing, which is apparently what happened in the whole country.
Might even give you clues about something big tripping offline.
Probably lots of false alarms, but if it an outage is particularly bad for you, good to know as soon as the system operators do.
Obviously over the internet could work too, but who wouldn’t love their own box?
When the phase gets pulled down hard like what nearly happened in Texas and what probably happened here, it'll go from looking like the background noise of phase changes to catastrophic in just a few seconds. It isn't like you'll get warning an hour ahead of time. You'll probably notice your computer monitor going dark before your grafana graph refreshes.
Yes, it's not that hard. There's smart meters and plugs that have frequency measurement built in.
You can even do it with an audio cable: https://halcy.de/blog/2025/02/09/measuring-power-network-fre...
In reality, power generation equipment will disconnect itself if the frequency is too low/high to avoid catastrophic failure.
[1] https://transparency.entsoe.eu/load-domain/r2/totalLoadR2/sh...
[1] https://transparency.entsoe.eu/transmission-domain/physicalF... [2] I'm not necessarily blaming the engineers, but the politicians who force those engineers to put square pegs in round holes. For example, I can imagine politicians making a short term decision to skimp on energy storage while increasing renewable penetration. Surely renewable systems must be less reliable without storage given the lack of rotational inertia?
[1]: https://www.spiegel.de/wissenschaft/stromversorgung-in-spani...
News travelled extremely slow: phone coverage was just barely enough to receive a couple text messages every 15 minutes or so. News spread on the street, I even saw a group of 20 people hunched around someone owning a hand-held radio in the streets.
Just before power was restored, things started to get worse, as the phone coverage went completely out (presumably batteries were depleted). People were in between enjoying the work-free day, and starting to worry about how tomorrow would look like if power didn't come back.
I stopped by a friends house and we then went on a walk. Some stores were open and cash was accepted. We hung out later that night and had a few beers. The sky was amazing as there was next to no light pollution. Next day was totally in the dark as well and again, no panic. More beers were enjoyed.
The choice to move to electronic everything without having to give a shit about reliability is a failure of modern government. Move fast and break society for a dollar.
Yeah, they don't need to do that anymore. Around me, enough towers have battery backups that I can count on 2 hours of coverage when utility power goes out (if it goes out late at night or early morning, there's usually coverage until 6-7 am when people start waking up and use up the rest of the power). I don't have a real landline, but the telco DSL would drop instantly with utility power so I don't have big hopes and I wasn't willing to pay $60/month to find out.
Around when I moved, stores would pull out the credit card imprint machines, but those don't work anymore because cards are flat. Cash might work, and I've got some, but I don't think many people in my community do; people don't have cash for the snack shack we run at my kid's sports, so I doubt they have it for restaurants and stores either. And we get frequent 2-4 hour power outages, at least one, usually two or three per year; and ~ 24 hour outages every few years. The snack shack runs during summer where electricity is most reliable, but I doubt people stock up on cash in the fall and use it all up before spring/summer; they probably just don't have any.
It's the other way around: Cards are flat because a carbon imprint doesn't afford the merchant any payment guarantee by the card issuer anymore anyway. (In other words, the "floor limit" above which cards require electronic authorization is now zero.)
The people operating these imprinters are sales clerks and waitstaff, not graphologists or experts in detecting altered physical credit cards. The sophistication of fraudsters has also advanced, and as a result, a system that might have been good enough in a pinch 20+ years ago isn't necessarily good enough today.
That said, in my view there's no excuse to not leverage the physical chip present on effectively all credit and debit cards these days, which is technically capable of making limited autonomous spending decisions even with both the issuer and terminal offline in scenarios like this. It probably won't happen without regulatory pressure, though.
Unfortunately, too many cards, and all mobile wallets, don’t support offline authorization for that to be viable.
In fact, even verifying the signature is no longer required in at least the US.
Signature verification also only solves cardholder authentication, not card authorization (i.e. figuring out if the card is funded, still valid etc.)
Maybe it depends on the country, but my memory of 2003 is almost every non-elderly adult I knew (in my own upper-middle class milieu) already had a mobile phone. Not a smart phone as we understand the term today, but a lot of phones back then had primitive smarts that are now largely forgotten, such as WAP/WML browsers (which maybe not many people used, but I certainly remember using one), JavaME applets (vaguely remember using them too-maybe post-2003, but higher end 2003 phones definitely could run them), vendor-specific mobile app formats such as Symbian
It's interesting to think about and realize how much things have changed now though, and how reliant people are on everything, and especially their tiktoks etc. working all the time.
Some of the panic is likely related to the war in Europe too, and especially the general talk about war
We were just two years removed from 9/11 so terror talk was the first thing that happened. We got that news from AM radio in our cars. Still no panic.
All gas stations closed because they could not sell gasoline/diesel. Today there are lines on all gas stations, people filling their car tanks and bottles..
Oh, let me tell you about electric cars! Many people had to spend the night somewhere away from home because they could not charge their cars.. My sister (with her job's electric car) had to stay the night some 200km away from home, and since the ATMs (Multibanco) didn't work, she didn't have physical money to pay for food. Luckily a stranger paid for the food (yogurt and some cookies). Petrol cars, because of their range, had better luck!
Pure fear and panic..
I can only blame the authorities (Portuguese/European) for not having contingency plans for keeping people informed, and thus letting fear spread like wildfire.
Electricty went down (something kind of frequent). My UPS kept PC up, and alarm system with sim and small UPS mantained wifi up for an hour or so.
Scary moments started when people I was in a call with in Portugal texted 'Grid is out'. Later no phone signal nor data.
At first, it might seem people running towards supermarkets an overreaction on being without TikTok for a couple of hours, but you have to live how scary it is to experience this in Europe's current political status to know 60 million people (plus industry) in three countries are out of the grid.
If you see Snowden's film (this might not be the most trustworthy source) it is exactly how CIA's agent describes the feasable attack towards these countries. Again, not a valid source, but I'd love to understand if that could be feasable.
I wonder what similar solutions exist in the iOS ecosystem.
I hope this comes back ASAP.
I've got to give massive props to Antena 1 too, which is the national broadcaster's main radio station, who stopped all normal programming and did an all-day massive report on this situation to keep people informed. From what I could tell they didn't even run any ads during that period, just all-day reporting continuously repeating key information for people who'd just tuned in.
Indeed, visited the local hardware store and some bigger stores like Decathlon, and they were out of all batteries, gas-powered stoves and anything else related. Seems they ran out of it just hours after the power was cut too.
> I've got to give massive props to Antena 1 too
RNE did a great job, and together with the response of the previous crisis of Covid, I feel relatively safe as a Spain resident during a crisis. People around you are so caring as well, like when I tried to figure out how to open my garage door without power, a young guy stopped while passing by to ask if I needed help. Simple stuff, but gives a larger feeling of that we'll survive together no matter what.
I lost some years today.
My son is fine, thanks to a random person (“the man with a rabbit”) who just decided to give a lift to my son and his friend to the edge of Barcelona.
The atmosphere was quasi-festive and most people were quite relaxed, enjoying an unexpected afternoon off. Younger people filled the bars which were serving everything they could. There were long lines at supermarkets and an occasional fellow toting a box of supplies, but mostly there were just huge numbers of people in the street and completely collapsed traffic flow (the police were out in force almost immediately, directing traffic). In the part of Madrid I was in about 1/4-1/3 of the population is from South America and I suspect most of them have seen this all before anyway. The only real stress I saw was from people that need a train to get home (because the trains weren't running) and a had a walk of more than 2-3 hours.
I got cell phone signal when I was near two hospitals which were fully operational.
It was interesting that almost immediately, while I was still at work, everyone said power was out in Portugal and France too. After an hour or two some were claiming problems in Germany, but this seemed already to be unfounded rumors.
Some younger people couldn't walk home because they didn't have google maps ...
For instance, one reporter asked one of the government flunkies whether it could be a cyberattack and they turned his noncommittal “maybe, we don’t know” into “government says cyberattack may be ongoing”.
Be careful of idiot reporters out there.
Edit: I’m listening to another radio interview where they are outlining the plans to bring online Portuguese dams and thermal generators over the next few hours, progressively unplugging from the Spanish supply (fortunately we have enough of those, apparently).
It should take 3-4 hours to get everything balanced with only national supplies, and they will restore power from North to South.
Key points that started it were (you can see the chain of events in the doc):
2.4.1. At 16:52:33 on Friday 9 August 2019, a lightning strike caused a fault on the Eaton Socon – Wymondley 400kV line. This is not unusual and was rectified within 80 milliseconds (ms)
2.4.2. The fault affected the local distribution networks and approximately 150MW of distributed generation disconnected from the networks or ‘tripped off’ due to a safety mechanism known as vector shift protection
2.4.3. The voltage control system at the Hornsea 1 offshore wind farm did not respond to the impact of the fault on the transmission system as expected and became unstable. Hornsea 1 rapidly reduced its power generation or ‘deloaded’ from 799MW to 62MW (a reduction of 737MW).
In my head, I'm thinking of generators/plants, connected by some number of lines, to some amount of load, where there are limited disconnection points on the lines.
So how do grid operators know what amount of load will be cut if they disconnect point A123 (and the demand behind it) vs point B456?
Is this done sort-of-blind? Or is there continual measurement? (e.g. there's XYZ MW of load behind A123 as of 2:36pm)
This is wild. From a amateur technical perspective, it would only take a cheap hall sensor inside the transformer to have a pretty good guess of how much current has been flowing to the load.
Hell, put the hall sensor onto a board with a micro controller and a LORA transmitter and stick it to the outside of the feed line. Seems like an incredibly cheap upgrade to get real-time load data from every substation.
If you're monitoring real time power consumption you then need a whole extra infrastructure to communicate this info back and forth. Of course you then have to consider how you're going to keep that extra infra online in the event of power issues.
If you find yourself in the middle of a black swan event, and 15 GW have tripped offline, you have milliseconds to dump pretty much exactly 15 GW of load, otherwise more generating capacity is going to trip offline very quickly.
If you only dump 14 GW because you used historical data (which happens to be imprecise, because today's cloud cover reduced rooftop solar output), you're still going to be in trouble. A detector scheme with sensors at every substation would allow you to do just that.
I also wonder what the realtime requirement is. Data from a minute ago is fine .. except in this kind of situation, when things are changing very quickly.
The estimates we get from seasonal studies are usually close enough, especially since load shedding isn't a finesse exercise.
The situations that require load shedding usually give operators only a few minutes to react, where analyzing the issue and determining a course of action takes the lion's share. Once you're there, you want the actual action to be as simple as possible, not factor in many details.
This has changed a lot though, as even home batteries afaik will start discharging if they start noticing the frequency dropping to provide some support on generation. But if it's dropping too fast and too quickly it won't help.
But yes they do have very granular info on all the HV sources and how much load is on them.
In this case, we are dealing with a widespread grid incident. The various grid protection mechanisms have been triggered to prevent interconnection overload. In addition, the generators are trying to correct the grid frequency to exactly 50Hz. At 49Hz, more power must be generated; at 51Hz, less power must be generated. However, if the frequency varies too much, there are also protection mechanisms to prevent the turbines from overspeeding or amplifying frequency variations.
The grid is complex, and normally this type of incident is limited to one cell of the electricity distribution grid. A blackout is a domino effect, when a minor event triggers a chain reaction that disconnects more and more elements from the grid.
Th grid operator will have to restart or reconnect the power plants one by one, restore power to stations and sub stations. All of this must be done in a specific order before power can be restored to consumers. All of this takes time, requires resources (you need men on the ground), and the slightest error can lead to further outages.
Some consumers are prioritised, such as hospitals, transport infrastructure, telecoms and water networks. Many critical pieces of equipment have UPS systems, but these are not always designed for such long outages or have not been tested for years. There are patients with home equipment who will struggle.
This is why rotating load shedding is preferable. The outages are not too long and vital infrastructure is not affected (or less so).
When yhe time comes, you just shed enough "buckets" to stabilize. Load shedding is not a precise task, when you're at that point, you'd rather load a few more megawatt and be safe, than play with the limits and be sorry.
> So how do grid operators know what amount of load will be cut if they disconnect point A123
Opening a line within the grid isn't used to shed load, as the grid is mainly redundant. It's used 1) to protect the line itself (either by letting it trip, or by opening it preventively), or 2) to force power to flow differently through the network, by modifying its impedance. Opening a line from the transport network is not a way to load shed.
In this totally random example [1], opening the line A-B increases impedance in the right part of the network, forcing power to re-route through the left part, and reduces the load on another line in the right path that was overloaded.
[1]: https://excalidraw.com/#json=5l8OS96Wdke6l9YEClQt8,NJ-r2PtiE...
> Or is there continual measurement? (e.g. there's XYZ MW of load behind A123 as of 2:36pm)
The network is, indeed, monitored continually, and we do have potential network equipments failure simulations every few minutes, with contingency plans also simulated to make sure that they work in that particular case. This way, when shit hits the fan, we at least have a recently tested plan to start from.
(apologies for singling out these specific groups of people - my point is that it might be worth to put down news sources like xitter, and read AP/translated local Portuguese news)
REN said: “Due to extreme temperature variations in the interior of Spain, there were anomalous oscillations in the very high voltage lines (400 kV), a phenomenon known as ‘induced atmospheric vibration’. These oscillations caused synchronisation failures between the electrical systems, leading to successive disturbances across the interconnected European network.”
https://www.theguardian.com/business/2025/apr/28/spain-and-p...
A cold start from zero generation took 6 to 10 h. No casualties, hospitals and crtitical infrastructure kept working all day.
Whatever the cause, this has been a great test of the emergency protocols.
Even your minor power outages take 30 minutes to an hour for your local power company to determine what failed.
https://radar.cloudflare.com/es?dateRange=1d
https://radar.cloudflare.com/pt?dateRange=1d
Portugal nearly reached zero.
https://www.euronews.com/my-europe/2025/04/28/spain-portugal...
The similarity between that event and this early-on report is striking.
[es language]: https://www.lavozdegalicia.es/noticia/espana/2021/07/24/aver...
"Le gestionnaire français souligne par ailleurs que cette panne n’est pas due à un incendie dans le sud de la France, entre Narbonne et Perpignan, contrairement à des informations qui circulent."
I suppose it makes sense that it was an automatic shutdown rather than infrastructure failing on such a wide area. And then once it's shut down, a black-start is a logistical challenge as other comments have explained.
I'm also seeing some reports about it being more likely that something happened on the east side, somewhere like the Ebro valley or north across the Pyrenees. Catalonia seems to have been particularly affected, and it's on the path of important lines coming from France. High heat at noon could have caused a line to fail and short against a tree, which would be similar to the 2003 nation-wide outage in Italy.
In theory [a flawed one] you've had enough spare capacity to survive N failures and N+1 failures are statistically unlikely because p^(N+1) is close to zero.
On practice [or with a better theory] you can't multiply probabilities in a grid system because random variables aren't independent. 30% spare capacity can go to -100% in a second.
Called N-1 criterion. See https://en.wikipedia.org/wiki/Contingency_(electrical_grid)#...
And it depends. During https://en.wikipedia.org/wiki/2006_European_blackout N-1 criterion was supposed to be holding, in practice not even N-0 was holding and network crashed.
Few years ago nearly entire day European network was sitting on N-0 due to multiple issues in Poland, caused by a heat wave and deeper root causes. There are many power plants and power lines where any further issue would cause Europe-wide blackout.
To me this indicates that in many energy markets where renewables are sufficiently built out, the only factor for why we aren't using them more is the storage capacity and grid infrastructure to handle their variability -- and instead just running stable but dirty energy systems.
There are a lot of cool mechanical grid systems like gravity batteries (e.g. weights on a pulley system) [2] or compressed or liquid air storage systems [3] which provide cheaper storage at higher capacities and durations than electric batteries
1. https://reneweconomy.com.au/a-near-100-per-cent-renewable-gr... 2. https://www.youtube.com/watch?v=trA5s2iGj2A 3. https://www.youtube.com/watch?v=fjERw-Ol-_s
There's a map at [2]
> The Spanish electricity system is currently connected to the systems of France, Portugal, Andorra and Morocco. The exchange capacity of this interconnection is around 3 GW, which represents a low level of interconnection for the peninsula. The international interconnection level is calculated by comparing the electricity exchange capacity with other countries with the generation capacity or installed power.
[1] https://www.ree.es/en/ecological-transition/electricity-inte...
https://www.rte-france.com/eco2mix/les-echanges-commerciaux-...
https://transparency.entsoe.eu/transmission-domain/physicalF...
There seems to be some kind of recurrent daily pattern where the French - Spanish interconnect switches from Spain -> France imports to France -> Spain exports at around that time, and then back again in the late afternoon.
https://gridradar.net/en/blog/post/underfrequency_january_20...
https://www.linkedin.com/pulse/day-europes-power-grid-almost...
https://www.acer.europa.eu/news/continental-europe-electrici...
I remember it because power went out in at least 1/3 of Romania back then.
Definitely felt surreal to first lose power to the degree that even traffic lights were no longer working, and then to hear it's also happening across the region just before mobile networks also went offline.
https://www.reuters.com/world/europe/power-blackout-hits-mon...
~90 page report: https://eepublicdownloads.blob.core.windows.net/public-cdn-c... (beware: PDF)
I was an an adjacent area at the time and iirc we were saved by our nuclear operator releasing some insane amount of steam to bring the supply down and avoid more overloading.
The first night, we had 2 grills with 10ft high flames roaring but I would say in a controlled fashion. The cops quickly asked us to not do something dumb like that anymore and we agreed.
Memories of the local party stores selling 40oz's for pennies on the dollar, as the outage would last days but the booze would not. It didn't really turn into anything bad for us in SE Michigan other than a fun story I can tell my internet pals about 20 years on.
If I recall correctly, we were also somehow the first house in the neighborhood to get power restored...I remember playing a Dreamcast with a hacked NES emulator running Rampart 2 player. I never made good on it, but I always said I wanted to make a shirt that read "I blacked out in the black out of '03"...which is probably for the best.
https://www.youtube.com/watch?v=0ccTzHBUsYQ
> In January 1998, Montreal and the region surrounding it was hit by the most disastrous ice storm ever recorded: more than four inches of ice entombed an area larger than the State of Florida, causing trees and power lines to collapse on an unprecedented scale, leaving millions in the dark without heat (some for up to four weeks). 35 people died and damages totalled more than $5 billion making it the worst natural disaster in Canadian history.
4 out of 5 power lines to Montreal went down
Full grid failures are almost certainly never down to a single cause.
(No relation to the other infamous Signal chat :))
There should be 4-8 hours of battery backup on every site - at least.
It's always fascinated me during disasters how independent telecomm can be. Kudos for all the engineering that went into it!
I.e. even when any other conceivable dependency is down, the networks keep running.
The telephone network was designed from the ground up to be completely independent of _everything_ except fuel deliveries. If grid power is up, that's convenient, but it's totally not required.
In many places, that's because telegraph and telephone lines got there before the power grid did. Lines running along railroads connected communities that had no centralized power generation. Delco-light plants at individual farms might be the only electric power for miles, aside from the communications lines themselves. Even if the only phone was at the rail depot, it still had to power itself somehow. As those communities sprouted their own telephone offices and subscriber lines branching throughout town, the office had its own batteries for primary power, and eventually generators to recharge them. (Telegraph networks largely ran from just batteries, recharged chemically rather than with generators, for years.)
Fast-forward a century and there was just never a need to depend on anything else. As long as the diesel bowser can get down the driveway, the office can run indefinitely.
Among old AT&T/Bell/WECo hands, the devotion to reliable service goes far beyond fanatical. Many offices built during the cold-war have showers in the basement and a room of shelf-stable food, though these are no longer maintained. The expectation was that whoever was in the office when the bombs dropped, would keep things running as long as they could. And when they couldn't anymore, well, there was probably nobody left to call anyway.
Or if you’re AT&T, grid natural gas backup, so your CO goes down if electrical and natural gas both go out once the batteries die. Did I mention how they didn’t build in roll-up generator connection points and had to emergency install those?
Spain's demand: https://transparency.entsoe.eu/load-domain/r2/totalLoadR2/sh...
Spain's generation: https://transparency.entsoe.eu/generation/r2/actualGeneratio...
Spain's import/export with France: https://transparency.entsoe.eu/transmission-domain/physicalF...
The filters can be used to see similar data for Portugal
I definitely had no problems with electricity all of today (on the eastern side). And there was nothing in the news about local outages either.
Funny enough, there were news before the Easter holidays that they're preparing for extremely reduced demand by shutting down facilities.
In any case, if I recall correctly from a Youtube video I can't find (it was either Wendover or Real Engineering), if the grid is fully down, it takes quite a lot of effort and time to bring it back online because it has to be done in small steps to avoid over/under loading/using.
Very good video. Very good channel.
(plus 11 million of Portugal for a total of 60 million people in the Iberian Peninsula)
I mean, what else are you gonna do without power?
Sex. At least that's what everyone believes
The legend is that hip hop and sampling started after the blackout in NYC. Some electronic and music stores were looted and the recording equipment eventually ended up in the hands of musicians looking to make a new sound: https://www.rollingstone.com/music/music-features/new-york-c...
All essential infrastructure has to. Heck, if you have a landline you can probably siphon off some power from the DC component.
But I would guess the whole network equipment would draw quite a bit, especially a modern infrastructure.
It’s funny to think how the moment is goes off you feel nothing, but hearing that many people produce noise and express happiness makes your body notice instantly, a sensation we often describe as electric.
Growing up in Spain I've never experienced anything like this (not there at the moment, but friends have told me over WhatsApp).
The whole Europe power grid are somewhat interconnected I wont be surprise if this knock on effect start knocking out other surrounding countries.
[1] https://en.wikipedia.org/wiki/Continental_Europe_Synchronous...
Ex Electronic Engineer interest.
Is a grid built on renewables and batteries somehow more resilient? Solid state things tend to be less fiddly, hence my question.
I remember reading at one point in the past that renewables were actually worse for the grid due to less predictable power generation or something, but that was a long time ago, certainly pre-battery storage.
Yes, if it is sunny or windy, they can be scaled in minutes, but only if conditions are met. The inverse is true for nuclear/coal - they cannot be scaled up & down in minutes.
So this is a complicated subject in itself, and a full answer won't fit inside this textbox. Some bullet points:
- Grid stability is maintained by batteries, but not literally. The "batteries" in question are typically rotating generators, i.e. turbines, wind, literally anything where you have an electrical coupling to a lot of physical inertia. That's what keeps the grid running second-to-second; while a power plant might pretend it's outputting a constant 4MW, it actually shifts noticeably from moment to moment. The kinetic energy of the generator helps balance that out.
- Going up from the sub-second range, an overload of the generator obviously would cause the shaft to slow down, dropping the frequency and causing brownouts. Brownouts are bad and can damage the grid, so typically breakers will disconnect if it falls below 49Hz; a 2% drop.
- Baseload plants can't cope with this, as they take multiple minutes to spool up. Minimum; for something like a coal power plant, where you have to shove in additional coal and wait for it to catch fire, it's going quite a few minutes. This is what defines 'baseload'.
- Peaker power plants can increase (or decrease) their mechanical power production in a matter of seconds. These days that typically means gas turbines, though hydroelectric power is even better, and nuclear power could be used for peaker plants -- but isn't; most nuclear reactor designs outside of the navy is a baseload design. France does have some load-following designs, and we need more of those.
- Wind turbines can't increase their output, flat out, but they can decrease it (by feathering, or by using brakes). This is good enough, except this would turn them into 'peaker plants' that can't help with peaks. If we had enough wind turbines to cover 100% of the load then we'd technically be fine, but economically speaking that doesn't work; they'd be at less than 10% power most of the time.
- Wind turbines have rotating shafts, but a lot of the time they produce DC power, linked through inverters, which removes that benefit and makes them act like solar panels in effect. However, this is a purely economic issue; they can trivially be upgraded to support grid stability if the pricing scheme will pay for it.
- Solar panels are worse: They have no inertia! There is no rotating shaft there to cover sub-second usage spikes. That's where complaints about 'renewables causing reduction of grid stability' come from, along with issues like domestic solar needing to backfeed power through distribution lines and transformers that aren't necessarily designed for that.
- But batteries can absolutely help. The kinetic energy of a rotating turbine isn't actually that big; it's not that expensive to pair a solar panel with a battery to build a grid-forming system that acts the same way a kinetic power plant would.
Some wind turbines are also internally a hybrid design that can dynamically adjust the frequency difference angle both for minimal losses in production, but also to provide frequency shifting and even artificial demand (i.e. essentially using wind power as brake)
I believe (but I am 0% expert) these are mostly but not exclusively used for adding inertia to the system, rather than real energy storage.
Monero is the favorite payment coin when ordering real cyber attacks.
It is common to use Monero as reward for specific groups performing cyberattacks. Please inform yourself on the topics and reflect afterwards on what was written.
the talk discusses further aspects of the European energy grid which are relevant for the current situation.
No conspiracy or whatever, just thinking about the complex system which is also vulnerable for some parts.
You don't think the omission of encryption and lack of authentication imply a criminal absence of due diligence.
I guess some huge contracts are in the offing fix those necessary features which should have been built-n in the first instance.
An estimated 305 million USD of monero were purchased this morning before the grid shutdown. That was a +50% increase on the monero price, something that never happened before in the last 11 years of the coin.
Quite a record. Quite a coincidence.
So some or most cellular towers will have generators, and their fibers will backhaul through repeaters, some or most of which will have their own generators. When it gets back to the MTSO, that will definitely have large diesel turbines on site and at least 24 hours of fuel with priority refueling contracts.
I'd expect there to be a lot of outages, for instance where all the towers in a region end up backhauled through a site where the generator fails or was never installed for some reason. But there will also be a lot of places that stay up in some capacity because, more or less by happenstance, all the fuel tank permits got approved and all the equipment actually worked.
Data, cellular etc everything kept working. But at some point I guess the generators and batteries started to fail and capacity degraded.
Things like Air Crash investigations don't take a year because of paperwork FFS. Investigating things takes immense time.
For that incident, an expert panel was set up in July, the interim report was published in November, and the final report in Feburary 2025: so it'll take a few months.
- "The risks posed to electrical systems by big variations in atmospheric temperatures are well known in the industry, even if it is rare for problems to manifest on this scale."
- "“Due to the variation of the temperature, the parameters of the conductor change slightly,” said Taco Engelaar, managing director at Neara, a software provider to energy utilities. “It creates an imbalance in the frequency.”"
https://www.theguardian.com/business/2025/apr/28/spain-and-p... ("Spain and Portugal power outage: what caused it, and was there a cyber-attack?")
It's crazy how momentum can carry a business.
To use a potentially controversial example, Microsoft products (Office, Windows) are still extremely entrenched despite the overwhelming majority of knowledgable people agreeing that they're on a steep downward trajectory and the alternatives have long since surpassed them.. leading to this[0] recent video from Pewdiepie...
According to local newspapers metro network, airport and traffic lights are all down
Maybe a tech company with servers will pay extra for full backup, but it’s not typical.
Without fail, during every grid outage, some will fail to start and there will be elevator rescue calls throughout the city.
Would be interesting to see if it will register here.
EDIT:: typos
Ouch, that sucks. Bet their technicians are on overtime already too :/ I'm guess you've already tried the true-and-tested "Turn off, wait five minutes, turn on again"?
Usually we've had lots of issues with our fiber from Vodafone (like random ~70% packet drops from time to time) but happy Vodafone seems to have recovered quickly in this case, no GPON issues here as far as I can tell.
I know it may be rare but I think some day we really need to move or mandate every single flat / home / apartment / living places to have a 12 - 24 hours backup battery included. Something that has 10K+ Cycles, durable and non-flammable. Not only does it make sure our modern lives without sudden interruption, it also solves the renewable energy problem.
And even if we do want to invest in large amount of grid storage (which we would need to anyway, if we want to transition to renewables), I'm not sure pushing this down to the individual house is sensible. Its a great way to limit economies of scale and make maintenance/inspection harder.
In a longer outage it'd be a massive benefit to have even just a few houses here and there that retain power.
why not? Having distributed buffers ought to make the system more resilient in most cases wont it? Not to mention that these home level batteries can be used to smooth out power usage and lower peak loads.
In fact, having an EV car act as this same battery would be an even more efficient use of resources.
I'd prefer for the power coming to my house to just not go out. The grid operator can install batteries in places other than my home. The grid operator can maintain them for me. The grid operator can get cheaper loans than I can for installing them, they're staffed with supposed experts in this stuff, just have them handle it on their premises.
I do have a 35 kWh propane fired generator onsite though, which does provide for outages regardless of where the break in the line is.
And once again, another point for growing the wealth gap. Poor people who can't afford the thousands of dollars for installing the batteries get shafted by such things instead of us assuming the grid operators will just smooth these prices out for us.
Once again in the end I'd just prefer if there were actors on the grid running massive batteries able to arbitrage the extreme spot prices and sell me electricity at a reasonable rate all the time instead of me having to actively min/max every dang thing every moment in my life.
Personally, I'd rather toss that $10k into my kids' education funds and sign up for a fixed rate electricity plan. Hopefully that'll be better ROI.
Grid-scale batteries have pretty different economics than some batteries in my garage. Their real-estate cost is probably significantly lower than the cost for halfway decent residential. They're buying a much larger order of batteries, so their per-kWh cost is probably a good bit lower. Also, their installation cost per-kWh is also a good bit cheaper. They're probably completely fine buying/selling purely on open spot markets, meanwhile in the end I'm going to need to run my pool pump and I'm going to want to do dishes and laundry and charge my car and all the other things on some reasonable schedule so I'd like some amount of price protection on whatever time-of-use plan I get (a day of $9/kWh electricity prices will surely wipe out whatever gains you might have made). They're probably able to get cheaper loans than me, so it's easier for them to arrange the big capital investment instead of high interest loans or having to invest existing capital.
I'm not arguing the economics of batteries don't make sense in today's power markets. Far from it, I've been toying with getting into it in North Texas. But buying a few kWh of batteries and putting them in my garage probably isn't going to break even and almost certainly isn't going to make me money.
Let's say you're in California, which has a pretty decent swing in energy prices available to the home consumer. You'll see prices swing between $0.27 to $0.65/kWh. You've got a 10kWh battery pack, so you can arbitrage something like $0.38/kWh * 10 kWh = $3.8/day, assuming you get a full charge cycle and ignoring efficiency for simplicity's sake. $114/mo. So assuming you didn't have any loans and you're ignoring opportunity costs on that $10k, your break-even is in ~88 months or ~7.3 years.
After a decade of ~$114/mo, you'll have offset $13,680, assuming you always used a full charge and you experienced no other maintenance costs and bought it cash on hand and that energy prices didn't change and had a perfectly efficient inverter to charge and discharge the batteries and the batteries still had 100% charge capacity the whole time. The 10-year CD, which you could just forget about for a decade, stands at $17,310.
And this is for one of the few markets in the US that really offers that big of a time of use plan other than "free nights and weekends***" (with pages and pages of fine print and other fees) For instance, I just looked at a time of use plan available to me here in North Texas. 9AM-4PM I'm billed at $0.068 for the generation, $0.050029/kWh plus a flat $4.23/mo for delivery. Outside of that window I only pay the delivery cost. But that means I'm only really able to arbitrage $0.68/day with a 10kWh setup. That's 490 months or a hair over forty years to break even.
Also, this further just makes having stable electricity yet another thing in the wealth gap. Only those wealthy enough to afford the high upfront capital costs, the ongoing maintenance cost, and the space to store it get reliable electricity, fuck everyone else! Or we can just focus on investing in a stable and clean grid and share that cost with everyone, all you need is to just be connected.
But hey if I get a massive battery bank I'll have power for when the end times come. I won't be able to go get groceries anymore and eventually the fiber line and radios around me will go quiet but at least I'll be able to play videogames. For a few hours at least.
Don't get me wrong, I'm an Eagle Scout, be prepared and all that. I've got a big pile of charcoal, a chunk of propane, a camping stove, several day's supply of water and canned/non-perishable foods, some batteries for lanterns, etc. The cars all get topped off when big storms seem possible. If the big outage comes this will be more worthwhile than being able to turn on the TV for a few hours, and cost significantly less than several dozen kWh of batteries. And if the power is out for more than a week or two I'll have far bigger concerns than being able to post on Hacker News.
They can still do so for other reasons, like a short circuit in the wiring.
And that said, I do have lots of other somewhat beefy batteries around the house. They do a lot of useful things for me such as power my tools including my lawn mower, string trimmer, hedge trimmer, saws, etc. There is a massive one in the car parked in the garage. In these cases it is a useful trade off of that slight risk as I'm actually getting something normally useful out of it.
You could distribute this capacity at each house and feed it back to the grid during peak times. But is TCO of 1000 * 100kWh same price as 100MWh worth of capacity?
If you're going to have the battery anyway(a car) its hard to compete with, but once you need more capacity, I'm not sure it makes sense to distribute it quite as much.
I’ve set this up on my “smart” thermostat to speed up just before rates go up and kick off for a while once they do (and it was a pain to setup, somehow this functionality was not baked in)
And if someone is dumb enough to do high load stuff on their own battery during a blackout, then it's less of a problem for others. Also individual failures will cover less homes.
Incentives and consequences will be different and differently spread.
And on individual level, you can also chose whether you want this or are fine with outage. (I'm against mandating this)
I bought a generator for just this situation.
[1] https://www.wedistrict.eu/interactive-map-share-of-district-...
The thing you said doesn't really make sense to me; I'm not sure it's an apt analogy.
That is just shifting someone else's mistake into being my responsibility.
I know so many people who invested in overly expensive battery storage systems for their solar. Power in Germany is expensive, but even with that expensive power many of those battery systems will never hit a positive ROI. But they’re still happy for the feeling of being „independent“.
I’ll turn 26 in a few months. The first time I experienced a power outage in my life was two weeks ago when a construction worker in our basement drilled into the wrong wall…
I predict that home batteries will become a "no brainer" from a financial perspective (for anyone who has the upfront capital to purchase them) within the next 10 years.
Battery storage may become a bit of a no brainer from a purchasing price point of view, but I don't think it will actually be more beneficial for people, especially if BEV adaption continues at a similar rate and bi-directional charing becomes widespread
20kw/hr of just storage is about $10k. Every one of these companies also has standard DC input ports for bring your own solar panels, because their panels are overpriced.
The 8-Bit guy has been using such a system to maintain his studio. It is reliable, uses cheap solar panels. This stuff is commodity.
Like it or not but we are a species of social animals, you cannot live without relying on others. That's just delusional.
If electricity is that important to you, buy your _own_ resilience. Tesla powerwalls and non-tesla equivalents have been available for ages.
And then he chooses the one big corporation which has a guy in the government.
power cuts affect more than just your home.
https://news.sky.com/story/large-parts-of-spain-and-portugal...
The wording in the article makes it look like Seville, Barcelona and Valencia are in France.
Just a typical cascade failure because it means everything's now running with lower tolerances.
Also France has a massive Nuclear base load and is usually an electricity exporter so their grid will be a bit more resilient (the spinning mass of massive turbines in Nuclear plants provides inertia to the grid itself) than the Spanish/Portuguese grid that would have had a decent renewable mix.
At the same time, electricity grids will have various measures to prevent instability spreading. Whether that's load shedding (dropping parts of the grid), removing production, or what France likely did, dropping the interconnects. This is usually fully automated.
If there were more connections to France it _maybe_ could have spread further. Nuclear power plants can be twitchy if things go south so if it had spread enough to knock 2-3 french plants offline then that could potentially have toppled France which then as a big exporter could have swept across much of the rest of mainland Europe (and maybe even the UK which might struggle to adapt to the loss from France through its HVDC interconnects if all three were maxed out)
That said, this is all very unlikely.
Several people have said the amount of power transferred between France and Spain is very low compared to the amount of power consumed inside either France or Spain, which seems like as good as reason as any that it didn't affect France. The grid in France was presumably able to absorb the sudden change in power flow without itself breaking.
"In an update, Spanish power grid operator Red Electrica says it's beginning to recover power in the NORTH and SOUTH of the country."
I'd imagine something similar applies here. You'd have some number of deaths specifically attributable to lose of power, plus countless other deaths caused or prevented in non-obvious ways. This might be visible at a high level as a statistical outlier in the total number of deaths during the time period of an outage.
https://www.euronews.com/my-europe/2025/04/28/spain-portugal...
Edit: BBC reports it now also https://www.bbc.com/news/live/c9wpq8xrvd9t
OP I think is joking La Liga got the power turned off to protect their revenues.
These things have relatively small costs, but make the system much more resilient.
You should probably just keep some cash on hand. And keep an FM radio and a few batteries on hand.
The postal services are worse now. Railway is more expensive and a disaster. Energy prices are sky high and apparently now we also see unreliability.
Calling attention to how fragile many of our critical systems are is almost certanly a net-positive in the long run.
Would have been a very different story if it were a week rather than a day, but I was left with a sense that this complex community of locals and foreigners is stronger than I previously suspected.
Not the best news source, but it’s the only one I’ve found so far. HN moderators, feel free to replace it later with a better one.
Supposedly also France is affected (unconfirmed)
Translated version: https://elpais-com.translate.goog/economia/2025-04-28/apagon...
Also, there are currently four different submissions re this on the front page. I'd suggest we dont need any more.
Should have paid the extra €€ to put the solar panels in backup mode…
https://www.elconfidencial.com/espana/2025-04-28/directo-cor...
Edit
However, I can't get the energy provided outage map to load, maybe too many people accessing it
The sunny weather is very inviting outside for someone with the day off :-)
> A fire in the south-west of France, on the Alaric mountain, which damaged a high-voltage power line between Perpignan and eastern Narbonne, has also been identified as a possible cause.
I guess news networks don’t have comparable access to live metrics
Sounds like a major infrastructure risk given that it is possible for more than one country to experience a full loss of power.
EDIT: Andorra is also affected, so that is three.
[0] https://www.lavanguardia.com/vida/20250428/10624908/caida-ge...
Not something that's easy to test for.
Mon 28 Apr 2025 07:22:00 EDT
Mon 28 Apr 2025 11:22:00 UTC
this is breaking just minutes ago:* https://www.bbc.com/news/live/c9wpq8xrvd9t (rolling updates)
* http://archive.is/https://www.bloomberg.com/news/articles/20...
* https://www.rte.ie/news/world/2025/0428/1509881-spain-portug...
The Spanish national operator:
It happens every winter. When I was a kid, we once went 2 weeks without electricity.
Wood stove becomes an obvious necessity. Not just for staying warm. For cooking.
Chickens in the backyard too.
Those moments show you that being self-sufficient is an extremely important skill, and while we're not totally self sufficient in those moments, we get pretty darn close.
Really makes you appreciate the pre-electric era.
People cut down forests with axes. Pulled the stumps out with oxen. Cut the lumber with hand saws. Chiseled foundations with pickaxes. Etc, etc.
With the constant threat of Nuclear War, there's a sword of Damocles hanging over our heads that we will return to this era.
That makes it all the more important to learn these skills.
Global warming/climate change strikes again
Using twitter has the huge advantage that spikes in users in Spain for checking this stuff is a rounding error in the normal traffic so is very unlikely to take down the status page.
This should show people just how powerful network effects are. They are legitimately a force of nature.
Maybe outright lies are more effective at getting followers than you suggest.
That being a distasteful reality doesn't make it incorrect.
The key is: will the bulk of the people, who are in the center, go for it?
And the answer is: if they perceive the idea as being cynical, or purposely deceptive, then no. Even if politicians temporarily get away with lies, every lie does long-term damage to a movement.
The truth will out.
That said, twitter should allow for official profiles and organizations to have their tweets (xs?) made public.
"115 batshit stupid things you can put on the internet in as fast as I can go by Dan Tentler" - https://youtu.be/hMtu7vV_HmY
But important/urgent updates only via Twitter is definitely a huge no-no.
It's far easier to use Twitter but it doesn't mean it should be used, it fences out people like the OP and me, who do not have Twitter nor want to have Twitter, I don't want to be forced into using a private corporation service to get status updates from the electrical network where I live in. It's quite a simple proposition and very reasonable, not sure why you are so incensed by a quite reasonable expectation.
> Also, why do you assume that website wouldn't crash under the sudden 10000x load? It is an utterly useless solution, that wastes time and solves nothing.
Because it can be cached very easily, it's 2025 where setting up this kind of cache is extremely easy compared to 2005.
> Like does your engineering skills suddenly magically evaporate the moment Elon's name is mentioned?
Please, stop, you are too irrational to understand a very simple and reasonable thing, no need to start throwing Elon into this bullshit, just stop here with the rabid lunacy. I was against major corporations only posting updates on Twitter waaaay before Elon bought it, I still stand by it.
from: https://news.sky.com/story/large-parts-of-spain-and-portugal...
This is literally the whistleblowers about cashless society have been warning everyone about for well over a decade now.
Using the term whistleblower in this manner is inappropriate; actual whistleblowers are individuals who bring to light illicit acts by organizations or governments at great personal risk.
This is how humans are with all catastrophes–there isn't enough money until after something really, really bad happens and suddenly there is enough money to fix the issue.
NYC is extremely vulnerable to a 9/11 style attack on the fresh water aquaducts. Fuller wrote about this all the way back in the 60s in Operating Manual For Spaceship Earth:
Thus under lethal emergencies vast new magnitudes of wealth come mysteriously into effective operation. We don’t seem to be able to afford to do peacefully the logical things we say we ought to be doing to forestall warring-by producing enough to satisfy all the world needs. Under pressure we always find that we can afford to wage the wars brought about by the vital struggle of "have-nots" to share or take over the bounty of the "haves." Simply because it had seemed, theretofore, to cost too much to provide vital support of those "have- nots." The "haves" are thus forced in self-defense suddenly to articulate and realize productive wealth capabilities worth many times the amounts of monetary units they had known themselves to possess and, far more importantly, many times what it would have cost to give adequate economic support to the particular "have-nots" involved in the warring and, in fact, to all the world’s ’have-nots."
(Samurai wallet is defunct, but the principle holds up)
Sure you can transfer the private key from one device to another, but (a) you can't know the other person didn't retain a copy of it and (b) you would be limited to spending the exact amount you have in an existing transaction because you couldn't send a transaction to the chain that splits it.
Transacting bitcoin private keys without going to the network and trusting the other party to not scam you defeats the whole purpose.
and so being located in middle of city you do not want ten of thousands of liters of diesel in tanks there.
same applies for luxury high rises in europe, almost all if not all of 20+ story buildings built last 30 years have them on roofs.
Why? Every gas station stores that capacity.
also depends on data center, some datacenters are directly in middle of european city. in most dense part of city and dense in europe or asia is something else then dense in america.
> and so being located in middle of city you do not want ten of thousands of liters of diesel in tanks there.
This is based on FUD. This is the case all over the US and they don’t cause problems.
It’s interesting that Europe has taken such a brittle approach to infrastructure.
No need for imaginary scenarios.
Indeed but it stands to reason that this outage will last maybe a few hours until the grid has recovered. A nationwide full blackout is a scenario that's on a "once a quarter century" level, and the last one in 2006 was resolved after two hours. It's Europe, not the US - our grids operate on much, much stricter requirements and audits on resiliency, hell since last year we got an active warzone in the ENTSO-E grid and it hasn't been too much of an issue!
Not much of value will have been lost in the meantime. The only ones who are truly and beyond screwed by such events are large smelters and similar factories where any prolonged downtime leads to solidification of the products which, in extreme cases, require a full reconstruction.
As for "I can't buy eggs in a supermarket now"... lol. People need to learn to chill down a bit. You won't die from having to wait a few hours to be able to buy the eggs.
I think you've left out a few things, I remember doing on site work at a pharma company that required some downtime on one of their lines and if we went over the allotted time, they would be charging us up to 2 million EUR an hour. Hospitals and critical services SHOULD have backup generators etc, but depending how long this lasts a lot of things can become a major problem.
The majority of the cases will be fine, but when there's mass confusion and interruption like this, there's always horrible stories that come out.
edit: and europe has almmost always atleast half a year of whole country supply of natural gas in caverns and other storage.
None of that changes the difficulty of a black start. If there is a full outage, it will take a while to get going.
That's the beauty of the European grid: it is not a black start event for Spain, at least as long as even a single link to any of the neighbouring countries is available.
It might be faster to instead black start several independent power islands in parallel, and connect them together as a final step. At least in my country (Brazil), that's how it's done for large-scale blackouts, even when some of the country still has power; it was done that way for the partial blackout in 2023, and there's a written procedure on how to do it (which is available on the operator website, if you know where to look). In 2023, some areas failed to black start for one reason or another, and had to wait for power from the outside; other areas managed to black start as expected, and were then synchronized with other areas until everything came back together.
But honestly dark starts are the kind of boomer self-made problems that'll just have to work around
Whoever built a solar grid inverter without the capacity for dark start needs a stern talking to
As long as even a single link to any of our neighbours is up and running, it can be used to start the rest of the grid - which is exactly what was done in the 2006 outage and why that one took barely two hours to be resolved. The only truly screwed country at the moment is Portugal because all their grid links run through Spain.
It's tempting to think of the grid as something grid operators control, feeding power from point A to point B, but the grid is actually largely uncontrolled - the power just flows wherever it wants to - and the only controls they have are turning on and off generators, adjusting their throttle, disconnecting loads (rolling blackouts) and sometimes opening circuit breakers (though this is not normally useful). They don't even have precise real-time monitoring of the whole grid - only specific measurements in specific locations, from which the rest is estimated using lots of maths (which is how you would design it too, if measurement devices cost $100,000 apiece). That's why it's not a trivial task to keep it working.
However, you're able to have your own, private miniature grid, on which you can power your own loads from your own generators. It's even possible to do this with solar inverters! You will need to specifically seek out this capability, and get extra hardware installed, which is probably why you don't have it. You need a "transfer switch" to definitively disconnect your private grid from the main grid when you're using your private grid capability - it's not allowed (and not safe, and will blow up your equipment anyway if you force it) to just feed power onto your local unpowered section of the grid.
And while there are ways to maintain inertia https://www.renewableenergyworld.com/solar/grid-inertia-why-... I don't see why a solar farm can't do it through smart syncing of inverters (or maybe they do some measure of it)
Talking about "national" in the sense Spain (pop. 48M, 506,030 km²) is roughly equivalent to a few US states. A similarly (population/area) sized outage occurred a couple of decades ago:
* https://en.wikipedia.org/wiki/Northeast_blackout_of_2003
North America is organized in regional grids:
* https://en.wikipedia.org/wiki/North_American_power_transmiss...
Texas, on the other hand, which is easily the size of a country...
It’s is a known fact that in general the US power grid is orders of magnitude less reliable than in Europe. And the excuse of “the weather is more extreme” is just that: a lame excuse.
Just count the number of American households that have generators and/or batteries vs the Europeans if you really have an honest desire to know anything about anything.
CA of course has rolling blackouts for other reasons.
A few more interconnects with the rest of the country and it wouldn't have even made the news.
this is after decades of Texans bragging about their independent power supply. Many Texans still believe outright lies about the blackout, like it being "caused" by green energy sources, which was false.
It was caused by free market participants not spending capital to harden their network. Solar panels and Wind Turbines work great in the cold climate of Canada.
The storm that caused such a problem is a once every ten years storm. The grid companies all should have foreseen this with even minimal investment in planning. They didn't, because that's less profitable, and the "regulator" in Texas has no ability to punish them for pinching pennies on reliability and resilience.
Free Market at work baby!
This is incorrect. That storm set multiple records, most notably the longest freezing streak the state has ever experienced [1]
Houston, San Antonio, Austin and Waco hit 30 year lows while Dallas set 80 year lows.
It also hit the entire state at the same time.
Maybe there's validity to some of the rest of your post, but that storm was absolutely not a regular occurance.
1. https://www.ncei.noaa.gov/access/monitoring/streaks/mapping/...
As was pointed out, the USA has three independent grids (east, west, and Texas) and EU countries are roughly comparable to states (except with less federal power). The equivalent of a European nationwide blackout would be a US statewide blackout, and those HAVE happened, definitely within your lifetime if you're old enough to use Hacker News, mostly in Texas.
I had a long blackout as a kid during a hurricane in 1985. Once it was safe it was repaired rather quickly.
The cash registers, though, had backup power, so the store could still take their money.
Apparently when this had been done in the past shoppers were generally honest & relatively accurate.
This is actually exactly the case that I had in one trip to Andorra: the power was down for 2 hours while we were choosing equipament for skiing. The shop had no issues getting our orders done though, because they just manually filled the orders with pen-and-paper and did the payment with a credit card terminal connected to a smartphone.
If your city has an extended power outage, the cell nets could easily be down as well.
And I am not saying that you shouldn't accept money as backup, of course you should. But what I am saying is that you can still accept credit cards even during most power outages.
Same as Software Engineer, it is impossible to have perfect, 100% reliability, but it doesn't mean we can't improve from 99% to 99.9%, for example, to have a better service.
Credit cards and payment networks have always explicitly supported "Offline" processing like that.
The kind of fraud that system enables isn't really common.
Without electricity the water system depressurized, which contaminates it. After about a week the sewage pumping stations have backed up so the sewer system is starting to fail.
Modern cities cannot operate without electrical power given their scale and density.
It is bizarre to think the biggest problem is "how do we keep a transaction of value?"
Like, just declare an emergency and let business owners be reimbursed by the government.
I know someone who works at a supermarket, and (some of?) their point of sale (POS) systems have a small UPS that can run for a couple of hours to ride through smaller outages.
PoS systems aren't particularly power-hungry, but store owners never want to spend an extra cent, so they go with the smallest UPS they can manage. (And arguably if they went with a big overkill UPS, its after-outage recharging power would be larger so you'd be able to put fewer registers on a single circuit, so it's not as simple as just dropping in a bigger UPS.)
That's insane to me, in the EU anyway it's not permitted to only accept electronic payments..
> Retailers cannot refuse cash payments unless both parties have agreed to use a different means of payment. Displaying a label or posters indicating that the retailer refuses payments in cash, or payments made in certain banknote denominations, is not enough.
That's not the case. There are individual laws in each country that govern this.
https://fullfact.org/online/UK-not-only-europe-country-legal...
Either way, there should be no reason grocery stores don't accept cash imo.
https://www.ecb.europa.eu/euro/cash_strategy/faqs/html/index...
Only in a handful of cities and states. There is no federal law requiring businesses to accept cash for goods and services.
But in this case, an emergency, I would assume someone would still know how to take a manual payment receipt!
When the power is out one cannot pay with cash either - because the cash register is offline.
Most retail workers are GenZ and struggle to understand what this would even look like because they’ve never conducted any transaction without POS computers (for looking up prices, for tallying them, for figuring tax and total, and computing change), so even if a dusty manual in the stockroom technically spells out a method of ringing sales using nothing but pen and paper and maybe a solar calculator, I would be surprised if any of the clerks working any given day would have the initiative to initiate an offline protocol. Most likely the store manager would usher customers out, lock up the store, keep the staff for 30 minutes to see if it came back on, and then go home.
It’s not like you can (could?) keep a block „just in case” and thus many shopkeepers wouldn’t even bother in case of outages.
Depending where you live a good old trust can be a currency. Humans are great when it comes to adaptation, I bet I could just write on paper name, CC number and leave it on a paper for shopkeeper and everything would resolve just fine..
I've only seen a few but I believe they have springs on the inside and roll on little wheels similar to how desk draws roll. Most can be opened with a key to trigger that event.
(And in many cases you cannot legally pay large amounts of money in cash, it has to be electronic)
So it is perfectly legal to use pen and paper and a cash box.
https://www.lexware.de/wissen/buchhaltung-finanzen/neue-rege... https://www.lexware.de/wissen/buchhaltung-finanzen/kassenbon...
Many other EU countries have similar regulations, and in some cases had them for a long time.
Receipts or invoices are the basis for a firm's whole economic activity, including the underpinning of their financial reporting, their tax burdens etc. And businesses failing to provide receipts erodes not only the tax base, but also any rights a consumer may have.
Also, fuel station can probably successfully run it's own backup power;)
Cash registers can be connected to small UPSes to ride through smaller outages. You wouldn't need a larger battery if all you want to do is ride through a few-hour outage, or even a whole business day (8-12 hours?).
Maybe just my perception, but power outages seem to be getting rarer with time, though when they do happen they seem to be far larger.
Are you suggesting to attack Russia, based on absolute thin air?
Except for the mountain of evil, violent, underhanded and illegal stuff Russia keeps getting caught doing, and has so far escaped scot-free due to Western cowardice?
I agree that Russia is evil, but saying that it got away with it Scot free is just insane. In fact it even plays into Russian propaganda, to say that nothing really harmed them even after the past 3 years of international sanctions and consequences.
And western cowardice? Apart from nuclear war (which I'm glad people are being "cowardly" about), what do you suggest? It's also funny when war mongers online talk about cowardice. Why dont you go volunteer in Ukraine if you're such a non coward?
> It's also funny when war mongers online talk about cowardice
> Why dont you go volunteer in Ukraine if you're such a non coward?
This is an impressive feat. You've managed to package not one -- not two -- but THREE Russian propaganda talking points into a single paragraph. I believe I've just witnessed the second coming of Alexander Pushkin.
Are you going to actually dismiss the fact that yes, nuclear war would be a very likely possibility if we declared war against Russia? Much more likely than... Not doing that?
Or are you going to use the Reddit argument that somehow, it just won't happen because trust me bro, and only Russian shill would care about getting nuclear bombed?
And yes, it's cowardly to call people cowards because they won't push for a war that you yourself won't fight. I didn't bring up cowardice first by the way, but I guess it's fine to do it when you agree with the person lol.
By the international community of course. The same one that has heavily sanctioned Russia.
> And as another comment said, are you volunteering? You can even wage war against Russia now if you want to, volunteers in Ukraine are always welcome.
That's just a cheap deflection. Me dying in the trenches won't change anything; a coordinated properly aggressive response might teach the war criminals at the Kremlin that they will not be let off. It's just a repeat of Munich/Ethiopia, and no individual at our level can do anything about it but advocate for change in policy to our governments.
And it's funny to say that it's a repeat of Munich / Ethiopia. Again, war mongers will always prefer comparing to the few times where war was the only solution. The only issue is that not only do both situations have nothing in common (not that anyone cared about Ethiopia, since European colonialism in Africa was nothing new), but it also erases the fact that we are talking about nuclear powers here. If Hitler had nuclear weapons pointed at London and New York back in 1938, then obviously the equation wouldn't be as simple as "stop him now" even with hindsight lol.
Other people are dying, and more people will until the aggressor is stop. The aggressor has shown loud and clear he has no intention of stopping.
> Again, war mongers will always prefer comparing to the few times where war was the only solution
Because the situation is comparable. Putin has clear and overt ambitions for further expansion (one of his minions already published the plans with Moldova next). If he isn't stopped, like he wasn't in Georgia and Crimea, he'll just continue. So the comparison is very apt. Yes, nuclear weapons add a serious nee dimension to the problem. That doesn't mean we should let Putin get away with war crimes.
Edited to add: https://kyivindependent.com/russia-military-building-up-at-f...
Speaking strictly for myself, given that I want to avoid having people think I'm apologising for the Putin regime, I do not ever use these arguments in good faith.
It's because the modus operandi of war mongers, since before world war 1 has always been to agitate for wars that they don't want to fight for themselves. We are talking about nuclear war here, the bad faith argument is to imply that being against it is a sign of being a paid Russian shill. I guess the Pentagon is full of said shills, since they have never even hinted towards a direct war against Russia, or wanted to get directly (by means of troops on the ground) involved in one.
Now I agree that sometimes it's simply not realistic to go all "well do it yourself then". But in this case, Ukraine accepts, trains and equips volunteers. It's not a hypothetical.
If so, then the question would be if Russia did plant that tree. We should look out for more suspicious trees in our immediate areas.
See some tree squating where it shouldn't? Walnut or Vatnik? You can never be sure...
Western style of life is not only EU but also USA. I do not know how people can even doubt this lol.
If they wanted western life in russia, then establishment will make changes to have it there, no ? Russia is NOT democracy, it is tyranny, autocracy. Again it is not narrative it is what they do there lol
Hating the west is only an ideology given to plebs
`any != all` after all.
Your argument is essentially; because some Russian people send some of their children to be educated or buy some property in the west (as a portfolio of how many?) that the argument that the state of Russia dislikes the EU holds no water.
To me, it's hardly evidence of anything, just like how some people in the UK fetishise Russia- yet the UK government is actively hostile and condemns without hesitation- Russias actions towards Ukraine.
The "hate west" narrative is pushed because it makes sense during the war. If Putin decides now praising the west will let him keep the power the propaganda machine will do a 180 turn
so they HATE west no matter what they say, so you are correct in that.
but you are making wrong conclusion,
machine is not bad thing BUT they are good people.
They ARE bad actors no matter if they use propaganda machine that way or any other way or not use at all. they are bad actors period. propaganda machine is separate thing.
contrary to west, in russia you get beaten by police because your children in west posted something on Xtwitter...
Alexei Navalny, Boris Nemtsov, Boris Berezovsky, Sergei Magnitsky, Stanislav Markelov, Anastasia Baburova, Natalia Estemirova, Anna Politkovskaya, yuri Shchekochikhin, Vasily Melnikov , Vladislav Avayev , Sergey Protosenya, Yevgeniy Palant, Yuri Voronov, Ravil Maganov, Vladimir Sungorkin, Anatoly Gerashchenko, Vadim Boyko, Vladimir Makei, Grigory Kochenov, Vladimir Bidenov + Pavel Antov, and thousands of others.
most spectacular was - Pyotr Kucherenko where two men holded him and third put shopping bag on his head, and noone saw nothing in whole plane... except three photos were taking of incident...
All the money, humanitarian aid, weapons, intelligence, training and geopolitical backing beg to differ.
The only positive aspect of this is after the root cause is found, the grid will become more resilient in the long term (but these kinds of changes typically take long time).
The goal would be to create enough pressure from people - frustrated by problems like power cuts — so that governments must withdraw their support for Ukraine.
Any "WW III" fearmongering is similar : intimidate everyone into withdrawing support.
Many European countries have created emergency guides to help citizens preparing for crisis like this one. [2] This, I guess, has the underlying goal of maintaining trust in European governments.
[1] : https://www.politico.eu/article/russia-increasing-hybrid-att...
[2] : https://www.reuters.com/world/europe/eu-commission-urges-sto...
... Wait, how are you defining that? Much of the EU is about as close as it is possible to be to being at war with Russia without actually sending in troops.
so blackout is attack from russia. so stop spreading lies of terrorist russian state.
But Russia is an aggressive authoritarian state that was already caught for (smaller) acts of sabotage in EU, some of them quite dangerous. Why they are doing this? Who knows, war in Ukraine was not rational too. Perhaps some people want to be evil just for the sake of being evil.
As a Russian emigrant, I long stopped trying to rationalize Kremlin decisions. Why authoritarians are authoritarians? Who knows. Mad with power or something.
You cannot control stable governments, so you destabilise them with various tools for prolonged periods of time and then you end up with a country which is much easier to influence.
Same with the undersea cables.
Some Western side companies banned Russia by IP's like Intel, but in general, my list of websites to tunnel through a VPN is rather short, like a dozen and mostly to unblock youtube as meta and twitter are cancer anyway.
After the multiple sabotages, killings, corruption, as well as the invasion of a neighbor country, we have some reasons to think Russia is a bad state actor.
"Putin channels ultranationalist discourse, such as the Izborsk Club and the neo-fascist Alexander Dugin, in calling for quasi-religious rebirth of Russian dominance, an agenda that seeks to swallow “Little Russia” into a renewed Russian empire that stretches from “Lisbon to Vladivostok,” a phrase popularized by Dugin and repeated by Putin."
https://brill.com/view/journals/joah/4/1-2/article-p126_10.x...
>renewed Russian empire that stretches from “Lisbon to Vladivostok,” a phrase popularized by Dugin and repeated by Putin."
This is a direct lie. Putin has never said this.
And one of the greatest lies that is being spread about Putin that he intends to conquer Europe and recreate Russian Empire.
Moreover, they are unable to just live-and-let-live and actively go out of their way to make other peoples lives miserable. This is due to pervasive zero-sum thinking in Russian strategic thinking. They are fixated on the idea that in order for Russia to 'win', others must suffer and lose.
I would surmise that the Russians think that Spain and Portugal are cowed, and want to keep them intimidated and prevent them from increasing their aid to Ukraine.
They have already assassinated People in Spain last year
https://www.politico.eu/article/maxim-kuzminov-russia-ukrain...
I do not really think that this needed to be a russians work tho. Spain and Portugal are really kinda far and it would be massively idiotic move even for them.
Where? This is the first I have heard of it.
Local power outages are probably the most common "disaster" one should prepare for.
But a whole country’s grid can go down like this in an instant?
On top of that, Ukraine inherited a lot of nuclear power plants, and despite losing the largest one in Zaporozhie region, still operates all the other as Russia doesn’t attack them in any serious way.
No miracles, just pumping money non stop.
That’s why you still need a strong diesel/diesel-electric locomotive fleet, imagine if Spain had been right in the middle military mobilization and military materiel transport, an event like this one would have stopped then dead in the tracks had they been relying only on electric locomotives.
https://www.outono.net/elentir/2021/10/23/the-spanish-army-r...
> For journeys outside the base, *the Army uses Renfe locomotives*.
which, in my understanding, means that in order to move military materiel (to the borders with France, let's say, or to the closest sea-ports most probably) and tens to hundreds of thousands of mobilised men the Spanish Army does indeed rely on Renfe locomotives, i.e not on their own.
This is the absolute worst thing to do when there is a shortage of power - you immediately make the shortage worse and more grid disconnects.
The real fix is a grid with second by second pricing based on system frequency, and every individual user allowed to set a daily 'spend cap' of euros/dollars, letting them choose how much they are willing to pay for reliability.
Such an market has a huge stabilizing effect on demand, meaning a major incident would probably only have fairly small impacts on system frequency and embedded solar wouldn't disconnect.
Solar PV is great but is mostly grid-following so cannot operate on it's own. As I understand it you need a minimum fraction of power generation to be large spinning turbines.
I think this problem can be mitigated with add-on rotational mass style kinetic energy batteries or something like that. I don't think variable energy pricing will help if it's an issue with over-demand the grid managers can do rolling blackouts to manage while fixing the supply problems. The grid is just broken at the moment and the solar can't maintain the grid alone.
Only "small stuff" IBRs need a leading frequency from the grid and disconnect outside their safety corridor because those usually aren't controllable from some central grid authority. Thus the stupid-but-safe behaviour mandated for them.
We do not have a solid understanding of how inverter-based fast frequency response works with an existing grid that uses physics-based inertia.
I've seen some papers saying that they help stabilise grids.
That made sense before technology became available for everyone to make their own choice - but that is no longer the case.
Let's skip the technical problems in your theory and focus on the social.
People need power to survive. You know, food, hot water, light, work, internet, mobile phones, entertainment, etc. This requires stability, not second by second pricing.
When you put a chicken in an oven, you want to cook that chicken and eat it, feed your family. Electricity price rising in the next few minutes would mean that you either have to risk disease (chicken staying in the dangerous temperatures until the electricity price drops) or being hungry and throwing food away. This is not how you want society to function.
Believe it or not, but maintaining an electricity grid is a massive undertaking, and the people in charge of it knows the topic much better than you do.
The problem isn't a market problem, it's a physics problem: having a synchronized grid of AC current with many producers over a wide area is a real challenge, even when the underlying issue is resolved it takes a lot of time to add the power plants (or renewable equivalent) to the grid because they must be synchronized.
Also, nobody in the field disagrees that in the more distributed grid we are seeing today, more endpoint communication and control could lead to more resilience. Whether pricing signals are the best path is a more open question, but they certainly appear to be a feasible option.
No it doesn't. The fact that it's being said in a comment full of nonsense tells me that they don't have “above-average understanding”. They probably have read something, once, and now thinks they are an expert, that's literally what Dunning-Kruger is about.
They seem to believe that the equilibrium of supply and demand is all that matters, when it's just one piece of the puzzle and among the easiest to manage. Large, nation-scale, failures like this one are very unlikely to be caused by a lack of supply alone and markets are nowhere near fast enough to help preventing these.
Like, what can you do, use some 1000 of MW to melt iron rods or something to give the power stations time to slow down? Free wheels?
Don't you realize that the smaller the grid, the more important the instantaneous load variations can be in relative term and the harder it is to keep things running smoothly? It's not a theoretical concern, it's why electric networks on islands are much harder to work with and much more prone to collapse than bigger networks.
The bigger the grid, the more efficient and resilient it is (and managing electric grids on islands is a nightmare), but it comes with a significant complexity and means restarting from zero is harder.
I thank the heavens that the people who run the electricity system do not share your opinions.