From Wikipedia:
"Some of the mined uranium was found to have a lower concentration of uranium-235 than expected, as if it had already been in a nuclear reactor. When geologists investigated they also found products typical of a reactor. They concluded that the deposit had been in a reactor: a natural nuclear fission reactor, around 1.8 to 1.7 billion years BP – in the Paleoproterozoic Era during Precambrian times, during the Statherian period – and continued for a few hundred thousand years, probably averaging less than 100 kW of thermal power during that time. At that time the natural uranium had a concentration of about 3% 235U and could have reached criticality with natural water as neutron moderator allowed by the special geometry of the deposit."
Even if it were, in the time limit safety increases, such costs decrease, more so with more development.
More broadly, nuclear looks expensive not because it's unproductive — but because standard asset pricing discounts its most valuable feature: time. Dense, stable power for centuries gives a low net present value due to long-duration-use and high up front cost, but this more a flaw in how future value is discounted in common economic models that punish rather than reward long life.
I'm not against nuclear energy in principle, it just seems to be a technology that instead of becoming cheaper becomes more expensive, has enormous costs beyond energy production (decommissioning, waste management) and is subject to extremely rare failures that threaten to evaporate any gain in the previous decades or centuries. I don't even think it's that dangerous for people- victims of Chernobyl and Fukushima have been a tiny number. It just seems economically not worth it.
If you’re arguing the chosen discount rate is too high in some models, we can have a productive discussion about that.
If you’re arguing the methodology is wrong, you’ll need to explain more before I understand your point of view, or perhaps you’ll be interested in lending me $1M today and I’ll pay you $100/day for the next 55 years, by which time you’ll have more than doubled your money.
Upper Midwest USA / Middle Canada? probably pretty darn safe.
But I think a fallacy to claim that natural phenomena should inherently be considered "environmentally safe" in human terms. There are coal seam fires that have been going on for centuries and the pollution of these is just as bad as the pollution generated by human created coal mine fires (and that's truly awful, a significant source of carbon pollution).
[1] The problem with nuclear reactors isn't that their pollution couldn't disposed of with ideal methods but that when they run by for-profit corporations, you will always have the company skirting the edge of what's safe 'cause corporations just go bankrupt with catastrophic events and so their risk-reward behavior isn't the risk-reward optima for humanity.
Has CO2 fire suppression been unsuccessfully attempted in these seams? Since nobody is underground and we know how to inject CO2 into underground deposits at various pressures, it seems like it would be a good candidate. Plus, with rotary steerable drilling, we could come in laterally (from a safe location above ground) to as many depths of injection as necessary.
2) transportation to the site: https://static.ewg.org/files/nuclearwaste/plumes/national.pd...
3) exploding waste barrels due to corner cutting in kitty litter selection exposing surface workers and contaminating the work area - only 1/2 mile down but this type of accident is depth independent https://www.latimes.com/nation/la-na-new-mexico-nuclear-dump...
4) fires
5) lack of a safety culture
6) communicating to future peoples not to mine here
7) long term structural stability and management (ex: Morsleben radioactive waste repository and Schacht Asse II)
3) if a waste barrel explodes, somehow, underground how does the waste make it's way through a mile of bedrock?
4) Again, how does a fire bring the wast up through a mile of bedrock?
5) This is just a vague statement.
6) So the concern is that future society will forget that this is a waste site, mine a mile deep and retrieve waste, and never figure out that the waste is bad for them? This is rather specific hypothetical that IMO demonstrates just how hard it is for a nuclear waste site to result in contamination.
Furthermore, naturally occurring uranium exists in groundwater and needs to be filtered out in places where levels exceed safe limits. So it's not like burying waste is creating a new problem: https://www.kqed.org/stateofhealth/120396/uranium-contaminat...
So what happens if uranium from nuclear waste somehow works its way into the water supply? We'll detect it and remove it in water treatment, just like how we remove contamination from naturally occurring uranium.
5) Industry term. Operationalizing any significant system will involve human beings, and with it their workplace culture. You can read about it here: https://mshasafetyservices.com/fostering-a-culture-of-safety.... Many mining hese were written in blood.
6) No, the concern is that people may be harmed. You see we've lost track of radioactive waste in the past. And humans are remarkably curious. Often we've figured it out before anyone was harmed. Sometimes sadly not. But the harm is the concern, not the lack of knowledge of harm.
And again, the question remains how people may be harmed by nuclear waste buried in bedrock half a kilometer underground? A even if a buried waste canister spontaneously combusts, how does the waste make it through half a kilometer of rock? In order for an unknown harm to occur, harm first has to actually occur.
This kind of appeal to an unknown harm can be used to arbitrarily object to anything.
"We need to stop building solar panels and wind turbines because they have the potential to cause an unknown harm. You disagree that these systems have the potential to cause harm? Well of course you can't know this, because it's an unknown harm that we're trying to prevent. How can you possibly disprove the existence of an unknown harm?"
> Care to elaborate on what you mean by this? Because even if you include Chernobyl, nuclear power is one of the safest form of energy generation: https://ourworldindata.org/safest-sources-of-energy. It's 100x safer than dams. Include only western plants and it's the safest form of energy generation.
I should also add that on average nuclear power releases less radioactivity than coal.
I grew up in a place and time where nuclear waste was routinely dumped, records lost, EPA government consultants lied, and people got sick. Nobody was held accountable other than token fines.
Can you provide even one example where nuclear waste from power generation - not nuclear weapons production - got people sick in the United States?
https://www.ncronline.org/earthbeat/government-workers-were-...
https://www.kansas.com/news/local/article49479255.html
The local uranium mills were primarily weapons related -fuel for breeder reactors.
For the power industry we have to drive to the other side of the state, over to Hematite, where each time a former employee comes down with any rare cancer from a long list, it's assumed to from working at the plant.
What about mining waste causing increased cancer and largely poisoning a river? https://en.m.wikipedia.org/wiki/Church_Rock_uranium_mill_spi...
"Pre-burnup doesn't count" is exactly what an abusive ex would say.
> What about mining waste causing increased cancer and largely poisoning a river?
What about it? Mining copper and rare earth minerals for magnets is polluting too. Producing aluminum to build transmission lines is also polluting. Mining, in general, is a pretty dirty industry. But surely nobody is suggesting we stop building electric motors or transmission lines? Uranium mining is not an exception in this regard.
You've given 3 examples, none of them are contamination from spent nuclear waste from power generation.
I have no more energy to give people who cannot be precise with their requirements. I get enough of that at work.
Please note that these are both chemically and radioactively harmful to people.
> Sure, there were plenty of bad nuclear waste disposal programs in the early cold war, but this has quite limited relevance to nuclear power generation.
That's what they said in the 00s, 90s, 80s, 70s...
> In order for an unknown harm to occur, harm first has to actually occur.
Nuclear power is an incredible technology, but understand that the nuclear industry has done little to earn trust. Just feels like an abusive ex plastered on the porch shouting "it'll be difficult this time I've changed" and doesn't inspire confidence.
Again, the point is that your link is about disposal of plutonium from nuclear weapons productions. Not spent uranium fuel from power generation.
> Nuclear power is an incredible technology, but understand that the nuclear industry has done little to earn trust. Just feels like an abusive ex plastered on the porch shouting "it'll be difficult this time I've changed" and doesn't inspire confidence.
Care to elaborate on what you mean by this? Because even if you include Chernobyl, nuclear power is one of the safest form of energy generation: https://ourworldindata.org/safest-sources-of-energy. It's 100x safer than dams. Include only western plants and it's the safest form of energy generation.
It's not like an abusive ex promising to have changed. It's a lot more like a very respectful partner that your hippie friends hate for incoherent reasons.
It seems like a pretty obvious solution to this would be to purposely do the reaction under controlled conditions before transporting it, so then you're transporting stable cesium compounds instead of elemental cesium metal.
> Cesium will be the primary radionuclide released in a nuclear waste accident because it is present in what is called the fuel-clad gap. This gap is the space between the fuel pellets and the inside wall of the metal tube that contains the fuel. This “gap cesium” can be released in any event where the cladding is breached. Cesium is a highly reactive metal and even a small break in the seal will release significant amounts of it. Cesium burns spontaneously in air, and will explode when exposed to water.
Obviously the "highly reactive" applies to elemental cesium and is meant to imply that a collision would be a serious problem because exposing it to air would cause a big fire and release a plume of radioactive material. If that isn't the case then it seems like the thesis of the paper is rubbish?
Cesium is extremely reactive, as is noted. In particular, it will readily reduce U(+4) to U(+3). Nuclear reactor fuel is primarily uranium dioxide, so there is ample material there for this putative metallic cesium to react with. Cesium is the most electropositive element, so it will give electrons to (reduce) almost anything.
The state of cesium in the vapor gap will be relatively volatile cesium compounds, like cesium iodide. The core temperature of a uranium dioxide fuel pellet greatly exceeds the normal boiling point of this salt.
And what you linked is still under construction. We don't know yet, if it really works safe long term, or if there will be future costs.
Now I believe it can be done safely, but only if monitored all the time with good care. But that is expensive and humans tend to skimp.
The only real scenarios are deliberate excavation, and a meteor impact directly on the waste repository. Neither of which are particularly likely scenarios.
Are we supposed to hold off on developing the only geographically independent and non-intermittent form of clean energy because of some vague nebulous fear that waste buried half a kilometer deep in bedrock will come back up to the surface and harm people... somehow?
Or rather we do know that the initial promises of reactor safety were also quite overconfident. So people assume the same of permanent storage of the waste.
It's not even open yet.
There is a difference between “something can be done correctly” and “something is likely to be done correctly.” Nuclear advocates I’ve read tend to argue the former - it’s possible to have safe reactors, it’s possible to keep the waste sequestered safely, there’s not a technical reason why nuclear is inherently unsafe. Skeptics tend to be making a different argument - not that it’s not possible to do things safely and correctly, but that in our current late-capitalist milieu, it’s almost impossible that we _will_. It’s not an argument about capability, it’s an argument about will and what happens in bureaucracies, both public and private.
Whether it's technology, economics, or politics, clearly the state of the art is deficient because we currently have persistent deficiencies.
Digging a shaft half a kilometer into bedrock and sealing it is not state of the art.
It's not even a a matter of mundane human error when executing procedures over and over again.
It's that the entire managerial pyramid gradually and slowly erodes
The Asse II site used an existing mine to avoid having to excavate a new tunnel, which subsequently flooded.
It would be like having a discussion about green energy and insisting that people should assume dams will fail or that blades are going to fly off of turbines.
Chernobyl was state run.
(I'm pro-nuclear but that's a hilariously bad argument.)
Which is kind of a problem for future burials because humans exist now and want and know how to find uranium.
The time between humans cracking the atom and the excavation of this nuclear waste is only a few decades. It took less than a hundred years for humans to find this nuclear waste in the ground.
Your argument is not well-founded. Burying nuclear waste for it to be discovered and excavated in less than a century is not nearly long enough.
We at least have pretty good evidence that nuclear fission products can be exposed to groundwater/hydrothermal fluids for a pretty long time.
I only know (or knew) high school physics, and when entering this in Claude I get an answer but am unable to verify the answer. Claude says 680 kWh gained per 0.03 grams of U-235 lost due to fission. I am left wondering into what the U-235 fizzed into (sorry, pun) and if I should take that into account.
Edit: There we go with modernity. I went to Claude instead of Wikipedia. Wikipedia at least has the answers. Thanks u/b800h. 100kW of heat on average. I can start filling in the blanks now.
There is an entire scientific publication on the topic if it interests you:
https://www.sciencedirect.com/science/article/abs/pii/S00167...
With that in mind, is it really surprising that you don’t get the ‘right’ answer out? Any more than if you compress an image with JPEG, a given pixel isn’t the ‘right’ color anymore either?
They’re both close (kinda) at least, which is the point. If you wanted the exact right answer, don’t use lossy compression - it’ll be expensive in other ways though.
When I'm in research/discovery mode, I use Perplexity. Its search/analysis is a lot slower than a Google search, but saves me time overall and generally gives me solutions that I'd have to spend time sorting through a Google search to find, in less time than it takes to do so.
However, uranium ores are often formed due to redox processes, since U(VI) is much more soluble than U(IV). So maybe concentrations wouldn't have been as common back before the Great Oxygenation Event about 2.4 Gya. Still, that leaves ~600 Mya between that point and this reactor, which would be not quite one half life of U235.
Heh. The garbage web software developer me would have just called it good enough
Would be really interesting to know what the error bars on those figures look like
[1] https://physics.nist.gov/cgi-bin/Compositions/stand_alone.pl...
That's related to the material of our solar system all coming from the same supernova explosion or similar, right? Does this apply to our entire milky way or just the solar system? What if parts collided with material of _other_ origins and some of that is on Earth, then there could be different mixes, right?
https://world-nuclear.org/information-library/nuclear-fuel-c...
We can calculate the abundances of U-235 and U-238 at the time the Earth was formed. Knowing further that the production ratio of U-235 to U-238 in a supernova is about 1.65, we can calculate that if all of the uranium now in the solar system were made in a single supernova, this event must have occurred some 6.5 billion years ago.
This 'single stage' is, however, an oversimplification...
The really interesting thing is that phrase "the production ratio of U-235 to U-238 in a supernova is about 1.65"; the now-rare U-235 is actually more abundant than U-238 in the fresh debris of a supernova. Prolonged aging has preserved more U-238 (half life 4.47 billion years) than U-235 (half life 0.704 billion years) to the point that U-238 is now much more terrestrially abundant. If Earth had been formed with uranium that rich in U-235, there would have been Oklo events all over the place. Uranium wouldn't need isotopic enrichment to be used as fuel in light water reactors. Nuclear fission would probably have been discovered early in the 19th century, soon after the element itself was recognized, because any substantial quantity dissolved in aqueous solution would have reached criticality.
The fact that everywhere we see the same U-235/U-238 ratio or very close (Oklo) strongly implies either a single source (supernova) or that if it was more than one source they were all at roughly the same time (6.5 billion years ago), with the latter seeming [to me] less likely, so a single source at 6.5 billion years ago is what makes sense. Unless there were many supernovae and their remnants mixed quite well in our corner of the galaxy where our sun was born.
If the Uranium came from multiple supernovae, then why is it shocking that earth has different concentrations of U235? Moreover, how is it proof of a past fission reaction?
What if that "part" of U235 came from a separate supernova which is a little older and some more of its U235 had already decayed?
After a U-235 atom undergoes fission, one of the outcomes is it releases Barium and Krypton (and some neutrons), which then eventually decay to stable/semi-stable elements. If one of those stable elements is common in the deposit but otherwise rare naturally, it would point to a nuclear reaction having occurred.
Also note that the U-235 decay chain generally looks different from the decay chain following a fission reaction of U-235.
These numbers are probably only for the local corner of the galaxy. It depends on when the supernova(s) that created the uranium exploded.
In order to know whether or not the AI was wrong, you'd need to do some research. Otherwise it's about as reliable as any "fact" some random person on the internet claims to be true.
Effective tool use is kind of a big deal.
um, stars?
https://en.wikipedia.org/wiki/Natural_nuclear_fission_reacto...