I'm not so sure that's really the case; it's more that for many animals there simply isn't any pressure to evolve (or retain) this trait.
It's not like the natural selection process has a feature list it can tick off. It operates with zero foresight and an incredibly dumb principle: whatever helps procreation.
Cows are not dying due to tetrodotoxin poisoning in significant numbers, as far as I know, so there is no reason for them to evolve resistance to it. The same applies to most animals, including the snakes outside that area.
Your dog can synthesise their own vitamin C and will never develop scurvy. Most animals can do this – humans and some other primates are the exception. An ancestor lost the trait for vitamin C synthesis by chance, and because these primates were living in trees eating lots of fruit with vitamin C, evolution simply didn't notice. There is no disadvantage to being able to synthesise vitamin C, and no advantage in dropping the trait. It didn't affect procreation (at the time). Now we're all stuck with it.
Now, maybe all of this does have a cost for the snakes. But it's far from a given that there is one.
You are assuming there is but one cause for development and/or loss of resistance.
There may not be much pressure to develop resistance to tetrodotoxin for most species. Simultaneously there might be a higher metabolic cost to retaining it for some species but not for others. It is also possible that resistance with low cost is very rarely lost which is why we carry resistance to toxins we don't often see but population bottlenecks in ancestral lines can cause loss of a trait to propagate - even by accident. And much like Vitamin C loss if it doesn't matter the loss sticks. We should not forget that there are multiple resistance mechanisms as well: an immune system generally primed to fight certain common causes of mortality can, entirely by accident, also be primed to recognize and destroy certain proteins conferring resistance to some toxins and not others.
I have barely scratched the surface above. The random walk of evolution and its constant hoarding tendencies should make everyone skeptical simplistic mechanisms of action as well as "just so" explanations of evolutionary history.
FWIW most things are multi-causal. I previously made the same argument about house prices. People who claim it is caused by foreign money, low interest rates, restrictive zoning, etc all want their pet theory to be The One True Reason. In reality the market is complex and many of the proposed causes are merely contributing factors.
I made no assumptions. As I pointed out to another commenter, you might be in too much of a haste to play at being a contrarian. It might be more useful to pay closer attention to what you're objecting to.
Evolutionary game theory demonstrates that evolution is a matter of fitness payoffs. If cost of a trait increases, fitness is reduced. The prevalence of a trait in a fit population is indicative that, at best, the trait increases fitness, at worst, it doesn't hinder it. In both cases, the genes tend to be passed on and the game is allowed to continue. When carrying the trait becomes costly, there's pressure to get rid of it (through the usual evolutionary means).
The above model encompasses all the unnecessary specificity you tried to bring into the matter. If you object to it, address your concerns to the scientists that are leading us all astray.
For now, let's circle right back to the author's original argument. Absence of an actually useful trait to increase fitness (i.e. protecting ones from certain food sources and others from predators) might be indicative of a hefty tax to pay for carrying it.
(No offense, I hope you don’t realize how you are coming across, or that if you do this comment will trigger some introspection)
Behave yourself
> we carry around only...
Not true. We can carry resistance to some ancestral pressure which isn't part of the current environment.
> sooner or later...
Yes, sooner when it's costly, later when it's less so, through normal evolutionary pressure (entropy and all).
The point is, most species at time T do carry traits that aren't that useful to them anymore. The costlier ones yield enough negative fitness points in evolutionary game theory to rid the gene pool of them quicker. It brings us right back to the author's original argument.
In fact, looking at related newts whose ancestors were toxic (assuming the trait is not novel in these ones) would give us some idea as well.
It’s funny how often this sort of thing comes up. I’ve always felt that “biology” as a field was unique in the way that it is often taught. Bio 101, etc. - most of undergraduate biology - is often taught with this sort of sweeping worship of the process of evolution in a way that leads to it transcending rational thought. Natural selection is very real, and it’s also such a sorry excuse for an evolutionary algorithm :D
It’s been a long time since my first bio classes so I can’t remember the way I was first taught it, but I do remember all more advanced bio literally being told to unlearn what we had already been taught.
Is that because resistance to those toxins was strongly selected for in humans, or because the source of those toxins did not strongly select for effectiveness in humans?
People can handle significantly more of a wide range of plant toxins like theobromine and caffeine (both found in chocolate) which harm more pure predators like dogs in very low doses, but where rare for out imitate ancestors.
Cattle, deer etc however can handle many of those at much higher doses.
You’re right that there was an energy trade-off, but it was being able to run faster and longer that was more important than strength for our ancestors, who still had quite small brains (the brain of an Australopithecus is only 35% the size of a human).
Brain size developed later, probably in a feedback loop with our diet - as we began to eat more meat our brains got bigger, which made us better hunters. And hominids actually got bigger and stronger as their brains grew.
Humans evolved Wesker muscles to gain brain mass, source: https://www.nationalgeographic.com/culture/article/140527-br...
> Humans owe their big brains and sophisticated culture to a single genetic mutation that weakened our jaw muscles about 2.4 million years ago, a new study suggests.
_A new study suggests_
I don’t think you can treat these claims as categorically true. It’s plausible and probably warrants further study, like most things in biology.
Edit: I could not read the second article you linked as it was behind a paywall, but I found the full text of the original paper[1]. The paper appears to make a much weaker claim: that a weakening of jaw muscles in humans coincided with acceleration in brain size. This is certainly intriguing, but correlation does not imply causation.
Huh? Where did I say that?
I was just pointing out you presented two claims as facts and the sources do not support your claims. Maybe there are other sources that do, but the two studies you cited make much weaker claims.
The news article titles misrepresent the findings in the typical way that news articles sensationalize and misrepresent science etc etc.
Once you reach a particular point, things might tend to play out in ways that look more deterministic in specific places, but fate is still hard to predict. Consider the Vaquita. A species that has thrived for ages has been nearly wiped out of existence because a random primate species evolved to invent plastic fishing nets, and now that same primate species might altruistically manage to govern itself out of destroying the Vaquita. Really, nothing in that story was guaranteed to happen based on the frontier of the search process 1,000,000 years ago, it was just how the dice landed.
The Vaquita’s survival or loss is playing out in some ways as an international diplomacy story, where the first Trump administration saw declines in Vaquita numbers and the Biden administration took steps to improve their chances, and then Trump was re-elected, with some people believing that came down to Biden—a single human man who does not live in Mexico—experiencing age-related declines, and with others believing oligarchs bought the election to support causes like Russia’s pursuit of Ukraine. Really, this is about as random and as divorced from survival of the fittest as it gets. But it is no more random than the fact that the person who might have ushered in 1,000,000-year era of world piece and cured all suffering in all species would be just as likely to die as a child in a car accident as anyone else.
In ML, evolutionary algorithms are classified under randomized optimization, due to the way that they take random steps to forge random paths into vast combinatorial spaces that could never be completely understood or completely explored.
Has anybody modeled what percent of a population has to die from something for the protective gene to become widespread?
https://revive.gardp.org/resource/minimal-selective-concentr...
The question is incoherent. The gene spreads if the organisms carrying it average more children. It unspreads if they average less. All of them could be dying of the same thing, and it wouldn't matter.
The rate of spread is given by the https://en.wikipedia.org/wiki/Selection_coefficient , but cause of death isn't relevant.
Whenever a species winds up isolated in a cave, it loses eyesight really quickly in evolutionary terms because making and maintaining an eye is so metabolically expensive. So, while the mutations are random, any of them that can save the energy of developing vision get selected for very quickly.
So, even though the mutations are random, it really looks like "cause-and-effect" from the outside: get isolated in cave->lose vision; get exposed to outside light again->regain vision.
By the same token, changes that aren't very expensive metabolically will have very weak "cause-and-effect" because there is no particular pressure to carry the mutations forward or clean them up.
The fact that guinea pigs, fruit bats, and passerines (almost half of all bird species!) also have a mutated GULO gene suggests that there is in fact some pressure to get rid of it as soon as it is bioavailable from diet.
Synthesizing vitamin C takes energy, energy that could be used for other biological processes. It's also possible excess vitamin C has some minor deleterious effect. For example, it's an antioxidant, and these render immune cells somewhat less effective against certain threats (which they use oxidizing chemicals to destroy). It's been found larger doses of the ACE vitamins causes increased growth of lung cancer, probably due to reduced immune attack.
Some have argued against this idea, though, although I'm not convinced by the argument (see if you can spot the problem.)
So why did the trait of that mutant primate spread throughout the entire population? There should instead be a mixture of those who can and those who can’t synthesize vitamin C.
(Indeed, one should perhaps not so blithely assume that there was sufficient fruit for everyone and so C didn’t matter… for it is precisely the ability to survive in times of drought and scarcity that drive evolution, and there id no reason to suspect a population that could synthesize their own vitamin C was less fit than a population that couldn’t. The issue of vitamin C is far from simple…)
> There should instead be a mixture of those who can and those who can’t synthesize vitamin C.
Probably was for a long time. All of this happened about 60 millions years ago. It's been a while.
Turns out, it's the water-lily.
Second-order effects are so cool
https://crookedtimber.org/2025/03/14/occasional-paper-the-in...
>Newts with weaker poison? They get eaten. Snakes with less resistance? Have trouble finding newts they can choke down, and don’t get to steal their poison. So the arms race continues.
How does a snake know that the Newt has weaker/strong poison? Is it leaving some Newts along and eating others, or is it eating any Newt it runs across? Does a strong-poison newt survive snake consumption attempts?
> And it explains why the newts keep evolving to be more toxic: the snake may want to eat newts generally, but if an individual newt packs enough of a wallop, the snake may just retch it up and go after a different one. Newts with weaker poison? They get eaten. Snakes with less resistance? Have trouble finding newts they can choke down, and don’t get to steal their poison. So the arms race continues.
That's got to be an extremely weak effect. No snake gets an individual benefit from eating the newts. They get a collective benefit, that predators recognize the species as poisonous, in which all snakes, poisonous and delicious alike, share equally.
The problem is large enough that actually-poisonous animals routinely have delicious mimics of entirely different species who free-ride off of the work the originals do to be poisonous.
You can't explain why snakes apparently need to avoid sending a dishonest signal with a theory that predicts that mimics don't exist.
Here is a slightly more in-depth piece where a wildlife biologist mentions other possible forcing functions that cause the snakes to eat the newts: https://baynature.org/2022/04/06/the-bay-area-is-the-center-...
From the article:
> “When garter snakes are born in the late summer, they often live under mats of drying pond vegetation … That happens to be where the newly metamorphosed newts come out in the fall, and we suspect there could be a lot of interaction between predator and prey just because of this overlap in microenvironment. That could have led to strong selection in the past that resulted in such high levels of resistance.”
Let me see if I follow: once the snake population has the warning coloration, and predators know not to eat them, then individual snakes being successful at eating poisonous newts is unrelated to the snakes living long enough to pass their genes (i. e. being successful in terms of natural selection). So a snake which has the right colors will be successful, regardless of its diet.
I wonder, is there a point where mimicry can fail? Can predators at some point start to eat the mimics?
The same sources of variation which provide the variety for evolution to work on to evolve avoid eating things with this appearance behavior will also provide variation that evolution can work on to evolve back to do eat things with this appearance behavior; the frequency with which eating causes death vs. the degree to which not-eating results in insufficient nutrition will decide which wins.
As you note, the behavior you end up with is determined by how much stress the mimics place on you.
To your point I wonder if eventually some truly toxic species can become a victim of their own success
Sort of. Whether the snake is poisonous is unrelated to its success, because it dies upon being eaten whether there are consequences to its predator or not. (The article takes some pains to show that this is untrue of the newts, but not the snakes.)
However,
> the snakes living long enough to pass their genes (i. e. being successful in terms of natural selection)
This does not reflect a good understanding. Success means having more children, not having any children.
I never said "any children", so your remark doesn't reflect a good understanding of what I wrote.
Try not being pedantic for pedantry's sake, and engage with a question asked in good faith without playing games of one-upmanship.
When cargo culting goes right.
> Successful predation of the rough-skinned newt by the common garter snake is made possible by the ability of individuals in a common garter snake population to gauge whether the newt's level of toxin is too high to feed on. T. sirtalis assays toxin levels of the rough-skinned newt and decides whether or not the levels are manageable by partially swallowing the newt, and either swallowing or releasing the newt.
This might be a total tangent, but every time I see “newts”, I think about how Karel Capek actually coined the word robot in his 1920 play R.U.R., and then later gave us War with the Newts...really smart amphibians. Thanks for sharing.
[0]https://www.discovermagazine.com/planet-earth/a-beautiful-we...
> It’s so toxic that the poison from a single newt can easily kill several adult humans. You could literally die from licking this newt, just once.
TBF there is one death reported in Oregon from someone eating an entire newt in 1979, but they aren’t as bad as the article would have you believe. Many of us have handled these newts. There would be a lot more dead people if licking is all it took.
> A 29-year-old man drank approximately 150 mL of whiskey at about 11 AM July 9, 1979. At 6 PM he swallowed a 20-cm newt on a dare. Within ten minutes he complained of tingling of the lips. During the next two hours he began complaining of numbness and weakness and stated that he thought he was going to die. He refused to be transported to a hospital and was left alone for 15 minutes and then experienced cardiopulmonary arrest
It's great in german - 3 syllables for 3 letters, but english/french, it's NINE syllables for 3 letters. I always thought it should have been web.domain.org.
Kind of absurd to use multiple syllables for a single letter if you think about it.
> I always thought it should have been web.domain.org.
It should have just been "domain.org" - the web part is already specified in the protocol. And if you are concerned about domains only having a single IP that could have been (and for many protocols has been) solved with SRV records.
Thanks for the taking the time to find out for the rest of us.
I can already think of uses of this word jokingly in a people context
But this doesn't seem as immediate as the newt's defense where it's on the skin and thus causes potential predators to spit them out or even to seize up - meaning that at least some attacked newts survive the encounter. Eating the liver means the snake is dead. And since it's going to be impossible to tell if a particular snake is immune (and is thus potentially toxic) how would this deter predators? (Especially given the limited range of snakes with this immunity and the probability that there are predators of the snakes that don't necessarily have this same limited range - ravens, raptors, etc.)
Maybe the predator's carcass next to a half eaten garter snake is meant to serve as a lesson to other potential predators.
Or perhaps the aim is not to deter but to simply take one natural predator down with them for the good of their species.
And higher predators (like mammals) also have food preferences. They don't always eat stuff indiscriminately, so predators that don't _like_ snakes will preferentially survive. Eventually, this can get established as a genetic trait.
Or as a behavioral one, if parents don't teach cubs to hunt snakes.
I used to keep native snakes and lizards (and inadvertently breed them!), and couldn’t keep newts because I wasn’t sophisticated enough to create the right environments for them. This is one species I kept (and killed, unfortunately). I’ve learned to do it far better since then, but haven’t tried keeping newts again. They’re beautiful little creatures.
I do think the article plays up their toxicity some. There's only one reported human fatality I could find, from some dipshit who ate one on a dare. If you handle them gently and don't stuff your entire hand in your mouth immediately after, I suspect you're fine.
They swarm all over the PNW, in season. Don't step on them if you can help it. They're not death newts. I'd be a dead commenter if they were death newts.
They swarm all over the trails in spring, and then they're gone for the rest of the season. That's my recollection of it.
I don't live there anymore, maybe they have evolved into these dangerous death newts. One can hope.
Perhaps interesting; https://pubmed.ncbi.nlm.nih.gov/34816428/ (Still need to read it)
I am not aware of any species besides humans that do this, though.
Same deal with spiders, they're obligate carnivores. If there are spiders around, that's because there are other bugs about.
Honestly I understand wolf culls a lot more. I don't agree with them, but I understand the rational motive of people who have livestock, pets, or even children.
After preparing dinner, one girl got very ill, as did I, while other people who ate the dinner were fine. I was so worried I'd mis-identified some mushrooms.
But turns out she had handled one of these newts and the bacteria had transferred to the mushrooms she picked. I contacted it from washing the mushrooms. I threw up several times that night.
In hindsight, had we not washed the mushrooms as thoroughly as we did, things could have gone much worse.
Leafy greens also have very low calories per pound. We eat them for the nutrients not for the calories. Because of mushrooms and wild greens, I buy very little vegetables, all I need is relatively cheap (per calorie) foods to go with the wild stuff.
There is also risk of food poisoning with food from restaurants or the store.. not to mention the chronic poisoning of eating food grown with excessive pesticides etc.
For the most part the abundant edible mushrooms look very different from the dangerous ones. But yes you do need to know ID thoroughly if you go for certain species.
That said not everyone lives where edible mushrooms are abundant, I'm not trying to suggest everyone should do it.
All the significant calories comes from the oil or butter they're cooked in.
I'm not sure it was ever about avoiding starvation, but rather just a different flavor to eat sometimes. When you're always eating the same local ingredients, food can get boring pretty quick. It's the same appeal of spices -- you got a new flavor!
You wouldn't choose to do manual typesetting (or copying books by hand!!!) today either versus alternatives.
If you're _just_ looking to add umami flavor to a dish today, you'd be crazy to pick foraging for wild mushrooms over Aji no Moto.
That looks like... an intensely conservative estimate. Deer use salt.
That's not the appeal of spices. People don't stop using the spices they like in quest of newer, worse-tasting ones. By far the most common case when a person is eating spices is that there's nothing new about the flavor.
And this often fueled increased trade and increased cultivation volumes and increased prices and tariffs and wars and cruel laws. In antiquity.
And often, the actual medicinal benefits became overhyped, and crept from their scope, and each nation's crown jewel of a spice became a miracle cure-all, and cue the trade wars and sword-wielding knights defending their spice.
Basically the "Snake Oil Salesmen" of the Wild West were white hucksters who diluted the actual snake oil down so much, or didn't bother adding any in the first place, then sold the elixirs on Main Street between the saloon and the whorehouse. So the Native Americans were nonplussed that their shamanistic remedies had been subverted as a trope of quacks and hoaxers.
Most of all, these spices and mushrooms have been gradually enshittified, perhaps literally, and many of them are a shadow of their former selves, bred for mass-production. And Americans sit there and dust our burger and fries with gray sand that doesn't even taste like black pepper anymore. Not to mention the salt that's been refined until there's nothing but sodium in it.
Perhaps mushrooms are the least likely food to be enshittified or deliberately commercialized, except for about 4 types in the grocery stores. From what I've learned about mushroom foraging, it's never worth it; just go buy mushrooms in the store, I mean for crying out loud. The risk is too great, and aficionados can claim "easy identification" all they want, but "easy" is relative and not for you to judge, because there's a fine line between tasty and fatal.
Calls to poisen control concerning mushrooms: 8,294 Of those calls, 4862 were of unknown origin, only like 3-400 are confirmed dangerous wild grown mushrooms, 2k+ are psylocibin. 3-400 is probably <1% of the amount of people who forage, so its a lot safer than driving a car I'm guessing.
(This was a quick scan)
https://piper.filecamp.com/uniq/dPhtQdu6eCQnIQ5R.pdf [page 174-175]
I've eaten many hundreds of pounds of wild mushrooms in the last eight years and have not had any food poisoning at all.
I'm not sure that's true, I know that we had Cantharellus cibarius ("golden chanterelle") growing up everywhere in the woods, but I don't think the Douglas fir even exists there, never heard that name before. The forest was mostly oak I think.
The calorific value of a meal is one of the least important aspects - you might as well complain that the mushrooms don’t come in sufficiently varied colours to make it worthwhile.
It’s not about the calories. It’s about the experience - the taste, the texture, the satisfaction of knowing you did it yourself.
Also, for many mushrooms, the risk of consuming a toxic variety is extremely low if you know what you're doing. People love to bring up examples of "But the head of the Mycological Society of XYZ died of misidentified mushrooms!!", but a while back I dug into those examples and found 0 evidence for any of them - they're just popular Internet old wives' tales that people love to regurgitate.
The number of hospitalizations is somewhere around 10k a year. For ~1500 of those it's at least life threatening. ~100ish end up with organ failure or permanent neurological problems. ~10 of them die. That's every year.
That might be mildly dangerous compared to other hobbies, but if you isolate for actual practitioners of the hobby, suddenly those numbers look extremely dangerous.
Foraging for mushrooms is not dangerous if you know what you're doing and stick to easily identified mushrooms that aren't easily confused with poisonous varieties.
Also I said plants and mushrooms. Not specifically mushrooms. AAPCC doesn't track mushrooms separately and I would consider the CDC to not be the authority on poisoning -- their specialty is diseases.
https://i.imgur.com/vIXenG8.png
8294 case mentions, 3039 hospitalizations.
for outcomes check the table.
Looks like that includes the hallucinogenic mushrooms, which leads to 2139 case mentions and 1146 hospitalizations a year.
When your boss starts pushing Return To Office, ask if the company has a worthwhile kitchen in the office; at least a burner and plenty of room in the refrigerator for ingredients; it should be feasible to cook breakfast and lunch, but also dinner, in case you need to work late.
Essentially nobody is starving in the US for lack of calories (unless it's a case of mental illness or something similar). In fact, in the US, usually the opposite is true. From the Wikipedia page on food insecurity in the US:
> Reliance on food banks has led to a rise in obesity and diabetes within the food insecure community. Many foods in food banks are highly processed and low in nutritional value leading to further health effects. One study showed 33% of American households visiting food pantries had diabetes.
Makes me wonder if a) these toxicity stories are exaggerated, b) it's really regionally specific, c) toxicity has radically increased in the past ~40 years since I was playing with newts, or d) we got dumb lucky.
I loved this article. I didn't know anything about the newt / snake interaction; I wonder if my dad did.