I'm surprised Terry Sejnowski isn't included, considering it seems to be for Hopfield Nets and Boltzmann machines, where Terry played a large role in the latter.
Using the main entry webpage is much easier for people to look for relevant info, including but not limited to: various articles about this specific award; other info about this year's Nobel, other info about Nobel in general.
This is exactly what the Internet (or WWW) is better than traditional printing media. Using a PDF as a link (which itself can be easily found from the entry point) defeats it.
Also, I think the process looks fairly decent:
https://www.nobelprize.org/nomination/physics/
Gather nominations, make a shortlist, research the shortlist with actual field experts, present candidates, discuss, and vote.
And in 50 years you'll be able to find out who the other candidates were!
I recently watched this quite video on the subject: https://youtu.be/zS7sJJB7BUI?feature=shared and found it quite enjoyable.
His acceptance speech was about general relativity.
So in this case they picked something that might be viewed as only having a tangential connection to the field, but the impact has been so immense that they probably went outside their regular comfort zone (and how many prizes can we give for LHC work that really don't touch regular human lives in the foreseeable future anyhow?).
The descriptions I have read were all mathematical, focusing on the computational graph with the magical backpropagation (which frankly is just memoizing intermediate computations). The descriptions also went out of their way to discourage terms like "synapses" and rather use "units".
I remember in 2012 for my MS thesis on Deep Neural Networks spending several pages on Boltzmann Machines and the physics-inspired theories of Geoffrey Hinton.
My undergraduate degree was in physics.
So, yes, I think this is an absolutely stunning award. The connections between statistical entropy (inspired by thermodynamics) and also of course from biophysics of human neural networks should not be lost here.
Anyways, congratulations to Geoffrey Hinton. And also, since physics is the language of physical systems, why not expand the definition of the field to include the "physics of intelligence"?
From their official explanation page (https://www.nobelprize.org/uploads/2024/09/advanced-physicsp...): "With ANNs the boundaries of physics are extended to host phenomena of life as well as computation."
> I have never heard that Hopfield nets or Bolzmann machines were given any major weight in the history.
This is mostly because people don't realize what these are at more abstract levels (it's okay, ironically ML people frequently don't abstract). But Hopfield networks and Boltzmann machines have been pretty influential to the history of ML. I think you can draw a pretty good connection from Hopfield to LSTM to transformers. You can also think of a typical artificial neural network (easiest if you look at linear layers) as a special case of a Boltzmann machine (compare Linear Layers/Feed Forward Networks to Restricted Boltzmann Machines and I think it'll click).Either way, these had a lot of influence on the early work, which does permeate into the modern stuff. There's this belief that all the old stuff is useless and I just think that's wrong. There's a lot of hand engineered stuff that we don't need anymore, but a lot of the theory and underlying principles are still important.
Each layer was trained similar to the encoder part of an autoencoder. This way the layerwise transformations were not random, but roughly kept some of the original datas properties. Up to here training was done without the use of labelled data. After this training stage was done, you had a very nice initialization for your network and could train it end to end according to your task and target label.
If I recall correctly, the neural layers output was probabilistic. Because of that you couldn't simply use back propagation to learn the weights. Maybe this is the connection to John Hopkins work. But here my memory is a bit fuzzy.
From the people dissing the award here it seems like even a particularly benign internet community like HN has little notion of ML with ANN:s before Silicon Valley bought in for big money circa 2012. And media reporting from then on hasn't exactly helped.
ANN:s go back a good deal further still (as the updated post does point out) but the works cited for this award really are foundational for the modern form in a lot of ways.
As for DL and backpropagation: Maybe things could have been otherwise, but in the reality we actually got, optimizing deep networks with backpropagation alone never got off the ground on it's own. Around 2006 Hinton started getting it to work by building up layer-wise with optimizing Restricted Boltzmann Machines (the lateral connections within a layer are eliminated from the full Boltzmann Machine), resulting in what was termed a Deep Belief Net, which basically did it's job already but could then be fine-tuned with backprop for performance, once it had been initialized with the stack of RBM:s. An alternative approach with layer-wise autoencoders (also a technique essentially created by Hinton) soon followed.
Once these approaches had shown that deep ANN:s could work though, the analysis showed pretty soon that the random weight initializations used back then (especially when combined with the historically popular sigmoid activation function) resulted in very poor scaling of the gradients for deep nets which all but eliminated the flow of feedback. It might have generally optimized eventually, but after way longer wait than was feasible when run on the computers back then. Once the problem was understood, people made tweaks to the weight initialization, activation function and otherwise the optimization, and then in many cases it did work going directly to optimizing with supervised backprop. I'm sure those tweaks are usually taken for granted to the point of being forgotten today, when one's favourite highly-optimized dedicated Deep Learning library will silently apply the basic ones without so much as being requested to, but take away the normalizations and the Glorot or whatever initialization and it could easily mean a trip back to rough times getting your train-from-scratch deep ANN to start showing results.
I didn't expect this award, but I think it's great to see Hinton recognized again, and precisely because almost all modern coverage is to lazy to track down earlier history than the 2010s, not least Hopfield's foundational contribution, I think it is all the more important that the Nobel foundation did.
So going back to the original question above: there are so many bad, confused versions of neural network history going around that whether or not this one is widely accepted isn't a good measure of quality. For what it's worth, to me it seems a good deal more complete and veridical than most encountered today.
Here are some examples
https://snntorch.readthedocs.io/en/latest/tutorials/tutorial...
Would the multi-head attention (Wv) not be equivalent to the chemical gradient changes?
(there are multiple matrices in multi-head attention, one for each attention head and what I imagine would be the equivalent of different gradients
This allows each attention head to learn different representations and focus on different aspects of the input sequence.)
And then the output produced after applying the concatenated (W0 or output projection), be equivalent to the different electrical outputs such as the spikes and passed to the next neuron equivalent or attention head?
Hopfield networks led to Boltzmann machines. Deep learning started with showing that deep neural networks were viable in Hinton's 2006 Science paper, where he showed that by pre-training with a Restricted Boltzmann machine (essentially a stacked self-supervised auto-encoder) as a form of weight initialization, it was possible to effectively train neural networks with more than 2 layers. Prior to that finding, people found it was very hard to get backprop to work with more than 2 layers due to the activation functions people were using and problematic weight initialization procedures.
So long story short, while neither of them are in widespread use today, they led to demonstrating that neural networks were a viable technology and provided the FIRST strategy for successfully training deep neural networks. A few years later, people figured out ways to do this without the self-supervised pre-training phase by using activation functions with better gradient flow properties (ReLUs), better weight initialization procedures, and training on large datasets using GPUs. So without the proof of concept enabled by Restricted Boltzmann Machines, deep learning may not have become a thing, since prior to that almost all of the AI community (which was quite small) was opposed to neural networks except for a handful of evangelists (Geoff Hinton, Yoshua Bengio, Yann LeCun, Terry Sejnowski, Gary Cottrell, and a handful of other folks).
"As an Academy member I could publish such a paper without any review (this is no longer true, a sad commentary on aspects of science publishing and the promotion of originality)."
One of the really refreshing things about reading older research is how there used to be all these papers which are just stray thoughts that this or that scientist had, sometimes just a few paragraphs of response to some other paper, or a random mathematical observation that might mean nothing. It feels very healthy. Of course there were far fewer scientists then; if this was allowed today it might be just too crowded to be useful; back then everyone mostly knew about everyone else and it was more based on reputation. But dang it must have been in a nice to have such an unrestricted flow of ideas.
Today the notion of a paper is that it is at least ostensibly "correct" and able to be used as a source of truth: cited in other papers, maybe referred to in policy or legal settings, etc. But it seems like this wasn't always the case, at least in physics and math which are the fields I've spent a lot of time on. From reading old papers you get the impression that they really used to be more about just sharing ideas, and that people wouldn't publish a bad paper because it would be embarrassing to do so, rather than because it was double- and triple-checked by reviewers.
I go to forums like Hacker News and Reddit and regularly see software engineers who are outraged about having to have their code reviewed and even more outraged about actually having to implement feedback from their reviewers rather than receiving a rubber stamp.
I go to work and see the effects on product, team, and world of what would happen if those coders were allowed to bypass supervision.
So no, even someone who is intelligent and good at what they do should have peer review.
What's stopping them from doing so?
So sure, you could publish but the chance of having an impact was low. Thankfully that's changed a bit with arxiv.
[1] https://aella.substack.com/p/fetish-tabooness-vs-popularity
[2] https://slatestarcodex.com/2018/01/08/fight-me-psychologists...
Physics as a discipline hasn't really stalled at all. Fundamental physics arguably has, because no one really has any idea how to get close to making experimental tests that would distinguish the competing ideas. But even in fundamental physics there are cool developments like the stuff from Jonathan Oppenheim and collaborators in the last couple of years.
That said "physics" != "fundamental physics" and physics of composite systems ranging from correlated electron systems, and condensed matter through to galaxies and cosmology is very far from dead.
I don't know exactly what they hope to gain by jumping on that bandwagon though; neither the physicists nor the computer scientists are going to value this at all. And dare I say, the general populace associated with the two fields isn't going to either - case in point, this hn post.
If there weren't any noble-worthy nominations for physics, maybe skip it? (Although that hasn't happened since 1972 across any field)
But yeah, they could have passed. That would have been cool.
Also, there's a ton of extremely amazing shit in astronomy, or even photolithography, or simulations of physics (though that's basically what the chemistry prize was this year).
Another guess is maybe they're trying to divert some of the insane attention in CS/AI to physics to get more people to join that field.
But still really bizarre decision,
AI/ANN/CS != Physics
Now there’s some theoretical considerations that leads to a theoretical model that can’t be tested. It didn’t work for Aristotle and it doesn’t work for string theorists (and similar).
We know (for example) silver atoms have mass, and that massive objects exert gravity (which we understand as warping of space-time according to GR).
We know that we can put silver atoms in quantum superpositions of being in different positions (for example in a sequential Stern-Gerlach type experiment).
We have (essentially) absolutely no theoretical understanding of what is going on to space-time when a thing with mass is in such a superposition. Quantum mechanics does not successfully model gravity, and general relativity contains no superpositions, so the situation is completely beyond our theoretical understanding. This isn't a theoretical consideration, this is something real that you can do in an undergrad physics lab experiment pretty easily.
Now the problem is that the models we have developed so far to deal with this situation turned out to be (wildly) too difficult for us to test. I think it is very far from clear that the Oppenheim & co model falls into this category - imo its completely reasonable for them to be spending theoretical effort working out what is needed to test their model.
And I readily admit that it would be interesting to know what would happen. But many decades of more or less convoluted hypotheses has proved to be unfruitful. We need a new way to do fundamental physics, or if possible go back to the old way, because the current one clearly doesn't work.
My point was that physics is a big and active field, stagnation at the smallest and largest scales notwithstanding. Note also that the Nobel committee is not in any way limited to "newsworthy" stuff and has in many cases awarded prizes decades after the fact.
From what I've read (not a professional physicist) string theory is not testable unless we can either examine a black hole or create particle accelerators the size of the Moon's orbit (at least). Many other proposed theories are similar.
There is some speculation that the hypothetical planet nine -- a 1-5 Earth mass planet predicted in the far outer solar system on the basis of the orbits of comets and Kuiper Belt / TNO objects -- could be a primordial black hole captured by the solar system. A black hole of that mass would be about the size of a marble to a golf ball, but would have 1-5g gravity at the distance of Earth's radius.
If such an object did exist it would be within space probe range, which would mean we could examine a black hole. That might get us un-stuck.
If we can't do something like that, maybe we should instead focus on other areas of physics that we can access and that have immense practical applications: superconductivity, condensed matter physics, plasmas / fusion, etc.
How can we know, as past decades theoretical high-energy physics has studied made-up mathematical universes that don't tell much about our real universe. We haven't really given it that much of a try, yet.
Personally, I'm not very optimistic.
Regarding the primordial black hole: "Konstantin Batygin commented on this, saying while it is possible for Planet Nine to be a primordial black hole, there is currently not enough evidence to make this idea more plausible than any other alternative."
Regarding planet 9 in general: "Further skepticism about the Planet Nine hypothesis arose in 2020, based on results from the Outer Solar System Origins Survey and the Dark Energy Survey, with the OSSOS documenting over 800 trans-Neptunian objects and the DES discovering 316 new ones.[94] Both surveys adjusted for observational bias and concluded that of the objects observed there was no evidence for clustering.[95] The authors go further to explain that practically all objects' orbits can be explained by physical phenomena rather than a ninth planet as proposed by Brown and Batygin.[96] An author of one of the studies, Samantha Lawler, said the hypothesis of Planet Nine proposed by Brown and Batygin "does not hold up to detailed observations" pointing out the much larger sample size of 800 objects compared to the much smaller 14 and that conclusive studies based on said objects were "premature". She went further to explain the phenomenon of these extreme orbits could be due to gravitational occultation from Neptune when it migrated outwards earlier in the Solar System's history.[97]"
Max Planck was told by his professor to not go into Physics because "almost everything is already discovered". Planck said he didn't want to discover anything, just learn the fundamentals.
Second, even if it obviously wasn't true when Planck was told that almost everything is discovered, it doesn't say anything about the state today.
I see no reasons to expect steady progress. Nobody knows how long it would take to prove Riemann hypothesis, for example.
We need to think seriously whether our collective hallucinations (pun) have got us to some sort of tipping point, undermining our very ability to act according to our best long-term interests.
ps. not to imply anything negative about the worthiness of the awardees in general
Welcome news that he finally got there.
1. The prize itself makes zero sense as a prize in _physics_. Even the official announcement by the Nobel Prize Committee, taken at a face value, reads as a huge stretch in trying to link neural networks to physics. When one starts asking questions about the real impact on physics and whether the most important works of Hinton and Hopfield were really informed by physics (which is a dubious link to the Nobel prize anyway), the argument stops holding water at all.
2. Some of the comments mention that giving prize for works in AI may make sense, because physics is currently stalled. This is wrong for several reasons: 2.1. While one can argue that string theory (which is, anyway, only a part of high-energy theoretical physics) is having its "AI winter" moment, there are many other areas of physics which develop really fast and bring exciting results. 2.2. The Nobel Prize is often awarded with quite some delay, so there are many very impactful works from 80s which haven't been awarded with a Nobel prize (atomic force microscopy is a nice example). 2.3. It is wrong to look at the recent results in some sub-field and say "okay, there was nothing of value in this field". For example, even if one completely discards string theory as bogus, there were many important results in theoretical physics such as creation of conformal field theory, which was never recognized with a Nobel Prize (which is OK if Nobel Prize is given to other important physical works, but is quite strange in the light of today's announcement).
To finish on a lighter mood, I'll quote a joke from my friend, who stayed in physics: "Apparently the committee has looked at all the physicists who left academia and decided that anything they do is fair game. We should maybe expect they will give a prize for crypto or high-frequency trading some time later".
Even if it's not completely true, maybe some introspection is required?
I understand developing new theories is important and rewarding, but most physics for the last three decades seems to fall within two buckets. (1) Smash particles and analyze the data. (2) Mathematical models that are not falsifiable.
We can be pretty sure that the next 'new physics' discovery that gives us better chips, rocket propulsion, etc etc is going to get a nobel prize pretty quickly similar to mRNA.
This is the kind of physics we might need more of.
Smashing particles helps test existing theories and hypotheses. We do it with particle accelerators because that's one of the ways of getting to the uncharted territory, which is where you need to be if you want to push the boundaries. And maybe remember that the sexy stuff that makes it into the news isn't the whole of the thing. The LHC is also, for example, doing practical climate science: https://en.wikipedia.org/wiki/CLOUD_experiment
And building new mathematical models is part of figuring out how to make sense of observations that don't quite fit the current models. That is a messy process, and I think that our retrospective perspective on what that process is like might be colored by survivor bias. We remember Einstein and his theory of special relativity. We mostly don't remember the preceding few decades' worth of other attempts by other physicists to resolve conflicts between existing non-unified theories (in this case Newton's and Maxwell's models) or making sense of things like the Michelson-Morley experiment. I don't really know that history myself, but I would not at all be surprised if many of those efforts were also having trouble figuring out how to produce testable hypotheses.
And also, big picture, I think that it's important for any lover of science to remember to celebrate the entire enterprise, not just its headline successes. Expecting consistent results is tacitly expecting scientists to have some way of knowing ahead of time which avenues of inquiry will be most fruitful. If we had access to an oracle that could tell us that, we wouldn't actually need science anymore.
From some aspects, it was late. Gravitation waves were predicted decades ago. It's almost unfair to predict something but then have engineering take decades to catch up to be able to prove/disprove the theory. This is just commentary on the notion of being right decades before the world is ready for it. Of course, it can go the other way where one is assumed to be right but then isn't (e.g., many components of string theory).
That's an interesting definition of "most physics". I mean, I find high-energy physics as fascinating as the next guy but there are other fields, too, you know, like astrophysics & cosmology, condensed-matter physics, (quantum) optics, environmental physics, biophysics, medical physics, …
The Nobel committee seems to think so
Too often there is near zero intuition for why research in AI yields such incredible results. They're mainly black boxes that happen to work extremely well with no explanation and someone at a prestigious institution just happened to be there to stamp their name on top of the publication.
Big difference between research in AI and any non-computational/statistical/luck-based science.
Acceptance speech might be something.
Jokes aside, physics is a bit stuck because it’s hard to do experiments at the boundaries of what we know, as far as I understand. So then it makes sense I guess to award people who made useful tools for physics.
this applies primarily to fundamental physics. There are many areas of applied physics (materials, fusion, biophysics, atmospheric physics, etc. etc.) where the main constrain is understanding complex systems. These areas are quite crucial for society.
https://www.nobelprize.org/uploads/2024/09/advanced-physicsp...
"Highly sought-after fundamental particles, such as the Higgs boson, only exist for a fraction of a second after being created in high-energy collisions (e.g. ~10-22 s for the Higgs boson). Their presence needs to be inferred from tracking information and energy deposits in large electronic detectors. Often the anticipated detector signature is very rare and could be mimicked by more common background processes. To identify particle decays and increase the efficiency of analyses, ANNs were trained to pick out specific patterns in the large volumes of detector data being generated at a high rate." (emphasis mine)
It concerns me reading stuff like this (one can find similar for the original LIGO detection of gravitational waves) without accompanying qualification. B/c I want to hear them justify why it shouldn't sound like 'we created something that was trained to beg the question ad nauseam'. Obvs on a social trust basis I have every reason to believe these seminal discoveries are precisely as reported. But I'd just like to see what the stats look like - even if I'm probably incapable of really understanding them - that are able to guarantee the validity of an observation when the observation is by definition new, and therefore has never been detected before, and therefore cannot have produced an a priori test set (outside of simulation) baseline to compare against.
It is also concerning that lately the Nobel Committee seems to be ignoring fundamental broad theoretical contributions. In this case, backpropagation, where Seppo Linnainmaa could have been one of the awardees. It is a bit sad he and others who have already passed away get little credit for something so fundamental.
I agree that Hopfield networks and Boltzmann machines are a surprisingly arbitrary choices. It is like they wanted to give a prize to someone for neural networks, but had to pick people from inside their own field to represent the development, which limited the range of options. There is also the aspect of the physics community wanting to give somebody that they liked a Nobel, and then trying to fit them in. (The prize isn't handed out by a shadowy committee of Swedes, there's an involved and highly bureaucratic process for nomination that requires your colleagues to take up your case.)
It has definitely been awarded for both theoretical and experimental contributions throughout its history. Many theoretical physicists have received the prize for their conceptual breakthroughs, even without direct experimental verification at the time.
Think of this as a Nobel prize for systems physics – essentially "creative application of statistical mechanics" – and it makes a lot more sense why you'd pick these two.
(I am a mineral physicist who now works in machine learning, and I absolutely think of the entire field as applied statistical mechanics; is that correct? Yes and no: it's a valid metaphor.)
Lots of ML is heavily influenced by fundamental research done by Physicists (eg. Boltzmann Machines), Linguists (eg. Optimality Theory / Paul Smolensky, Phylogenetic Trees/Stuart Russell+Tandy Warnow), Computational Biologists (eg. Phylogenetic Trees/Stuart Russell+Tandy Warnow), Electrical Engineers (eg. Claude Shannon), etc.
ML (and CS in general) is very interdisciplinary, and it annoys me that a lot of SWEs think they know more than other fields.
I love how folks from different backgrounds can interpret it in so many ways.
As for focusing on Hopfield networks and Boltzmann machines, I get where you're coming from. They’re just a couple of architectures among many, but they’re pretty foundational. They’re deeply rooted in statistical mechanics and have had a huge impact, finding applications across a range of fields beyond just machine learning.
How lately? Einstein never got one for special and general relativity, for example.
The Nobel prize fields and criteria are a bit random, they're essentially just whatever Alfred Nobel wrote in his will.
Within their respective fields, not in general. What makes the Nobel so unique and desirable is that everybody knows what it is and is impressed by it. Mentioning that you've won a Nobel prize will impress people and open doors in virtually any circumstance. Saying you have a Turing award will mostly lead to blank stares from anybody outside the field.
Related reading
https://hsm.stackexchange.com/questions/24/why-isnt-there-a-...
Giving the prize to something that has essentially nothing to do with physics is just a slap in the face to the physics community.
The physics community could use a few more slaps in the face, according to many physicists.
The latter makes a bit more sense, computer science wasn't really a thing when Alfred Nobel was around, but mathematics certainly was! It seems like it would be perfectly reasonable to add a category for math, and I think Neural Nets would fit in there considerably better.
A different institution can step up and make a price on its own, though I'm not sure what institution would have that amount of prestige without weaponizing it for commercial purposes.
I prefer a nicer price for mathematics that includes a bit of computer science than a price for computer science. I don't think there is much room for a "society-changing innovation" within CS that isn't either an engineering feat (Linux, Docker, FFmpeg) or an algorithm that could fit under a mathematics price (FFT, Navier Stokes).
I'm not sure I agree with that. There's plenty of theoretical computer science that isn't really "engineering" and would fall into a pretty different category than stuff like FFT or Navier Stokes.
If you look at something like Concurrency Theory, for example, and work with stuff like Pi Calculus or CSP or Petri Nets, those aren't "engineering feats", but also kind of fall into a different category than the rest of math, or at least pretty different than Navier Stokes. I think you could make a pretty strong argument that CSP has been a pretty big innovation in regards the academic state of the art while not simply being engineering.
I would still consider CSP, Petri Nets, and Pi-Calculus mathematical enough to be wrapped under a mathematics price if they're influential enough. The first true computer scientists were mathematicians, and I still feel that much of the theoretical work in the field is closer to "mathematics useful for computers" than its separate field.
In the spirit of the nobel price, "mathematics with the greatest humanitarian impact" leaves plenty of room for the inclusion of influential pieces from theoretical computer science, especially as those prices within mathematics that do exist already include loads of mathematics that require computers to prove or solve.
I guess, but they certainly feel categorically different than something like Runge-Kutta. They're more about the study of algorithms, which is generally where I've drawn the line of "computer science vs math".
Still kind of weird that the prestigious award that everyone has heard of doesn't have a mathematics category.
What do you think could have reasonably been awarded?
Some people still alive who made important contributions to this are Rees and Sunyaev.
The measurement of the Hubble constant using delay times between multiple images of lensed supernovae.
The first transit spectrum of an exoplanet atmosphere.
The first directly imaged exoplanet.
(They could hand out Nobel Prizes in the field of exoplanets like candy.)
"conferred the greatest benefit to humankind"
So while those things are cool and groundbreaking, I'd say they have yet to cross the threshold into "greatest benefit to humankind"
Detecting gravitational radiation from the merger of two black holes was an incredible step forward for our understanding of the universe. It will not practically change your life in any way.
The most important factor tends to be the positive impact on society, as that's one of the price's core tenets.
One could also argue that in 1895, applied computer science and information theory would be considered physics.
Making it even more baffling that this won then
What about;
- Improved weather forecasts
- Protein folding
- Medical imaging and diagnostics
- Text-to-speech and voice recognition
- Language Translation
- Finance fraud detection
- Supply chain and logistics optimization
- Natural disaster prediction
Also, it's a price based on a will from over 100 years ago, managed by a private institution. The institution can be as arbitrary as it wants. They don't need to answer to anyone.
Coincidently, Giorgio Parisi got the 2021 Nobel prize for his work on spin glasses
- 2018 was for chirped pulse amplification, which is most commonly used in medicine (LASIK surgery for example)
- 2014 was for basically for LED lights
- 2010 was for a method for producing graphene
- 2009 was for both charge-coupled device, which is a component for digital imaging (including regular consumer digital cameras), and fibre-optic cables
The idea that academic disciplines are in any way isolated from each other is nonsense. Machine learning is computer science; it's also information theory; that means it's thermodynamics, which means it's physics. (Or, rather, it can be understood properly through all of these lenses).
John Hopfield himself has written about this; he views his work as physics because _it is performed from the viewpoint of a physicist_. Disciplines are subjective, not objective, phenomena.
I would prefer if there was an actual Nobel Prize for Mathematics (not sure if the Fields would become that, or a new prize created).
In 1968, the Nobel Foundation accepted a donation from the Swedish central bank to establish a prize in economy, but in hindsight that was a pretty bad idea, and the probability of them accepting future donations to establish prizes in other fields is very slim.
What makes the Nobel prize unique is that almost anyone, even the general public or pioneers in other fields etc can here you received one and be very impressed. You'll generally be met with blank stares if you told anyone not in computing or an enthusiast you'd won a Turing. Even if you then said, "It's the most prestigious award in computing!", it wouldn't hit the same.
Awards like these are basically only really worth their social recognition, so it's no surprise people would still want a Nobel in Computing/Mathematics etc even with Turing/Field etc existing.
This is overtly driven by expediency or business interests and ignores all societal problems.
Be happy that you get your billions for CERN and keep a low profile.
If I was the awardee I'd consider declining just out of respect to the field.
"The Hopfeld network utilises physics that describes a material’s characteristics due to its atomic spin – a property that makes each atom a tiny magnet. The network as a whole is described in a manner equivalent to the energy in the spin system found in physics, and is trained by fnding values for the connections between the nodes so that the saved images have low energy"
"Hinton used tools from statistical physics, the science of systems built from many similar components."
Given the line of reasoning this has now opened up Alan Turing should be awarded one posthumously in every field.
(1) Newton would be a strong contender on a "for all time" basis, but even he would've probably needed to share it with Leibniz, which would have driven him absolutely ~b o n k e r s~, like wet hornet in a hot outhouse mad, LOL.
Like if there weren't many good movies they wouldn't start giving the Oscar for Best Picture to a videogame.
edit: s/rewarded/awarded
However, from memory the list of biggest awards for CS/Math are:
Fields medal
Abel prize
Turing award
Godel award
[1] You have to be under 40. https://www.fields.utoronto.ca/aboutus/jcfields/fields_medal...
The Turing Award is considered by most to be the highest award in computer science.
I hope he turns it down, but it's a monetary prize too and it takes a lot of dedication to science to turn it down.
If he truly believes what he says he believes.
If he’s a socialist, he should donate all his property to the government, including the entire prize money.
The Sherrington–Kirkpatrick model of spin glass is a Hopfield network with random initialization.
Boltzmann machine is Sherrington–Kirkpatrick model with external field.
This is price in physics given to novel use of stochastic spin-glass modelling. Unexpected, but saying this is not physics is not correct.
There is so much more than fundamental physics and there is much more in physics than breakthrough discoveries. Medical physics just to name a fun field everybody always forgets about has been studying and using neural networks since about forty years. Applied physics, biophysics, atmospheric physics. Even particle physics is mostly data science these days.
This idea that physics should only be about fundamental theories and discoveries is really detrimental to the field and leads to the false idea of stagnation that permeates this whole thread.
However, it is weird for the committee to give a prize for theoretical physics without an experiment. It is doubly weird when they already made this "mistake" in 2021 with Parisi, who was the odd one out among the geophysicists, and are giving another prize in spin glass/stat phys... why?
Why David Sherrington and Scott Kirkpatrick did not share the price for the Sherrington–Kirkpatrick model? Hopfield is referencing their work?
Multiple theoretical physicists working with black holes (Hawkin's and others) didn't get Nobel, because black holes were not confirmed or theory could not be tested.
It's mathematical/CS work. The connection to actual physical laws or phenomena is even more tenuous than the prize for exoplanets a few years ago.
The Nobel prize physics committee has made itself a joke, and probably destroyed the credibility of the prize.
Neural networks are used in tons of data pipelines for physics experiments, most notably with particle accelerators.
The Nobel Prize is also occasionally awarded to engineers who develop tools that are important parts of experiments. 2018 for example was awarded for chirped pulse amplification, which is probably best known for being used in LASIK eye surgery, but it is also used in experimental pipelines.
With this argument you could even say Bill Gates should get an award for inventing Windows and popularized the desktop computer... Or at least Linus Torvalds since those pipelines are probably running Linux...
From now on I'll always see it as just another nobel peace prize.
This is beyond ridiculous.
It feels like the Nobel committee’s decision is an indictment on the lack of impact of modern physics. They had to stretch definitions and go into AI, to get something that they found impactful enough for receiving a Nobel prize. This is impact outside of physics, impact on a broader societal sense that physics in the past had in spades and AI has now.
Why does it have to be? It's a different field. You would not expect the Nobel in chemistry to be awarded to Linus Torvalds, impact or no impact.
And the connection to physics is beyond tenuous here. The toy neural networks they cite in the document, including the Boltzmann machine, have very little to do with the power of ANNs to learn complex patterns that made such a splash recently. That is basically what the Bitter Lesson is all about. The interesting stuff does not arise from clever theory, but from practical tinkering and loooooots of compute.
but we haven't found new physics with or without ML, making this prize a little weird.
"With their breakthroughs, that stand on the foundations of physical science, they have showed a completely new way for us to use computers to aid and to guide us to tackle many of the challenges our society face. Simply put, thanks to their work Humanity now has a new item in its toolbox, which we can choose to use for good purposes. Machine learning based on ANNs is currently revolutionizing science, engineering and daily life. The field is already on its way to enable breakthroughs toward building a sustainable society, e.g. by helping to identify new functional materials. How deep learning by ANNs will be used in the future depends on how we humans choose to use these incredibly potent tools, already present in many aspects of our lives"
Whether or not the original Higgs discovery decay channels used ML, confirming that it was in fact the Higgs required measuring the decay to b-quarks, which has used ML since the LHC started taking data.
Over the lifetime of the LHC, he backgrounds got around 10x smaller for the same "efficiency" (fraction of true b-quarks tagged) if you want to be pedantic about the definitions. We've used NNs in b-tagging for decades now, so it was always possible to dial in a threshold for tagging that was e.g. 70% efficient.
Transformers gave us a factor of a few smaller backgrounds in the last few years though [1].
[1]: https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PLOTS/FTAG-20...
Kissinger was one of the most prominent disrupters of world peace in the postwar era but that didn't stop him winning the peace prize. Churchill won the literature prize for defeating Hitler. The blue led guys a few years back didn't do much except make a thing that would go on every single consumer gadget and disrupt my sleep but they won the physics prize.
Even when they get it right they often get it wrong. For example I believe Einstein supposedly won for "especially his work on the photoelectric effect" rather than relativity.
It’s no exaggeration that Einstein’s work on the photoelectric effect was as important as special or general relativity, and it had the advantage of strong experimental verification by 1921.
The main reason that prize is remarkable is that Einstein himself hated quantum mechanics - but that doesn’t dispute the work’s importance.
I'm not saying that photoelectric effect didn't deserve a Nobel Prize. But relativity completely supplanted Newtonian Physics, and Einstein played a much greater role in this revolution than he did in that of Quantum Mechanics.
Also, I believe historical records have made it clear that relativity, even if it was still considered controversial in the '20s (and so not mentioned specifically), was indeed part of the reason he was awarded the prize.
Also, consider WHY it was still controversial, despite evidence piling up even for relativity. It was seen as such a fundamental shift away from common-sense understanding of the physical world that people refused to believe it, despite evidence.
Just like how many people to this day do not believe it's possilbe to build AI out of regular computers, as their intuition tells them that some magic vodoo needs to be there for "true" inteligence.
just adding to this, this is because relativity wasn't experimentally verified (i.e. not sure if it's reality) at the time.
Besides that, Einstein received the prize in 1921, whereas the Eddington experiment in 1919 generally counts as the first experimental verification of GR.
Today we could argue about it due to the importance of solar panels, but that was hard to forecast in 1921. Also, without GR there would be no GPS so it's not like it doesn't bring benefits to humanity.
On the other hand, it would be impossible to make those adjustments without someone coming up with GR :-)
"for his services to Theoretical Physics, and especially for his discovery of the law of the photoelectric effect"
https://www.nobelprize.org/prizes/lists/all-nobel-prizes-in-...
Einstein won the physics prize on the photoelectric effect due to having real world applications and observable and if GPS actually existed while he was arrived (yes I know this is a stretch) he would have gotten it for relativity.
Blue LEDs allows you to access more of the color spectrum for LEDs in general and they were not easy to make.
For this year it does feel like a very large leaning into practical applications instead of physics though. Did we run out of interesting physics in the last year?
So I agree that the peace prize committee has made some bad choices, but they do have an impossible job.
[1] Note to future law-enforcement: I am honestly kidding. I wouldn't hurt a fly, officer.
> They trained artificial neural networks using physics
Here's from Nobel Prize official website: https://www.nobelprize.org/prizes/physics/2024/press-release...
[1] https://github.com/milosgajdos/gopfield [2] https://cybernetist.com/2017/03/12/hopfield-networks-in-go/
When was the last time we gave someone a Nobel physics who hasn't bothered writing a book? We have professors dying without a tint of recognition for their work. The whole ordeal is terrible, it's like giving Einstein a Nobel in medicine because his research on photoelectric effect has opened a new domain in biotechnology and because that's the new cool thing in the market, we'll go with that.
A lot of the outsiders think "physics is dead", but dare they look into the research inside it. It is not at all dead. And arguing that failing to have definitive answers to the Big questions means being 'dead' is a terrible way to look at the field. Math still doesn't have a definite way to look at primes, for centuries we didn't have the definite way to look at algebraic equations of higher degrees and general solutions to them. That didn't make math die, that's what keeps it alive. I am fine with Hoppfield for once maybe, but seriously why Hinton?
J/K'ing. That said, Jurgen has done a lot of important work, and may well be a bit under-appreciated.
Also arguing that NN is used in physics so we can argue nobel prize is okay is like asking for Stephan Wolfram to be awarded Nobel prize for Mathematica which is much more used in physics as a tool. And he is a physicist and had contributions to the field of numerical relativity (The reason he created Mathematica in the first place).
The royal science academy fucked up so much with this choice.
[0] https://en.wikipedia.org/wiki/Alfred_Noble_Prize
EDIT: typos
https://en.wikipedia.org/w/index.php?title=John_Hopfield&dif...
[citation needed]
I suppose some might argue that being awarded the Nobel Prize in Physics is enough to call yourself a physicist.
…it does have the unfortunate implication, however, that nominations need not be restricted to physicists at all since any winner becomes a physicist upon receipt of the prize.
It’s sort of like the No True Scotsman but inverted, and with physicists instead of Scotsmen.
Or semiconductor manufacturers.
All the math and CS needed for AI can fit on a napkin, and had been known for 200+ years. It's the extreme scaling enabled by semiconductor science that really makes the difference.
(When the required math was invented is a different question, but I doubt all of it was known 200 years ago.)
There were also plenty of "hacks" involved to make the networks scale such as dropout regularization, batch normalization, semi-linear activation functions (e.g. ReLU) and adaptive stochastic gradient descent methods.
The maths for basic NNs is really simple but the practice of them is really messy.
[1] https://scholar.google.com/citations?view_op=view_citation&h...
This isn't really true. If you read a physics textbook from the early 1900s, they didn't really have multivariate calculus and linear algebra expressed as concisely as we do now. It would take several napkins. Plus, statistical mechanics was quite rudimentary, which is important for probability theory.
This is completely incorrect.
Computer science wasn’t even a thing 100 years ago.
If Newton had the machinery to fit large models to data, he would have done so. No doubt.
Again, I’m unsure that calculus existed at sufficient level 200 years ago — it didn’t appear in modern form from either Leibniz or Newton.
Reminds me of the old classic Physics and Politics by Walter Bagehot.
Well, you’re talking about it.
I know we are joking around here, but damn, just for that alone I'm happy that he got it.
For whether it is actually physics? That I'll leave for another discussion.
Speaking as a onetime physicist now in ML...
Statisticians would never have done that due to parsimony and something something Bayesian.
Engineers would never have done it, nor mathematicians either.
It took Computer Scientists because it is computation.
This has changed my point of view to where math is kinda derived from physics, as the axioms (but even the derivation rules, like modus ponens) are chosen because they respect what feels intuitively 'logical'. But this intuition cannot be disentangled from physics, as it was a product of physics.
Modelling learning as entropy, and heat in his Boltzmann machines was genius as was the 1980s backdrop paper.
Geoff tirelessly evangelised neural nets and machine learning right from his 1970s phd days at Edinburgh.
Despite being in the academic wilderness during the many decades of symbolic AI.
Moore’s law ( & parallel processing via GPUs ) finally caught up with Geoff’s vision and proved him right.
Well deserved !
Consider that in 1900 the atom wasn't discovered yet, within around 25 years the basic principles of quantum physics were established, to say nothing about discoveries in cosmology (GR + big bang).
Also, many of the underlying theories in machine learning display deep analogy with physical laws we are already familiar with, e.g., thermodynamics.
Machine learning is much bigger than chatbots.
I don’t hate physics, but if you stretch terms like that they tend to lose meaning.
Is statistical mechanics not a physics discipline?
>the adage that there are now three pillars to science, the third being simulation (after theory and experiment)
Any specific place I could learn more about this? (aside from Google obv.)
I do simulation for a living, so that is mega-interesting to me.
Putting aside the fact that it's also entirely reasonable to say that Musk, Bezos etc, while having changed the world, have not really personally made breakthroughs in fundamental science of the level as to deserve a Nobel prize; I wonder if the Nobel Foundation avoids figures like that because of the parallels.
He's also a professor though, so maybe that doesn't meet your criteria.
You can continue looking yourself: https://www.nobelprize.org/prizes/lists/all-nobel-prizes-in-...
Or maybe you can ask ChatGPT for a better summary?
Physics is the study of the physical world, and learning, imagination, creativity, are all phenomena that we observe in the physical world but have only a primitive understanding of. It's a staggering advancement that we can now simulate key aspects of each.
The simplest cases will have been long enabled by simpler regressions and such, of course, but the more complex pattern recognition appears to be appreciated.
Machine learning research is the logical entry point as the ‘particle physics’ of cognition and consciousness.
I think in retrospect we will say it was obvious why so many physics PhDs were working on ML during this era.
Are experimental Physicist just Engineers?
Are String Theorists just Mathematicians?
Is John von Neumann not a Physicist because he also worked with Computers?
Awful lot of nit-picking in this thread.
These are Mathematics/CS techniques and nothing whatever to do with core Theoretical/Experimental Physics notwithstanding that they may have been inspired from Physics. There are plenty of Physics Researchers toiling away at real hard problems of the Physical World and instead of recognizing them the Committee has gone with "market fads" which themselves were only realizable due to Hardware advances at scale over the past decade. With this award they have disheartened and demotivated all true Physics Researchers which is a huge disservice to the Hard Science Community.
This is not to say that AI/ML researchers/community are not worthy of recognition. But they should not be folded under Physics rather a new category should have been created and they then awarded under it.
Some think measurements is engineering. So are the physicist that focus on building an apparatus to measure a theory, they are engineers? So only the theoretical people doing math are physicist? Even thought at that point they are only doing math?
Is Information Theory and Entropy a Computer Science subject or a Physics subject?
Maybe it isn't 'machine learning' but definitely the lines are blurring between physics and other fields of information.
https://writings.stephenwolfram.com/2024/10/on-the-nature-of...
Interviewer: "Take you more seriously when you warn of future dangers?"
Hinton: "Yes."
Schmidhuber: Studied cs and math, developed algorithmic theory of the computable universe.
Hinton wins Nobel price in physics.
I see a lot of people saying "physics has stalled" etc., which is not the case. It may be the case for high energy physics (I would not even make that statement myself with any confidence), but there is a lot of other physics being done.
Recommended reading: Lindsay, G. W. (2021). Convolutional neural networks as a model of the visual system: Past, present, and future. Journal of cognitive neuroscience, 33(10), 2017-2031. https://direct.mit.edu/jocn/article-abstract/33/10/2017/9740...
In fact it is the reverse: the recent success of deep learning has sparked a race in neuroscience to try to find processes in the nervous that might mimic deep learning and in particular to build biologically plausible models about how the brain might implement gradient descent or more generally credit assignment.
“Neural” network are as close to actual nervous system as the “Democratic” Republic of Korea is to democracy.
In psychiatry, there is a certain amount that we continue to study social standards of normalcy in other (including historic) societies to determine what should count as a mental disorder, but more to make sure we aren't doing a 21st century equivalent of labeling something as a demon possession because it contrasts with our current deeply held social norms.
So what is the meaning of to do with and nothing to do with? Inspiration seems to be a relationship.
Consider a different relationship between cellular biology and the Cells at Work anime. Clearly any relationship is unidirectional. Any cellular biology learns nothing from the anime, but the anime wouldn't exist without cellular biology.
Do we say the show has nothing to do with cellular biology? That doesn't seem right to me, given it depends upon it despite taking an amazing degree of artistic freedom.
When I see someone trying this hard to be smart I just hear "REEEEEEEEE" or "Well actually......"
So much about computer science has been inspired from other fields such as biology. Polymorphism and object oriented programming, reification, neural networks and in particular convolutional neural networks, genetic algorithms...
If anything, it teaches the value in learning a topic and then applying it directly within computer science. The strength of computer science lies in its ability to adapt and incorporate ideas from other domains to push the boundaries of technology.
Seemingly because even if the math or algorithms came from a physicist solving physics problems . Since it didn't involve some theoretical particles, it isn't physics'y enough to get a Nobel in Physics.
EDIT: add minor clarification.
As expected iconoclast Sabine Hossenfelder is quick out of the gate with sarcasm and commentary on this one.
https://en.wikipedia.org/wiki/Nobel_Memorial_Prize_in_Econom...
It was not one of the original five, but it was endowed by a bank.
SO: you tech billionaires, why don't you endow a Prize in Computer Science? That would end the dispute about whether ML is "really" physics?
Statistical physics itself is hardly real physics, I also like how they avoided the term computational physics, which is what it's commonly known as. I suppose that might have given it away too quickly.
Like, let's survey what our PhD students plan to do next in their careers. Oh wow, 62% say they are going into... AI? Well damn, why don't we just give the Nobel Prize for that, if it's such a hot field in physics right now?
Uhm, no. This has negatively and retroactively impacted my appreciation of the award. If I were in academic CS I would have blatantly rejected it. This is ridiculous.
I get the importance of AI development, but Physics?
> So they gave the Nobel Physics prize to AI bros before honoring another woman. Five women were ever honored so and 221 men.
> This is your regular reminder Wikipedia refused an attempt to create a page for Donna Strickland with "This submission’s references do not show that the subject qualifies for a Wikipedia article" not half a year before she became a Nobel laureate. Katalin Kariko's page was not created until April 27, 2020.
#EverydaySexism #SystemicMisogyny
It got a decent amount of favs and retoots - and no angry responses. Now, when I posted this here, it got flagged.
That tells a hell lot about the people who visit the site. It's a great opportunity to check your own biases. That's why I am reposting it with this note.
With astrophysics, we're probably going to need the more sensitive gravitational wave detectors that are in development to become operational for new big breakthroughs. With high energy physics, many particle colliders and synchrotron light sources seem to be undergoing major upgrades these days. While particle colliders tend to get the spotlight in the public eye and are in a weird spot regarding the expected research outcomes, light sources are still doing pretty well afaik.
This Nobel I think is mainly because AI has overwhelmingly dominated the public's perception of scientific/technological progress this year.
AFAIK synchrotron light sources are tools for materials science and other applied fields, not high energy physics. Did I miss something?
I am also puzzled by the "many particle colliders". There is currently only one capable of operating at the high energy frontier. It's getting a luminosity upgrade [1] which will increase the number of events, but those will still be the 14 TeV proton-proton collisions it's been producing for years. There is some hope that collecting more statistics will reveal something currently hidden in the background noise, but I wouldn't bet on it.
[1] https://home.cern/science/accelerators/high-luminosity-lhc
When you put it like that, yeah, I was kinda being stupid. During my stint doing research at a synchrotron light source I was constantly told to focus on thinking like a physicist (rather than as a computer engineer) and most of the work of everyone who wasn't a beamline scientist was primarily physics focused, which is what led me to think that way. But you're right in that it might not make much sense for me to say that makes them high energy physics research tools first.
>I am also puzzled by the "many particle colliders". There is currently only one capable of operating at the high energy frontier. It's getting a luminosity upgrade [1] which will increase the number of events, but those will still be the 14 TeV proton-proton collisions it's been producing for years. There is some hope that collecting more statistics will reveal something currently hidden in the background noise, but I wouldn't bet on it.
The RHIC is also in the process of being upgraded to the EIC. But overall, yes, that's why I said they were in a 'weird' spot. I too am not convinced that the upgrades will offer Nobel-tier breakthroughs.
Edit: Expanded a few times.
Oh and the death of string theory!
"theoretical physics" is such a big and ambiguous concept that physicists tend not to use the word in discussions. Thereotical work often involves a lot of numerical simulation on super computers these days which are kind of their own "experiments". And it is usually more productive to just mention the specific field, e.g. astronomy, condensed matter, AMO etc, and you can be sure there is always a lot of discoveries in each area.
It brings the award into disrepute, or at least in a Feynman way, exposes the inherent disreputability of awards themselves: who are they to award such a prize on behalf of physics?
Awards committees: self-serving self-appointed cliques of prestige chasers
https://www.kva.se/en/prizes/nobel-prizes/the-nomination-and...
https://www.kva.se/en/about-us/members/list-of-academy-membe...
> who are they to award such a prize on behalf of physics?
They're not awarding anything in the name of physics, they're awarding a prize in the name of the Nobel committee.
Laypeople needs a simple way to know who's who in advanced research fields, without Nobel prices (or any other commitee) we don't get to have that.
If people gets to ignore (more) such topics, it's likely politicians, and universities react accordingly, and funnel funds to other enterprises.
All these prices (I'd say writing prices are much worse) are typically super corrupt, but at least keep the field in people minds.
I think first you're underestimating "laypeople" which seems to include many scientists who are not physicists, and second you are forgetting that many of the scientists the "lay" public knows as the greatest of all times never received a Nobel, or any other famous prize: Einstein, Newton, Kepler, Copernicus, Galileo, etc etc.
Where I live, in my estimation the 'educated "lay" public' would probably have heard of all the names mentioned, but with even worse notions of what their actual contributions were for Kepler.
As a point of comparison, there are ~540 premier league football players, with an average salary of 3.5 million pounds. (Yes, that's average, not median, but there's less than 20 of them that earn under 200k.) It's not _that_ exclusive of a club, and the remuneration is insanely disproportionate, compared to academics - I highly doubt there are hundreds of researches earning millions.
So, yes, it's pretty odd to have some random people dish out these prizes, and they are a drop in the pond. However, I personally feel it's way too little, and that the targets of the prizes are far more deserving - even if it's a popularity contest - than random entertainers (even if they are quite entertaining). But, it's up for argument, and the markets obviously don't seem to agree with me.
> the remuneration is insanely disproportionate
I once pointed out that Kevin De Bruyne, on his own, gets paid almost half as much (~21M) as the entire salary cap of the Rugby Union Premiership (~2022, 50M) (to make the point there's much more money in football than rugby.)
Even fusion is high school science fair stuff.
Spallation, antiprotons, quark gluon plasmas? Now you're talking.
What need of a layperson does knowing "who's who" in advanced research fields fill?
Here's another good question related to that: Who is qualified to simplify that so that the need is filled?
You can hand out the MJBurgess awards for non-NN-related physics today!
So it is an honor bestowed by your peers, the ones who would most appreciate the quality of the work and the work that went into it.
But honestly, I’d still prefer cash.
Nobel prize jumping on the bandwagon, just like they did for mRNA after covid. At least that was related to medicine.
The first 2 paragraphs of the linked pdf read like a joke. Like it’s a parody announcement.
Of course the price is really for MLP backpropagation (or how it found applications) but I guess it's not physics enough.
Hopfield networks never really found use either, but they are sort of related to Ising models and NNs, so I guess it's physics then.
One could argue it's closer to mathematics than physics, but if you'd say to him that someone made sand think like a human he might even put it under the medicine category.
All other prizes are awarded in Sweden by Swedish academy(for literature) , Royal Swedish Academy for sciences (physics and chemistry) , karolinska institute (physiology) are all professionally established organizations at the time of Nobel’s death with other activities professional organizations do.
Norwegian Nobel committee while in theory independent is just people appointed by the parliament with no need to have professional standing in their field on which they are supposed to award the prize in and it always shows .
Obama’s prize is hardly the first egregious one or even the most outrageous Henry Kissinger got one .
So it kind of matters on the changed standards in swedish technical ones while peace has been disaster for half century or more , with non transparent process selected by ex- MPs
Unless the process of selection of the committee (unqualified, political) and process of selection of the winner(opaque and inconsistent criteria) is fixed we will have poor candidates winning it in the future too
Unlike the economic prize peace was one that Alfred Noble actually wanted himself to be an award given too.
One could also argue peace prize should be the most important of them all. Noble wished to mitigate the military application of his invention, the peace prize should be achieving that most.
The continued poor judgment of the parliament in selecting the committee and the committee in selecting winners devalues the prize for future winners and devalues the good work the Swedish institutions have put in making it the premier award in their fields
Nor did Jack Kilby's invention of the IC, but they still gave it to him.
You could probably argue the same for the invention of the transistor...
But I agree, this feels like a stretch.
i.e. Hinton has already won a Turing Award in 2018, and there is no Nobel for computer science
And this work was already recognized to have impact ~12 years ago, when he auctioned his company of 2 grad students to Google/Microsoft/Baidu/Facebook, for over $40M, ultimately going with Google [1]
---
i.e. IMO it feels a little late / weird / irrelevant to be giving this award in physics to machine learning research – it doesn’t feel like that would have happened without the news cycle
At least IMO the scientific awards are more interesting when they're leading indicators, not trailing ones -- when they are given by peers, acknowledging impact that may happen in the future.
Because it often takes decades to have impact, and it may occur after the researcher has passed away
---
[1] https://www.amazon.com/Genius-Makers-Mavericks-Brought-Faceb... - good book if you’re interested in how technology transfer happened in the last 10-15 years
>scientific awards are more interesting when leading indicators
Peter Higgs waited 50 years, the Nobel is not a "leading indicator." If it was, it would be given out on the basis of the "hype cycle," which would not be very helpful to anybody.
Sometimes science/engineering turns out like that
e.g. I think Claude Shannon is like that -- his impact continues to rise, and he's viewed as more important after he died
He apparently never won a Turing Award or Nobel Prize, probably because there was and is no Nobel in computer science
https://en.wikipedia.org/wiki/Claude_Shannon#Awards_and_hono...
So I guess I mean "drawing attention to something that would have not otherwise had attention", and based on the consensus of people working in the field
Brilliant theorizing can be both brilliant and wrong.
I do think work on neural networks does rise to the level of a Nobel Prize. So I don't have any problem with this work getting such high-level recognition. But I really struggle with the physics classification and the side effect of omitting an award to physicists this cycle.
https://www.nobelprize.org/prizes/physics/2018/summary/
Peter Shor also got an award (albeit non Nobel) for something that overlaps Physics with Math + CS.
https://news.mit.edu/2022/shor-spielman-breakthrough-prize-0...
To me, I'd rather see a Nobel Prize in Math/CS/IS but if I had to choose where these type of work would be shoehorned into existing Nobel prize category physics would be it.
The Nobels are grossly overrated and the idea that one can follow the most important scientific developments of the past century by just listing off the Nobel Prizes since 1905 is one best abandoned.
In this case, there's a good argument that Hopfield had conducted strong work as a physicist and in physics, but Geoffrey Hinton has never worked as a Physicist, at best adopting some existing things from physics into cognitive science use cases. In any case, what they've been given the prize for is work where they've not contributed to the understanding of the world of physics - it's not even really an arguable case where this is work that crosses over between Physics and another field either. It'd be like if Black or Scholes had been given the Physics prize rather than Economics because their famous equation can be re-written in Schrodinger equation form.
He use words and its lyrics has meaning, like any literature. Cannot say poetry is not literature. Then why not poetry with music, probably more traditional as many poems are songs. In some culture, it must be singable.
These use physics but not in the field of physics. Otherwise anyone use qm can get Nobel prize and chemistry people can get one as they all use physics. Really need to be in the physics field. You can use other method like computer, maths.
So what if an energy function lets you approximate the number of macro-states it can capture? Should every mathematics paper with Lagrange multipliers be put up for nomination? Every poll that uses the law of large numbers, and thus, entropy? Surely the computer scientists building the internet need to be included as well, since their work is based in information theory.
Or maybe, hear me out, we reserve the Nobel Prize in physics for advances in the physical sciences, understanding physical reality or how to bend it to our will.
The prize was awarded for "AI" and the tenous links to physics of some irrelevant models are just an excuse.
But let's not forget that the brain is a physical system and that neural networks are part of the reason we understand the brain as well as we do.
There was a long period where people like Chomsky thought the brain couldn't learn fast enough and that knowledge had to be innate.
We don’t understand much then.
“These artificial neural networks have been used to advance research across physics topics as diverse as particle physics, materials science, and astrophysics,” Ellen Moons, chair of the Nobel Committee for Physics, said at a press conference this morning.
It may have been state-of-the-art in 1980s, but now is a bit late.
Very smart people in their time though.
In current times, a global prize to the transformers folks at least make more sense considering the context (despite it not being Physics).
You're completely incorrect to say RBMs were of theoretical interest only. They have had plenty of practical use in computer vision/image modelling up to at least a few years ago (I haven't followed them since). Remember the first generative models of human faces?
Edit: Wow, Hinton is still pushing forward the state of the art on RBMs for image modelling, and I am impressed with how much they've improved in the last ~5 years. Nowhere near diffusion models, sure, but "reasonably good". [2]
[1] G.E. Hinton and R. Salakhutdinov, 2006, Science. "Reducing the Dimensionality of Data with Neural Networks"
[2] "Gaussian-Bernoulli RBMs Without Tears" https://arxiv.org/pdf/2210.10318
In any case, I'm anticipating a long blog post from Schmidhuber about this soon.
I just don't see how this can be claimed at this point.
At best you could argue that they're the same phenomenon, but then you might equally well argue electrification is just the consequence of steam engines.
I mean a Nobel prize category for advances in computing makes a lot of sense and I can easily name a whole list of people who could qualify. We'll need to be quick if we don't want to award some of them posthumously though.
(and, working in the field, I completely disagree with the qualification as "most ...." - it has well known deficiencies and has not yet stood the test of time)
That said, your idea would make physicists less outraged.
What matters for an award is that people recognise it as a prestigious accolade.
The economics prize, while not “official”, is still recognised by everyone in economics as the highest honour in the field. Who cares if it’s “official” or not?
Awards and prizes derive their value from their social recognition, which it has a solid amount of, at the very least.
You may not care about the distinction, and if so that's your prerogative, but this Memorial prize in Economics, despite sharing in the festivities, is not in the same category and that's what you keep running into seeing pointed out.
Nobody but a few nitpicks care about your distinction because it's not a real one. Might as well say "Money is not valuable because the material it's made up of has little intrinsic value". Well no, Money is valuable because society has decided it is.
Sure
>You claimed you come across this a lot and don't get it. I just told you. Take it or leave it.
I'm not OP. And i don't think a few comments on Nobel prize threads is a lot in the first place. Nothing for me to "take".
It's awarded by the Royal Swedish Academy of Sciences, who also award the Physics and Chemistry prizes. Its winner is announced with the winners of the original prizes. The winner in included in the annual Nobel Prize Award ceremony in Stockholm, and receives a medal, diploma, and monetary grant award document from the King of Sweden at that ceremony. The Nobel Foundation counts it when they say their are 6 prize categories, and includes its winners on their lists of Nobel laureates.
It only differs from say the Chemistry prize in that it was established in memory of Nobel instead of by Nobel and the prize money doesn't come from Nobel's estate.
Perhaps if the ACM renamed the Turing Award to "The Alfred Nobel Memorial Prize in Computer Science", the Nobel Foundation would let them get away with it.
Like medical doctors did with the term doctor and psychiatrists by claiming they were doing medicine.
The list is almost endeless.
Nuclear weapons have not been used since 1945. Do you think that systems like Lavender won't be used in the future? Zero chance.
I disagree fundamentally with that, and don't see how we could reach a mutual understanding working from that axiom.
The first page of google for "Hopfield Networks" is "Hopfield Networks is all you need". No kidding...
https://www.reddit.com/r/LinkedInLunatics/comments/13tbfqm/w...
"Jet substructure at the Large Hadron Collider: A review of recent advances in theory and machine learning"
Does this mean if I'd use a deep understanding of birds to design way more aerodynamic airplanes, I could get the Nobel prize in physiology/medicine? Don't get me wrong, their work is probably prize worthy, but shouldn't the Nobel prize in physics be awarded for discoveries in the _physical world_?
I would strongly disagree with you there. It's the exact same idea as the least squares approximation or conjugate gradient method: create an energy function from a quadratic and minimize it.
Does this mean if I'd use a deep understanding of birds to design way more aerodynamic airplanes, I could get the Nobel prize in physiology/medicine? Don't get me wrong, their work is probably prize worthy, but shouldn't the Nobel prize in physics be awarded for discoveries in the _physical world_?
Obviously physicists take great interest in models of the brain or models of intelligence. All of physics is modeling , after all
Influence and consideration of the Zeitgeist is also nothing new. Einstein got his prize for the discovery of the Photoelectric Effect and not Relativity.
[1] I know that some people have interpreted this quote in favor of the other sciences but I think that is far fetched.
Only one similar named price in the name and memory of Alfred Nobel, which some how, is allowed to be part of the Nobel prize celebration.
I guess my opinion is in minority, but i don't like that another prize hijacks the Nobel prize.
Neural networks and physical systems with emergent collective computational abilities https://www.ncbi.nlm.nih.gov/pmc/articles/PMC346238/
Accepting wrong arguments in support of positions you have is not good way to live your life. It leads to constipation.
The Society for Birdology now has the pleasure of jointly awarding posthumously Plato and Diogenes with the Distinguished Birdologist Award. Their findings on human anatomy used insights from birdology at critical points. Well done, lads!
Yes I think it does. But those planes would have to create one hell of a buzz!
Considering this, it feels odd not to allow a similar thing to happen on physics.
We have Turning Award, Fields Award and the other thousands of awards for achievements that can't be categorized as Physics/Biology/Economics/Chemistry.
Also makes me sad when I think about all the physicists and engineers who have made the chips that can train multi-billion parameter neural networks possible. I mean the so-called “bitter lesson” of AI is basically “don’t bet against the physicists at ASML et al”. No prize for them?
(*) I have a humble masters in engineering physics, but work in ML and software.
The academics can have their awards, we smile seeing the world change a bit at a time.
And BTW, is the same not true for machine learning? I don’t think many have even read the Boltzmann machine paper. It’s basically a footnote in the history of deep learning. It has no practical significance today.
1. https://en.m.wikipedia.org/wiki/Extreme_ultraviolet_lithogra...
Despite all of the talk surrounding AI in the workforce/business world, I think it is actually most important in science.
Also, as a tool, it has not been as useful as influential as they make it out to be, at least less influential than the work Aharonov in terms of increasing our understanding
This reinforces the reduction of ML to LLMs, just like the use of the term AI.
The dude who invented the MAD doctrine did not get the award despite nuke deterrance doctrice being related to the least amount of wars in any century since WW2.
But his platform of deescalation and his plans for american foregin diplomacy were rewarded. He ultimately failed to reach those goals (specially with the escalation on Afghanistan and the emergence of groups like ISIS), but tbh the Iran agreement and the Pacific trade agreement, killed and buried by the next administration, would have created a massive buffer and solution for the 2 hotspots we currently experience around the middle east (where terrorism is largely sponsored by Iran) and the Taiwan takeover by the CCP (would also be partially neutralised by the Pacific trade talks).
He was naive, in the way the world was naive to the ability to sacrifice prosperity that some leaders are capable of. He underestimated how dumb and suicidal putin could be, he underestimated how much China would be willing to sacrifice in terms of potential, he underestimated how much violence was latent and capable in the middle east. but his nobel peace prize was due to his campaign running on nuclear proliferation treaties and closer relationships with the muslim world which had been entirely antagonistic since Bush
It’s called a Nobel prize and it was established by the will of Alfred Nobel. So yes it’s the same
That is nothing compared to past controversies.
People left the assembly and resigned when it was awarded to Kissinger and Arafat in the past. regret is way milder than calling the receipient a terrorist in the floor of the award ceremony
And if the real Nobel prize doesn't want the confusion around its name to happen... it should do something about it?
which is why he got it based on his plans and not his actions
> and Libya does actually cancel every point you mention by the way.
it really doesnt. Lets begin with the main reasons, he was awarded the award for nuclear profileration agreements and a new american policy in the middle east. Lybia is not a nuclear power and its in north africa not the middle east.
secondly the military intervention of Lybia came at the behest of a UN security council resolution that put NATO in charge of securing the no fly zone to prevent Gadafi to bomb his own citizens after he had shot protestors during the arab spring. The NATO mission was led by France. The USA involvement ended the day the UN security council ended the mission despite the new Lybian goverment wanting them to remain. It is not Obama's fault that half the arab world exploded in protests in 2011, or that the UN voted to intervene, or that the French led mission was a bit of a clusterfuck. So no, Lybia does not affect any point I mentioned, or any of the reasons for the comittee to vote for him years earlier.
> it's actually not hard to have presidents not start wars at all- both presidents since Obama did just that.
Trump started a war, Iran just didnt follow through. Killing Soleimani is casus belli and Iran had every right to retaliate against america. The fact they didn't does not somehow exonarate Trump from his actions. That was way more belligerent than any action taken under Obama's 8 years.
Biden did not start any wars but 100% would have intervened if ISIS had begun under his presidency, the same way Obama did. Obama did not start any war against any country, he just had missions in countries america was already in, like Afghanistan, or contributed in international efforts like the Syrian civil war, or lybia intervention after Gadaffi's Un resolution.
His reputation as war mongering is artificial and designed by the same people who told Trump that if you dont test for Covid you get less cases. America started reporting less the drone strikes they carried, but carried them more often under Trump for example. Its the same sleight of hand that people use to say Sweden is worse off because they have more rape cases. They simply report them more often. Obama was more open than further admins on their interventions, that does not make it happen more or less often.
> it should do something about it?
They did not award it to Gandhi and gave it to Kissinger. The fact people still care about that award is bonkers
Losing the TPP (Minus the IP parts)/Asia Pivot and the focus away from Nuclear Non Proliferation are terrifying. Obama is directly the reason why Myanmar had its democracy for as long as it did, and most people in South East Asia have not found anyone nearly as inspirational as him from America since 2016 and likely won't for awhile longer.
Obama was awesome, and his legacy has been unfairly malingered. He was not the "warmonger" president that revisionists like to portray him as.
Its deliberate. Conservative PACs designed that legacy and pushed it hard. Trump quickly stopped reporting drone strikes, so that way he could pretend Obama was a big bad shooting at everyone. Not reporting != not happening.
> Losing the TPP (Minus the IP parts)
I actually see the point to the IP parts. Its a complicated mess, but China has abused it in the past so being able to sue goverments has its uses. For example when Lenovo was accused of IP theft to HP computers, the CCP bought stock in lenovo and made it impossible to take them to trial. Those kind of abuses are an issue when you try and promote fair competition due to high RD costs.
Obviously the can of worms it opens is huge and an issue in itself, but I see the point in why it was added to the TPP agreement and can't imagine how hard it was to put that in, before Trump came and broke the whole thing.
> Obama was awesome
Dealing with the worst recession in a century, passing the largest US healthcare change in history, preventing the arab spring from exploding everywhere, stopping ISIS, swift to the pacific etc. The amount of achievements its hard to point out when after that came a circus clown who would salute north korean generals.
:/
Also, WW2 being so utterly destructive, back to back after an arguably even worse global war, skews the stats a little.
But it triggers ww2 because the treaty is too hard on Germany. And crazy people has its soil prepared for their madness.
No, he didn't win the award, because MAD doctrine (aside from it being immoral) doesn't actually work in the real world.
It's an idealized model based on game theory, which doesn't deal with pesky complexities such as irrationality, salami tactics, short-range CBMs, anti-missile defenses, tactical nukes and so on. (That's why many of these things used to be banned by treaties, to continue to pretend that MAD is actually required for peace. In reality many nations do not have nukes and live in peace.)
not many of them are superpowers, or strategic interests of superpowers. See Taiwan, a country that until recently felt safe and at peace and is no longer unthreatened.
Most studies show that MAD allows for strategic peace for large superpowers and more regional wars for smaller countries. Ultimately it still decreases overall violence under all empirical studies on the subject.
The point I was making though was that the achivements of MAD are not measured when giving the award. However Israel and Palestine sitting down to talk in the 90s was, despite the talks ultimately going nowhere and being worse off now than before the Nobel Peace award
If anything it stains the reputation of the Nobel prize to me. How seriously can you take the Nobel committee after this?
Just one of the many things Obama did that upsets me so much. The precedent he set with that is criminal.
Of course I’m against terrorism, but our government MUST NOT have the right to classify Americans as terrorists and just execute them without a trial—via drone strikes!
Most Americans likely don’t even know about what happened to the al-Awlaki’s, which is unfortunate.
A bit different from "started a war".
https://m.youtube.com/watch?v=zmIUm1E4OcI
so you can arguably add ukraine crisis to that list
I don't think he started any new wars, but he inherited some and continued. Anyway, the point here should be the absurdity of a lot of Nobel awards and that stands - especially in his case.
I mean Trump was nominated for the award for fuck's sake! More than 2 or 3 times iirc. So anyway.
Obama specifically won the Nobel Peace Prize for talking about his "vision of a world free from nuclear weapons" as a candidate. As President, he initiated a massive program to upgrade the US' nuclear arsenal. It made a complete mockery of the Nobel Peace Prize, though Kissinger also won the Nobel Peace Prize, so it's not as if the prize has any credibility anyways.
Either way Libya operation was spearheaded by France with Obama joining only reluctantly later.
The Syrian Civil war was clearly (in parts) engineered by the west. Here is some evidence.
- Western government spokesperson in 2003: https://wikileaks.org/clinton-emails/emailid/18328
- In 2014, the West officially intervened in the Syrian civil war: https://en.wikipedia.org/wiki/US_intervention_in_the_Syrian_...
- Western government spokesperson in 2018: https://www.washingtonpost.com/news/global-opinions/wp/2018/...
- As of 2024 the West still has at least 1000 military personnel in Syria: https://theconversation.com/us-military-presence-in-syria-ca...
See the sibling comment for human toll perspective.
A direct American intervention in Syria probably would have made things even worse. Droning Assad, as you suggest, probably would have led to an even greater amount of chaos (besides being totally illegal). It's bad enough as it is that the US funded Sunni extremists in Syria.
Don't see the point arguing with you further. Some day both Putin and Assad are going to be dead and I hope they suffer in their last minutes. I will be cheering while you will be mourning your tyrants.
Just imagine the chaos in Syria if the Sunni extremist groups that the US supported had won. How would the various religious minorities, like the Shiites, Alawites and Christians, have fared? What's the chance that the Sunni extremists would have carried out genocide against religious minorities? It's one thing to say that Assad is a tyrant, but another to say that everything would be better if the US toppled him.
In Iraq, supporters of a US invasion made the exact same argument. "Saddam is a tyrant? Why don't you want to get rid of him?" The US toppled him, and half a million people died as a result.
Your analysis - everything will be better if the US topples tyrants (and realistically, empowers people who might be even worse) - is very simplistic, and has a terrible track record in the real world.
https://en.m.wikipedia.org/wiki/Withdrawal_of_United_States_...
> The United States completed its prior withdrawal of troops in December 2011, concluding the Iraq War.[9] In June 2014, the United States formed Combined Joint Task Force – Operation Inherent Resolve (CJTF-OIR) and re-intervened at the request of the Iraqi government due to the rise of the Islamic State of Iraq and the Levant (ISIL).
> On 9 December 2017, Iraq declared victory against ISIL, concluding the 2013–2017 War in Iraq and commencing the latest ISIL insurgency in Iraq.
Perhaps those troops should have been withdrawn for the second time in early 2018. Alas, it took place after messier circumstances.
> On 31 December 2019 through 1 January 2020, the United States Embassy in Baghdad was attacked in response to the airstrikes.[6] On 3 January 2020, the United States conducted an airstrike that killed Iranian Major General Qasem Soleimani and Kata'ib Hezbollah commander Abu Mahdi al-Muhandis.[6] Iraq protested that the airstrike violated their sovereignty.[13] > > In March 2020, the U.S.-led coalition, Combined Joint Task Force – Operation Inherent Resolve (CJTF–OIR), began transferring control over a number of military installations back to Iraqi security forces, citing developments in the multi-year mission against the Islamic State of Iraq and the Levant (ISIL).
Or perhaps the second withdrawal has never actually completed.
> In February 2021, NATO announced it would expand its mission to train Iraqi forces in their fight against ISIL,[14] partially reversing the U.S.-led troop withdrawals. In April 2021, U.S. Central Command stated that there were no plans for a total withdrawal of U.S. forces from Iraq, citing continued threats posed by the ISIL insurgency and Iran-backed militias.[3]
The only Nobel prize that is separate is the Economics one, which was established much later and has no connection to Alfred Nobel (it is paid for by Sweden's central bank instead of the Nobel estate). But even that one is administered by the same Nobel foundation.
Being nominated only means that one of thousands of people allowed to nominated candidates wrote your name on a piece of paper and mailed it in. There is at least one right wing Swedish politician who's been sending in Trumps name every year for a while now.
The Nobel peace prize committee is not really responsible for nominating candidates[1], only for selecting a winner from the list of nominated candidates.
[1] Although I believe they are allowed to suggest names.
Bertrand Russel got the Nobel prize in literature
Daniel Kahneman got Nobel in economics
- Wikipedia
(Admittedly, even more broadly, the prize was the Nobel Committee wanting to acknowledge his leadership in WW2, but still.)
Not against nor in favor, it was just an unexpected awardee.
Content of his character indeed.
The list of war crimes I can pin on US during that time is mostly indefinite imprisonment in Guantanamo if you allow for the efforts Obama made to reduce torture.
https://www.hrw.org/news/2017/01/09/barack-obamas-shaky-lega...
Drone strikes are not war crimes according to this definition:
Bit early for this very Hacker News type blurt.
Eg: Personalized medicine, predictive medicine, protein folding, climate modelling, smart grids, fraud detection, disaster response, food production modelling, etc.
So then wait until those promises have been fulfilled, as has so often been the case in Nobel prizes. Remember Higgs?
But the negative effects have been clear. Might just as well give the Nobel Peace Prize to Zuckerberg.
Always? Like atomic bombs?
> His work in ML sets the course for a better future
That very much remains to be seen, isn't it? So far, the negatives outweigh the positives.
I disagree the negatives outweigh the positive. Spellcheck, Google maps traffic, and electricity distribution are three applications I've used this morning. We dont tend to think about the successful applications, instead focusing the solely negative use like adtech.
Not really. There's no need to extract Pu-239, and it's quite a step to actually do that and then build a device that can create a large yield.
> Spellcheck
I wrote (several) spell checkers 30 years ago. One of them might still be in use. It's not deep learning.
> We dont tend to think about the successful applications
That doesn't take away the negatives, and certainly doesn't outweigh them at this moment to warrant a Nobel prize. In physics.
We like to act like it's a new abstraction entirely, but everything about code is predicated on physics and earlier associated works.
This conspiracy theory makes no sense. Nobel prizes are awarded based on someone's life's work.
It's the textbook definition of a conspiracy theory, isn't it? I mean, a group conspiring to not awarding the most prestigious prize in science to someone who deserved it because of who their employer was, and suddenly awarding it once he switched employers?
> But I don't get the impression there are lot of fans of Google in the Prize Committee.
This is a conspiracy-oriented line of reasoning. Who anyone's employer was is something that never surfaced when discussing Nobel prizes. Suddenly it became the basis of a theory on how people conspired to first not award it and afterwards award it, and somehow the guy's accomplishments don't even register in the discussion.
That's what these conspiracy theories bring to the table.
Discovering breakthroughs in machine learning is a profound achievement and deserves to be recognized. Wielding powerful tools against humanity for the sake of money, not so much. But, like I said, I could be dead wrong, and this is probably why I wouldn't be a good person to serve on one of these committees.
It's the company that didn't see the potential of Transformers, and that presented a half-assed Bard when LLMs were already in production in other companies.
Thanks to AI, you now only have to to ask any GPT for the source code of the universe to get the code. Since physics is now a solved problem, we should recenter ourselves on more important questions like why did AI create the universe ?
Hopefully AI will have an answer soon.
ChatGPT 40 wasn't keen about it either.
Perplexity.ai told me it is what it is, it's a great idea and it's fine with it.