What's amazing to me is just how long it took to get to first photo- I was working on the design of the LSST scope well over 10 years ago, and the project had been underway for some time before that. It's hard to keep attention on projects for that long when a company can IPO and make billions in just a few years.
Both modes of observation - surveys and targeted observations of individual objects - are necessary for astronomical research. Often, large surveys are used to scan the sky, and then targeted observations are used to follow up on the most interesting objects.
0. https://en.wikipedia.org/wiki/Sloan_Digital_Sky_Survey
Note that "seeing" means something very specific in astronomy: https://en.wikipedia.org/wiki/Astronomical_seeing.
It is surreal to see LSST/Rubin finally get first light.
Even more interesting to see who is still working on LSST, and who is not.
> "Probably within the first year we’re going to see if there’s something there or not,” says Pedro Bernardinelli, an astronomer at the University of Washington."
https://www.nationalgeographic.com/science/article/is-there-...
It is generally recommended to upvote a comment you appreciate rather than making a comment that isn't adding substance. It helps keep the signal rate higher.
Among many other uses: https://m.youtube.com/watch?v=h6QYjNjivDE
(And amazing production of the actual video as well)
Pretty sure you can see some kind of masking for satellites in some of the frames of the asteroid videos.
Doing some extremely rough math along these lines to double check myself:
* Gemini says that a dinosaur-extincting asteroid hits Earth about once every 100 million years. So in any given year that's 0.000001%.
* Economists say a human life is worth about 10 million dollars. There are about 8 billion people on Earth. So the total value of all human life is $80,000,000,000,000,000 (or 8e+16).
* So in any given year, the present value of asteroid protection is $800,000,000 (likelihood of an impact that year times value of the human life it would wipe out).
* The Guardian says the Vera Rubin telescope cost about $2,000,000,000 (2 billion).
By that measure, assuming the Rubin telescope prevents any dinosaur-extinction-level asteroid impacts, it will pay for itself in three years.
So you have to put the monetary value somewhere (although you're completely within your right to question this specific amount).
My immediate reaction though is to doubt the mapping of dollar to value - e.g., the 10 million dollar valuation of the human life, but also the valuation then of all the things that year-dollar-cost could be spent on. Many of those things probably don't map very well between true value, and dollar cost (my go-to example of this is teachers fulfilling one of the most critical roles to ensure a functioning society, yet the dollar cost paid for their labor being typically far lower than most other jobs).
And indeed, accounting for externalities (unmeasured or unmeasurable) is a tough economic proposition. If it weren't hard to account for every single variable, creating a planned economy would be easier (ish).
FWIW, there's a whole sub-field just dedicated to determining the value of life for various purposes (a starting link: https://en.wikipedia.org/wiki/Value_of_life). You may disagree with any specific assessment, but then you have to argue how that value should be calculated differently.
So you could actually make an argument that to a country like the US, full 100% reliable asteroid protection is only worth like $50M/year (even if an impact means full extinction)?
So if upkeep for a detection/deflection system costs more than that we'd be "better off" just risking it?! Thats insane. I would have expected this number to be much higher than $50M/year.
I also think that fully accounting for multi-generational consequences is murky/questionable and not really something we do even in much more obvious cases: Eligible people deciding against having children are not punished for depriving future society of centuries of expected workyears, and neither are mothers/fathers rewarded for the reverse.
But even if you accounted for losing 3 full generations and some change (for biodiversity loss), that still leaves you in the ~$200M/year range.
Currently we don't have reliable asteroid deflection capability at any price (but it would be technically somewhat in reach), but just imagine a future NASA budget discussion that goes "we're gonna have to mothball our asteroid deflector 3000 because it eats 5% of yearly NASA budget and thats just not worth it"-- that could be the mathematically correct choice, which confounds me.
All the math assumes that the probabilities will follow historic trends and is relatively static. With single digit events, we really have no way in knowing what the actual likelihood of impact is. It could be 1 in 100 million, it could actually be 1 in 1 million and we've been rolling a bunch of nat 20s.
Before we build out the asteroid blaster 9000, the first step is detection. With that in place then we get actual good risk and probability calculations. If the detector tells us "There's no object that will strike earth in the next 1000 years" we can safely not put any budget into asteroid defense. If, on the other hand, the detector shows "Chicxulub 2.0 will hit in the next 100 years" then your probability of an impact is 1 and the actual budget worth it is going to be much closer to that $8e+16 number calculated earlier.
We can already say that we have very high completion of cataloguing near-Earth objects that are anywhere near extinction-event / Chicxulub-sized (~10km), and have a majority of catastrophic / country-killer (~1km), and are digging deeper and deeper into regional / city-killer (~100m) bodies.
What we don't have is comets. Comets on long period orbits just aren't readily detectable with this sort of survey unless they're quite close in to the Sun, and I don't think we have great statistics on frequency vs size, size being something that requires very specific radar cross-checking to establish with any confidence. A long-period comet or hyperbolic body has a potential impact velocity much higher than inner system asteroids, and impact energy scales with impact velocity squared.
You can fight this a bit by working in the thermal infrared, which you really need a specific sort of space telescope for. But long-period comets and hyperbolic impactors will be a probabilistic threat for the foreseeable future. I would say "Be thankful that they're so rare", but the data from observatories like Rubin on these bodies during points of their orbit where they're close enough to the sun to actually detect, is necessary to statistically characterize their existence with any confidence.
I also believe the approximate bounds we have on impact probability are good enough for this estimate and quite unlikely to be off by a factor of 100, because we can guess at both size distribution and impact likelihood from craters (on earth and moon), and if the >10km object impact likelihood was over 1/million years we would expect to see a hundred times more craters of the corresponding size...
We already have 10s of years of certainty with the current observations. Most of the uncertainty comes from the interactions of unknown objects. As the mappings of objects increase, our predictions will become much better.
The other thing to consider is that large objects will have much better certainty. A 10km asteroid won't be influenced (much) by colliding with 100 1m asteroids. It will only be impacted if it hits or swings by something like a 1km asteroid.
Rubin should in a pretty short timeframe (a few years) give us an orbital mapping of all the >1km asteroids, which is pretty exciting.
To be honest, I think the 1km diameter range might still be a major fraction of the actual risk, because the estimates around "human exctinction every 100Ma" are probably much too pessimistic.
Tracking large near earth objects is wise for several global and domestic security reasons.
Have a great day =3
The image of the woman holding the model of the sensor is nice because it includes a moon for scale.
Question I was curious about is whether or not the focal plane was flat (it is).
This is an interesting tidbit:
> Once images are taken, they are processed according to three different timescales, prompt (within 60 seconds), daily, and annually.
> The prompt products are alerts, issued within 60 seconds of observation, about objects that have changed brightness or position relative to archived images of that sky position. Transferring, processing, and differencing such large images within 60 seconds (previous methods took hours, on smaller images) is a significant software engineering problem by itself. This stage of processing will be performed at a classified government facility so events that would reveal secret assets can be edited out.
They are estimating 10 million alerts per night, which will be released publicly after the previously mentioned assessment takes place.
>This stage of processing will be performed at a classified government facility so events that would reveal secret assets can be edited out.
Interesting, I'm guessing secret spy satellites?
https://en.wikipedia.org/wiki/United_States_Space_Surveillan...
1: https://en.wikipedia.org/wiki/Space_Surveillance_Telescope
The thing that really saddens me is that the military gets to filter the data first and scientists only get to see the already manipulated data instead of a raw feed from their own instrument.
Both because they can't be made invisible, and because you need to avoid collisions.
However, what we strive for is being accurate to "if your eyes COULD see like this, it would look like this". To the best our our ability of course. We did a lot of research into human perception to create this and tired to map the information of color and intensity in a similar way to how your brain constructs that information into an image.
Let me tell you, I did not appreciate how deep a topic this was before starting, and how limited our file formats and electronic reproduction capabilities are for this. The data has such a range of information (in color and intensity) it is hard to encode into existing formats that most people are able to display. I really want to spend some time to do this in modern HDR (true HDR, not tone-mapping) where the brightness can actually be encoded separately than just RGB values. The documentation on these (several competing) formats is a bit all over the place though.
Edit: I wanted to edit to add, if anyone reading this is an expert in HDR formats and or processing, I'd live to pick your brain a bit!
[0] https://aladin.cds.unistra.fr/AladinLite/?target=12%2026%205...
[1] https://rubinobservatory.org/gallery/collections/first-look-...
https://aladin.cds.unistra.fr/AladinLite/?baseImageLayer=CDS...
Here's the canonical example: https://home.cern/science/computing/grid and a lab that didn't have enough horsepower using a different grid: https://osg-htc.org/spotlights/new-frontiers-at-thyme-lab.ht...
Personally, I have pointed the grid folks (I used to work on grid) towards cloud, and many projects like this have a tier 1 in the cloud. The data lives in S3, metadata in some database, and use cloud provider's notification system. The scientists work in adjacent AWS accounts that have access to those systems and can move data pretty quickly.
In my experience, there are many tradeoffs using cloud but I think when you consider the entire context (people-cost-time-productivity) AWS ends up being a very powerful way to implement scientific infrastructure. However, in consortia like this, it's usually architected in a way that people with local infrastructure (campus clusters, colo) can contribute- although they tend to be "leaf" nodes in processing pipelines, rather than central players.
That capability is coming with starlink laser modules. They've already tested this on a dragon mission, and they have the links working between some satellite shells. So you'd be able to offload data from pretty much everywhere starlink has presence.
Second this, but other areas are of great interest too. Kuiper Belt discoveries and surveys FTW!
(For those who haven't noticed, you can just simply paste 186.66721+8.89072 or whichever target you're curious about in an astronomy database like Aladin[0], and there right-click on "What is this?")
[0] https://aladin.cds.unistra.fr/AladinLite/?target=12%2026%204...
https://skyviewer.app/embed?target=186.66721+8.89072&fov=0.2...
https://skyviewer.app/embed?target=185.46019+4.48014&fov=0.6...
https://skyviewer.app/embed?target=188.49629+8.40493&fov=1.3...
https://skyviewer.app/explorer?target=187.69717+12.33897&fov...
They look like they're roughly in the same plane. Is it safe to assume they're roughly in the same plane, or could they be really distant along the line of sight? The similarity in size makes me think they are, but I don't have any reason to be confident in that judgment.
https://noirlab.edu/public/images/iotw2421b/ ("thought to be right next to each other — both at a distance of about 50 million light-years")
In this particular case, RSCG 55 means a group of galaxies[3], of which NGC 4410 is one member. Apparently RSCG is the "Redshift Survey Compact Groups" (https://cds.unistra.fr/cgi-bin/Dic-Simbad?RSCG) so 55 is just an index number.
That's also the case for the 4410 after NGC; in that case stands for "New General Catalog". In contrast the Sloan Digital Sky Survey gave NGC 4410 the name SDSS J122628.29+090111.4 where the numbers indicates its position in the sky.
The "index number" and the "position of the sky" are the two most popular naming strategies.
[1] NGC 4410 has 37, but the NGC objects are among the more popular https://simbad.u-strasbg.fr/simbad/sim-id?Ident=+NGC+4410&Nb... [2] https://simbad.u-strasbg.fr/simbad/sim-id?Ident=M87&submit=s... [3] https://simbad.u-strasbg.fr/simbad/sim-id?Ident=RSCG+55&NbId...
Incredible.
The interesting thing about the spikes in our images is that they stay fixed in image plane coordinates, not sky coordinates. So as the night sky moves (earth rotates) the spikes rotate relative to the sky leading to a star burst pattern over multiple exposures.
There was a livestream presentation and press conference up on YouTube
https://www.youtube.com/live/Zv22_Amsreo?si=zQLeGfJokZoCPkji
At time 1:38:19 - one hour 38 minutes 19 seconds - into the livestream presentation, there's a slide that shows RGB streaks of fast-moving objects that were removed for the final image.
Those streaks are apparently asteroids.
Perhaps it is indeed a glitch or cosmic ray event.
(Is there a better URL for the slide deck?)
* the only objects in space are stars
* all stars are equally bright
* the average brightness is one that can be seen
(unless you roll all this into "homogeneous"?)
For observatories like Rubin, is there a plan for keeping them open after the funding ends? Is it feasible for Chile to take over the project and keep it going?
On a practical note, what happens to a facility like this if one day it's just locked up? Will it degrade without routine maintenance, or will it still be operational in the event someone can put together funding?
Arecibo was about 60 years old for comparison when it collapsed, but there are lots of faculties that are effectively ships of Theseus, with new instruments coming in over time which refresh the faculty (and when that stops happening, then you get concerned).
* https://petapixel.com/2025/06/23/hands-on-at-the-vera-c-rubi...
Not super technical, but a little higher level (with decent analogies to photography, for their traditional audience).
TL;DR: VCRO is capable of imaging spy- and other classified US satellites. An automated filtering system (involves routing through some government processing facility) is in place to remove them from the freshly captured raw data used for the public transient phenomena alert service. 3 days later, unredacted data is made available (by then the elusive, variable-orbit assets are long gone.)
[1] https://www.theatlantic.com/science/archive/2024/12/vera-rub...
(via https://news.ycombinator.com/item?id=44352455, but no comments there)
What's that faint illuminated tendril extending from M61 (the large spiral galaxy at the bottom center of the image) upwards towards that red giant? It seems too straight and off-center to be an extension of the spiral arm.
EDIT: The supposed "Tidal tail" on M61 was evidently known from deep astrophotography, but only rarely detected & commented upon.
https://docushare.lsstcorp.org/docushare/dsweb/Get/LSE-163/L...
I'll see myself out.
This one's extra-special! The pattern is multiple + shapes, rotated and superimposed on top of each other. And they're different colors! That's this telescope's signature scanning algorithm—I don't know what that is, but, it's evident it takes multiple exposures, in different color filters, with the image plane rotated differently relative to the CCD plane in each exposure. I assume there's some kind of signal processing rationale behind that choice.
edit: Here's one of the bright stars, I think it's HD 107428:
https://i.ibb.co/HTmP0rqn/diffraction.webp
This one has asteroid streaks surrounding it (it's a toggle in one of the hidden menus), which gives a strong clue about the timing of the multiple exposures. The asteroids are going in a straight line at a constant speed—the spacing and colors of the dots shows what the exposure sequence was.
I think this quote explains the reason they want to rotate the camera:
> "The ranking criteria also ensure that the visits to each field are widely distributed in position angle on the sky and rotation angle of the camera in order to minimize systematic effects in galaxy shape determination."
https://arxiv.org/abs/0805.2366 ("LSST [Vera Rubin]: from Science Drivers to Reference Design and Anticipated Data Products")
LSST is a alt/az telescope. The earth rotates. The sensor plane must rotate during the exposure to prevent stars from streaking, which it accomplishes via this platform: https://docushare.lsstcorp.org/docushare/dsweb/Get/Document-...
The fact that the sensor rotates without the spider rotating also spreads out the diffraction spikes.
But that rotation is limited, so between different exposures with different filters the image plane will be rotated relative to the sky.
As the quote goes the change in orientation has benefits for controlling systematics..
However, the brightness of the diffraction effects is much lower than the light of the focused image itself. Where the image is itself dim, the diffraction effects might not add up to anything noticeable. Where the image supersaturates the detector (as can happen with a 1-pixel-wide star), the "much lower" fraction of that intensity can still be annoyingly visible.
There are projects (dragonfly and huntsman are the ones I know of) which avoid using mirrors and instead use lenses (which have their own issues) to reduce this scattered light.
My favourite fact about these in relation to astronomy is that you can actually get rid of the diffraction spikes if your support vanes are curved, which ends up smearing out the diffraction pattern over a larger area [2]. However this is often not what you want in professional astronomy, because the smeared light can obscure faint objects you might want to see, like moons orbiting planets, planets orbiting stars, or lensed objects behind galaxies in deep space. So you often want sharp, crisp diffraction spikes so you can resolve these faint objects next to or behind the bright object that's up front.
[1] https://www.celestron.com/blogs/knowledgebase/what-is-a-diff...
All the dim fuzzy objects are galaxies much further away.