I do agree that the original looks better, but the author of the post clearly prefers the modified version.
Pro tip: Digital sensors are much less forgiving than negative film when it comes to exposing highlights. With a bit of foresight they are best tackled at shooting time. Highlights from water/glass reflections are tamed by a fairly cheap polarizing filter, and if you shoot raw you should do the opposite of negative film and always underexpose a scene with bright highlights (especially if highlights are large or are in your subject of interest). Let it be dark, you will have more noise, but noise is manageable without having to invent what doesn’t exist in the parts that are the most noticeable to human eye.
Huh? This used to be true twenty years ago, but modern sensors in prosumer cameras capture a lot more dynamic range than can be displayed on the screen or conveyed in a standard JPEG. If you shoot raw, you absolutely have the information needed to rescue nominally clipped highlights. You get 2-4 stops of latitude without any real effort.
The problem here is different. If the rest of the scene is exposed correctly, you have to choose one or another. Overexposed highlights or underexposed subject. The workaround is to use tone mapping or local contrast tricks, but these easily give your photos a weird "HDR" look.
— If you think there is “clipping” and the somehow different “nominal clipping”.
Clipping is binary. Either you camera’s pixel clipped (i.e., accumulated enough photons to get fully saturated, and therefore offers no useful information for debayering), or it did not. If it clipped, then you lost colour data in that part of the image. If you lost colour data, and you want a colour photo, then in almost all cases one way or another you will have to rescue it by conjuring information up for the photo to look authentic and aesthetically good. (A lot of raw development software does it for you by default.) If the clipped bit is small, it is easy to do subtly. If it is big, which is a real danger with sun reflection bokeh, then… whoopsie.
— If you think modern sensors are safe from clipping.
This applies to any digital camera, from consumer to top professional models. Sensor bit depth is not even remotely enough to expose a scene with extreme highlights (e.g. sun or its reflections) without taking special measures regarding exposure at capture time. If you don’t tame highlights with a polarizing filter, you must dramatically underexpose the scene or they will clip.
1. Clipping occurs in the scene-referred raw camera sensor pixel data at capture time.
2. Raw processing software fills in the highlight by supplying the missing colour information. Often this happens by default.
3. When you obtain a display-referred JPEG, the missing information for clipped highlights was already supplied. Tone curves and other adjustments apply already after highlight reconstruction, making the final result look more organic.
In other words, with modern digital photography processing workflows you will never see any clipping on a histogram for the final JPEG (unless either something went really wrong or it was intentionally done for a look), and it is a poor foundation to make assumptions about what clipping had or had not occurred at capture stage.
What you can safely assume, however, is that a normally exposed digital photo featuring any sun reflections from water or glass will almost certainly have blown highlights.
Hence my original advice: don’t leave it up to ML when a little forethought can do wonders in terms of preserving information.
Regardless, it’s not generative AI making up details. The differentiated pixels are there in the camera jpeg even if traditional image processing techniques have to be used to make them visible onscreen. The complete structure of the feathers that isn’t visible in the camera jpeg, for example, is plainly visible just modifying the contrast.
Edit: I'll note some new models (SD3 and Flux) have a wider latent dim and seem to suffer from this problem less.
AI generated images are also biased strongly towards medium lightness. The photographer's adjusting of the tone curve may simply give it that "look".
I’ve used Photoshop’s generative fill many times on singular images and there’s no loss on the ungenerated parts.
What I'd like to know though... is how is the model so bad that when you tell it to "remove this artifact" ... instead of it looking at the surroundings and painting over with some DoF-ed out ocean... it slaps an even more distinct artifact in there? Makes no sense.
Ironically, some older SD1/2-era models work a lot better for complete removal.
In this case there are better tools for the job anyways. Generative fill shines when it’s over something that’d be hard to paint back in - out of focus water isn’t that.
As for whether it can be compromised... Probably? It sends all or some of your photo to a remove server, so that can certainly be taken.
I must be missing something obvious but I don't see it mentioned in the submission or comments, or perhaps I'm not making the connection
That’s just how AI generative fill works. You keep running it until it looks how you want.
Anything but the original. That the original might have shimmered (or otherwise looked like) this Bitcoin icon, should have no bearing on what it chooses to put there
https://www.reddit.com/r/photoshop/comments/1e5nyt7/generati...
I also expect some bug involved, where user selects a circular mask, but outer edge is slightly blurred (feathered), which, when multiplied with white background, gives a light circle-shaped contour. After that Photoshop fills not a "hole in the sky", but a "light circle-shaped object" in the sky.
Discussed 4 years ago: https://news.ycombinator.com/item?id=24196650
1. It makes people feel good because bad AI can’t take jobs away
2. It makes people feel bad because it’s further enshittification of experiences that are already getting worse
Web pages are universally accessible. Everyone has a browser on their device of choice. The web is 35 years old. Access to information is a solved problem.
Guess I'm just complaining. There are just so many links being submitted to HN these days that require accounts or specific apps installed. Feels like there should be a rule around this.
I don't have a lot of familiarity with app deep links, but my understanding is that originally deep links required a special non-domain to be registered that was separate from a normal web url.
Somewhere that's changed over the years and now regular web links like this default to opening in apps if the user has them installed.
You might be thinking of protocol handlers, like oacon:// to open something in software installed to handle that (in this case, launching openarena to connect directly from a dpmaster mirror with such application links included). I don't think they were called deep links back then, just different protocols, like http in http://example.org is a protocol that your browser is configured to handle and ftp:// used to be as well
These are still in relatively common use today, but on mobile devices it has become the norm to hijack specific domain names or even a path (e.g. F-Droid will try to handle repositories on third-party domains for you by trying to hook¹ any URL that contains */fdroid/repo/* -- so far, this has always been useful to me, but I can see the flip side). This link hijacking is often a pain for me as anyone linking to any Google product will make my phone try to open some Play Services component, which is largely not functional. I can't get rid of the system component (e.g. replace it with microG) without installing a custom ROM, which I can't do without getting rid of half the device's special features (no point having this phone then), but I also don't want it pinging back to the mothership so... a pain it shall be
As for your problem, reset the app's settings and it'll re-prompt you the next time you click one of these links in which app these links should open. It should do that any time there is (newly) more than one app that can handle any given URL
¹ https://github.com/f-droid/fdroidclient/blob/be028d71c2a25b9...
I understand the confusion (and frustration), but this is just a normal URL. The .app is just an ordinary TLD (but one of the newer ones.)
There also is the ability to register for a URL "scheme" (the bit that replaces "https"), which I believe is what you're thinking of and it does predate the https thing, but both have been around for a while. I'm guessing companies have just gotten more aggressive about using the https one.
Edit: and yes it is annoying, I've uninstalled the GitHub app because of this.
[1] https://developer.apple.com/documentation/xcode/defining-a-c...
Having an app installed but logged out is such a rare case, and one that’s easily solved.
I had the official bsky app installed, but I ended up with Graysky at some point and just forgot about it, and somewhere along the way my browser vendor (Apple) and the app vendor (bsky) decided that I would probably tolerate this horseshit. Reddit pulls the same shit but that’s extra special because it’s broken, so it goes to the App Store screen for Reddit and you just can’t load it at all without a laptop.
If you’re one of the people arguing this is cool you’re ugly and stupid like people who disagree with Linus.
This wouldn't even be CLOSED WONTFIX, there is literally nothing that anyone but you could do to fix this.
We're complaining about a completely normal URL that links to a normal webpage. This is very, very silly.
I have the Bluesky app installed and did not see a pop-up. It’s not “gaslighting”. It’s people having different experiences due to different OSes and settings.
It’s amazing how meaningless the word gaslighting has become.