Otherwise everyday photography in public spaces would become legally risky or impractical, especially in crowded areas where avoiding all faces is nearly impossible and where the focus clearly isn't on the individuals but the landmark or scene itself.
If a deepfake is made of someone, that person was clearly the subject of the image/video and thus violates his/her privacy. This extra legislation would help protect in case the original image/video was taken with consent (so no privacy issue).
There's one photographer, François Brunelle, who has a project where he takes pictures of doppelgängers: http://www.francoisbrunelle.com/webn/e-project.html
Some more examples
https://www.wbur.org/hereandnow/2024/10/14/francois-brunelle...
https://www.reddit.com/r/BeAmazed/comments/1cimhns/canadian_...
AFAIK, copyright allows for independent creation (unlike patents), so unless one person had deliberately copied the other's appearance, there should be no problem.
I was just recently trying to find an associate from my past with an unfortunately common whole full name in his language and was rather surprised at how many of the people depicted online with his name looked extremely similar to him, but upon closer discernment were surely not him. How do you discern that a “deepfake” (what a dumb term) is similar to you and not just similar to anyone else?
Also, what if AI is just trained with images of you? The consequent image will similarly only be an inspiration of you, not you, not the same as even using images in an attempt to graft a very similar facial feature onto an image or map it into a video.
It is in fact also what artists do in physical medium, they look at something/someone and are inspired by it to create an illusion that gives the impression of similarity, but it is not that thing/person. Will this new law possibly make art illegal too because people have not thought this through?
On a digital screen, it is of course also not you at all, it is individual pixels that fool the mind or give an illusion. It is really a pernicious muddling of reality and logic we have allowed to emerge, where the impression of depiction is the property of someone even though it is not that person, but also only if it is the means for control, ie money. Mere peasants have no control over their image taken in public.
The Sphere in Vegas is another good example of this on a large scale, each “pixel” is roughly 6” apart from the other and about 2” in diameter, for all intents and purposes separate objects, each only projecting one array of colors in a matrix of individual LEDs. Up close it looks no different than a colored LED matrix, only when you stand sufficiently far away is your mind tricked into believing you see something that is not really there.
Frankly, these moves to “protect” are very much a direct assault on free expression and even may create unintended consequences if art exceptions do not apply anymore either. Is it now illegal for me to paint a nude, how about from an image that I took of someone? What about if I do it really well from my own memory? What about if I use a modeling tool to recreate such a nude as a digital 3D object from images or even memory? Is AI not also simply a tool? Or is it more?
Presumably the only reason to use a deepfake of a specific person is to produce things specifically in relation to that person. Otherwise, why bother? So “is this about the individual or just coincidence?” isn’t likely to be a factor in any complaint made. This seems like a hypothetical rather than something that is likely to need answering in practice.
You presume both too much and not enough.
How are you going to do that unless it actually looks like you?
Then we see how they’re doing and decide - hey let’s not be like them.
Imagine I drew a Coca Cola logo in paint. Now I own the copyright to my picture of the Coca Cola logo. Next I stick it on my new brand of soda. That’s not allowed.
Coca-cola own rights to their logo. You should own rights to your face and voice.
Let's say, as an example, a married politician having an affair with someone. Generally, news sites will publish photos with the face of the politician visible but would blur the other person. The former is clearly a person of public interest, the latter is not. Even if it's a photo taken in a public space.
Then, when someone uses their face to promote something, someone else can repeat the face with what it promotes.
So I think the whole thing actually works in this particular case.
What specific behaviors does this forbid that weren't already forbidden?
But me (not really) on my website (I don't have one) where I trash politicians (I don't) and post a photo of said politician eating poop, that should be 'frowned upon'. (Or worse to shame an ex-gf or a colleague that 'won't yield to my sexual advances').
While reading the article though, I thought of the cases where a paparazzo takes a photo of CelebrityA, then the CelebrityA posts said photo to her Insta (without getting permission from the agency) and the agency sues her. Now (in Denmark) the CelebrityA can sue the paparazzo for taking her photo in the first place (right?). This would protect people from getting uncomfortable photos.
What we'll probably see is, celebrity look-alikes will be contacted to license out their own "features".
Andy Warhol drew images of Campbell's soup cans in paint. https://en.wikipedia.org/wiki/Campbell%27s_Soup_Cans
He controlled the copyright to that painting. That's transformative, and the result does not meaningfully affect Campbell's ability to trade.
Quoting that Wikipedia link: "Although Campbell's never pursued litigation against Warhol for his art, United States Supreme Court justices have stated that it is likely Warhol would have prevailed.", with two quotes from two Supreme Court cases.
The second such quote is from Neil Gorsuch: "Campbell's Soup seems to me an easy case because the purpose of the use for Andy Warhol was not to sell tomato soup in the supermarket...It was to induce a reaction from a viewer in a museum or in other settings."
On the other hand, were Warhol to stick his copyrighted images on a new brand of soup, that would violate trademark law as it would confuse buyers.
While in the West people have no respect to other people, and don't bother to blur anything. I think it would be better for everyone if you couldn't post photos of other people without their permission and if annoying Youtubers would go to jail.
Also when talking about some celebrity on TV they often show a drawing if they could not obtain rights to a photo.
Am I missing something or is this just plain racism? There are lots of japanese people who don't look japanese, foreigners who are permanent residents, and japanese-looking people that aren't japanese - how is it respectful to protect just a certain ethnic groups privacy?
Business don't exist to respect or care about people they exist to generate profit so the idea of a business "respecting" something is not even realistic.
The law is what outlines what are the limits and guarantees the basics rights to everyone.
Big overgeneralization. Here in Germany the "Recht am Eigenen Bild" (literally right to your own image) has existed for decades, and similar to Japan publishing images of others has some pretty big limitations and without consent is usually restricted to places or persons of public interest. To the chagrin of Google Street view or Twitch streamers
Every burglars wet dream. I have no idea what crime is like in Japan but in EU this is not an option.
"If, for example, you use a continuous recording of the road in which other vehicles' license plates are visible to defend yourself against a traffic ticket, you could be violating data protection, a serious offense that could be punishable by a fine of up to 300,000 euros."
https://rinckerlaw.com/name-image-and-likeness-how-to-protec...
But Jakob Engel-Schmidt has been talking about this in the Danish news since April/May, back when two opposing political parties created a computer-generated fake video depicting Mette Frederiksen saying things that would have outraged voters.
(I'm talking philosophically by the way, not legally)
For someone like Cormac McCarthy, whose sparse punctuation, biblical cadences, and apocalyptic imagery create an unmistakable "voice," the argument seems strong. His style is as identifiable as vocal timbre e.g. readers recognize McCarthy prose instantly, just as they'd recognize his speaking voice.
Is a Donald Trump impersonator (example[1]) copying the creative performance of Donald Trump (president)? What if someone did intend to create deepfakes of Donald Trump (president) and instead of using an image or audio of Donald Trump (president) as source material, use the Donald Trump impersonator as the source material?
I guess you worry about stuff like person A looks like celebrity person B and sells their image for, say, frosty frootloop commercials. As long as A is not impersonating B, ie. claiming to be B, I can't see a problem. "Hi, my name is Troy McClure, you may know me for looking like Serena Williams." I guess it will be the decade of the doppelgänger agencies, like in Double Trouble ;) [1]
[1] https://www.imdb.com/title/tt0087481/?ref_=nv_sr_srsg_1_tt_8...
Same situation as today: if you have a lookalike out there who does pornography, and somebody you know runs across it, they'll think it's you and not much you can do about that except explain.
Dollars to doughnuts that this law is used against people not misrepresenting themselves, who happen to look like famous people.
There have been many cases where a company wanted to hire say, actor X to voice their commercial, actor refused, so they hired someone else with a nearly identical voice, the original actor sued and won(!!!!!) because apparently it's their "signature" voice.
I disagree because obviously that means the other person has no right to make money using their voice now, at no fault of their own?
But yeah I'd imagine you'd have the same problem here - you can't generate a picture of say, Brad Pitt even if you say well actually this isn't Brad Pitt, it's just a person who happens to look exactly like him(which is obviously entirely possible and could happen).
(In music, some other cases have been about suspected misuse of actual recordings, e.g. a cover band being sued because the original musician believes they actually used one of their recordings, and disproving that can be tricky. I don't think that can as easily happen with look-alikes)
From Wikipedia: "Public figures can be photographed as part of their function or professional activity... A photograph of a public figure taken as part of his private life therefore still requires explicit authorization for publication. Thus, the Prime Minister cannot oppose a journalist photographing him at the exit of the Council of Ministers or during an official lunch, but he can prohibit the publication of photographs representing him at an event in his private life, such as a family reunion.”
https://fr.wikipedia.org/wiki/Droit_à_l%27image_des_personne...
Is it a moral right which cannot be transferred? Is there a time limit? Does it expire upon death? If not, who inherits the right?
Or is it more like an economic right, which may be transferred?
Or is the author using "copyright" in a very broad and non-legal sense?
California, for example, has laws concerning the misappropriation of likeness, but these are not copyright laws.
Does the proposed Danish law allow deepfake use by consent, and what counts as consent? If clause §123/43.b of the Microsoft MacGoogleMeta user agreement says "by agreeing to this service you allow us to make and distribute deepfakes" - does that count as consent?
There is no creativity involved whatsoever. Plenty of people look similar enough that they share "copyrighted" features. Cartoons of prominent people = copyright infringement? (Europe has a long history of judgments and precedents that prominent people can be parodied etc., how will that square with a fancy copyright protection.) You can principially make money on your copyright, so if a twin "sells" their face rights and the other twin demands a share, then what?
Just make deepfakes a specific crime and do not mess with IP any further. It is already a mess.
Public photography? does this mean your image cant be sold if take in public? I'm sure there are many other scenarios that would be interesting to argue about as well.
Like Arnie wouldn't allow his likeness in the C64 predator game (Which also had backstory not in the movie, blew my mind, games could build on movies and actors had rights to the likeness of a movie character they were)
Does this mean corporation's can't CCTV me like I can't film in a theater?
A lot of problems with this, and the real privacy benefits won't be enforced, we will see what happens.
Or if I get tattoo wit logo, is that "my own feature" and now I have copyright?!
This is like giving copyright to a name, there will be collisions and conflicts.
Punitive damages are very rare or non-existent depending on the country and the loser of the case usually has to pay the winning party's legal fees. There just isn't the incentive to sue someone over something silly like what you've mentioned.
I sure as hell don't.
It's also why the idea that "code is law" popular in certain circles was always misguided.
We moved past content scarcity decades ago and we are squarely in the attention scarcity regime. We use copyright against itself to have open source. We prefer interactivity and collaboration, as in open source, social networks or online games. Copyright stands in the path of collaboration and interaction.
Will companies now need to license "the likeness" of people too? Will "likeness" be property to be sold or rented?
- either the famous person cannot use their look if a lookalike refuses to agree
- or they have to pay all lookalikes to use their own image
- or the lookalikes get less protection under this law
- a person might lose their look-rights if they change their appearance to look like someone else
- someone who wants to go into acting might not get hired if they look too much like a famous actor
They already do.