Teachers in the classrooms have been globally sounding the alarms for decades about the loss of discipline, the loss of basic manners, the loss of respect for authority, the loss of empathy, the attention issues (attention seeking and attention impairment), the increase entitlement, the inability to cope with negatives, the increase in illiteracy, etc.. No one has listened. Then things go wrong and we blame the teachers.
It is ok for kids to be mischievous. It is ok for young adults to take the piss out of each other in a healthy way. But this looks to me like an education problem. That kind of value based education where parents used to educate kids to be compassionate to each other, to respect each other and to f**ng understand that if a moment of fun with some friends could ruin someone else's life maybe it is not worth being the cool dude for 5 minutes. We have lost that. Most kids do not have these values these days.
But what did we expect? We have been systematically ditching those values. We have an older generation now that is selfish, egotistical, careless and dismissive with anything that it is not them and their belief framework whatever that is. We have polarised to the extent of hatred. And this is showing in society. It is showing in our kids.
I don't think tech is to blame. I don't think kids are to blame. This might be our fault.
No, I think it absolutely is "tech". Clicks, upvotes, attention are the reward and coin of social media. It looks as though the more debasing the content, the higher the reward—tech is encouraging bad behavior.
In other words, people are raising winners and winners get to do what they want, like Ricky Bobby and Donald Trump.
Talladega Nights (2006) exposed this change in US culture twenty years ago. https://www.youtube.com/watch?v=eY5VNDvea1M&t=189s
No, we used to teach them to hide their cruelties and told them which targets were acceptable. Ask anybody who experienced being gay before 2000.
> ...
> That kind of value based education where parents used to educate kids to be compassionate to each other, to respect each other and to f*ng understand that if a moment of fun with some friends could ruin someone else's life maybe it is not worth being the cool dude for 5 minutes. We have lost that. Most kids do not have these values these days.
> ...
> I don't think tech is to blame. I don't think kids are to blame. This might be our fault.
I don't think just one thing is to blame, and it's a fool's errand to try to pin it on one thing.
But here's a cause I think you missed: corporate capitalism, and the rise of dual-income families. Before you blame the parents, realize they're probably working for "the man" instead of being around their kids when teachable moments happen. The kids spend most of their days from infancy to adulthood in some kind of institutional setting (daycare to school) with the legal minimum adult-to-child ratio (business is business!), and the family all comes together exhausted at the ass-end of the day. We're in the second or third generation of that.
This situation is good for business. You take a working solution, and split it into two problems that business can make money "solving": business gets more of the parents labor, and more of their money because they have to pay for childcare while they work.
Part of the story of the last 40+ years is business cannibalizing society (families, community, civil society) for private gains. And they protect their gains because the social problems are somewhat multi-factoral, so they can always pay for propaganda and redirect the blame away from themselves. There's a reason why "blame the parents, make them fix it" is so tempting and "blame business, make them fix it" is taboo.
In the world that we are in today, we should concentrate on what are the tactical wins we can take advantage of to reduce harm.
This comes down to adding basic guardrails on the generation of sexual material on foundation models (eg. no sexual imagery can be created without age verification) as well as updating revenge porn and CP laws to remediate the now pertinent loophole around digitally generated imagery.
Also, this is a global issue. Similar incidents have happened in South Korea, India, Thailand, Japan, etc.
Educating children has always been tough, but it used to be the case that for a parent to fight against the other influences their kids had growing up meant to be against a few friends. And making sure that their kids resonated with the parents values instead of the "bad influences" ones used to be a full time job.
It would have been UNTHINKABLE for me and my peers to do some of the shit teenagers find acceptable today to another kid. And that was thanks to the values our parents gave us. We were not special. We were the average kid being raised 30 years ago.
In the age of social media, a modern parent is up against hundreds of video hours of the Andrew Tates of the world, the manosphere, the crypto grifters, the gym bros, the makeup divas, the get rich quick scams, etc... All examples of a society compensating rubbish ideas. Hundreds of thousands of examples of people doing the wrong things, and staying on top and being granted lambos, and money and fame.
If you are a parent (not you personally, generic you) your job us to scream louder than all that influence. To spend every single minute, every single ounce of energy telling your kids that all that shit is not ok.
That doing the wrong thing is not ok even if it nets positive to you. And lead by example.
That independently of your political views, the recent behaviour of elected officials is not acceptable. That those influencers are in the wrong. That they are not examples to follow. You need to stay on top of all that. You need to make them clear that all that people they are constantly exposed to are despicable human beings and that this behaviour at scale would be our global doom.
If they get all that from 4 hours of instagram and facebook a day, 30 minutes of daily parenting is not going to cut it. You need to be the biggest influence.
And if you are not doing that, sorry, you are a shitty parent, and I blame you for the current state of our younger generations' principles. If you are too busy with your life to scream louder than the rest of the influences, you should have thought twice what kind of resources (time in this case) you had before having a kid.
This is an education problem. And I think we are beyond tactical wins to reduce harm. Either we start taking responsibility collectively as a society, or this cannot be fixed with laws, and rules, and age checking OSs and porn ID checks.
In a society where your average fellow citizen is a Karen, or a gym bro, or political or religious fanatic that cannot name 3 continents, then it follows that their kids are going to have the same level of literacy, critical thinking, empathy and ideas.
It seems more about power and money than anything else, and the moral grandstanding/outrage is the manipulative icing on top.
EDIT: I am not saying any of this is ok. I am saying it's crazy social media has been destroying young people's lives encouraging them to sexualize for an audience for decades, and nobody is willing to do something that would maybe actually help these kids, like ban social media for under 18. I'm glad the tides are turning though.
I think you might misunderstand the issue.
Adults posting their own nudes on social media, even to profit from it? At a glance, seems OK, might have some moral/ethical considerations if you're involved with religion, but that should more be "Should I post my nudes?" rather than "Should others be allowed to post nudes?" but alas, religious people often don't think like that.
People posting generated/fake nudes of others (sometimes children) on social media, even just for fun? At a glance, obviously not OK and obviously something that has to be addressed, in some manner, the very least make it illegal if it isn't already.
Obviously OK, and the puritans have to learn to live with it.
Children are being sexualized and exploited left and right on places like Insta, Youtube, often even encouraged by their parents as it can lucrative. And this is totally okay and encouraged.
But when kids start sexualizing each other it's news and a big problem.
I'm saying there is a strange standard applied where some forms of exploitation are encouraged. And I don't get it. But I get why it's confusing the kids.
Huh, why would that be OK and encouraged? I haven't heard a single person, either on HN or around me AFK that they're OK with chil;dren being sexualized and exploited on social media, who on earth would even try to make such an argument?
https://www.latimes.com/entertainment-arts/story/2024-10-10/...
https://www.nbcnews.com/pop-culture/celebrity/piper-rockelle...
> In the 147-page complaint, the plaintiffs described Smith as a “mean-spirited control freak” and alleged she made comments about children’s genitalia, shouted obscene and sexually graphic remarks at them, encouraged them to be “sexy” and “sexually aggressive” in videos, and inappropriately touched the children on their legs, thighs and buttocks. One plaintiff said Smith told her she was mailing Rochelle’s underwear to a man who liked to “sniff” it.
"Society" is not a basis for morality. Consent or consensus are not a basis for morality. "Society", which means every one of us, has a duty to obey objective moral principles and learn to apply them within the concrete situations that make up our lives. Man is not free to confect his own moralities to taste, because he does not confect his own nature.
So, the first thing we must do is accept the objectivity of moral principles. Without doing that, we would only be left to debate matters of taste.
[0] Before the rabid dogs commence their barking, allow me to say that I do maintain that racism and bigotry are immoral, but I have a sound basis for maintaining their evil, unlike the moral relativist with his vacuous emotional ranting.
Or its simpler corollary: Don't do to others what you would _not_ want done to you.
Everything else derives from this.
And repressing bad behavior is a good thing.
The crux of the issue is that revenge porn laws do not extend to digitally manipulated images in most jurisdictions.
It's extremely easy to do all sorts of weird things with AI, all local. Controlling that means controlling the hardware, something none of us wants, and it will get only easier, I suppose. So, doing what I said earlier becomes even easier if these stuff gets automatically devalued by commoditization.
I'm not saying I know better than people who want restrictions, and I'm not trying to offer yet another "ban all bans" opinion, I just don't see any other realistic solution. There are however many other people much more knowledgeable than me in these matters, so, maybe I'll be positively surprised.
> In 2017, Kekilli blocked her Instagram account from being accessed in Turkey, saying that users from that country had sent a multitude of abusive and threatening messages.
and this:
> A discussion has been trending on X (formerly Twitter) after a post featuring side-by-side images of Sibel Kekilli from the early 2000s and her later look in the popular series Game of Thrones. The caption read, “She was once a p** star, but HBO gave her the role anyway,” which has garnered close to 10 million views.
from https://www.yahoo.com/entertainment/celebrity/articles/inter...
Why should that even matter? So you can see her naked online... So what?! I'm reading "Sluts" from Beth Ashley, and she identifies these patterns perfectly in todays world even when they're not obvious.
Males too, they value being tough too much so they don't report abuse. One example is Chester Bennington (may he rest in peace):
> Bennington was afraid to ask for help, not wanting people to think he was gay or a liar, and the abuse continued until age 13
I have a son, and I would be devastated if I couldn't give my son the courage to report something like this. I'm thankful that in this day and age, the "gay" stigma is much less pronounced (like 0 in this wonderful country called Germany). That said, we still have a lot to do though!
So how do we solve this without making porn and nudity nothing sacred? I remember first coming to Germany and people being confused because I was too shy to change my clothes in the male changing room together. Then I realized... Everyone is naked, why should I be ashamed?
Wait, you buried the lede here! Instagram can block countries? How can I do that?
Of course I agree with teaching kids that people might have various views about nudity, but I think effectively teaching them that if they take nude photos of themselves it is the end of the world and will inflict permanent damage to their reputation as a means to try to prevent it happening is absurd.
I think if anything the opposite would be the better solution – to teach kids that it's perfectly normal and respectable in this day and age for people to share nudes with each other, but that it's important to trust those you share the nudes with if you don't want them getting out.
Similarly with deep fakes I don't think we should be telling kids how awful it is for them to be deepfaked, and that they are a victim etc, but that this is just something that's likely to happen these days and while it's disrespectful, and while they have a right to be angry, it's also not something to get overly worked up about.
I just think we have to be pragmatic about this.. The only reason there's any shame in any of this is because we have a societal sigma around nudity. You're not going to get rid of deepfakes and nudes being leaked, but you might be able to change attitudes such that it doesn't really matter.
These fakes are made of young women with what looks like cum all over them or in a pose to give a blowjob or be penetrated. Devaluing nudity does not change how people interpret porn.
I don't think the genie is easily returned to the bottle and the cure may be worse than the disease.
Well, I guess the argument goes that regardless of how much you lock down centralized platforms like Grok, these tools can run locally on a PC so as long as people can do local inference with this tooling, it won't fully go away. With that said, limiting the centralized platforms from generating nudes from uploaded images/photos feels like an obvious limitation they should implement, if they haven't already.
So if we consider that we could probably limit most of the "off-hand" stuff that happens with the platforms, but we cannot fully limit the offline ones, I'm guessing that the best solution is merely "Education" here, together with laws. AFAIK, it's already illegal to create deepfakes and share those with others, but probably the education around this isn't great, as it's such a new thing hardly adults understand, much less younger folks.
It's a societal problem that I'm afraid doesn't have a technological solution, as far as I can tell, because the cat is out of the bag, so whatever "solution" we come up with, have to go beyond just technical capabilities/limitations.
One thing people might be missing to carefully consider, is the whole "private" vs "sharing" part of this. People been fantasizing, drawing pictures and similar things about others (even strangers) for as long as humans probably been around. What's new, is that it's effortlessly to share those now, and they spread far, wide and fast. I don't feel like we could possibly stop the whole "I don't like people fantasizing about me" part, it's just too human to get rid of, but what we need to get rid of is the whole sharing part, which is what actively harms people.
It's also a matter of scale. If a dozen teenage boys in a class are generating nudes, they feel safety in numbers. If it's only the weird computer kid who can do it, it's far easier to address and far more likely that it'll blow back on him.
The two options are either the people in power standing up for the girls or giving the girls the power to deal with it themselves.
People who have power generally are benefiting from the structure in place and so don't want to change it and/or they don't want to do any more work. Expelling all the kids who create deepfakes would cause a lot of arguing from parents and people who are on the boys' side, and they just don't want to deal with that. It's easier to tell the girls to be quiet.
The other option is setting up a system that rebalances the power. For example, if a kid gets caught making deep fakes, give their victim access to every single thing on their devices: Private messages, Discord chats, images, etc. and let the victim decide who and what to release their private information to. Not going to happen.
Another reason nothing is going to be done is if we teach 11 year old girls it's not acceptable for people to do this to them, they'll carry that forward through their life and a lot of people who find it gross to create fake porn of children are fine with doing it to older women and they don't want to create women who create a fuss about it. There are a lot of people who think it's disgusting I was sexually propositioned when I was 10/11 but think it's fine I can't go for a walk in my neighborhood now without being bothered since I'm older.
Make it a crime. Prosecute it. This isn’t hard, unless you have a legislature that is incapable of passing laws. People are becoming fed up with this stupidity. Give it a few years and a few congressman kids getting targeted as it becomes more mainstream and things suddenly change. We’re at the leading edge of the stupidity, voters, families, politicians aren’t angry enough yet. It will change, legislation is cyclical.
And I agree that giving up that personal information would be a nightmare. That's one reason it won't happen. It's just probably the only analogue to what the perpetrators do that would actually scare them.
We're likely just going to put up with this state of affairs.
It also highlights HN's demographics. What younger women feel is problematic is viewed as trifling by a number of younger or middle aged men on HN (especially those without kids).
No healthy man of good will wishes to see women get hurt or disrespected.
1: Unintended consequences
2: That power-hungry people latch on to issues like this to further political agendas that have severe negative consequences; mostly by using "think of the children" to stifle important debate and discussion of unintended consequences.
If you think it’s a really big issue, why don’t you own the problem?
You could just go turn off the trillions in AI spending and destroy computing as we know it. No cop-outs, remember.
Instead I think a) kids shouldn't be on the internet and b) the public school system is a barely supervised dumpster fire.
If society wasn't so puritanical there would be no harm from it.
What is "puritanical" then, in your opinion? Is it being uptight? Do we abandon our efforts to uphold modesty and dignity of the human person? Do we stop teaching our children about boundaries and propriety, and their responsibilities as they mature? How should boys and young men conduct themselves around women? Should they learn about the concept of consent, and respecting others' wishes?
It's time we got back to sexual ethics, as it is absolutely not the case that anything goes or even that "consent" suffices to make something okay.
1. This has always been possible but the bar has been lowered to barely above typing “her, but nude!”. Opposed to a talented photoshop or pencil artist doing this previously. Is the issue scale?
2. Lots of things have been illegal and immoral without tools. Assault for example. We don’t really have to deal with those things on a large basis. The difference here is effectively thought crime until distribution takes place and then it’s just another form of assault right? We already have laws on bullying and assault, no?
Like I said… skipping ahead, let me guess, For The Children; we must block local models, AI except for The NYSE Chosen Ones, encryption, and must have digital ID to use everything. Tell me this article is in favor of anything else.
As soon as you figure out that everything you read at large outlets, esp those owned by Condé Nast is directly written to affect a stock price somewhere it becomes a little exhausting.
Someone correct me if I'm wrong, but I don't think the comment section is meant for others to educate you about what the article says, you're supposed to read the article yourself first, then comment.
Everything you've said might be true, but are you actually adding insights to the conversation here, when you admit to not even knowing what the article is trying to say in the first place?
The article has a beyond obvious take even from the headline that “ITS SO BAD, We Must Do Something (tm)”. I read the article first, show where my take is incorrect.
Most of us know what this is, and it’s choke full of the usually moral pleas wrapped up in a nice package of “don’t mind the side effects”.
> Like I said… skipping ahead, let me guess, For The Children; we must block local models, AI except for The NYSE Chosen Ones, encryption, and must have digital ID to use everything. Tell me this article is in favor of anything else.
Please quote the parts of the article where they mention banning/blocking local models, saying that any AI except "The Chosen One" is OK, anything about encryption or even anything about Digital IDs?
No, delisting online nudify apps will take care of 99% of this. There's no reason for them to exist.
Once you can run decent image generation models on your phone, what shall we do? Prohibit apps that allow you to run custom models? Add a mandatory safety check?
It seems like a pointless exercise to me - better punish the actual crime rather than try to regulate the tech.
There is significant societal value in preventing crimes rather than just punishing them.
And even if you outlawed the distribution of uncensored models, we face the next question when you can do the fine tuning directly on the phone and someone makes an accessible app.
Do we then prohibit apps that allow you to fine-tune a model, do we mandate safety scans of the images in the dataset?
> when you can do the fine tuning directly on the phone
Well you'd need millions of on/off nude samples to start with, so I don't think that's likely. And that's assuming we get a billionfold increase in mobile CPU performance or a billionfold decrease in fine tuning compute requirements.
You're reaching for things that, if they happen, are a decade or more out. It's ok to solve today's problems today. Perfect is the enemy of good.
Sure it's distasteful, but it's no different to cutting and pasting heads onto porn stars as was done plenty before.
I'm not saying there's entirely no harm in that, it's obviously a form of bullying, but AI does not make it novel, or a crisis.
(Alas I guess that might be true for this lesson too!)
"hey kids, get used to being exploited sexually, as it would be too expensive to require massive multinational corporations to bother to regulate AI"
It is bullying.
But AI does not make it more than that, or a crisis.
edit: Oh I get it now, we have some lobby group posting here.
A bullying machine that makes this sort of extreme bullying trivial and leads directly to widespread use of the bullying machine is a massive problem.
That said, there's the question of where the line is drawn. If I clip from a biology textbook an image of a nude woman, and I clip the Gerber baby's head and paste it onto the nude woman's head, what is that? If I generate the face of a young boy, then draw a stick figure under it and give the stick figure a stick erection, what is that? I don't know. I mean these things are weird and offensive, but to what degree?
There were a few really big stinks about this too, with the school director getting involved.
I mean, this is not new. Just evolved.
You must not have kids. If you did, I’m speculating that you’d get it.
If you do have kids (or if you can empathize having kids), would you be ok with tech that super easily allows your kids’ peers to share/laugh about nudes of your kids?
You can kill with a gun, a knife, a fork. Removing the tool does not change the situation of being around a person who wants to kill you.
We homeschool our kids and they have grown up to be respectful people.
Furthermore, most people are not technical in nature and cannot tell the difference between deepfakes and real photos and videos in short bursts.
Basically, the friction needed to develop revenge porn has been dramatically reduced.
They don't need to. The point is that eventually, everyone will just assume it's a fake, exactly because you can't tell, and fakes are easier to produce and thus more common than real leaked nude photos and videos. At that point, a deepfake shouldn't be more socially damaging than a rumor.
Rumors can be exceedingly damaging - to the point of death - even without convincing photographic evidence of it.
After social media became common it was also hypothesized that embarrassing stories dug up from somebody’s last would not be harmful anymore but that future has never materialized.
this is not a good, nor should it ever be, an inevitable thing
It should be treated as CP and revenge porn, but the issue is legislation surrounding these in a number of cases doesn't treat digitally altered images as within scope of CP or revenge porn.
Additionally, platforms like Grok are taking advantage of this ambiguity by arguing that they do not need to add guardrails.
Adding basic guardrails like not generating a sexualized image without identity verification or preemptively blocking questionable prompts would dramatically reduce this problem.