Conversation
Notices
-
Peep (peep@annihilation.social)'s status on Friday, 12-Apr-2024 00:31:08 JST Peep @mattskala @feld Most people never become "adults." They just end up with adult responsibilities in their lap. - † top dog :pedomustdie: likes this.
-
Matthew Skala (mattskala@mstdn.io)'s status on Friday, 12-Apr-2024 00:31:09 JST Matthew Skala @feld I think a big part of the problem is failing to recognize that young adults really are exactly that, which means they *are* fit to assume some adult responsibilities long before they're 18. But the world is moving in the opposite direction from such recognition, especially, as you say, in North America.
-
feld (feld@bikeshed.party)'s status on Friday, 12-Apr-2024 00:31:10 JST feld @mattskala We're going to have to end the shame of nudity and sexuality or we will have to give up our right to privacy from the government. It seems like the conservative / puritanical crowd seems to think suppressing and shaming children is a good thing and these suicides are the cost of keeping society under their control.
I'm pretty sure they're going to lose this battle. They can't keep the kids away from information anymore, and that information also includes sexual expression.
Perhaps if we just... taught them what's appropriate for their age instead of trying to shelter them until they're 18+... we might actually get somewhere.
This can be accomplished without sexualizing children. Europe doesn't have really these problems and it seems their children grow up to be much more level-headed adults than we get in the USA...
The trick is for the USA to not get caught up in overcorrecting for this problem which will only lead to a setback. But I don't even know where we begin when they've successfully convinced half the country that teachers are all evil and we can't possibly homeschool all kids -
Matthew Skala (mattskala@mstdn.io)'s status on Friday, 12-Apr-2024 00:31:11 JST Matthew Skala @feld Maybe. But now enough white girls (and far more boys, but they don't count) have killed themselves after being blackmailed over nude images in private messaging, that now the politicians think it is a problem that must be solved. And the networks want to show themselves to be acting responsibly in a way that will prevent the politicians from regulating them.
It's a mess, and most of the things the various actors are doing are understandable, but I don't like it.
-
Matthew Skala (mattskala@mstdn.io)'s status on Friday, 12-Apr-2024 00:31:12 JST Matthew Skala @feld Now imagine the fun if they try to apply it to private messages and there are humans in the review loop...
-
feld (feld@bikeshed.party)'s status on Friday, 12-Apr-2024 00:31:12 JST feld @mattskala not unimaginable that someone would try to do that, but I don't see the point of policing private messages because ... they're private. If they were published to a large group or to the general public they should probably be moderated, but private messages? That's pretty extreme IMO and will just push people to use DMs to share other forms of direct contact (signal, whatsapp, whatever) -
Matthew Skala (mattskala@mstdn.io)'s status on Friday, 12-Apr-2024 00:31:13 JST Matthew Skala @feld Even full-powered humans can't agree on what should "count" as nudity let alone whether there should be exceptions to blurring and if so exactly where those exceptions are. So even if one managed to train a model to give 100% correct answers by somebody's standard - which I think is less possible even today that many people seem to think - it would be doomed anyway because there is no correct non-problematic answer to give.
-
feld (feld@bikeshed.party)'s status on Friday, 12-Apr-2024 00:31:13 JST feld @mattskala OnlyFans is special in this regard -- which most people aren't aware of -- because it uses AI for classification and prioritization of content in the publishing queue but has humans doing reviews.
Yes there are literally tons of people whose job all day is to review the posts on OF and approve/deny them from being published. So when you post content on OF it doesn't happen without some level of review first.
Instagram will never pay to do that. It doesn't fit their business model, so they'll rely on AI only and the false positives and false negatives will very very poorly reflect on them. -
feld (feld@bikeshed.party)'s status on Friday, 12-Apr-2024 00:31:14 JST feld @mattskala why is that not an adequate answer? OnlyFans has an AI model that can detect if your vagina changes and flags the posts to make sure you aren't posting photos of someone else on your account. It has actually caused problems for people who had cosmetic surgery on their vulvas and then had to go through a new verification process to get their accounts back. -
Matthew Skala (mattskala@mstdn.io)'s status on Friday, 12-Apr-2024 00:31:16 JST Matthew Skala The glaring hole in this story is: how are they going to *know* which images "contain nudity"? It's all very well to say they'll blur those, but that's cart before horse.
Could as well say "we'll make our social network safe by banning all the bad people," which in fact they are also saying, and we know how well that always works.
Very likely, the purported answer is going to be two letters long, and that is not an adequate answer for obvious reasons.