The glaring hole in this story is: how are they going to *know* which images "contain nudity"? It's all very well to say they'll blur those, but that's cart before horse.
Could as well say "we'll make our social network safe by banning all the bad people," which in fact they are also saying, and we know how well that always works.
Very likely, the purported answer is going to be two letters long, and that is not an adequate answer for obvious reasons.
@feld Even full-powered humans can't agree on what should "count" as nudity let alone whether there should be exceptions to blurring and if so exactly where those exceptions are. So even if one managed to train a model to give 100% correct answers by somebody's standard - which I think is less possible even today that many people seem to think - it would be doomed anyway because there is no correct non-problematic answer to give.
@feld Maybe. But now enough white girls (and far more boys, but they don't count) have killed themselves after being blackmailed over nude images in private messaging, that now the politicians think it is a problem that must be solved. And the networks want to show themselves to be acting responsibly in a way that will prevent the politicians from regulating them.
It's a mess, and most of the things the various actors are doing are understandable, but I don't like it.
@feld I think a big part of the problem is failing to recognize that young adults really are exactly that, which means they *are* fit to assume some adult responsibilities long before they're 18. But the world is moving in the opposite direction from such recognition, especially, as you say, in North America.