Often just nuking all image uploads made during a certain time period. Which is why old image threads in Lemmy have time periods littered with broken images.
I don’t understand why Lemmy needs to have a built-in image server at all. Reddit didn’t have one for the longest time and it was fine. Sure, I don’t think anyone would be particularly happy with going back to Imgur etc., but it doesn’t seem worth the trouble.
Often they delete all images during the time frame of a CSAM attack, as that has been the only real feasible way to ensure images weren’t left behind. Though I think a few images have started using AI detection methods to remove images like that automatically (read up on that here and here), also Pict-rs now has a Log linking uploaded images to the user, so now images can be purged with the users.
Admins can purge posts manually which actually deletes them. Or use tools like db0’s lemmy-safety that tries to automatically search for CSAM and wipe it.
I think the problem here is the user didn’t finish their post which means the photo was uploaded but not associated with a post and therefore not purgeable that way.
So what have they been doing to nuke the csam images, editing the database directly?
Often just nuking all image uploads made during a certain time period. Which is why old image threads in Lemmy have time periods littered with broken images.
I don’t understand why Lemmy needs to have a built-in image server at all. Reddit didn’t have one for the longest time and it was fine. Sure, I don’t think anyone would be particularly happy with going back to Imgur etc., but it doesn’t seem worth the trouble.
It’s a trade off for us.
You risk CSAM, and have to shoulder the storage costs.
But you also help to reduce link rot, as the images are kept on the site, rather than an external image host that might explode/go VC one day.
Some instances do just disable the image server part (I think lemm.ee used to and still only allows small images?)
I mean I don’t know why we need images at all, this stuff worked fine when it was just a BBS
Uphill both ways.
They definitely should remove it, at least until moderation tools are available.
Yes
Often they delete all images during the time frame of a CSAM attack, as that has been the only real feasible way to ensure images weren’t left behind. Though I think a few images have started using AI detection methods to remove images like that automatically (read up on that here and here), also Pict-rs now has a Log linking uploaded images to the user, so now images can be purged with the users.
Admins can purge posts manually which actually deletes them. Or use tools like db0’s lemmy-safety that tries to automatically search for CSAM and wipe it.
I think the problem here is the user didn’t finish their post which means the photo was uploaded but not associated with a post and therefore not purgeable that way.
That last problem was fixed in an older version of the software. If you upload, but don’t post, it will now be deleted after a time.
You can test this pretty easily by just leaving your browser open with an image uploaded and trying to post it later.