• 0 Posts
  • 47 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle















  • If you want an unsettling thing to think about, look into Calhoun’s rats.

    Social media has essentially overcrowded our functional distance with one another.

    You could be on a ranch with no one around for miles, and yet you have hundreds if not thousands of other people are directly interacting with you giving and competing for dopamine hits.

    And just like the rats we now have people not leaving their domiciles, being apathetic, hedonistic, etc. We’re mentally falling apart because we’re just too overcrowded.

    “Tankie!” “Fascist!” “Heathen!” “Religious nut!” “Zionist!” “Antisemite!”

    (Eventually the rats end up eating each other.)

    Yet here we both are. That dopamine drip sure is nice…

    Don’t forget to like and subscribe!


  • A lot of victim blaming in this thread.

    I don’t agree with their theological views, and I don’t love that indoctrination is so often tied to humanitarian efforts.

    But these are people who were trying to help out children in the middle of the second worst humanitarian crisis in the world right now, and the one that very very few people are giving much attention to as the other dominates news cycles and most of the Western world has just written off Africa as a whole (oops). (Also, I think this may be the only story about Hati on Lemmy in recent history, in fact).

    They didn’t ‘deserve’ getting brutally murdered for sticking around and not abandoning the children’s schools and homes they spent the past decades cultivating.

    They did a lot more to help people than I ever have, even if a key factor in their doing so was what I might consider delusional thinking.

    And so even if I’m not a fan of some aspects of their lives, I respect what they did do, and think it’s a bit fucked up to be making light of their deaths.



  • Not necessarily. There’s been a lot of advances in watermarking AI outputs.

    As well, there’s the opposite argument.

    Right now, pedophile rings have very high price points to access CSAM or require users to upload original CSAM content, adding a significant motivator to actually harm children.

    The same way rule 34 artists were very upset with AI being able to create what they were getting commissions to create, AI generated CSAM would be a significant dilution of the market.

    Is the average user really going to risk prison, pay a huge amount of money or harm a child with an even greater prison risk when effectively identical material is available for free?

    Pretty much overnight the CSAM dark markets would lose the vast majority of their market value and the only remaining offerings would be ones that could demonstrate they weren’t artificial to justify the higher price point, which would undermine the notion of plausible deniability.

    Legalization of AI generated CSAM would decimate the existing CSAM markets.

    That said, the real question that needs to be answered from a social responsibility perspective is what the net effect of CSAM access by pedophiles has on their proclivity to offend. If there’s a negative effect then it’s an open and shut case that it should be legalized. If it’s a positive effect than we should probably keep it very much illegal, even if that continues to enable dark markets for the real thing.