Mark Zuckerberg says sorry to families of children who committed suicide — after rejecting suggestion to set up a compensation fund to help the families get counseling::CEOs of Meta, TikTok, Snap, Discord, and X testified at hearing on child safety.

  • 31337@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    1
    ·
    5 months ago

    Meta could’ve done a lot of things to prevent this. Internal documents show Zuckerberg repeatedly rejected suggestions to improve child safety. Meta lobbies congress to prevent any regulation. Meta controls the algorithms and knows they promote bad behavior such as dog piling, but this bad behavior increases “engagement” and revenue, so they refuse to change it. (Meta briefly changed its algorithms for a few months during the 2020 election to decrease the promotion of disinformation and hate speech, because they were under more scrutiny, but then changed it back after the election).