The ESRB has added:

“To be perfectly clear: Any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone; the only piece of information that is communicated to the company requesting VPC is a “Yes” or “No” determination as to whether the person is over the age of 25.”

Sure, ok…

I don’t know what else to say about this, this will obviously turn into something else.

      • SatanicNotMessianic@lemmy.ml
        link
        fedilink
        arrow-up
        14
        ·
        1 year ago

        From the description, it sounds like you upload a picture, then show a face to a video camera. It’s not like they’re going through FaceID that has anti-spoofing hardware and software. If they’re supporting normal web cams, they can’t check for things like 3d markers

        Based on applications that have rolled out for use cases like police identifying suspects, I would hazard a guess that

        1. It’s not going to work as well as they imply
        2. It’s going to perform comically badly in a multi-ethnic real world scenario with unfortunate headlines following
        3. It will be spoofable.

        I’m betting this will turn out to be a massive waste of resources, but that never stopped something from being adopted. Even the cops had to be banned by several municipalities because they liked being able to identify and catch suspects, even if it was likely to be the wrong person. In one scenario I read about, researchers had to demonstrate that the software the PD was using identified several prominent local politicians as robbery and murder suspects.

            • Shikadi@lemmy.sdf.org
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              Yes, it does work like that in some cases. My comment is technically wrong, the best kind of wrong.

              As another commenter pointed out, the way they intend to do it sounds absolutely like they are going to do it the old way, which surprises me because the hold up a photo thing has been a solved problem for a while.

      • Umbrias@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Plaster sculpt, then add fake skin to, and add a small linear actuator for breathing stimulation, small twitch motors under the skin, and run it under some alternating leds to stimulate blood flow coloration. Should defeat almost all facial recognition software. Might need some eye fakes.

        Or just wear makeup to an insane degree. Or return to the forests and live a much happier life.

      • Radium@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        arrow-down
        4
        ·
        edit-2
        1 year ago

        TI would be every dollar I’ve ever made that you know absolutely nothing about how it works. You seem like someone who is barely technically proficient and likes to pretend like that means they know how things work.

        I’m a software engineer and can confirm that you are absolutely fucking wrong on this one.

        • Shikadi@lemmy.sdf.org
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          I’m a software engineer and I work in machine vision hardware. I may have been lazy with my response, but I do know what I’m talking about. On some level I’m probably in a bubble because I work close enough to the cutting edge of things that I wouldn’t expect any modern company to be employing such basic algorithms to a solved problem.

          I’m a software engineer and I can confirm that you are absolutely fucking rude on this one.

  • mindbleach@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    Drink verification can.

    Any images and data used for this process are never stored

    Anyone who believes this deserves it.

    • Jamie@jamie.moe
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Since it determines if you’re over the age of 25, maybe instead they could get a more accurate measure by having you drink a verification beer.

  • Jeena@jemmy.jeena.net
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    1 year ago

    I don’t think the day before your 18th/25th birthday and the day on your 18th birthday your face looks so much different.

  • Mozami@kbin.social
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    To be perfectly clear: Any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone; the only piece of information that is communicated to the company requesting VPC is a “Yes” or “No” determination as to whether the person is over the age of 25.

    I’d have a hard time coming up with a better lie than this.

  • coffeeguy@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    Pay: ESRB facial recognition + Denuvo system monitor + custom launcher with system privileges + game

    Pirate: game

    This type of stuff only punishes paying customers.

  • PlatypusXray@feddit.de
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    Can anybody actually remember voting for totalitarian control freaks who seem to be scared of people who are not under constant surveillance?

      • exohuman@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        While true, unfortunately the latest government spy bill is bipartisan. It will make end to end encryption for texts and chat illegal, using drug enforcement as the excuse.

    • reversebananimals@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Nearly a quarter of Americans say that a strong leader who doesn’t have to bother with Congress or elections would be “fairly” or “very good” and 18 percent say that “army rule” would be “fairly” or “very good.” More than a quarter of respondents show at least some support for either a “strong leader” or “army rule.”

      https://www.voterstudygroup.org/publication/follow-the-leader

      A disturbing minority of human beings unironically prefer being under a boot.

  • rickrolled767@ttrpg.network
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Can people who stop trying to throw tech at things where it clearly doesn’t belong? Seems like every time I turn around people are trying to use AI for things with the expectation that it’s some flawless innovation that can do no wrong.

    And that’s not even getting into the privacy nightmare that comes with things like this

  • RagingNerdoholic@lemmy.ca
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    we won’t ever ever keep your pictures and stuff for the juiciest possible marketing fodder, we super duper pinky swear

    image

  • SokathHisEyesOpen@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    To be perfectly clear: Any images and data used for this process are never stored, used for AI training, used for marketing, or shared with anyone

    Does anyone have some bridges for sale? I suddenly feel an urge to buy a bridge.

  • liara@lemm.ee
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Because this strategy worked so well for determining individuals’ assigned sex at birth. What could possibly go wrong?

    • Dojan@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      It already has gone wrong.

      There’s a story about a gay couple here in Sweden. One of the men lived with his mother.

      One morning, around 3-4AM I think, a group of masked men went into his apartment and woke him up violently. They physically abused him, before they took him away.

      Eventually he was taken to an interrogation room where he was questioned about a child he had supposedly sexually assaulted.

      At some point they showed him pictures of him and this purported child, only said child was his very much adult, twink-ass boyfriend.

      He and his boyfriend had shared the images with one another over a chat service, like Kik or something, which some American organisation had gotten their hands on, and then forwarded to Swedish police.

      Swedish police then swatted him, and when they stood there with egg on their face the investigation was dropped. No repercussions for the police. None of the people who brutally assaulted the man got any sort of punishment, because he wasn’t able to identify any of them, since they were masked and he shockingly didn’t have X-ray vision, and the police had magically lost all records of who they sent out to bring him in.

      Thinking back on this still fills me with rage. I’ve always thought our police were fairly chill and approachable, nothing like the gun toting cowards in the US, but no. It seems like ACAB holds true everywhere.

      Here’s a source. It’s in Swedish though so you might need to use Google Translate.


      Edit: I got some stuff wrong. I re-read the article and got really angry, so despite it being 3AM, I got out of bed and moved to my computer so I can correct the information and translate parts I find important.

      The man, Babak Karimi, lived with his mother, a nurse, and his partner whom he met in Malaysia ten years prior. His partner is thirty years old. The police enter the apartment, wake his mother, pointing flashlights and weapons at her, they pull his partner out into the livingroom, sans clothes, interrogate him but do not answer any questions in turn.

      They scream at Babak, demanding to know where his laptop is, looking to confiscate all electronics. They get up on his bed and hit him, then they pull him out of bed, down on the floor, and taze him. At this point in time, he believes that they’re being robbed, and that he is about to die.

      After locating electronics and obtaining passwords, they cuff him and take him away.

      At 7:54 they let him know what he’s suspected of: sexual assault of child, sexual exploitation of child, and creation and spreading of child pornography. They tell him his rights, food and healthcare, etc. He requests to meet a nurse because of his head due to the previous assault, but none is provided.

      At 13:21 they begin interrogating him. They ask him about mail addresses, phone numbers, a street address, whether he’s lived there, etc. He agrees with everything except the charges, because the events they are charging him for haven’t happened.

      As I previously stated were 21 files uploaded via Yahoo mail and we suspect that you’ve uploaded them. Three of these are considered explicit, and we suspect that they are produced by you. What are your thoughts?

      - That’s even stranger. What do you mean? - It’s not true, not something I’ve done.

      Jenny Rosdahl clarifies further. She says two of the images are of a young boy showing his bottom and being touched by the hand of an adult.

      This is about 48 images that are considered abusive, 98 images that are considered explicitly abusive, and 3 films that are considered explicitly abusive. What do you think about this?

      - Strange.

      He suggests that perhaps the images are of his partner, but the interrogator states that they’re not. Eventually, Babak and his lawyer are allowed to view the images, which obviously are of his 30 year old partner.

      According to the preliminary investigation material, which Kontext has gained access to, the suspicions against Karimi are based on image files that have come to the attention of the police through the organization National Centre for Missing and Exploited Children (NCMEC). NCMEC is funded by the U.S. Department of Justice and assists authorities and individuals in cases involving, for example, child abuse. NCMEC monitors the Yahoo email service using algorithms that analyze photographs with nudity. The images of Dennis have been categorized as child pornography by NCMEC. On March 6th, the Swedish police’s own investigator in the South region examines the photographs and reaches the same conclusion.

      Less than a week later, a search is conducted at Babak Karimi’s residence.

      The police examines Dennis. They want to confirm that he has the same birthmark present on the man on the images which during the preliminary investigation has been described as prepubescent.


      So basically. Because a U.S. government entity is using an AI model to spy on private emails, a gay man in Sweden was physically abused by Swedish police, not to speak of the “inspection” his partner had to go through, and the trauma the entire family must’ve suffered from it.

      In the end there was no justice for them. No repercussions for the police. The mother moved to Canada. The partner moved to Malaysia. When the article was written, Babak himself was in the process of closing up all unfinished ends here and moving as well.

    • TiredSpider@slrpnk.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Saw an app try exactly this. It was run by terfs and they wanted to lock out anyone who wasn’t a cis woman. Instead it labelled almost every black woman a man and many trans women got through the filter anyway.