• roux [he/him, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    14 days ago

    I can imagine. We have a firm “No Mr. Beast” rule in our house but my kid still learned about him through school and I’m having a hard time explaining the nuance of Mr. Beast helping people since it’s all profit motivated. But I’ve also suspected Beast was a sham so this year’s revolations have been “fun”.

    • Ericthescruffy [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      13
      ·
      14 days ago

      I have a 9 year old I have 50/50 custody with. We don’t do much youtube, if any, in our house but his mom is extremely Laissez-faire with electronic devices so I think he’s basically freebasing youtube over there.

      Both logistically and ethically you can’t really keep your kid in a perfect bubble and its unavoidable they’re going to get exposed to stuff like this so unironically: do talk to your kids about it. Especially with the future of AI: I believe Media literacy is going to be INSANELY important to start grasping as soon as possible. My partner and I have used Mr. Beast as a starting point to have conversations about how people can be playing themselves as a character or presenting things as reality even though they’re scripted. We’ve also used it to engage in conversations about how liking/enjoying certain content isn’t a value judgement on the people creating it and how we all enjoy things written and created by people who weren’t super great humans. I encourage checking out “making of” docs also. He’s 9 so I can’t say how well this all plays out long term…but I have observed him noticing recurring tropes in some of his media so I feel like there’s something to it.

      • Belly_Beanis [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        14 days ago

        Adding on to this, you should show them videos on YouTube that were proven false. One example I can think of is Joey Salah’s video about Trump cars being vandalized in black neighborhoods. People not part of his video caught the video on film from other angles. There’s one where you can see Salah talking to a group of people, who all ended up being the “vandals” in the video. Turns out, he hired a bunch of black actors under false pretenses, got the footage of them smashing cars with Trump bumper stickers, then edited all together to look like they were random people.

        The whole thing got exposed and there’s videos of it explaining the whole thing. You can show your kids how footage can be edited to reinforce biases (in this example: “black people are violent and will attack you or your property for any reason”) and how sites like YouTube use content like this to generate revenue through controversy.

        Project Veritas is another. It was also an early adopter of deep fakes, using them to make people say things they didn’t. Deep fakes of politicians saying stuff would be another example. The Biden-Vaporeon copy pasta is a hoot (probably not age appropriate for 9-year-olds. I dunno. I’m only an uncle).