Given that language is an important lens through which we see the world, AI could subtly alter our perceptions and beliefs over the coming years, decades, and centuries.

  • rodbiren
    link
    fedilink
    arrow-up
    10
    ·
    11 months ago

    Would the opposite not be true? AI models work by predicting the next likely text. If we start changing language right from underneath it that actually makes it worse off at predicting as time moves along. If anything I would expect a language model to stagnate our language and attempt to freeze our usage of words to what it can “Understand”. Of course this is subject to continued training and data harvesting, but just like older people eventually have a hard time of understanding the young it could be similar for AI models.

      • rodbiren
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        Could be, though I am reminded of the early 2000s when the government did research on shorthand texting and whether it could be in encoded messaging. Things like lol, brb, l8t, etc. If there is one thing I know about AI is that garbage data will make the models worse. And I cannot think of a better producer of data that causes confusion than young people especially if they are given an exuse to do so.