Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • FooBarrington@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    5
    ·
    edit-2
    10 months ago

    They can’t decide how they represent data that’d require T3 (like us) which puts them, in your terms, at the level of memory, not intelligence.

    Where do you get this? What kind of data requires a T3 system to be representable?

    I don’t think I’ve made any claims that are related to T2 or T3 systems, and I haven’t defined “memory”, so I’m not sure how you’re trying to put it in my terms. I wouldn’t define memory as an adaptable system, so T2 would by my definition be intelligence as well.

    Actually it’s quite intuitive: Ask StableDiffusion to draw a picture of an accident and it will hallucinate just as wildly as if you ask a human to describe an accident they’ve witnessed ten minutes ago. It needs active engagement with that kind of memory to sort the wheat from the chaff.

    I just did this:

    Where do you see “wild hallucination”? Yeah, it’s not perfect, but I also didn’t do any kind of tuning - no negative prompt, positive prompt is literally just “accident”.