These experts on AI are here to help us understand important things about AI.

Who are these generous, helpful experts that the CBC found, you ask?

“Dr. Muhammad Mamdani, vice-president of data science and advanced analytics at Unity Health Toronto”, per LinkedIn a PharmD, who also serves in various AI-associated centres and institutes.

“(Jeff) Macpherson is a director and co-founder at Xagency.AI”, a tech startup which does, uh, lots of stuff with AI (see their wild services page) that appears to have been announced on LinkedIn two months ago. The founders section lists other details apart from J.M.'s “over 7 years in the tech sector” which are interesting to read in light of J.M.'s own LinkedIn page.

Other people making points in this article:

C. L. Polk, award-winning author (of Witchmark).

“Illustrator Martin Deschatelets” whose employment prospects are dimming this year (and who knows a bunch of people in this situation), who per LinkedIn has worked on some nifty things.

“Ottawa economist Armine Yalnizyan”, per LinkedIn a fellow at the Atkinson Foundation who used to work at the Canadian Centre for Policy Alternatives.

Could the CBC actually seriously not find anybody willing to discuss the actual technology and how it gets its results? This is archetypal hood-welded-shut sort of stuff.

Things I picked out, from article and round table (before the video stopped playing):

Does that Unity Health doctor go back later and check these emergency room intake predictions against actual cases appearing there?

Who is the “we” who have to adapt here?

AI is apparently “something that can tell you how many cows are in the world” (J.M.). Detecting a lack of results validation here again.

“At the end of the day that’s what it’s all for. The efficiency, the productivity, to put profit in all of our pockets”, from J.M.

“You now have the opportunity to become a Prompt Engineer”, from J.M. to the author and illustrator. (It’s worth watching the video to listen to this person.)

Me about the article:

I’m feeling that same underwhelming “is this it” bewilderment again.

Me about the video:

Critical thinking and ethics and “how software products work in practice” classes for everybody in this industry please.

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I’m so glad I started adding this shit to Zotero to use as references in future long form articles, cause it turns out it’s also a pretty good bookmark manager

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        haha, I was wondering (and planning to ask)

        it’s still an unsolved problem in my life, and none of the solutions or frameworks I’ve come across yet have matched up to my needs. I might be doomed to have to write a software.

        • self@awful.systems
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          god they fucking would wouldn’t they. flashing back to MDN implementing a bunch of LLM bullshit and the two people responsible for sneaking it into the codebase getting increasingly passive aggressive (in a very cryptobro-reminiscent way) with the hundreds of developers who had a problem with it

          • froztbyte@awful.systems
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            1 year ago

            shit what happened with that, I got too busy with life

            did they just hunker down for a while and hope people would stop being mad? my first suspicion/expectation is that this is what they would do/did

              • froztbyte@awful.systems
                link
                fedilink
                English
                arrow-up
                5
                ·
                1 year ago

                that’s as close to diametrically opposite the right thing as one can manage to do

                impressive, I guess