College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

  • FernandaKipper@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Chatgpt jest darmowy, więc wielu uczniów korzysta z niego, aby w prosty sposób odrabiać zadania domowe. Potrzebujemy ściślejszej kontroli w tej kwestii

  • Doplano@sh.itjust.worksBanned
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Essay writing was always my Achilles’ heel until I discovered a professional writing service online. Hiring their team of skilled writers has completely transformed my approach to assignments. Now, every nursing essay examples I submit is crafted to perfection, which has notably boosted my academic standing. This service has not only provided me with high-quality essays but has also given me peace of mind and the freedom to focus on other critical aspects of my studies.

  • Rozz@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Am I wrong in thinking student can still generate an essay and then copy it by hand?

    • CrimsonFlash@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      Not during class. Most likely a proctored exam. No laptops, no phones, teacher or proctor watching.

      • Syrc@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        …then why can’t you do that with a school laptop that can’t access the web…?

          • Syrc@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            And so do colleges. If they don’t want to invest $2000 every 5/6 years for a hundred dumpster windows 95 PCs it shouldn’t be the paying student to suffer.

    • drekly@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Sounds like effort, I’m making a font out of my handwriting and getting a 3d printer to write it

      • Rozz@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        Obviously that is the next step for the technically inclined, but even the less inclined may be capable of generating them copying to save time and brain effort.

  • MaggiWuerze@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    has led some college professors to reconsider their lesson plans for the upcoming fall semester.

    I’m sure they’ll write exams that actually require an actual understanding of the material rather than regurgitating the seminar PowerPoint presentations as accurately as possible…

    No? I’m shocked!

    • OhNoMoreLemmy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      We get in trouble if we fail everyone because we made them do a novel synthesis, instead of just repeating what we told them.

      Particularly for an intro course, remembering what you were told is good enough.

      • zigmus64@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        The first step to understanding the material is exactly just remembering what the teacher told them.

        • Hemingways_Shotgun@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Meh. I haven’t been in Uni in over 20 years. But it honestly seems kind of practical to me.

          Your first year is usually when you haven’t even settled on a major. Intro classes are less about learning and more about finding out if you CAN learn, and if you’ll actually like the college experience or drop out after your first year.

          The actual learning comes when the crowd has been whittled to those who have the discipline to be there.

            • Hemingways_Shotgun@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              I would love to have that time and money back.

              One of the disadvantages of being of an age where you straddle the line between worlds without internet and with, is that you get to enjoy the 20,000 dollars you spent on learning in the 90s suddenly be available for free in the present.

              Seriously, there isn’t a single thing I learned in my Near Eastern Classical Archaeology degree that I couldn’t just go learn from Wikipedia today.

          • Dark Arc@social.packetloss.gg
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            2 years ago

            if you CAN learn

            I always found this argument completely unsatisfactory…

            Imagine someone coming up to you and saying “you must learn to juggle otherwise you can’t be a fisherman” and then after 14 years of learning to juggle, they say “you don’t actually need to know how to juggle, we just had to see if you CAN learn. Now I can teach you to fish.”

            You’d be furious. But, because we all grew up with this nonsense we just accept it. Everyone can learn, there’s just tons of stuff that people find uninteresting to learn, and thus don’t unless forced; especially when the format is extremely dry, unengaging, and you’ve already realized… You’re never going to need to know how to juggle to be a fisherman… ever.

            The show “Are you smarter than a fifth grader?” (IMO) accurately captures just how worthless 90% of that experience is to the average adult. I’ve forgotten so much from school, and that’s normal.

            The actual learning comes when the crowd has been whittled to those who have the discipline to be there.

            Also this is just ridiculous, “Everyone is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid.”

            • Tavarin@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              You do realize you get to choose which courses to take in undergrad right? Universities aren’t forcing you to take any of the courses, you choose ones in subjects you are interested in, and first year is to get you up to speed/introduce you to those subjects, so you can decide if you want to study them further.

              once you have a major or specialist, then yeah, you have some required courses, but they do tend to be things very relevant to what you want to do.

              • Dark Arc@social.packetloss.gg
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                2 years ago

                You do realize you get to choose which courses to take in undergrad right? Universities aren’t forcing you to take any of the courses, you choose ones in subjects you are interested in, and first year is to get you up to speed/introduce you to those subjects, so you can decide if you want to study them further.

                That’s not true at all, every degree has a required core curriculum at every university I’ve ever heard of (e.g., humanities, some amount of math, some amount of English, etc). It also says nothing for the K-12 years.

                • Tavarin@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 years ago

                  In my university you had breadth requirements, but it was 1 humanities course, 1 social science, and 1 science, and you could pick any course within those areas to fulfill the requirement. So you had a lot of choice within the core curriculum. Man, if other unis aren’t doing that, that sucks.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      IME, a lot of professors liked to write exams that specifically didn’t cover anything from the PowerPoint presentations lol

    • Aurenkin@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      My favourite lecturer at uni actually did that really well. He also said the exam was small and could be done in about an hour or two but gave us a 3 hour timeslot because he said he wanted us to take our time and think about each problem carefully. That was a great class.

  • Mtrad@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Wouldn’t it make more sense to find ways on how to utilize the tool of AI and set up criteria that would incorporate the use of it?

    There could still be classes / lectures that cover the more classical methods, but I remember being told “you won’t have a calculator in your pocket”.

    My point use, they should prepping students for the skills to succeed with the tools they will have available and then give them the education to cover the gaps that AI can’t solve. For example, you basically need to review what the AI outputs for accuracy. So maybe a focus on reviewing output and better prompting techniques? Training on how to spot inaccuracies? Spotting possible bias in the system which is skewed by training data?

    • Revv@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Training how to use “AI” (LLMs demonstrably possess zero actual reasoning ability) feels like it should be a seperate pursuit from (or subset of) general education to me. In order to effectively use “AI”, you need to be able to evaluate its output and reason for yourself whether it makes any sense or simply bears a statitstical resemblance to human language. Doing that requires solid critical reasoning skills, which you can only develop by engaging personally with countless unique problems over the course of years and working them out for yourself. Even prior to the rise of ChatGPT and its ilk, there was emerging research showing diminishing reasoning skills in children.

      Without some means of forcing students to engage cognitively, there’s little point in education. Pen and paper seems like a pretty cheap way to get that done.

      I’m all for tech and using the tools available, but without a solid educational foundation (formal or not), I fear we end up a society snakeoil users in search of the blinker fluid.

    • settxy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      There are some universities looking at AI from this perspective, finding ways to teach proper usage of AI. Then building testing methods around the knowledge of students using it.

      Your point on checking for accuracy is on point. AI doesn’t always puke out good information, and ensuring students don’t just blindly believe it NEEDS to be taught. Otherwise wise you end up being these guys… https://apnews.com/article/artificial-intelligence-chatgpt-courts-e15023d7e6fdf4f099aa122437dbb59b

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      That’s just what we tell kids so they’ll learn to do basic math on their own. Otherwise you’ll end up with people who can’t even do 13+24 without having to use a calculator.

      • Overzeetop@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        people who can’t even do 13+24 without having to use a calculator

        More importantly, you end up with people who don’t recognize that 13+24=87 is incorrect. Math->calculator is not about knowing the math, per se, but knowing enough to recognize when it’s wrong.

        I don’t envy professors/teachers who are hacing to figure out novel ways of determining the level of mastery of a class of 30, 40, or 100 students in the era of online assistance. Because, really, we still need people who can turn out top level, accurate, well researched documentation. If we lose them, who will we train the next gen LLM on? ;-)

        • InEnduringGrowStrong@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          end up with people who don’t recognize that 13+24=87 is incorrect

          I had a telecom teacher who would either allow you to use a calculator, but you had to get everything right.
          Or go without and you could get away with rougher estimates.

          Doing stuff like decibels by hand isn’t too bad if you can get away with a ballpark and it’s a much more useful skill to develop than just punching numbers in a calculator.

      • Arthur_Leywin@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        When will people need to do basic algebra in their head? The difficulty between 13+24 and 169+ 742 rises dramatically. Yeah it makes your life convenient if you can add simple numbers, but is it necessary when everyone has a calculator?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Like someone said. It’s not just about knowing what something is, but having the ability to recognize what something isn’t.

          The ability to look at a result and be skeptical if it doesn’t look reasonable.

          169+742. Just by looking I can tell it has to be pretty close to 900 because 160+740 is 900. That gives me a good estimate to go by. So when I arrive at 911. I can look at it and say. Yeah. That’s probably correct, it looks reasonable.

        • Mtrad@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          That sounds like ot could be a focused lesson. Why try to skirt around what the desired goal is?

          That also could be placed into detecting if something is wrong with AI too. Teach people things to just help spot these errors.

          In my experience, it’s so much more effective to learn how to find the answers and spot the issues than to memorize how to do everything. There’s too much now to know it all yourself.

    • Snekeyes@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      It’s like the calculator in the 80s and 90s. Teacher would constantly tell us “no jobs just gonna let you use a calulator, they’re paying you to work”…

      I graduated, and really thought companies were gonna make me do stuff by hand, cause calulators made it easy. Lol.

  • Four_lights77@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.

    • SamC@lemmy.nz
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Good luck doing one on one assessments in a uni course of 300+

    • pinkdrunkenelephants@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      And forget about having any sort of integrity or explaining to kids why it’s important for them to know how to do shit themselves instead of being wholly dependent on corporate proprietary software whose accessibility can and will be manipulated to serve the ruling class on a whim 🤦

      • jarfil@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        wholly dependent on corporate proprietary software

        FLOSS would want a word with you.

        • pinkdrunkenelephants@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          The way we have allowed corporations to take over the internet as a whole is deeply problematic for those reasons too, I agree with you. And it’s awful seeing what we’ve become.

      • Not_Alec_Baldwin@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        2 years ago

        It’s insane talking to people that don’t do math.

        You ask them any mundane question and they just shrug, and if you press them they pull out their phone to check.

        It’s important that we do math so that we develop a sense of numeracy. By the same token it’s important that we write because it teaches us to organize our thoughts and communicate.

        These tools will destroy the quality of education for the students that need it the most if we don’t figure out how to reign in their use.

        If you want to plug your quarterly data into GPT to generate a projection report I couldn’t care less. But for your 8th grade paper on black holes, write it your damn self.

        • pinkdrunkenelephants@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Putting quarterly data into ChatGPT is dangerous for companies because that information is being fed into the AI and accessible by its creators, which means you’re just giving away proprietary information and trade secrets by doing that. But do these chucklefucks give one single shit? No. Because they’re selfish, lazy assholes that want robots to do their thinking for them.

          • joel_feila@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            Well with more and more data abd services via the cloud, company don’t seem to care about data sharing

    • JTode@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      In what ways do you envision working with LLMs as an educator of children?

      I have used ChatGPT to explain to myself a number of fairly advanced technical and programming concepts; I work in Animation through my own self-study and some good luck, so I’m constantly trying to up my skills in the math that relates to it. When I come up against a math or C++ term or concept that I do not currently understand, I can generally get a pretty good conceptual understanding of it by working with ChatGPT.

      So at one point I wanted to understand what Linear Algebra specifically meant, and it didn’t stick but I do remember asking it to expand on things it said that weren’t clear, and it was able to competently do so. By asking many questions I was able - I think - to get clearer on a number of things which I doubt I ever would have learned, unless by luck I found someone who knows the math to teach me.

      It also flubbed a lot of basic arithmetic, and I had to mentally look for and correct that.

      This is useful to an autodidact like myself who has learned how to learn at a University level, to be sure.

      I cannot, however, think of a single beneficial way to use this to educate small children with no such years of mental discipline and ability to understand that their teacher is neither a genius nor a moron, but rather, a machine that pumps out florid expressions of data that resemble other expressions of similar data.

      Please, tell me one.

      • jarfil@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        Devise a physical problem that can be tested, have everyone in class pull a ChatGPT answer to it, have them read the answers out loud and vote on which one is right, then apply it to the physical version and see it fail. Show them how tweaking the answer just a bit solves the problem.

        years of mental discipline and ability to understand that their teacher is neither a genius nor a moron

        Ta-da! Just taught them that without all your years.

        I cannot, however, think of a single beneficial way to use this to educate small children

        Then you’re not a teacher. Please don’t ever teach small children.

  • TimewornTraveler@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Maybe machine learning models technically fit the definition of “algorithm” but it suits them very poorly. An algorithm is traditionally a set of instructions written by someone, with connotations of being high level, fully understood conceptually, akin to a mathematical formula.

      A machine learning model is a soup of numbers that maybe does something approximately like what the people training it wanted it to do, using arbitrary logic nobody can expect to follow. “Algorithm” is not a great word to describe that.

    • WackyTabbacy42069@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      It actually is artificial intelligence. What are you even arguing against man?

      Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn’t AI because you don’t like it is like saying rock and roll isn’t music

      • over_clox@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        If AI was ‘intelligent’, it wouldn’t have written me a set of instructions when I asked it how to inflate a foldable phone. Seriously, check my first post on Lemmy…

        https://lemmy.world/post/1963767

        An intelligent system would have stopped to say something like “I’m sorry, that doesn’t make any sense, but here are some related topics to help you”

        • XTornado@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          If it shouldn’t be called AI or not no idea…

          But some humans are intel·ligent and let’s be clear…say crazier things…

        • jarfil@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          If “making sense” was a requirement of intelligence… there would be no modern art museums.

          • over_clox@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            Instructions unclear, inflates phone.

            Seriously, if it was actually intelligent, yet also writing out something meant to be considered ‘art’, I’d expect it to also have a disclaimer at the end declaring it as satire.

            • jarfil@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              That would require a panel of AIs to discuss whether “/s” or “no /s”…

              As it is, it writes anything a person could have written, some of it great, some of it straight from Twitter. We are supposed to presume at least some level of intelligence for either.

        • Jordan Lund@lemmy.one
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 years ago

          Inflating a phone is super easy though!

          Overheat the battery. ;) Phone will inflate itself!

        • WackyTabbacy42069@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          AI doesn’t necessitate a machine even being capable of stringing the complex English language into a series of steps towards something pointless and unattainable. That in itself is remarkable, however naive it may be in believing you that a foldable phone can be inflated. You may be confusing AI for AGI, which is when the intelligence and reasoning level is at or slightly greater than humans.

          The only real requirement for AI is that a machine take actions in an intelligent manner. Web search engines, dynamic traffic lights, and Chess bots all qualify as AI, despite none of them being able to tell you rubbish in proper English

          • TimewornTraveler@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            2 years ago

            The only real requirement for AI is that a machine take actions in an intelligent manner.

            There’s the rub: defining “intelligent”.

            If you’re arguing that traffic lights should be called AI, then you and I might have more in common than we thought. We both believe the same things: that ChatGPT isn’t any more “intelligent” than a traffic light. But you want to call them both intelligent and I want to call neither so.

            • throwsbooks@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              I think you’re conflating “intelligence” with “being smart”.

              Intelligence is more about taking in information and being able to make a decision based on that information. So yeah, automatic traffic lights are “intelligent” because they use a sensor to check for the presence of cars and “decide” when to switch the light.

              Acting like some GPT is on the same level as a traffic light is silly though. On a base level, yes, it “reads” a text prompt (along with any messaging history) and decides what to write next. But that decision it’s making is much more complex than “stop or go”.

              I don’t know if this is an ADHD thing, but when I’m talking to people, sometimes I finish their sentences in my head as they’re talking. Sometimes I nail it, sometimes I don’t. That’s essentially what chatGPT is, a sentence finisher that happened to read a huge amount of text content on the web, so it’s got context for a bunch of things. It doesn’t care if it’s right and it doesn’t look things up before it says something.

              But to have a computer be able to do that at all?? That’s incredible, and it took over 50 years of AI research to hit that point (yes, it’s been a field in universities for a very long time, with most that time people saying it’s impossible), and we only hit it because our computers got powerful enough to do it at scale.

              • ParsnipWitch@feddit.de
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                2 years ago

                Intelligence is more about taking in information and being able to make a decision based on that information.

                Where does that come from? A better gauge for intelligence is whether someone or something is able to resolve a problem that they did not encounter before. And arguably all current models completely suck at that.

                I also think the word “AI” is used quite a bit too liberal. It confuses people who have zero knowledge on the topic. And when an actual AI comes along we will have to make up a new word because “general artificial intelligence” won’t be distinctive enough for corporations to market their new giant leap in technology….

                • throwsbooks@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  2 years ago

                  I would suggest the textbook Artificial Intelligence: A Modern Approach by Russell and Norvig. It’s a good overview of the field and has been in circulation since 1995. https://en.m.wikipedia.org/wiki/Artificial_Intelligence:_A_Modern_Approach

                  Here’s a photo, as an example of how this book approaches the topic, in that there’s an entire chapter on it with sections on four approaches, and that essentially even the researchers have been arguing about what intelligence is since the beginning.

                  But all of this has been under the umbrella of AI. Just because corporations have picked up on it, doesn’t invalidate the decades of work done by scientists in the name of AI.

                  My favourite way to think of it is this: people have forever argued whether or not animals are intelligent or even conscious. Is a cat intelligent? Mine can manipulate me, even if he can’t do math. Are ants intelligent? They use the same biomechanical constructs as humans, but at a simpler scale. What about bacteria? Are viruses alive?

                  If we can create an AI that fully simulates a cockroach, down to every firing neuron, does it mean it’s not AI just because it’s not simulating something more complex, like a mouse? Does it need to exceed a human to be considered AI?

            • sin_free_for_00_days@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              I’m with you on this and think the AI label is just stupid and misleading. But times/language change and you end up being a Don Quixote type figure.

      • TimewornTraveler@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 years ago

        I am arguing against this marketing campaign, that’s what. Who decides what “AI” is and how did we come to decide what fits that title? The concept of AI has been around a long time, like since the Greeks, and it had always been the concept of a made-made man. In modern times, it’s been represented as a sci-fi fantasy of sentient androids. “AI” is a term with heavy association already cooked into it. That’s why calling it “AI” is just a way to make it sound high tech futuristic dreams-come-true. But a predictive text algorithm is hardly “intelligence”. It’s only being called that to make it sound profitable. Let’s stop calling it “AI” and start calling out their bullshit. This is just another crypto currency scam. It’s a concept that could theoretically work and be useful to society, but it is not being implemented in such a way that lives up to its name.

        • BigNote@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          The field of computer science decided what AI is. It has a very specific set of meanings and some rando on the Internet isn’t going to upend decades of usage just because it doesn’t fit their idea of what constitutes AI or because they think it’s a marketing gimmick.

          It’s not. It’s a very specific field in computer science that’s been worked on since the 1950s at least.

    • Venia Silente@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Please let’s not defame Djikstra and other Algorithms like this. Just call them “corporate crap”, like what they are.

  • Mugmoor@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn’t exactly a new development.

    • whatisallthis@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      So what you’re telling me is that written tests have, in fact, existed before?

      What are you some kind of education historian?

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        He’s not pointing out that handwritten tests are not something new, but that using handwritten tests over typing them to reflect the student’s actual abilities is not new.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Same. All my algorithms and data structures courses in undergrad and grad school had paper exams. I have a mixed view on these but the bottom line is that I’m not convinced they’re any better.

      Sure they might reflect some of the student’s abilities better, but if you’re an evaluator interested in assessing student’s knowledge a more effective way is to make directed questions.

      What ends up happening a lot of times are implementation questions that ask from the student too much at once: interpretation of the problem; knowledge of helpful data structures and algorithms; abstract reasoning; edge case analysis; syntax; time and space complexities; and a good sense of planning since you’re supposed to answer it in a few minutes without the luxury and conveniences of a text editor.

      This last one is my biggest problem with it. It adds a great deal of difficulty and stress without adding any value to the evaluator.

    • lunarul@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      I had some teachers ask for handwritten programming exams too (that was more like 20 years ago for me) and it was just as dumb then as it is today. What exactly are they preparing students for? No job will ever require the skill of writing code on paper.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 years ago

        What exactly are they preparing students for? No job will ever require the skill of writing code on paper.

        Maybe something like, a whiteboard interview…? They’re still incredibly common, especially for new grads.

        • lunarul@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          A company that still does whiteboard interviews one I have no interest in working for. When I interview candidates I want to see how they will perform in their job. Their job will not involve writing code on whiteboards, solving weird logic problems, or knowing how to solve traveling salesman problem off the top of their heads.

          • pinkdrunkenelephants@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            And what happens when you run into the company that wants people who can prove they conceptually understand what the hell it is they’re doing on their own, which requires a whiteboard?

            I program as a hobby and I’ll jot down code and plans for programs on paper when I am out and about during the day. The fuck kind of dystopian hellhole mindset do you have where you think all that matters is doing the bare minimum to survive? You know that life means more than that, don’t you?

            • lunarul@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              The ability to conceptually understand what they’re doing is exactly what I’m testing for when interviewing. Writing a full program on a whiteboard is definitely not required for that. I can get that from asking them question, observing how they approach the problem, what kind of questions they ask me etc.

              I definitely don’t want them to do just the bare minimum to survive or to need to ask me for advice at every step (had people who ended up taking more of my time than it would’ve taken me to do their job myself).

              I’ve never needed to write more than a short snippet of code at a time on a whiteboard, slack channel, code review, etc. in my almost 20 years in the industry. Definitely not to solve a whole problem blindly. In fact I definitely see it as a red flag when a candidate writes a lot of code without ever stopping to execute and test each piece individually. It simply becomes progressively more difficult to debug the more you add to it, that’s common sense.

          • MNByChoice
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            Their job will not involve writing code on whiteboards,

            Really? We do that frequently when problem solving. No one expects the written code to be perfect, but it shouldn’t demand impossible things.

          • Dark Arc@social.packetloss.gg
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            That’s a valid opinion, and I largely share it. But, all these students need to work somewhere. This is something the industry needs to change before the school changes it.

            Also, I’ve definitely done white board coding discussions in practice, e.g., go into a room, write up ideas on the white board (including small snippets of code or pseudo code).

            • lunarul@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              I’ve definitely done white board coding discussions in practice, e.g., go into a room, write up ideas on the white board (including small snippets of code or pseudo code).

              I’ve done that too back before the remote work era, but using a whiteboard as a visual aid is not the same thing as solving a whole problem on a whiteboard.

              • Dark Arc@social.packetloss.gg
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                2 years ago

                It’s a close enough problem; a lot of professors I’ve known aren’t going to sweat the small stuff on paper. Like, they’re not plugging your code into a computer and expecting it to build, they’re just looking for the algorithm design, and that there’s no grotesque violation of the language rules.

                Sure, some are going to be a little harder “you missed a simicolon here”, but even then, if you’re doing your work, that’s not a hard thing to overcome, and it’s going to cost you a handful of points (if that sort of stuff is your only mistake you get a 92 instead of a 100).

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 years ago

          Which is equally useless. In the end you’re developing a skill that will only be used in tests. You’re training to be evaluated instead of to do a job well.

        • lunarul@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          I personally never had a problem performing well in those tests, I happen to have the skill to compile code in my head, and it is a helpful skill in my job (I’ve been a software engineer for 19 years now), but it’s definitely not a required skill and should not be considered as such.

  • Chickenstalker@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    If your exams can be solved by AI, then your exams are not good enough. How to get around this? Simple. Oral exams aka viva voce. Anyone who had defended their thesis knows the pants shitting terror this kind of exam does to you. It will take longer but you can truly determine how well the student understands the content.

  • rab@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    As an anecdote I used to do k-12 sysadmin and we once had a major issue during grade 12 English exam and the students had to handwrite their essays. Almost everyone failed. Most did not even finish. Most students can hardly even handwrite anymore and it makes sense