• tony@lemmy.hoyle.me.uk
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    1
    ·
    1 year ago

    I was taught how punch cards work and that databases used direct disk access. In 1990.

    In college (1995) we learned Cobol and Assembler. And Pre-Object oriented Ada (closer to early pascal than anything I can see on wiki today). C was the ‘new thing’ that was on the machines but we weren’t allowed to use.

    The curriculum has always been 20 years behind reality, especially in tech. Lecturers teach what they learned, not what is current. If you want to keep up you teach yourself.

    • glad_cat@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      1 year ago

      I learned how “object-oriented databases” worked in college. After 20 years of work, I still don’t know if such a thing exists at all. I read books regularly instead.

      • tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Wiki says they existed, and may still do… never come across one. I thought mongodb might be one but apparently not.

        • DrDeadCrash@programming.dev
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Eplan, electrical controls layout tool, used an object oriented database as its file format, It still may. I saw recently that they entered a partnership of some sort with SolidWorks, so they’re still kicking.

      • Paradox@lemdro.id
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        I’ve used one before. Maglev is a ruby runtime built atop GemStone/S, which is an object db. Gives Ruby some distributed powers, like BEAM languages (Elixir and Erlang) have.

        Practically all it meant was you didn’t have to worry about serializing ruby objects to store them in your datastore, and they could be distributed across many systems. You didn’t have to use message buses and the like. It worked, but not as well as you’d hope.

        Amusingly, BEAM languages, have access to tools a lot like oodbmses right out of the box. ets, dets, and mnesia loosely fit the definition of an oodb. BEAM is functional and doesn’t have objects at all, so the comparisons can be a tad strained.

        Postgres also loosely satisfies the definition, with jsonb columns having first class query support.

    • Davin@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      I had to take a COBOL class in early 2000s. And one of the two C/C++ courses was 90% talking about programming and taking quizzes about data types and what do functions do, and 10% making things just beyond “hello world.” And I’m still paying the student loans.

    • PlexSheep@feddit.de
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Im currently studying Cybersecurity and I can speak positively about that. We’re taught C and Java in the programming course (java is still ew, but C is everywhere and will be everywhere). I know a course of two friends got taught Rust (I learned it at work, it’s great).

      The crypto we learn is current stuff, except no EdDSA or Post Quantum stuff.

    • oats@110010.win
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      A course in college had an assignment which required Ada, this was 3 years ago.

      • CoderKat@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        If it was something like a language theory class, that’s perfectly valid. Honestly, university should be teaching heavily about various language paradigms and less about specific languages. Learning languages is easy if you know a similar language already. And you will always have to do it. For my past jobs, I’ve had to learn Scala, C#, Go, and several domain specific or niche languages. All of them were easy to learn because my university taught my the general concepts and similar languages.

        The most debatable language I ever learned in university was Prolog. For so long, I questioned if I would ever have a practical usage for that, but then I actually did, because I had to use Rego for my work (which is more similar to prolog than any other language I know).

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      1 year ago

      What everyone would LIKE to learn is the exact skill that’s going to be rare and in high demand the second right after you graduate. But usually what’s rare and in high demand is also new, and there are no qualified teachers for it. Anyone who knows how to do the hot new thing is making bank doing it just like all the college grads want to do. My advice is to get out of college and then spend the next four working years learning as much as you can. You’re not going to hit the jackpot as a recent grad. You’re maybe going to get in the door as a recent grad.

    • fusio@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      this, and also nothing is 100% new - knowledge in similar areas will always help

    • Aceticon@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      I got an Electronics Engineering Degree almost 30 years ago (and, funnilly enough, I then made my career as a software engineer and I barelly used the EE stuff professionally except when I was working in high-performance computing, but that’s a side note) and back then one of my software development teachers told us “Every 5 years half of what you know becomes worthless”.

      All these years later, from my experience he was maybe being a little optimist.

      Programming is something you can learn by yourself (by the time I went to Uni I was already coding stuff in Assembly purelly for fun and whilst EE did have programming classes they added to maybe 5% of the whole thing, though nowadays with embedded systems its probably more), but the most important things that Uni taught me were the foundational knowledge (maths, principles of engineering, principles of design) and how to learn, and those have served me well to keep up with the whole loss of relevance of half I know every 5 years, sometimes in unexpect ways like how obscure details of microprocessor design I learned there being usefull when designing high performance systems, the rationalle for past designs speeding up my learning of new things purelly because why stuff is designed the way it is, is still the same, and even Trignometry turning out to be essential decades later for doing game development.

      So even in a fast changing field like Software Engineering a Degree does make a huge difference, not from memorizing bleeding edge knowledge but from the foundational knowledge you get, the understanding of the tought processes behind designing things and the learning to learn.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Also a software engineer… I look back on my undergrad fondly but was it really that helpful? … nah.

        I also put no stock in learning how to learn. If people want to learn something they do, if they don’t, they don’t. Nobody has to go to school to fish, play video games, or be a car guy, but all of those things have crazy high ceilings of knowledge and know how.

        If you go into an industry you’re not interested in, it doesn’t matter how well you learned to learn, you’re not going to learn anything more than required. For me, I’m constantly learning things from blogs, debates, and questions I find myself asking both for personal projects and professional projects.

        Really all a university is, is a guided study of what’s believed to be the foundational material in a field + study of a number of things that are aimed at increasing awareness across the board; that’s going to be more helpful to some than others.

        If you graduate and work in a bunch of Python web code … those fundamentals aren’t really that important. You’re not going to write quick sort of bubble sort, very few people do, you’re going to just call .sort().

        You’re also probably not going to care about Big-O, you’re probably just going to notice organically “hey this is really bad and I can rearrange it to cache the results.” A bunch of stuff like that will probably come up that you’ll never even pay any mind to because the size of N is never large enough for it to matter in your application.

        … personally I think our education system needs to be redone from the ground up. It creates way more stress than it justifies. The focus should be on teaching people important lessons that they can actually remember into adulthood, not cramming brains with an impossible amount of very specific information under the threat of otherwise living a “subpar” life.

        Older societies I think had it right with their story form lessons, songs, etc. They made the important lessons cultural pieces and latched on to techniques that actually help people remember instead of just giving them the information with a technique to remember it and then being surprised when a huge portion of the class can’t remember.

        Edit: To make a software metaphor, we’ve in effect decided as a society to use inefficient software learn functions driven by the prefrontal cortex vs making use of much more efficient intrinsics built into the body by millions of years of evolution to facilitate learning. We’re running bubble sort to power our education system.

  • serial_crusher@lemmy.basedcount.com
    link
    fedilink
    English
    arrow-up
    43
    ·
    1 year ago

    because of AI

    Oh look, a bullshit article.

    You need to learn the fundamentals of how things work, and how to apply those fundamentals, not rote specifics of a particular technology.

      • profdc9@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        Crypto and AI have rewritten the book on the laws of physics. Now you can defy gravity with AI!

        • ram@bookwormstory.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago
          Meme

          Text version of meme

          A linkedin post by Technology Management Consultant Sreekanth Kumbha that reads: "I can suggest an equation that has the potential to impact the future:

          E = mc2 + AI

          This equation combines Einstein’s famous equation E=mc2, which relates energy (E) to mass (m) and the speed of light ©, with the addition of AI (Artificial Intelligence). By including AI in the equation, it symbolizes the increasing role of artificial intelligence in shaping and transforming our future. This equation highlights the potential for AI to unlock new forms of energy, enhance scientific discoveries, and revolutionize various fields such as healthcare, transporation, and technology."

          MIT PhD in Physics and CS Taosif Ahsan replies: “What”

  • Hazdaz@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    2
    ·
    1 year ago

    Mostly bullshit because the ultimate goal of college isn’t to make you term basic facts which you need to graduate, instead the ultimate goal is to teach you how and where to learn about new developments in your field or where to look up Information which you don’t know or don’t remember.

    • English Mobster@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      One thing I found especially dumb is this:

      Jobs that require driving skills, like truck and taxi drivers, as well as jobs in the sanitation and beauty industries, are least likely to be exposed to AI, the Indeed research said.

      Let’s ignore the dumb shit Tesla is doing. We already see self-driving taxis on the streets. California allows self-driving trucks already, and truck drivers are worried enough to petition California to stop it.

      Both of those involve AI - just not generative AI. What kind of so-called “research” has declared 2 jobs “safe” that definitely aren’t?

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      edit-2
      1 year ago

      I mean, to some degree … yes. Day to day, I do very little math … if it’s trivial I do it in my head if it’s more than a few digits, I just ask a calculator… because I always have one and it’s not going to forget to carry the 1 or w/e.

      Long division, I’ve totally forgotten.

      Basic algebra, yeah I still use that.

      Trig? Nah. Calc? Nah.

      • SpacetimeMachine@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        1 year ago

        You’re not going to college level math to do basic calculations. You’re going to college level math because you need to learn how to actually fully understand and apply mathematical concepts.

        • Dark Arc@social.packetloss.gg
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          edit-2
          1 year ago

          I hear this all the time that there’s some profound mathematical concept that I had to go to college to learn … what exactly is that lesson? What math lessons have changed your life specifically?

          Also the comment I was replying to was about doing math. Mathematical “concepts” aren’t exactly “doing math.”

  • Bobby Turkalino@lemmy.yachts
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    1 year ago

    Learning professional skills? In college? My guy, that’s not what it’s about. Especially at universities, it’s not about learning professional skills as much as it is networking and earning a piece of paper that proves you can commit to something.

    E.g. my university was still teaching the 1998 version of C++ in the late 2010’s. First job I had out of school used C++ 17. Was I fucked? No, because the languages I learned were far less important than how I learned to learn them.

    • Khotetsu@lib.lgbt
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      I think the real issue is with schooling before college, and this article seems to be looking at college as the same sort of environment as the previous 12 years of school, which it isn’t. So much of everything through high school has become about putting pressure on teachers to hit good grades and graduating student percentages that actually teaching kids how to learn and how to collaborate with others has become a tertiary goal to simply having them regurgitate information on the tests to hit those 2 metrics.

      I have taught myself a number of things on a wide range of subjects (from art to 3d printing to car maintenance and more. City planning and architecture are my current subjects of interest) and I’ve always said when people ask about learning all this stuff that I love to learn new things, despite the school system trying to beat it out of me. I dropped out of college despite loving my teachers and the college itself both because I didn’t like my major (the school was more like a trade school, we chose our majors before we even got to the college) and because I had never learned how to learn in the previous 12 years of school. I learned how to hold information just long enough to spit it out on the test and then forget it for the next set for the next test. Actually learning how to find information and internalize it through experience came after I left school.

  • fubo@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    1 year ago

    GPT is not equipped to care about whether the things it says are true. It does not have the ability to form hypotheses and test them against the real world. All it can do is read online books and Wikipedia faster than you can, and try to predict what text another writer would have written in answer to your question.

    If you want to know how to raise chickens, it can give you a summary of texts on the subject. However, it cannot convey to you an intuitive understanding of how your chickens are doing. It cannot evaluate for you whether your chicken coop is adequate to keep your local foxes or cats from attacking your hens.

    Moreover, it cannot convey to you all the tacit knowledge that a person with a dozen years of experience raising chickens will have. Tacit knowledge, by definition, is not written down; and so it is not accessible to a text transformer.

    And even more so, it cannot convey the wisdom or judgment that tells you when you need help.

  • ISometimesAdmin@the.coolest.zone
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    Joke’s on you, the stuff that my college tried to teach me was obsolete a decade before I was even born thanks to tenured professors who never updated their curriculum. Thank fuck I live in the Internet age.

  • cyd@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    1 year ago

    Claiming modern day students face an unprecedentedly tumultuous technological environment only shows a bad grasp of history. LLMs are cool and all, but just think about the postwar period where you got the first semiconductor devices, jet travel, mass use of antibiotics, container shipping, etc etc all within a few years. Economists have argued that the pace of technological progress, if anything, has slowed over time.

    • thallazar@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I don’t think that latter statement is right,and if you’ve got some papers I’d love to read them. I’ve never heard an economist argue that. I have heard them argue that productivity improvement is declining despite technological growth though, more that it’s decoupling from underlying technology.

      • cyd@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        Robert Gordon and Tyler Cowen are two economists who have written about the topic. Gordon’s writings have been based on a very long and careful analysis, and has influenced and been cited by people like Paul Krugman. Cowen’s stuff is aimed at a more non-academic audience. You should be able to use that as a starting point for your search.

  • eestileib@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    In theory, you go to college to learn how to think about really hard ideas and master really hard concepts, to argue for them honestly, to learn how to critically evaluate ideas.

    Trade schools and apprenticeships are where you want to go if you want to be taught a corpus of immediately useful skills.

  • geosoco@kbin.social
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    This has arguably always been the case. A century ago, it could take years to get something published and into a book form such that it could be taught, and even then it could take an expert to interpret it to a layperson.

    Today, the expert can not only share their research, they can do interviews and make tiktok videos about a topic before their research has been published. If it’s valuable, 500 news outlets will write clickbait, and students can do a report on it within a week of it happening.

    A decent education isn’t about teaching you the specifics of some process or even necessarily the state-of-the-art, it’s about teaching you how to learn and adapt. How to deal with people to get things accomplished. How to find and validate resources to learn something. Great professors at research institutions will teach you not only the state-of-the-art, but the opportunities for 10 years into the future because they know what the important questions are.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    In an essay, Hyams shared his top concerns around AI — one of which is how technologies like OpenAI’s ChatGPT will affect the job market.

    “With AI, it’s conceivable that students might now find themselves learning skills in college that are obsolete by the time they graduate,” Hyams wrote in the essay.

    “The higher the likelihood that a job can be done remotely, the greater its potential exposure is to GenAI-driven change,” the researchers wrote, referring to generative artificial intelligence.

    The CEOs thoughts on AI come as labor experts and white-collar workers alike become increasingly worried that powerful tools like ChatGPT may one day replace jobs.

    After all, employees across industries have been using ChatGPT to develop code, write real estate listings, and generate lesson plans.

    For instance, Hyams said that Indeed’s AI technology, which recommends opportunities to its site visitors, helps people get hired “every three seconds.”


    The original article contains 463 words, the summary contains 148 words. Saved 68%. I’m a bot and I’m open source!

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Ultimately we’re running into the limiting factor being uptake. It’s not going to be a factor of how quickly new techniques get turned out, but rather how quickly they can be effectively applied by the available communities. It’s going to start with the new tech being overhyped and less than savvy managers demanding that the new product has X as a feature, and VCs running at it because of FOMO, and then the gap between promise and capability will run straight into the fact that no one knows what they’re doing with the new tech, especially when it comes to integrating with existing offerings.

    It’s going to cause a demand shock.