Oh my god it’s “visual programming” all over again. This is such a monumentally stupid take that fundamentally misunderstand and denigrates what software engineering is even about, I can’t even–
Actually, I like job security. Yeah, that’s right, coding is easy and dumb, don’t bother to learn kids! In unrelated news here’s my business card…
I could see a situation where logic and math classes were all that was needed because the ai could interpret the code for what the logic required was.
If you can gather requirements and lay out a detailed and perfectly accurate flowchart, then congratulations: you’ve just programmed. It’s done. The first difficult part is over. Translating that flowchart into machine code is easy, tons of tools already do that visually or however you want it and LLMs are just an additional tool for this.
Then there’s the second difficult part of a project’s lifecycle: Debugging, maintenance, and support. Where again AI can help punctually as part of the tool box, but most of those tasks don’t require writing (a lot of) code.
All the senior software engineers I know spend, optimistically, 20 % of their time actually “writing code”. That’s your upper limit on the efficiency gains of LLMs for higher level software engineering. Saying LLMs will replace programmers is like saying CAD software will replace architects.
yeah this is what I mean though and took it to be what he meant. code itself becomes irrelevant. I know tons of people who take coding classes and students who ask if you need to learn math to code and are told no. I don’t consider myself a coder because to me I script and I always move the goalpost for deciding when im really coding.
I wonder if we could replace jumped up CEOs with LLMs
Didn’t a company in
KoreaPoland do just that?… Looking for link
Edit: Polish company : https://www.businessinsider.com/humanoid-ai-robot-ceo-says-she-doesnt-have-weekends-2023-9?op=1
How does an AI go golfing with their Senators in exchange for special favors? How do they host lavish yacht parties for Supreme Court justices?
A future where literally no one knows how their software works sounds terrible
My takeaway from this whole thing is “Become dependent on us.”
That’s literally the entire end goal of supra-human organizations like governments and corporations, and why the American constitution is built upon reigning in their powers. Obviously their system has become corrupted in recent times, but that was foreseen and accounted for.
The Apple way… I think.
Senior dev here: it’s already like that.
I know how my code, and the code of everyone under me works.I have a good idea of how the systems I run the end result on work.
But under that layer there are several more layers between that and hardware that I only have a faint idea of.
Everybody in tech relies on some level of abstraction. Crazy-smart hardware engineers are no exception, they’re not dealing with fundamental electricity stuff. They wouldn’t be able to deliver the advanced chip designs we take for granted today if they were.
This was already true before LLMs.
Don’t worry about intellectual work that’s the best job to support your family today, you’re now free to toil in the fields doing jobs we give migrants for minimum wage plus board in shitty shared bunk houses!
toil in the fields
I am unironically building myself a woodshop so I can retire from tech into a life of making real tangible objects, with no JIRA tickets, no kanban boards, and no stand-ups.
My plan is to raise pigs. Because I’ve raised them before and they’re so much easier to deal with than users.
Wilbur, did you email the CEO directly to complain that you lost all your progress when you were using the search bar as a notepad, and now you’ve lost a days work and need us to recover it?
The neat part is when they get too annoying you can kill and eat them.
My supervisors discouraged this sort of thing with users.
My plan is to use my technical know-how to build stuff that actually matters (which is all very subjective).
It is my opinion that it’s a shame for capitalist societies that people spend a lifetime building strong skills in something to throw them all away on the first occasion.
I so so wish people with a strong VFX expertise here in Canada would be able to keep doing excellent VFX work instead of going for mediocre middle-manager careers because it’s the only way they can ever hope to live a life worth living.
But I respect folks for whom software engineering is just a way to earn a living. All the power to you guys. We desperately need you anyway, even if it’s not a lifelong commitment.
… On another note, though, I can tell you that I will gladly throw away my Jira skills as soon as I can.
Factories are hiring for $21+ an hour for basic work, inside, usually climate controlled. Also my health insurance is $25/mo with good coverage and includes a free gym membership. Where I live you can buy a decent house for $100k - $200k. Line mechanics make $60k-$80k with basic knowledge of how a PLC works. Management is going to get dropped for ai, fuck those guys.
In other news: The CEO of Nvidia doesn’t understand how programming works.
I remember when Bill Gates said: 512K RAM memory will be more then enough forever.
And even then, you can probably share it between all seven computers.
I like how the takeaway is not “Once AI exceeds our ability to understand or compete with it, humans will not be the ones building or controlling it anymore. We’re in for a very very different future than what currently exists, so we better be pretty fuckin responsible with what trajectory is dialed in at the moment that that happens, because once the line is crossed there’s no going back. I mean, we could maybe have a conversation about whether doing this is even a good idea in the first place, but the possibility of preventing it seems more vanishingly remote with every passing year, so at least we could make an emergency crash priority out of AI safety, like a couple of years ago ideally but definitely right now.”
No, the takeaway is “Hey guys here’s some career advice for the short term. I will not be taking questions concerning anything after that. Hey we made a new chip BTW.”
The underlying message is “buy our stock because our GPUs power AI”
I agree with you wholeheartedly. Powerful CEOs are not the almighty visionaries they want people to believe they are.
Sometimes, it’s just hyperbole meant to prop their business up.
I’ve seen this episode of Star Trek
Star Trek is the best case scenario.
We’ve already blew past Wall-E and are currently trending Robocop,Idiocracy or Max Headroom, with an eye to achieving Terminator or the Matrix.
Or the butlerian jihad
I think Dune is what we get when we come out the other side of the Terminator/Matrix future.
The first time I saw The Matrix I was also reading Dune and I always thought it was a prequel. Especially to the David Lynch film with all the black rubber
I think we’re on an accelerated timeline to Judge Dredd
CEO / management stupidity and ignorance at its finest
Keep in mind that he represents a company that hopes to dominate a market where AI runs on their GPUs. So he’s not exactly an unbiased source of information. That said, AI is poised to heavily impact 80% of ALL jobs on the planet within the next 10 years. It isn’t just coders that corpos want to replace with AI, it’s everyone. We’re going to see some crazy shit in the years to come.
It will replace junior level devs but you will still need people supervising it and doing systems level design and integration. And those people will need to know how to code. Software tools which abstract core knowledge and first principles don’t actually negate the need to know these things.
If anything this will make deep engineering and domain knowledge even more valuable, as AIs will replace a lot of the amateurish side of dev work these days, but the humans who do remain in that loop will require a much greater level of expertise.
Guy who benefits from spreading misinformation about AI that run on his GPU’s spreads misinformation.
deleted by creator
Is there a future where AI could probably code itself? I wouldn’t be surprised, but I highly doubt even then you would completely wipe out coding as a field because I am pretty sure you would need people who knew how to code to update new syntax and logic for the AIs. Not to mention, I don’t see a chance in hell that CEOs would actually be fine with not controlling how an AI codes and develops, and would have a human eye on it to ensure it’s doing what they want.
But maybe AI could legitimately replace CEOs since it’s already geared towards disseminating large datasets - just give it the ability to determine the day-to-day operations for the company based on numbers.
Is there a future where AI could probably code itself?
If we actually achieve AI that understands what it is doing, then absolutely yes. And then, the gloves come off. They’ll make improvements to make themselves smarter, and it’ll cascade until they get to a point where they look at us and wonder if we are intelligent by their higher standards.
However, mark my words: most CEOs and other senior execs could be replaced if not today, then in tye near future by today’s ML - and well before programmers can be replaced. I believe this is going to be a surprising and sharp lesson for them, because any board that recognizes an AI can give them similar ROI without the high cost and golden parachutes won’t hesitate to replace the C-suite. It’s more cost effective to ditch Bezos and keep several hundred developers than the other way around. All it’ll take is one high-profile case, and it’ll start a cascade. If I were a senior exec, I’d be sweating about the near-to-mid future of my career right now.
Google laid off most of its AI engineers 1-2 years ago because they taught AI to code AI. So the future is now, old man!
Professionals will say that AI can’t replace coding because they’ll never be able to replace the human element of critical thinking and interpreting client needs. I say AI will replace coding because CEOs are dipshits.
This is why there is a fundamental difference between coding and engineering, even though that line has gotten pretty blurry recently. AI will absolutely replace coding. It’s already doing so. It will be a bigger lift to actually replace human engineers doing domain specific R&D work. The AIs will just become another tool in that workflow, like calculators or CAD software.
It actually hasn’t. IDK which sector you’re working in, but ChatGPT has been useless in the overwhelming majority of cases for me.
Hallucinating functions that don’t exist, generating broken shit… And I gave it a fair shot. I could use a docs synthesizer right now. But it feels like even that is too advanced for the current tech.
The other use case I can foresee for it is as a fancy static analyzer. But the tech is not there yet. I guess we’ll have to rely on formally proven stuff for at least a bit more.