Technical debt is the number one cause of developer frustration. Working with imperfect systems demoralizes programmers, making it difficult to do quality work.
I’d wager not being given time to tackle technical debt is indeed frustating…
It’s hilarious when the identified problems come back around to bite the organization, when the priorities have been to work on poorly specc’d features instead.
But then it is the developers fault, never management
Seen a lot of that too. Execs who thinks all the devs are idiots and would be lost without their genius guidance, phoned in from a luxury remote location while all of us have to return to the office full time. Then stuff fails and we “pivot” to the next badly thought out fiasco. I guess it pays the bills.
I don’t care what your fancy RAMrod doohickeys say Johnson! We need that system up tomorrow so we can reach our quarterly earning projections for the shareholder’s meeting!
Yeah, that’s probably more the issue. We’ve seen too many times throwaway code become production code because “it works already, we need to move forward”.
The secret is just to do it anyway. I have yet to work in a job where anyone actively stopped me fixing technical debt, even if they never asked me to do it.
Depends on the work load. The company should make time for that and you should get paid for it.
Use overestimation padding, eh?
I keep seeing a pattern of sre/devops/sysadmin tasks being given back to developers and canning the SREs. Hard to understand why. Then some of the SWE get stuck basically focussing on infra SRE stuff and become unwilling SRE more or less. Circle of life? Do the old devops folks get made into glue or something?
Do the old devops folks get made into glue or something?
If i interpreted the “trend” correctly, “devops” was bastardized away from its original meaning to now mean “sysadmin”, at least in most cases.
Yeah. A “DevOps” is just a “sysadmin” who can pretend they don’t hate all developers for stretches of 20 minutes at a time. (I’m kidding. I know our SysAdmins love us… In their own secret ways.)
Well, I imagine theres a continuum of experiences and definition differences across the industry. Similar to how “product manager” is different at each place. What I saw back in the early 2000s or so was that the SRE and the word “engineer” in general used to be handed out sparingly. An “SRE” was a sysad/devops who had the ability to commit code to fix a product instead of just opening a bug and waking an engineer. An “engineer” committed compiled code, not short scripts. But then eveyone and their cousin became “sre” whether they deserved it or not, and everyone got an “engineer” title. I’ve seen manual QA folks who were unironically called “engineers”. QA is dead now and QE is barely hanging on, and SRE seems to be dying too. Not sure whats next, maybe just overpriced cloud gui tools and thats the end of it. And SRE can go be high school comp sci teachers. And SWE can wake up and fix their own bugs and hate their lives.
Devops is going the way of qa too.
i interpreted the “trend” correctly, “devops” was bastardized away from its original meaning to now mean “sysadmin”, at least in most cases.
I don’t think I agree. The role of a sysadmin involved a lot of hand-holding and wrangling low-level details required to keep servers running. DevOps are something completely different. They handle specific infrastructure such as pipelines and deployment scripts, and are in the business of not getting in the way of developers.
“Devops” original intent meant you don’t have a separate “operations” department separate from teams “developing” your product / software due to competing incentives. “Dev” wants to push new stuff out faster; “ops” wants to keep things stable. Or “dev” needs more resources; but “ops” blocks or doesn’t scale the same. The idea was to combine both “dev” and “ops” people onto projects to balance these incentives.
Then managers and cloud clowns repurposed it to apply to every person in a project so now every member is expected to perform both roles (badly). Or even more overloaded to somehow refer to “developer infrastructure” teams.
It is. Source: We’ve had the same issues for years, but never get any time allotted to fix them.
My boss legit says that he will give me some time to work on it every 2-3 months and then drops a “customer requires X feature and I promised that we will deliver in one week”. And mind you we have to patch up to 3 major versions in the past to back port the new feature because client haven’t upgraded and won’t in near future… which means sometimes our major releases are 60-70% same as our minor patches for old versions. Semvering much?
I feel burnt out on professional development, but at least for me tech debt is not the issue. Everything is imperfect after a while, because requirements change all the time and overall it’s not me accruing the debt. That’s why I don’t care.
I would say 80% of employees are unhappy, but I don’t have any data to back this up.
Can confirm. Was quite unhappy in my mechanical engineering job, had an opportunity to develop something nice in python, was told we’d do it in excel/vba instead, still unhappy.
was told we’d do it in excel/vba instead, still unhappy.
I just threw up in my mouth a little. Fifteen years ago, “I’ll stick to Excel” was a (bad, but) defensible position in data automation. Today that’s just insanity.
I’m still in a mechanical engineering world so just saying INT and FLOAT has people running away. Excel is the “safe zone” for them, sadly it means that I’ll just be doing the VBA part and oh gawd please get me out of here…
Yeah. I get that. Gotta do what you gotta do!
I’ve made some progress at organizations like that by setting up a private workflow in Python “just to check my work”.
oh no…
Nice. You can put that on your resume so you can get more of those kinds of jobs.
(/s. I like excel to a point but i really feel your pain too-- and fuck vba)excel has python support now! you may still get away with it
It’s cloud based though… Not ideal. I get why they had to do that (they didn’t want to expose people to the Python infra shit show) but it’s still kind of a shame.
Would be better if they added Typescript support IMO.
…like the js infra stuff isn’t it’s own special nightmare?
It’s significantly less of a nightmare and Deno is downright pleasant.
Every job lately seems to have been infected by Meta/google “data driven” leadership. Its so painful and wasteful sometimes.
Every job lately seems to have been infected by Meta/google “data driven” leadership. Its so painful and wasteful sometimes.
It’s cargo cult mentality. They look at FANGs and see them as success stories, and thus they try to be successful by mimicking visible aspects of FANG’s way of doing things, regardless of having the same context or even making sense.
I once interviewed for a big name non-FANG web-scale service provider whose recruiter bragged about their 7-round interview process. When I asked why on earth they need 7 rounds of interviews, the recruiter said they optimized the process down from the 12 rounds of interviews they did in the past, and they do it because that’s what FANGs do. Except FANGs do typically 4, with the last being an on-site.
But they did 7, because FANGs. Disregard “why”.
20 years ago it was the people who worshipped Jack Welch, not realizing (or not caring) that he was running GE into the ground.
The Behind the Bastards podcast covered Jack Welch, definitely worth a listen.
Yeah. I, like most leaders, spent some time learning all that crap. It was awful and worse than useless.
Google and Meta’s secrets are recruiting top talent to for top dollars, and then buying every start up that threatens their empire. There’s no secrets to great management to be had there.
I just threw out my copy of “product engineering at Google”.
Or maybe 80% of people are unhappy. No data here either
80% seems too high, but the US Surgeon General declared a loneliness epidemic https://www.hhs.gov/sites/default/files/surgeon-general-social-connection-advisory.pdf
And Gallup claims that 29% of Americans have been diagnosed with depression at one point: https://news.gallup.com/poll/505745/depression-rates-reach-new-highs.aspx
So… That is not good. It is almost like humans evolved to live in tight knit, walkable communities.
And Gallup claims that 29% of Americans have been diagnosed with depression at one point:
That really doesn’t mean anything. The only requirement for succumbing to a depression is being alive, because all it takes is something bad happening in your life (loss lf friend, loved one, even pet, etc) to fall into a pit of despair.
Joke’s on Gallup, I’ll be dead before I’m formally diagnosed with depression
80% of beings in the multiverse
Will AI steal their jobs? 70% of professional programmers don’t see artificial intelligence as a threat to their work.
If your job can be replaced with GPT, you had a bullshit job to begin with.
What so many people don’t understand is that writing code is only a small part of the job. Figuring out what code to write is where most of the effort goes. That, and massaging the egos of management/the C-suite if you’re a senior.
If your job can be replaced with GPT, you had a bullshit job to begin with.
This one’s funny to me, because the people who WILL try to replace you with GPT don’t care if they CAN replace you with GPT. They just will.
Look at how it’s haphazardly shoved into everything for no reason whatsoever already.
Business fails, next business pops up.
Yep! And we’re in the big tech era, so it can also be:
Business fails to produce any value and uses it’s influence to prevent the next business from popping up.
Business without value has influence?
Yes.
Google, Microsoft, Netflix, Amazon. None would still be business after recent decisions, if not for their market dominating capital size.
That is, their recent decisions provide no value to anyone else, and are made solely because they can, due to their size and anti-capitalist practices they have been allowed to get away with.
Automation is always incremental.
I’m an accountant. Components of the job have been being automated or systemised for many decades. Most of the tasks that occupied a graduate when I was one 20 years ago don’t exist anymore.
Not because AI is doing those tasks but just because everything became more integrated, we configure and manage the flow of data rather than making the data, you might say.
If you had to hire 100 professional programmers in the past, but then AI makes programmers 10% more efficient than previously, then you can do the same work with 91 programmers.
That doesn’t mean that 9 people were doing something that an LLM can do, it just means that more work is being completed with fewer programmers.
If you had to hire 100 professional programmers in the past, but then AI makes programmers 10% more efficient than previously, then you can do the same work with 91 programmers.
You’ve nailed to root of the misunderstanding by non-programmers. We’re already optimized past that target.
Some people think we type all day. We don’t. We stare at our screen saying “what the fuck?!” for most of the day. Those is especially true for the best programmers doing really interesting work.
There’s maybe three living humans who actually know how to correctly build a Windows installer. One of those three is paid to sell software to automate the task for everyone else. The other two retired already. (One is hiding out as a bar tender and claims to not speak any English if recognized from their MSI days.)
Pick an interesting topic in programming, and you’ll find similarly ludicrous optimization.
There’s a few hundred programmers building all banking automation, selling it to millions of bank employees.
It’s possible that AI will force a dozen people to stop doing banking automation. It’s a lot more likely that the backlog of unmet banking automation need will instead just get very slightly smaller.
Now, the reality of the economics won’t stop CIOs from laying off staff and betting that AI will magically expand to fill the gap. We’re seeing that now. That’s called the “fuck around” phase.
But we’ve seen “this revolutionary technology will make us not need more programmers” before (several times). The outcomes, when the dust settles are:
- The job is now genuinely easier to do, at least for beginners. (Senior professionals had access to equivalent solutions, before everyone else got excited.)
- More people are now programmers. (We laid a bunch of them off, and we meant to not hire any back, but it turned out that our backlog of cool/revolutionary/necessary ideas was more important to leadership than pinching pennies.)
- A lot of work that was previously ignored completely now gets done, but done very badly by brand new programmers. (We asked the senior developers to do it, but they said “Fuck you, that’s not important, make the new kid do it.” I think they’re just still cranky that we spent three years laying off staff instead of training…)
- The average quality of all software is now a bit worse, but there’s a lot more variety of (worse) software now available.
To add on this, this doesn’t necessarily mean that there are fewer programing jobs in total. If people work 10% more efficently, that means that the cost of labor is only 91% of what it was before meaning that people might be able to afford to finance more programing projects. One thing that does matter is for example things like entry level jobs disappearing or the nature of the work changing. Doing less boring gruntwork can make the job more fun, but otoh digitization sometimes results in the worker having less agency in what they do since they have to fit everything into a possibly inflexible digital system.
But that is always happening. Software that now can be built by two programers needed IBM few decades ago, just because of hardware, languages, available libraries and shared knowledge.
But we still have so many “app ideas” that there is more work to be done. I would be happy to have AI write all those apps that I need and have no time or money to make them.
My conclusion is that it is only about money and economy. We are in unofficial recession so everyone is cutting costs, as soon as money comes back we will go back into bulking/exploration phase.
If all you bring to the job is looking shit up and telling me yes or no instead of actually trying to help me find solutions, or explaining me what I did wrong, you’re just a glorified robot. You’re in line for replacement and you’ll fucking deserve it. At least that’s what I wanna say to “the computer said” people.
There’s a lot of like management being like “we gotta hit this deadline (that we made up)” combined with “if I hit all my targets and put in some overtime, the boss can buy another sports car this year”
I don’t want to work extra to make someone else richer. Maybe if I had a shit load of shares. Maybe. But I don’t. So I do my job with professional standards, but I’m not doing 12 hour days
Indeed, professional standards and 12h days are not compatible.
Maybe it is just my experience, but in the last decade, employers stopped trying to recruit and retain top developers.
I have been a full time software engineer for more than a decade. In the 2010s, the mindset at tech giants seemed to be that they had to hire the best developers and do everything they could to keep them. The easiest way to do both was to be the best employer around. For example, Google had 20% time, many companies offered paid sabbaticals after so many years, and every office had catering once a week (if not a free cafeteria). That way, employees would be telling all of their friends how great it is to work for you and if they decide to look for other work, they would have to give up their cushy benefits.
Then, a few years before the pandemic, my employer switched to a different health insurance company and got the expected wave of complaints (the price of this drug went up, my doctor is not covered). HR responded with “our benefits package is above industry averages”. That is a refrain I have been hearing since, even after switching employers. The company is not trying to be the best employer that everyone wants to work at, they just want to be above average. They are saying “go ahead and look for another employer, but they are probably going to be just as bad”.
Obviously, this is just my view, so it is very possible that I have just been unlucky with my employers.
I’ve kinda checked out of the private sector for this reason. I’ve been having a great time working for a government job. Great benefits, union, etc… pay is about 80 percent of what others make but it’s more than enough to get by.
Man, I’d be happy with 80% of what I get for less stress and more security. What kind of government job specifically?
What kind of government job specifically?
Most of them. Certainly the ones that have unionized. If you know someone in the inside, they probably know if there’s a union.
You’ll see more unions in government work because while private organizations breaking up unions is ethically questionable; governments breaking up unions is just openly totalitarian.
If I can’t negotiate with a private employer, I might be a wage slave, but I can ask the government for help.
If I can’t negotiate with my government job, it’s not actually a job, I’m just a slave.
I meant more specifically in OP’s case, but also which pay that much. When I looked locally (major city) all the G jobs were under 100k. Usually well under.
In the 2010s, the mindset at tech giants seemed to be that they had to hire the best developers and do everything they could to keep them.
Not really. The mindset was actually to hire skilled developers just to dry up the market, so that their competitors would not have skilled labour to develop their own competing products and services.
Then the economy started to take a turn for the worse, and these same companies noted that not only they could not afford blocking their competitors from hiring people but also neither did their competitors. Twice the reasons to shed headcount.
It was not a coincidence that we saw all FANGs shed people at around the same time.
Man not all are even trying to beat the average!
This is the first rule of sales. It is not important or necessary to be the best. It is only necesaary to be slightly less shitty than your nearest competitor.
So, roughly 20% of developers have found the right mix of self-medication?
I’ve been programming for years, I’ve only happy when working on my own stuff. It’s like the difference between renting and owning
I’ve rented my whole adult life, and I don’t mind it. There are downsides, but owning has them too.
And I never got depressed because of renting.
I will rent debt free until I am dead
Spot-on analogy.
deleted by creator
So true.
And literally a line I use to recruit peers to try out learning to code.
“I’m afraid I’ll be unhappy.”
“You might be. Many of us are. But the extra money helps.”
The bloody managers are the biggest problem. Most don’t understand code much less the process of making a software product. They force you into idiotic meetings where they want to change how things work because they “don’t have visibility into the process” which just translated to “I don’t understand what you’re doing”.
Also trying to force people who love machines but people less so into leading people is a recipe for unhappiness.
But at least the bozos at the top get to make the decisions and the cheddar for being ignorant and not listening.
The bloody managers are the biggest problem. Most don’t understand code much less the process of making a software product.
So, I’ve had my eye on management and started doing some management training. The job of management really isn’t to do the work itself (or even to understand the work). That’s the job of specialists and technical leads. The job of management is to oversee the workforce (hiring, organizing teams, dictating process, allocating project time, planning mid and long term department goals, etc) not to actually get your hands into the work itself.
It’s certainly helpful to understand coding broadly speaking. But I’m in an office where we’re supporting dozens of apps written and interfaced with at least as many languages. Nevermind all the schemas within those languages. There’s no way a manager could actually do my job without months (if not years) of experience in the project itself.
At the same time, the managers should understand the process of coding, particularly if they’re at the lower tier and overseeing an actual release cycle. What causes me to pull my hair out is managers who think hand-deploying .dlls and fixing user errors with SQL scripts is normal developer behavior and not desperate shit you do when your normal workflows have failed.
Being in a perpetual state of damage control and thinking that this is normal because you inherited from the last manager is the nightmare.
But at least the bozos at the top get to make the decisions and the cheddar for being ignorant and not listening.
Identifying and integrating new technologies is normal and good managerial behavior.
Getting fleeced by another round of over-hyped fly-by-night con artists time after time after time is not as much.
But AI seems to thread the needle. Its sophisticated and helpful enough to seem useful on superficial analysis. You only really start realizing you’ve been hoodwinked after you try and integrate it.
Setting aside the absurd executive level pay (every fucking corporate enterprise is just an MLM that’s managed to stay cash positive) it does feel like the problem with AI is that each business is forced to learn the lesson the hard way because no business journal or news channel wants to admit that its all shit.
The thing that frustrates me about developers who feel powerless over technical debt is…who is actually stopping them from dealing with it? They way I see it, as a software engineer, your customer is sales/marketing/product/etc. They don’t care about the details or maintenance, they just want the thing. And that’s okay. But you have to include the cost of managing technical debt into the line items the customer wants. That is, estimate based on doing the right things, not taking shortcuts. Your customer isn’t reading your commits. If they were, they wouldn’t need you.
It would be bizarre if your quote for getting your house siding redone included line items for changing the oil on the work truck, organizing the shop, or training new crew members. But those costs of business are already factored into what you pay at the end of the day.
who is actually stopping them from dealing with it?
Management. Someone in management sets idiotic deadlines, then someone tells you “do X”, you estimate and come up with “it will take T amount of time” and production simply tells you “that’s too long, do it faster”
they don’t care about the details or maintenance
They don’t, they care about time. If there are 6 weeks to implement a feature that requires reworking half the product, they don’t care to know half the product needs to be reworked. They only care to hear you say that you’ll get it done in 6 weeks. And if you say that’s impossible, they tell you to do it anyway
you have to include the cost of managing technical debt
I do, and when I get asked why my time estimations are so long compared to those of other colleagues I say I include known costs that are required to develop the feature, as well as a buffer for known unknowns and unknown unknowns which, historically, has been necessary 100% of the time and never included causing us development difficulties and us running over cost and over time causing delays and quality issues that caused internal unhappiness, sometimes mandatory overtime, and usually a crappy product that the customers are unhappy with. That’s me doing a good job right? Except I got told to ignore all of that and only include the minimum time to get all of the dozens of tiny pieces working. We went over time, over cost, and each tiny piece “works” when taken in isolation but doesn’t really mix with everything else because there was no integration time and so each feature kinda just exists there on its own.
Then we do retrospectives in which we highlight all the process mistakes that we ran into only to do them all again next time. And I get blamed come performance review time because I was stressed and I wasn’t at the top of my game in the last year due to being chronically overburdened, overworked, and underpaid.
Yeah management is totally backwards there; it’s like the building manager on a construction project going “all electrical needs to be done in X weeks”, but realistically they have no direct control over that deadline being met by declaring an arbitrary deadline. The unfortunate difference is that if you do a shitty job wiring a building, you’ll fail inspection and have to spend more time and money fixing it. Software can often hobble along; there aren’t strict enforcements for quality that the business can legally ignore, so you’ll always have sad defeated devs go “okay boss, we’ll skip the things we need to get this done faster for you (I hate this job and don’t care about the product’s long term success)”. Having a steady supply of those people will slowly kill a software company.
In the past, I’ve dealt with estimate pushback not by explaining what necessary work can be removed like tests, documentation, or refactoring, but by talking through ways to divide the project more effectively to get more people involved (up to a point, a la mythical man month). That seems to go more proactively. Then we look at nixing optional requirements. But, I’ve also usually dealt with mostly competent engineering management.
Yeah that’s what we did last time. I implemented a basic framework on top of a very widespread system in our codebase, which would allow a number of requested minor features to be implemented similarly, with the minimal amount of required boilerplate, and leaving the bulk of the work to implementing the actual meat of the requests.
These requests were completely independent and so could be parallelized easily. The “framework” I implemented was also incredibly thin (basically just a helper function and an human instruction in the shape of “do this for this usecase”) over a system that is preexisting knowledge. My expectation was to have to bring someone up to speed on certain things and then let them loose on this collection of tasks, maybe having to answer some question a couple times a day.
Instead, since the assigned colleague is basically just a copilot frontend, I had to spend 80% or more of my days explaining exactly what needed to be done (I would always start with the whys od things since the whats are derived from them, but this particular colleague seems uninterested in that).
So I was basically spending my time programming a set of features by proxy, while I was ostensibly working on a different set of features.
So yeah, splitting work only works if you also have people capable of doing it in the first place. Of course I couldn’t not help this colleague either, that’s a bad mark on performance review you know. Even when the colleagues have no intention of learning or being productive in any way (I live in a country with strong employee regulations so almost nobody can be fired for anything concerning actual work performance, and this particular colleague doesn’t hide that they don’t care about actually doing a good job, except to managers so they still get pay raises for “improving”).
Yeah, you can tell I’m unhappy
Yes, this. Refactor first to make the upcoming change easier and cleaner, not after. Don’t ask for permission, don’t even call it refactoring or cleanup. Just call it working on the feature, because that’s what it is. Don’t let non-engineers tell you how to engineer.
Yes, this! I rarely ask for permission on that sort of thing. I’ll just do it as part of my work and see if anyone calls me out on it.
I do this too, but I realize I’m privileged to be able to. In past jobs people actually would get pissed at me for doing it. I had a manager have a really shitty talk with me about it once. I’d guess a lot of people have bad experiences like that
If higher-ups complain about intempestive code refactoring, it’s always a good idea to stop for a moment and to start becoming less trigger-happy with refactors. It’s OK to take some time to determine what actual value refactors bring to the project in tangible terms - intuition is not enough. Convincing a critical manager is a good start, because their tolerance for programmer bullshit is low if they don’t actually write code.
Very often, and this is especially prevalent among junior programmers who care about what they do, the reasoning for refactoring turns out to be something along the lines of “I don’t like this” or “I read some cool blog article saying things should be done that way”, without any care about whether or not the change in question is actually improving anything, or, if it does, if the improvement is worth the degradation in terms of quality (new bugs)/maintainability (added genericity making the code more difficult to understand, cryptic features of the language being used that make it hard to understand what’s going on, I’m sure there’s other examples…)
The problem is you often get in cases where the developer cannot back their intuition that something is actually harmful with facts. When it’s not just pure bikeshedding about code they don’t like and falsely claim to be a ticking timebomb, they fail to weigh the risks of leaving slightly offputting code in the codebase against the risks associated with significant code changes in general, which, even with tests, will still inevitably break.
Developers of all sorts tend to vastly overestimate how dangerous a piece of code may be.
To be clear, while I’ve seen it with other developers, I’m still guilty of this myself to this day. I’m not saying I’m any better than anybody.
It’s just that I’ve seen how disruptive refactoring can be, and, while it is often necessary, I thought it would be important to mention that I think it should be done with care.
If you can convince a manager with rational arguments in terms of product quality, it can be a good way to make the case for a refactor, because your manager probably won’t be impressed by arguments about unimportant nuances we developers obsess about.
The joy begins when you know you should refactor the whole project from the ground up…
I believe for many companies, developers work on giant codebases with many hundred thousands or even millions of lines of code.
With such large codebase you have no control over any system. Because control is split between groups of devs.
If you want to refactor a single subsystem it would take coordination of all groups working on that part and will halt development, probably for months. But first you have to convince all the management people, that refactor is needed, that on itself could take eternity.
So instead you patch it on your end and call it a day.
So instead you patch it on your end and call it a day.
Yep!
I’m looking forward to the horror stories that emerge once some percentage of those changes are made solely by unmanaged hallucination-prone AI.
I would feel bad for the developera who have to clean up the mess, but honestly, it’s their chance to make $$$$$$ off of that cleanup. If they manage not to, their union is completely incompetent.
I feel blessed that I like my current job. Good manager, interesting work, limited amounts of bureaucracy. Most of this is a lucky coincidence but there are some things we can do. I had to explain many times to people which tasks I’m good at and which ones they should ask other people to do. I regularly defend this position. I set aside the morning for creative work only, no meetings, no admin, just thinking and solving. In the afternoon I down tools and do something physical, outside in daylight. A regular sleep cycle is absolutely critical for the maintenance of health and mood. Fresh food and companionship. Regular meditation. Do the basics well. These are the things that have made me happy.
Sounds like you’re well paid and your time is valued, I imagine most devs would be happy with that!
Reinforcing my headcanon that everyone is.
Yay, horror story time!