Hi,

My question certainly stems from the imposter syndrome that I am living right now for no good reason, but when looking to resolve some issues for embedded C problems, I come across a lot of post from people that have a deep understanding of the language and how a mcu works at machine code level.

When I read these posts, I do understand what the author is saying, but it really makes me feel like I should know more about what’s happening under the hood.

So my question is this : how do you rate yourself in your most used language? Do you understand the subtilities and the nuance of your language?

I know this doesn’t necessarily makes me a bad firmware dev, but damn does it makes me feel like it when I read these posts.

I get that this is a subjective question without any good responses, but I’d be interested in hearing about different experiences in the hope of reducing my imposter syndrome.

Thanks

  • 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍
    link
    fedilink
    arrow-up
    5
    ·
    19 hours ago

    I should know more about what’s happening under the hood.

    You’ve just identified the most important skill of any software developer, IMO.

    The three most valuable topics I learned in college were OS design basics, assembly language, and algorithms. They’re universal, and once you have a grasp on those, a lot off programming language specifics become fairly transparent.

    An area where those don’t help are paradigm specifics: there’s theory behind functional programming and OO programming which, if you don’t understand, won’t impeded you from writing in that language, but will almost certainly result in really bad code. And, depending on your focus, it can be necessary to have domain knowledge: financial, networking, graphics.

    But for what you’re taking about, those three topics cover most of what you need to intuit how languages do what they do - and, especially C, because it’s only slightly higher level than assembly.

    Assembly informs CPU architecture and operations. If you understand that, you mostly understand how CPUs work, as much as you need to to be a programmer.

    OS design informs how various hardware components interact, again, enough to understand what higher level languages are doing.

    Algorithms… well, you can derive algorithms from assembly, but a lot of smart people have already done a ton of work in the field, and it’s silly to try to redo that work. And, units you’re very special, you probably won’t do as good a job as they’ve done.

    Once you have those, all languages are just syntactic sugar. Sure, the JVM has peculiarities in how its garbage collection works; you tend to learn that sort of stuff from experience. But a hash table is a hash table in any language, and they all have to deal with the same fundamental issues of hash tables: hashing, conflict resolution, and space allocation. There are no short cuts.

    • Croquette@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      1
      ·
      17 hours ago

      Thanks for the input, it will make me think about how to approach how to get the skills I need.

      I’d say I am decent with FreeRTOS which is pretty much just a scheduler with a few bells and whistles.

      I haven’t used assembly in a long while, so I know where to look to understand all the instructions, but I can’t tell right off the bat what a chunk of assembly code does.

      Algorithms, I am terrible at these because I rarely use them. I haven’t worked in a big enough project where an algorithm is needed. I tend to work in finite state machine which is close to algorithms, but it’s not quite it. And a big part of my job is interfacing peripheral chips for other to use.