Skip Navigation

How frequently do you use profilers/debuggers at work?

I know profilers and debuggers are a boon for productivity, but anecdotally I've found they're seldom used. How often do you use debuggers/profilers in your work? What's preventing you? conversely, what enables you to use them?

41 comments
  • I have a tendency to just use console logging, and only use debuggers when things are starting to get hairy.

  • I seldom use profilers because I seldom need to. It's only usefull to run a profiler if your programm has a well defined perfomance issue (like "The request should have an average responsetime of X ms but has one of Y ms" or "90% of the requests should have a response after X ms but only Y% actually do).

    On the other hand I use a debugger all the time. I rarely start any programm I work on without a debugger attached. Even if I'm just running a smoke test, if this fails I want to be able to dig right into the issue without having to restart the programm in debug mode. The only situation, where i routinely run code without a debugger is the red-green-refactor cycle with running unit tests because I'll need to re run these multiple times with a debugger anyway if there are unexpected reds...

    What enables me? Well there's this prominent bug-shaped icon in my IDE right besides the "play button", and there's Dev-Tools in Chrome that comes with the debugger for JS...

    Running your code without a debugger is only usefull if you want to actually use it or if you're so sure that there aren't any issues that you might as well skip running the code altogether...

  • At my last job, doing firmware for datacenter devices, almost never. JTAG debugging can be useful if you can figure out how to reproduce the problem on the bench, but (a) it's really only useful if the relevant question is "what is the state of the system" and (b) it often isn't possible outside of the lab. My experience with firmware is that most bugs end up being solved by poring over the code or datasheets/errata and having a good long think (which is exactly as effective as it sounds -- one of the reasons I left that job). The cases I've encountered where a debugger would be genuinely useful are almost always more practically served by printf debugging.

    Profilers aren't really a thing when you have kilobytes of RAM. It can be done but you're building all the infrastructure by hand (the same is true of debugger support for things like threads). Just like printf debugging, it's generally more practical to instrument the interesting bits manually.

  • I've used a debuggers only a handful of times in the last decade or so. The projects I work on have complex stacks, are distributed, etc. The effort to get that to run in a debugger is simply not worth it, logging and testing will do 99.9% of the time. Profiling on the other hand, now that's useful, especially on prod or under prod load.

  • Really often, even since TurboDebuger in the 90s, no other way to trace your code, step over/into, watch variables, etc. For compiled program it's necessary. For javascript I use print lol

    • Don't forget being able to watch the stack in realtime, and run your code backwards to roll back its state!

  • I find debuggers are used a lot more on confusing legacy code.

    Lately, monitoring tools such as OpenTelemetry have replaced a lot of my use of profilers.

  • These days? Never, but I'm mostly writing Ansible and Terraform at work.

    When I was writing any code at previous jobs? Also never. It was one part we were highly restricted in what we were allowed to use (and I didnt feel like trying to get gdb through the approval process; it was far easier to just use print statements inside of conditionals) and one part the languages all being scripting languages.

  • For microcontrollers, quite often. Mainly because visibility is quite poor, you're often trying to do stupid things, problems tend to be localized, and JTAG is easier than a firmware upload.

    For other applications, rarely. Debuggers help when you don't understand what's going on at a micro level, which is more common with less experience or when the code is more complex due to other constraints.

    Applications running in full fledged operating systems often have plenty of log output, and it's trivial to add more, formatted as you need. You can view a broad slice of the application with printouts, and iteratively tune those prints to what you need, vs a debugger which is better suited for observing a small slice of the application.

  • My primary languages are Java (for work), Javascript (for work), and C/C++ (for hobbies). Earlier in my career, I used to use the debugger a lot to help figure out what's going on when my applications were running, but I really don't reach for it as a tool anymore. Now, I'll typically gravitate towards either logging things (at a debug level that I can turn on and off at runtime) or I'll write tests to help me organize my thoughts, and expectations

    I don't remember when, or if ever, I made the deliberate decision to switch my methodology, but I feel like the benefit of doing things in logging or tests gives me two things. 6 months later, I can look back and remind myself that I was having trouble in that area; it can remind me how I fixed it too. Those things also can serve as a sanity check that if I'm changing things in that area, I don't end up breaking it again (at least breaking it in the same way)

    With that said, I will reach for the debugger as a prototyping tool. IntelliJ IDEA (and probably other Java debuggers) allow you to execute statements on-the-fly. I'll run my app up to the point where I know what I want to do, but don't know exactly how to do it. Being able to run my app up to the point of that, then pause it and get to try different statements to see what comes back has sped up my development pretty well. Logging or testing don't really apply for that kind of exploration, and pausing to run arbitrary statements beats the other options in how quickly and minimally that exploration can be done

  • This usually depends on which industry you work in, and what language you're using usually :)

    I work in gamedev, c++, and I ALWAYS use a debugger. There's no running the game, or even the editor without the debugger connected. No matter if you need it currently or not. You always launch the project through the debugger so if anything comes up you can investigate immediately.

    Profiler is used any time there's a performance problem.

    • What enables me to use them is probably that this is very much true for the whole industry so software is built with that in mind.

      For example, we use special "print" statements for some of the errors that if a debugger is running, it will automatically stop the program so you can investigate. Without a debugger, it will just output the error in the log.

      There is no docker, the app is running usually on your local hardware. Consoles are also built with debugger support that you connect to from your PC. So it's very easy to use. Even connecting to another PC in a local network, for example, an artist or tester hardware, is possible from your computer without a problem. We have all the tools prepared for that.

  • I'm starting to get into the habit of reaching for debuggers more and more as opposed to just print()ing everything and hoping for the best.

    Profilers on the other hand I still have no idea how to apply (and more importantly, read the results of) properly, so that's something I'll need to learn.

41 comments