DaedTech

Stories about Software

By

When is It Okay to Turn off Static Analysis Guidance

Editorial Note: I originally wrote this post for the SubMain blog.  You can check out the original here, at their site.  While you’re there, download CodeIt.Right and give it a try.

The balance among types of feedback drives some weird interpersonal dynamics and balances.  For instance, consider the rather trite (if effective) management technique of the “compliment sandwich.”  Managers with a negative piece of feedback precede and follow that feedback with compliments.  In that fashion, the compliments form the “bun.”

Different people and different groups have their preferences for how to handle this.  While some might bend over backward for diplomacy others prefer environments where people hurl snipes at one another and simply consider it “passionate debate.”  I have no interest arguing for any particular approach — only in pointing out the variety.  As it turns out, we humans find this subject thorny.

To some extent, this complicated situation extends beyond human boundaries and into automated systems.  While we might not take quite the same umbrage as we would with humans, we still get frustrated.  If you doubt this, I challenge you to tell me that you have never yelled at a compiler because you were sure your code had no errors.  I thought so.

So from this perspective, I can understand the frustration with static analysis feedback.  Often, when you decide to enable a new static analysis engine or linting tool on a codebase, the feedback overwhelms.  28,326 issues the code can demoralize anyone.  And so the temptation emerges to recoil from this feedback and turn off the tool.

But should you do this?  I would argue that usually, you should not.  But situations do exist when disabling a static analyzer makes sense.  Today, I’ll walk through some examples of times you might suppress such a warning.

False Positives

For the first example, I’ll present something of a no-brainer.  However, I will also present a caveat to balance things.

If your static analysis tool presents you with a false positive, then you should suppress that instance of the false positive.  (No sense throwing the baby out with the bathwater and suppressing the entire rule).  Assuming that you have a true false positive, the analysis warning simply constitutes noise and not signal.  Get rid of it.

That being said, take care with labeling warnings as false positives.  False positive means that the tool has indicated a problem and a potential error and gotten it wrong.  False positive does not mean that you disagree with the warning or don’t care.  The tool’s wrongness is a good reason to suppress — you not liking its prognosis false short of that.

Non Applicable Code

For the second kind of instance, I’ll use the term “non applicable code.”  This describes code for which you have no interest in static analysis warnings.  While this may sound contradictory to the last point, it differs subtly.

You do not control all code in your codebase, and not all code demands the same level of scrutiny about the same concepts.  For example, do you have code in your codebase driven by a framework?  Many frameworks force some sort of inheritance scheme on you or the implementation of an interface.  If the name of a method on a third party interface violates a naming convention, you need not be dinged by your tool for simply implementing it.

In general, you’ll find warnings that do not universally apply.  Test projects differ from your production code.  GUI projects differ from data access layer ones.  And Nuget packages or generated code remain entirely outside of your control.  Assuming the decision to use these things happened in the past, turning off the analysis warnings makes sense.

Cosmetic Code Counter to Your Team’s Standard

So far, I’ve talked about the tool making a mistake and the tool getting things right on the wrong code.  This third case presents a thematically similar consideration.  Instead of a mistake or misapplication, though, this involves a misfit.

Many tools out there offer purely cosmetic concerns.  They’ll flag field variables not prepended with underscores or methods with camel casing instead of pascal casing.  Assuming those jive with your team’s standards, you have no issues.  But if they don’t, you have two options: change the tool or change your standard.  Generally speaking, you probably want to err on the side of complying with broad standards.  But if your team is set with your standard, then turn off those warnings or configure the tool.

When You’re Buried in Warnings

Speaking of warnings, I’ll offer another point that relates to them, but with an entirely different theme.  When your team is buried in warnings, you need to take action.

Before I talk about turning off warnings, however, consider fixing them en masse.  It may seem daunting, but I suspect that you might find yourself surprised at how quickly you can wrangle a manageable number.

However, if this proves too difficult or time consuming, consider force ranking the warnings, and (temporarily) turning off all except the top, say, 200.  Make it part of your team’s work to eliminate those, and then enable the next 200.  Keep at it until you eliminate the warnings.  And remember, in this case, you’re disabling warnings only temporarily.  Don’t forget about them.

When You Have an Intelligent Disagreement

Last up comes the most perilous reason for turning off static analysis warnings.  This one also happens to occur most frequently, in my experience.  People turn them off because they know better than the static analysis tool.

Let’s stop for a moment and contemplate this.  Teams of workaday developers out there tend to blithely conclude that they know their business.  In fact, they know their business better than people whose job it is to write static analysis tools that generate these warnings.  Really?  Do you like those odds?

Below the surface, disagreement with the tool often masks resentment at being called “wrong” or “non-compliant.”  Turning the warnings off thus becomes a matter of pride or mild laziness.  Don’t go this route.

If you want to to ignore warnings because you believe them to be wrong, do research first.  Only allow yourself to turn off warnings when you have a reasoned, intelligent, research-supported argument as to why you should do so.

When in Doubt, Leave ’em On

In this post, I have gingerly walked through scenarios in which you may want to turn off static analysis warnings and guidance.  For me, this exercise produces some discomfort because I rarely find this advisable.  My default instinct is thus not to encourage such behavior.

That said, I cannot deny that you will encounter instances where this makes sense.  But whatever you do, avoid letting this become common or, worse, your default.  If you have the slightest bit of doubt, leave them on.   Put your trust in the vendors of these tools — they know their business.  And steering you in bad directions is bad for business.

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Alain Van Hout
7 years ago

Interesting subject! Also, the second line of the second paragraph contains a typo. Turning of you static analysis rules should indeed be done only sparingly.

🙂

Erik Dietrich
7 years ago
Reply to  Alain Van Hout

Got it, thanks. There was a while where the inline spell checker in chrome wasn’t working in the WordPress editor. When you’re really used to that, its sudden absence makes you type all sorts of dumb things.