What Is a Best Practice in Software Development?
A while ago, I released a course on Pluralsight entitled, “Making the Business Case for Best Practices.” There was an element of tongue-in-cheek to the title, which might not necessarily have been the best idea in a medium where my profitability is tied to maximizing the attractiveness of the title. But, life is more fun if you’re smiling.
Anyway, the reason it was a bit tongue in cheek is that I find the term “best practice” to be spurious in many contexts. At best, it’s vague, subjective, and highly context dependent. The aim of the course was, essentially, to say, “hey, if you think that your team should be adopting practice X, you’d better figure out how to make a dollars and cents case for it to management — ‘best’ practices are the ones that are profitable.” So, I thought I’d offer up a transcript from the introductory module of the course, in which I go into more detail about this term. The first module, in fact, is called “What is a ‘Best Practice’ Anyway?”
Best Practice: The Ideal, Real and Cynical
The first definition that I’ll offer for “best practice” is what one might describe as the “official” version, but I’ll refer to it as the “ideal version.” Wikipedia defines it as, “method or technique that has consistently shown results superior to those achieved with other means, and that is used as a benchmark.” In other words, a “best practice” is a practice that has been somehow empirically proven to be the best. As an example, if there were three possible ways to prepare chicken: serve it raw, serve it rare, and serve it fully cooked, fully cooked would emerge as a best practice as measured by the number of incidents of death and illness. The reason that I call this definition “ideal” is that it implies that there is clearly a single best way to do something, and real life is rarely that neat. Take the chicken example. Cooked is better than undercooked, but there is no shortage of ways to fully cook a chicken – you can grill it, broil it, bake it, fry it, etc. Is one of these somehow empirically “best” or does it become a matter of preference and opinion?
This leads me into what I’ll loosely refer to as the “real” definition of best practice, and by this, I mean, the most common applied definition. Here it generally refers to a convention for the accepted way of doing something, often enshrined by some standards organization such as ISO or other, similar agency. In this regard, perhaps a better name for it might be “standard practice” or “accepted practice.” An example of this might be surgeons washing their hands prior to surgery. It is such a widely accepted and standardized practice that it would be scandalous for a practitioner not do follow it. Washing hands versus not washing hands has been empirically demonstrated to be better in terms of patient outcomes. Is the washing of hands the absolute “best” thing in the universe that a surgeon conceivably could do prior to surgery? Well, this philosophical question is not especially important to what I’m calling the “real” definition of best practice. What matters is that hand-washing is beneficial and widely accepted.
And now, for the cynic’s definition. To many, “best practice” is basically a meaningless buzzword intended to make one’s actions seem more important, reasonable, or justified than they actually are. I’m sure I’m not alone in having heard people toss around the term “best practice” to justify their actions when it was clear that the practice had neither been empirically demonstrated to be the best possible practice nor had it been endorsed by convention or a standards organization. You’ll often hear developers talk about their approach to some problem as being a “best practice,” particularly if they’re in a position of seniority or to control the organization’s technical decisions. But is it truly a best practice? Or are they just using that word to stave off debate?
By considering these three definitions, we might define a continuum. The cynical definition coincides with the least credible use of the term best practice, while the ideal definition coincides with a bullet proof use of the term. Thus, someone defining a practice as a “best practice” meets the cynical definition if he offers no proof. He meets the real definition if he succeeds in demonstrating value and establishing a justifiable standard, and he meets the ideal definition if he somehow proves that this practice is, literally, the best possible practice.
Software Development “Best Practices”
Having taken care of some definitions around the term “best practice,” let’s talk about some examples of things that are commonly put forth as best practices at some point or another along the continuum that I mentioned in the last slide. I’m not talking about something specific to a group or an organization, such as your architect deciding that a group “best practice” is to use the validation library that she wrote. I’m talking instead about commonly known and defined industry practices. It isn’t that these practices are universally agreed upon or without their detractors, but rather that the preponderance of developers out there seems to see potential value in moving toward adoption, even if they believe, perhaps, that the value of adoption is outweighed by the cost and inconvenience of adoption. In other words, it is possible to believe that something is a best practice but also not worth doing.
Here are some examples in the software development world.
- Agile methodologies are huge these days. There is an entire industry now dedicated to helping companies adopt agile practices.
- Automated testing is something almost universally agreed upon as a laudable goal.
- Test driven development may fall more on the controversial side, but there is no shortage of developers out there that think of this as a best practice.
- Continuous integration, as opposed to developing in silo branches and then having extensive merging efforts, is also largely viewed as a benefit.
- There was the original, so-called, Gang of Four list of object oriented design patterns, but the list of patterns has grown beyond the ones suggested by this group of authors to include many more. All taken together, using these patterns strategically is considered to be a best practice.
- Code reviews are widely considered to be indispensable ways of promoting code quality.
- Pair programming takes code reviews to the next level and has development performed in a constant state of review.
- One click, automated builds are considered an important goal for most organizations, in stark contrast to the practice of having someone labor for half a day to get the software into a deployable state.
This list is certainly not exhaustive nor is it, perhaps, without controversy. It may well be that you don’t consider one or more of the items here even to be good ideas. However, the industry in general has moved in a direction where this would be a contrarian point of view. For better or for worse, the industry has collectively voted that these are worthwhile goals.
But there’s a fine line between collective interest in a good idea and a passing fad. There’s a fine line between conventional wisdom and common misconception. How is one to know if these things really are good ideas and, more importantly, how is one to know if these things are good for himself and his organization?
Well, that’s the theme of the course (and here’s a hint — follow the money).
The Main Point
I’m a software developer, and I’d imagine many of you reading are as well. Almost certainly you’re a developer, a former developer, or someone who works with developers in some capacity. So, we all know that there’s a natural tendency for software developers to want to experiment with new things and gain marketable experience during the course of the work day. And we all know that there’s a natural tension between developers wanting to adopt the newest development practices, languages, processes and frameworks because they’re interested in doing so and developers wanting to do what makes the most sense for their organization or project. That’s simply a universal line to walk in this industry.
My goal with the course is, simply put, is to help you navigate walking the line between these two concerns. I aim to give you a way to look at the things being touted as industry best practices and to identify whether adopting them makes practical sense for your organization. And, further, I aim to give you a way to move things along the “best practices continuum” from “something that we want to do because people say it’s awesome” to “a convention that we can justify” to, ideally, and in some cases, “something we can do because it’s been empirically demonstrated to be better than the alternatives.”
It might be that with the heuristics and the framework that I’m going to show you, you come to the conclusion that a so-called industry best practice actually doesn’t make sense for your team or your organization. That’s fine, so be it. The important thing is that you’re basing your decisions on measured experience and data rather than on word of mouth and conjecture. The important thing is knowing whether so-called best practices are actual best practices.By the way, if you liked this post and you're new here, check out this page as a good place to start for more content that you might enjoy.
Another Great Post Erik…
I always ask in job interviews if they have CI, automated build, code reviews,automated tests, unit tests and other “buzz” words (to me they are not buzz words).
Makes sense to me. You’re assessing shops for mutual fit, so whether those things are trendy or not won’t really matter when it comes to whether you’re happy doing the work. As an aside, wouldn’t it be nice if companies gave you a look at their code bases, ALM, and build setup? Seems like every org wants to ask you questions, watch you code, and check out your open source contributions, but not too many are transparent about what their stuff looks like. I get the IP angle, but it’s always struck me that it’d be nice to be able… Read more »
Kudos for your course, as well. A good indicator of its worth was that, near the end, I began anticipating your points, and even considering them quite commonsensical, despite the fact that I’d never really considered the topic. Our CIO is fond of reminding us all that “we’re not a tech firm,” so it’s a very good reminder to keep IT funding requests in the context of business value. I confess I’ve been guilty of neglecting that point more often than not. Ironically, I canceled my Pluralsight account a couple months ago — shortly after your course was released —… Read more »
I found that in software development contexts, it’s not necessarily intuitive to think about whether the benefits of a given tool or approach will offset the cost. I mean, in that role, we’re really paid to dream up creative ways to automate solutions to problems, with other considerations being more afterthoughts. Certainly not a bad state of affairs, but I think it takes practice to bring costs/benefits more to the fore. And I appreciate this feedback! It’s really nice to hear that the course was in any way helpful for achieving an outcome that benefited you. I was really hoping… Read more »
I have always found it interesting that a “best practice” by its very nature was at some point new, experimental, unproven, and possibly controversial. It only becomes a “best practice” when somebody steps outside the box of the current “best practice” to try to improve upon it. Without the efforts of those who were willing to do this, we would all still be travelling by horse and buggy (no pun intended). The technology sector changes at a fairly rapid pace in part because there are those willing to pay the price of bucking the current “best practice” to discover what… Read more »
That’s an interesting point. I never really thought about the paradoxical nature of this, but it seems that you’re right — “best practices” are necessarily the result of ignoring “best practices.” It does make sense, since adhering to so-called best practices is done with the goal of operational efficiency rather than innovation (typically, anyway).
Dan North delivered a brilliant talk on best practices at Oredev 2007 – apparently the video is no longer online but here is a slideshare – http://www.slideshare.net/dannorth/best-practices-5047876 You can’t really get what he said from the slides so to summarize… He basically called “best practices” the hobgoblin of small minds (to borrow from Emerson). He explained that experienced programmers reach productivity levels that are orders of magnitude greater than beginners and novices but that management doesn’t like a model where a few great programmers have so much “power”. Best practices is a way to ensure that a dozen novices can… Read more »
I like this explanation a lot, actually. Not specifically related to this post or to my course, I’ve found myself in consulting roles a lot, and, at the risk of being flippant, a lot of my advice boils down to, “hire smart people that you trust and then trust them.” Organizations are always search for the silver process bullet that will make mediocre software developers produce awesome software (or passable software with better margins) and, if it exists, I sure haven’t seen it. I think a lot of this tilting at windmills comes from misguided comparison of software development to… Read more »
None of those things are examples of best practices in software engineering. Best practices are specific engineering techniques.
/Shrug/ So you say. I’ve half-dozed my way through plenty of meetings where people say as confidently as you that those things are, in fact, “best practices.” I’ll leave others to duke it out over the epistemology. My point here and in the course is that one might rationally adopt the perspective that the real “best practices” are the ones that maximize profits.
Off-topic. Tongue in cheek is an expression I don’t understand at all. I even tried putting my tongue in my cheek in search for understanding, obviously without success. Sorry for the off-topic comment, but, as English is not my first language, some cultural expressions or slang like this is not understandable to me.
I wasn’t aware of the origin of it myself, but went digging a little. It appears that it originates from the practice of biting one’s tongue (putting it against the cheek and biting down) to prevent laughter. Another take was that if you stick your tongue in your cheek, it forces the face to wink, which is also a sign of humor.
Weird little bit of historical trivia, I suppose.