Best Practice and industry standards

A lot of techniques and ideas of software development are considered “best practice”. But do we do them for the right reasons?

Last night I got caught up in a discussion with Jeffrey Way, when I responded to a tweet of his.

I responded fairly quickly with the following.

The ensuing discussion was on Twitter, which meant it by nature turned to writing short messages on rocks and throwing them at each others’ windows. I wanted to expand on the discussion in a less reductive format, because I think it’s interesting.

Way appears to be referencing the idea that dogmatic claims of best practice are an appeal to authority, and that if the decision can’t stand up on its own merits it should be challenged.

I don’t really disagree with him, either. My point was not intended as a counter to his statement, but rather as a corollary.

At the conclusion of this discussion — it was 1:30 am and I needed to go to sleep — I conceded that our experiences on this topic may well differ. Specifically, my own experiences vastly more involve attempting to encourage colleagues towards barely adequate practice, rather than struggling with any oppressive dogma myself.

I should clarify that because it makes me sound like an asshole. In a previous role I was in, my direct superior (we’ll call him Steve) was a senior developer. What that meant in this case was that he’d being doing the same thing for 10 years and made no effort to maintain his skills. He was doing PHP as he had been doing in the late 90s, with no learning over the intervening years, and actively resisted any attempt to do so.

The codebase I worked on was a disaster and many of the “best practices” I tried to impart were anything but controversial. In no particular order they included the following, and bear in mind this was only maybe four years ago.

  • Using Composer for dependencies

In every case (except tables for layout, interestingly) these suggestions were outright rejected. Not “expected to make their case” but rejected outright and aggressively by someone who felt personally slighted by the very idea his code wasn’t perfect.

And his arguments were exactly the same as those brought up by Way and those agreeing with his position. Opinion. What would they know. Things change.

I should state from the outset that the arguments were the same but the attitude vastly different. I have no doubt that rational discussion and providing evidence that backs an assertion would absolutely convince Jeffrey Way et al of the validity of a “best practice”. On its own merits. This didn’t happen in Steve’s case. He dug his heels in, white-anted arguments, misconstrued and brought up ridiculous counters (shouty uppercase names are easier to read, include statements is easier to tell what’s being used, etc).

This is my experience with challenges to “best practice”. And though it’s the worst example it’s by no means isolated.

There are things that are often considered “best practice”. I include in that category concepts as diverse as agile, TDD, continuous integration and deployment, static analysis, test automation, enforced code styles, version control, dependency management tools, code review, frameworks, progressive web apps, and an endless list of others.

If you look at these as established “rules”, there are two kinds of people who break them. Those who have the skill and knowledge to know a better path is only one group. The other is those who think they know better, because their arrogance and ignorance forces them to maintain their status quo.

Where Jeffrey Way has apparently plenty of experience with the former group my own appears to have been near entirely with the latter. So when we talk about “challenging best practice” he’s probably talking about people questioning whether Test Driven Development is always necessary or whether a comprehensive unit test suite written afterwards is actually a better and more natural flow. And I’m talking about people who code live on the server and won’t use version control because it “takes too long” or insists on SHOUTY_UPPERCASE_CLASSES in PHP because that’s how it’s done in Pascal.**

Certain practices have become entrenched. The why of that doesn’t really matter. Where they come from isn’t the key issue.

I’m OK with that. Smart people talk and have ideas, and suggest those ideas and people say “hey that’s a good idea”. It’s disinenuous to suggest we didn’t talk them over and settle on these practices. They weren’t forced into studios and dev teams. Teams adopted them because they saw merit. It’s worth pointing out that challenges to that “best practice” come from the same place, such as DHH’s famous TDD is Dead articles. These people have influence, sure. That’s not a harmful thing. They both encourage and challenge practices.

There is a huge amount of information out there on how to make good software well. Sure, not all of it is always valid, some of it changes, adjusts, adapts. But it’s incredibly arrogant to just assume you always know better in all circumstances.

What matters is that when there is an established practice, whether locally in the organization or in the wider world, the burden is on the person rejecting it to back up why that standard isn’t right or appropriate.

By all means you should be suspicious of someone who puts in obvious security flaws and says it’s best practice. But if you’re trying to convince me version control is a bad idea, or unit testing, agile or dependency management, you’d better have a damn good argument to counter “best practice”.

*One of his functions overran the integer on a cyclomatic complexity check. Some refactors got it down to around 72,000. The cutoff mark is 9. He said the test was wrong and the code was “perfectly readable”. It was not.

**both of these are actual examples, from different organisations.

Senior Web Developer based in Bangkok, Thailand. Javascript, Web and Blockchain Developer.