DaedTech

Stories about Software

By

Agile: False Hope and Real Promise

I went shopping for jeans last weekend, which was, as always, somewhat of an aggravating experience. It’s not just jeans, but, really, any wearable thing: shoes, work shirts, belts, even socks. The source of my irritation is the fact that I can’t simply replace items I have that are worn out because the thing you bought last time is always decommissioned. Strangely, major things like cars have better staying power than clothes. There’s a very simple use case missing: As a shoe wearer, given that I love a pair of shoes and thus wear them out in 2 years, I want to go back to the same store and buy the exact same pair of friggin’ shoes so that my life is not interrupted. And the software of life fails to perform.

The reason for this is the nebulous concept of fashion. Naturally, wearable manufacturers and retailers don’t want to be ‘stale’ so they churn things at a rate that prevents me from being pragmatic with my shopping and turns my trips into maddening treasure hunts instead of simple errands. So I contemplate fashion a bit with a natural bias toward annoyance, and this has informed a pretty cynical take on the subject. I have an ongoing hypothesis that the thing that drives fashion isn’t some guy sitting in Paris, saying “I’m going to make those idiots wear really tight jeans this year and then laugh at them.” Instead, its the shifting tectonic plates of poor people trying to look like rich people and rich people trying not to look like poor people.

By way of narrative, guy in Paris releases skinny jeans and charges $700 for them. They’re new and expensive, so rich people buy them so that they can be seen wearing new and expensive things — signaling their wealth and cosmopolitan tastes. Bulk clothing manufacturers start knocking off Paris designer and selling it to the masses for $39 a pair and they sell out like crazy as everyone wants to ape the A list celebrities and influencers wearing these things. Rich people get annoyed that they’re no longer distinguishable on sight and this demand drives Paris guy to dream up some new, weird thing that they can buy to look rich. Rinse, repeat.

FashionKing

This narrative isn’t limited to the arena of rich and poor people with clothes, however. It also extends to things like industry thought leadership and buzzwords. Crowd of cool kids starts doing something and succeeding/making money. They give it a name and everyone rushes to mimic their success, adopting this thing. Imitators adopt it en masse, bring mediocrity and misunderstanding to it, and the cool kids lament that everyone is getting it wrong and that it’s time for another industry shake-up. Queue the “Is {Insert Tech} Dead” link bait and search for the next great thing.

“Agile” is probably the poster child for this sort of thing in the development world. Back around the turn of the millennium, various industry thinkers were experimenting with ways to move software development away from software development practices that mimicked construction and physical engineering since the results produced by this traditional, “waterfall” approach were spotty at best. A group of representatives of some of these experiments came together to find common ground and the result was called, “The Agile Manifesto.” In the interceding 13 years, the industry has come to accept that the Agile approach is “the right thing” while agreeing that, by and large, no one does it right.

Off the top, I’ve heard the following expressed recently (paraphrased from my recollection):

  • Agile is dead.
  • Scrum is a scam that makes grunts stand while they deliver their status reports.
  • If XP doesn’t work it’s because you’re doing it wrong.
  • Agile is stupid because you really can’t work sanely without comprehensive documentation and specs.
  • Agile is a marketing gimmick to create phony demand for meaningless certifications.

And yet, you’d be hard-pressed to find any non-agile shop that didn’t look down shamefacedly and mumble “yeah, we do waterfall.”  So, the reasonable conclusions from this are:

  • Waterfall (or pseudo-waterfall things like RUP) are really out of style.
  • Agile is mainstream, but the real fashionistas are so over it because everyone’s messing it up and ruining it.
  • Cool kids are looking for what’s next: a post-Agile world.

Agile, Distilled

I’m not certified in Scrum or XP or anything else. I have no project management credentials and no letters after my name. I certainly have experience in these methodologies and understand the ceremonies and the roles, but I’d hardly count myself an encyclopedia of how everything should operate. It’s always bemused me that arguments emerge over the finer points of the exact approaches to these methodologies and that people can actually be certified in exact adherence when the focus of the Agile Manifesto seems to me to be best summarized by the idea of bringing pragmatism and empirical experimentation to software development. But, opportunities for snark notwithstanding, it’s easy to see why people become disillusioned. Anytime there is some sort of process and any failures are blamed on a lack of true, absolute adherence, you have a recipe for quackery.

But at it’s core, I think Agile methodologies can be distilled to a single compound principle: tighten the feedback loop and improve the feedback. Think about the different things that are done in various flavors of agile: automated testing/TDD, short sprints, pair programming, retrospectives, big/visible charts, etc. All of it is about delivering feedback faster, delivering feedback more obviously, and acting quickly on the feedback. This is really, really non-controversial. The sooner you get feedback on what you’re working on, the sooner you can improve. So, where does everything go off the rails?

Is it that things don’t go well because people aren’t following the processes right? Is it that the processes themselves are broken? Is it that the processes are good, but a bunch of hucksters are peddling fools’ gold? Is it that people are just too entrenched in their ways to be compatible with these principles? Personally, I don’t think it’s any of this. I think that the fundamental problem is one of expectations (though some of the hucksters do nothing to mitigate this). Specifically, I think the main source of woe is the assumption that “going Agile” will cure what ails you because the only thing standing between the broken, over-budget pile of crap you’re about to ship and software nirvana is having a different set of meetings and adopting a different desk arrangement. The Agile promise isn’t realized by tweaking process; it’s realized by getting a scheme in place for receiving meaningful feedback and using that feedback, with a lot of practice, to get better at making software.

10,000 Hours

Wait, what? Practice? That doesn’t have anything to do with Kanban boards, swim lanes, stand-ups, and all of the other Agile stuff that gets so much focus. In the words of the esteemed Alan Iverson, “What are we talking about? Practice? We’re talking about practice, man.” The wide world is talking about the magic dust you can sprinkle on a team to make it crank out better software, faster, and I’m in here talking about practice.

During my semi-weekly drive to Detroit, I’ve been listening to audio books, including, most recently, Malcom Gladwell’s “Outliers.” Without offering up too much of a spoiler, one of the things that he talks about is that people like Bill Gates don’t reach the incredible levels of success that they do by being dominant in a vacuum, but rather they combine incredible fortune and opportunity with cleverness, ingenuity and a whole lot of really hard work. Bill Gates was born at a time where him being a teenager coincided with the very first machines that allowed for desktop based computing, rather than punch cards. He had the incredible good fortune of gaining access to such a computer in the late 60’s, as a teenager, when almost no adults or professionals on Earth had such access. Addicted to the fun of programming, he worked nights and weekends during his teenage years, programming, at a time when few others in the world had that opportunity. By the time he was an adult, ready to start Microsoft, he was one of the few people on Earth with 10,000 hours of experience programming this way.

The hypothesis for which the book is best known is that 10,000 hours of deliberate practice is the general average amount of time put in before a person acquires mastery of a field. As I understand it, there has been some corroboration of this hypothesis as well as some counter-evidence, but the broad point remains that achieving mastery requires a whole lot of deliberate work and concerted, sustained effort to improve. You know how you get really good at delivering software? Do it for years and years, constantly learning from your mistakes. You know how you don’t get really good at delivering software? By having slightly different meetings, shipping crap more frequently, and hoping for different results.

The way I see it, big A Agile, at its core, is not about fixing you or your software. It’s about making your practice more meaningful and efficient. To return to the basketball metaphor, Agile isn’t a weekend retreat that turns you into Michael Jordan. It’s a weekend retreat that shows you how to do a bunch of layup and dribbling drills that, after 10,000 hours of practice, will make you pretty good, as long as you keep challenging yourself. The great shame of Agile or any other kind of movement is that of false hope in easy fixes. Done well, it will expose your pain points more quickly, provide you with tools for removing obstacles, illuminate a path to improvement, and show you how to self-evaluate regularly. But it won’t make you awesome. Only practice and continuous learning will do that.

By

The Humanized Expert Beginner

I was going to write a post about my recent fascination with Selenium tonight, but with the current bandwidth I’m getting from my hotel wifi connection, I won’t have successfully installed Firefox until Thursday. Undaunted, I decided to go through my drafts folder and move a card out of that sizeable backlog. The one that I settled on came as a response to Damien’s comment in my first Expert Beginner post. Periodically, people happen across that post and continue to comment, and I continue to find the subject relatively interesting since this theme is one that doesn’t seem to be in danger of going away. The portion of his comment about which I thought I’d muse is as follows:

You’d have to be mind-numbingly ignorant to fall into Expert Beginnerism. All you have to do to illuminate your ignorance is read a blog or book by highly experienced software engineers. If you’re an Expert Beginner in software, you’re likely an Expert Beginner in every area of your life.

I’ve been thinking about this a little bit here and there over the last couple of months. The question essentially boils down to one of whether Expert Beginnerism is a specific, learned approach to one set of problems or whether it’s a general approach or even a personality trait (defect?). Is it possible to be woefully mistaken about your own competence in one area of life while otherwise being a relatively normal, well-adjusted and pleasant individual in other arenas? Operating hypothesis: yes, it is. Tentative evidence: karaoke. I’m serious.

karaoke

Make your way down to a local place that does karaoke nights and suffer through the whole evening. I can almost promise you that you’ll find a nice, social, well-adjusted person that, to everyone’s listening displeasure, thinks they’re really good at singing. Most likely, they’ve never thought of making a go at it professionally and so no one has ever told them the hard but kind truth about their abilities. People just wince inwardly and say, “oh, sure, that’s GREAT!” And so the would-be Aretha Franklin or Frank Sinatra, through various merciful white lies in the face of unimportance, comes to believe wrongly that she or he is a good singer in spite of easily available evidence to the contrary.

That’s all well and good, however, because no one is paying these room-clearers to sing or, worse still, to teach others to sing. Is it possible to be a professional Expert Beginner without being insufferable across the board? At one time, I created a taxonomy of Expert Beginners (at least those with whom I’ve interacted) and group them according to their tolerance for cognitive dissonance. A poor-man’s interpretation of that might be how red they’d turn when asked, “Sir, have you no shame?”

In the parlance of that taxonomy, we have Master Beginner, Company Man, and Xenophobe. To recap, Master Beginners are consummate blowhards that believe themselves incapable of being wrong (no shame at all). Company Men have a basic, queasily pragmatic operating philosophy of circular reasoning: “if I have it, then I must have earned it by merit.” (They’d bristle some at suggestions to the contrary, turning slightly red). Xenophobes are those who have hissy fits when faced with the prospect that there might be some kind of gap in their knowledge (they turn beet red). I’d argue that the redder you turn, the less holistically defective you are when it comes to realistic self assessment.

We can pretty much set aside Master Beginners; these are seriously demented individuals that will be equally insufferable in all theaters. Company men are probably not far behind since they let their narrative be written as they go, with each instance of them not being taken down a peg bolstering their self confidence, however unjustifiably. So, even if they start out reasonably, after a bunch of years of “moving up by merit” they probably assume that they’re simply becoming awesome at everything. That leaves us with poor Xenophobe, who leads a high stress life, constantly on the brink of being exposed.

And this, I would argue, is the counter-example. Xenophobe is what happens if, while our karaoke singer is ‘performing’ on stage, someone comes along and says, “hey, I’m a musical talent agent and, while I’ve never heard anyone sing before, I’m sure that you’re just what we need, so here’s a few million dollars and some fame!” The stakes are high, the pot is sweet, and the dream is tantalizing, so our singer inks the contract, silences the “this is too good to be true” voice and engages in an odd dance that can only be described as, “make it then fake it.”

If we go away from the singer metaphor and into the more realistic scenario of “amateur techie at small, but growing, company,” you (might) start out with a decent, if naive, person that takes the road less advised when faced with “too good to be true.” And once that decision is made, it gets harder and harder with each passing day to go back, until you’re haggard somewhere, in an office, shrieking at junior developers that you don’t want to hear another word about this functional programming crap!

I’d imagine that this type of Expert Beginner, having arrived at the path described here, could actually be pretty reasonable in other areas of life. Sure, he must have some capacity for self-delusion, but can any of us honestly say that we don’t — that we haven’t indulged in basking in blatant flattery from time to time? I’m not saying that we could all make a lifetime of it, but what I am saying is that I think there’s a narrow cadre of expert beginners out there that might be decent people and even relatively modest in other areas of life.

By

A Study in Mistrust

I witnessed a rather surreal series of events some time ago. A software group lost its line manager to the proverbial greener pastures and was sort of temporarily adrift in a leadership vacuum. The organization didn’t immediately backfill the role and the team sort of lurched on for a little while, unsure of who would be transferred in. During the course of a search for a replacement, an interim manager was appointed, and the team promptly quit. Yes, you read that correctly. The team quit. I don’t mean they quit on him the way a sports team “quits on a coach” by tuning him out. I mean that the entire development team, within an incredibly short period of time, resigned from their positions. The world of software people is a strange one where they can often have new jobs within days or certainly weeks of deciding to look. But, I digress.

I’ve been watching the British take on Sherlock Holmes of late, and that has inspired me to spend a little bit of time thinking of a name for this case. After discarding such inanity as “The Case of the Missing Team” and such drollery as “The Adventure of the Vanishing Devs,” I settled on “A Study in Mistrust.”

Sherlock

The game’s afoot! (But first some backstory)

To solve the case, and lacking the formidable powers of deduction that Sherlock Holmes brought to bear, I’m going to rely instead on history. Labor has a rather rich history of mistrust once societies grow beyond the level of “tribe” where things tend to operate democratically or communally. The reason for this is that the origin of the corporate labor model involves foremen whipping indentured servants, slaves, and laborers who were failing to construct pyramids and the like quickly enough. You probably wouldn’t trust people either if the proposition they offered you was, “do this or I’ll injure you.”

Of course, we’ve come a long way since antiquity (though not without regressions). Feudalism replaced “do this or I’ll injure you” with “do this or I’ll let others injure you” and “do this or you’ll starve.” It’s not exactly a meteoric rise up Maslow’s hierarchy, but one supposes that hunger and possibly realized threats are a little better than immediate threats. Serfdom eventually gave way to the Industrial Revolution, which by and large replaced “do this or I’ll injure you” with “do this or you’ll go hungry and possibly end up in debtor’s prison.” And so, another relatively unremarkable step was realized. But then labor laws came about, removing Gilded Age practices like sharecropping and tenant farming from the equation. Actual gains were made and social safety nets created, resulting in a situation where not laboring in employ wasn’t necessarily as bad as it had been, historically. The unemployed still don’t have a particularly great life, but walking away from a job no longer constitutes a visceral, immediate threat to your well-being.

If you go all the way back to the labor of antiquity that was not cooperative (e.g. hunter/gatherer societies) and consider what caused the original power imbalance, you’ll find a lot of things like hereditary station in life. Your dad was a laborer and whip-wielder’s dad was a foreman, so whip-wielder gets to whip you while you labor. This doesn’t exactly engender a lot of mutual trust in the first place, and the nature of the situation makes things worse. Stacking pyramid stones sucks, and no one would want to do that, so you have to whip people to make it happen (or invent some religious story like telling laborers it’s worth it because even though their lives suck, they’ll totally get a better seat on the ferry to the afterlife for their troubles). The whip-cracker doesn’t trust the workers to work and, not surprisingly, the workers don’t trust the dude that’s whipping them.

That (lack of) trust story didn’t change a whole lot as the workforce gradually shifted from slave-oriented to feudal to early capitalist societies (not that this was a linear journey, as evidenced by the African Slave Trade). By the late 1800’s there wasn’t any (much) whipping going on, but your boss, who was also your landlord, might evict you if you didn’t work to his satisfaction or might fire you if you didn’t pay rent. Boss doesn’t trust the peons, and the peons don’t trust boss. This brings to mind classic and iconic factory imagery of someone strolling around with the clipboard, making notes, and taking laborers to task for failing to sew buttons properly or assemble widgets quickly enough. The lazy workers don’t trust the overbearing foremen and vice-versa, and all parties are aware that the work sucks.

Interestingly (and absurdly), the factory/foreman model (along with a masochistic, heaping helping of Protestant Work Ethic) is largely what informs the operation of modern corporations. Think of some kind of garden variety, generic office satirized by works like The Office and Office Space, and what do you picture? I bet you quickly form images of employees finding ways to linger at the water cooler while mid-level managers say, “so, what’s happening…. yeah, so I’m going to need you to go ahead and limit your breaks to 10 minutes every 3 hours and punch in and out of the break room each time you enter so that I can check… yeah.” No doubt they monitor employee Facebook usage to ensure that the lazy louts aren’t misappropriating company time.

Over the years, the outfits have changed and the cruelty (at least in the corporate world) has been mitigated and moved away from physical, but the basic lack of trust between the two parties has withstood the test of time. “Do this stuff that sucks or I’ll do bad things to you.” We accept this as a pretty standard way to operate and shrug it off with inane banalities like “welp, they wouldn’t call it work if you liked doing it,” and “I work to live; I don’t live to work.” In a world of mutual mistrust, the employed and the employer have a sort of festering but peaceful cold war of equilibrium.

This approach tends to work well enough for jobs done by rote and other types of labor, but, not surprisingly, it sucks pretty hard when it comes to knowledge work (loosely defined as people who “think for a living”). That is, yelling at doctors to “doctor faster and with fewer mistakes” doesn’t seem to lead to better health outcomes. The main barrier to accomplishing knowledge work isn’t the same as the main barrier to stacking pyramid stones. It’s hard to get people to stack pyramid stones because it’s boring, back-breaking labor that’s simultaneously not worth much from a market perspective (it requires no skill) and not worth doing from an individual perspective (who needs the hernias and slipped discs) — the ideal solution is to automate. But with knowledge work, the main barrier to accomplishing things tends to be some kind of creative block, which is ideally addressed by stimulating environments conducive to creative thought.

There’s a new norm emerging, and it’s driving things like the Agile movement, startup, “perk” culture, casual/flexible office environments, remote work, etc. The new norm of which I’m speaking is the idea that you get the most productivity out of knowledge workers by making them comfortable, relaxed and happy. And the element at the core of all of this is trust. Gone is the “do this stuff or I’ll do bad things to you” contract and in its place we find a new employer-employee contract: “if you deliver value and get things done, we’ll make it worth your while and try not to bother you more than we have to.”

Whodunnit

And so, Watson, only one piece remains missing after all of this background. Why would a team of knowledge workers in this brave, new paradigm just up and quit? (I’m resisting the urge to start this next sentence with “elementary.” You’re welcome). They’d quit if they were forced back into the old style of contract when they’d been used to the new; going from being trusted to being guilty of laziness until proven innocent would do the trick. And fast. And that’s exactly what happened.

I’ve written previously about how ridiculous it would seem to go from a well implemented agile process to waterfall. This is a similar concept. Going from a situation with the trust contract to a situation with the threat contract would be worse because it doesn’t just seem inefficient and weird — it would feel dehumanizing. And, in an industry where the demand for knowledge workers is through the roof, a knowledge worker in this position would quit and quit fast. And that is what the whole team did “A Study in Mistrust.” They reacted predictably to a whip-cracking newbie manager that wanted to “set the tone” from the start.

It’s worth considering your own situation and, if you have people reporting to you or you’re a non-org-chart thought leader, it’s especially worth considering the situations of those you lead. The ideal situation is one in which the employee and employer benefit mutually from an arrangement of trust, but if your situation is less than ideal, take care with how you handle it. Bravado and shows of force are not going to lead you to new outcomes. Both players in the relationship and signatories to the implied contract have a vested interest in finding a way to create or repair the trust.

By

Have a Cigar

There are a few things that I think have done subtle but massive damage to the software development industry. The “software is like a building” metaphor comes to mind. Another is modeling the software workforce after the Industrial Age factory model, where the end goal seems to be turning knowledge work into 15 minute parcels that can be cranked out and billed in measured, brainless, assembly line fashion. (In fact, I find the whole concept of software “engineering” to be deeply weird, though I must cop to having picked out the title “software engineer” for people in a group I was managing because I knew it would be among the most marketable for them in their future endeavors.) Those two subtleties have done massive damage to software quality and to software development process quality, respectively, but today I’d like to talk about one that has done damage to our careers and our autonomy and that frankly I’m sick of.

The easiest way to give the phenomenon a title would be to call it “nerd stereotyping” and the easiest way to get you to understand quickly what I mean is to ask you to consider the idea that, historically, it’s always been deemed necessary to have “tech people,” “business people,” and “analysts” and “project managers” who are designated as ‘translators’ that can interpret “tech speak” for “normal people.” It’s not a wholly different metaphor than the 1800s having horses and people, with carriage drivers who could talk to the people but also manipulate the dumb, one dimensional beasts into using their one impressive attribute, their strength, to do something useful. Sometimes this manipulation meant the carrot and other times the stick. See what I did there? It’s metaphor synergy FTW!

If you’re wondering at this point why there are no cigars involved in the metaphor, don’t worry — I’ll get to that later.

The Big Bang Theory and Other Nerd Caricatures

Last week, I was on a long weekend fishing trip with my dad and girlfriend fiancee (as of the second edit), and one night before bed, we popped the limited access cable on and were vegetating, watching what the limited selection allowed. My dad settled on the sitcom “The Big Bang Theory.” I’ve never watched this show because there have historically been about seven sitcoms that I’ve ever found watchable, and basically none of those have aired since I became an adult. It’s just not really my sense of humor, to be honest. But I’ve always suspected this one sitcom in particular of a specific transgression — the one about which I’m talking here. I’d never before seen the show, though, so I didn’t know for sure. Well, I know now.

In the two episodes I saw, the humor could best be summarized as, “it’s funny because that guy is so smart in one way but so dumb in another! It’s ironic and hi-larious!” The Sheldon character, who seems to be your prototypical low EQ/high IQ dweeb, decided in one episode to make every decision in life based on a dice roll, like some kind of programmer version of Harvey Dent. In another episode, he was completely unable to grasp the mechanics of haggling over price, repeatedly blurting out that he really wanted the thing even as his slightly less nerdy friend tried to play it cool. I don’t know what Sheldon does for a living, but I’ll wager he’s a programmer or mathematician or actuary or something. My money isn’t on “business analyst” or “customer support specialist” or “account manager.” But, hey, I bet that’d make for a great spin-off! Nerdy guy forced to do non-nerdy job — it’s funny because you wouldn’t expect it!

My intention here isn’t to dump on this sitcom, per se, and my apologies if it’s a favorite of yours and the characters are endearing to you. I’m really picky and hard to please when it comes to on-screen comedy (for example, I’d summarize “Everybody Loves Raymond” as “it’s funny because he’s a narcissistic, incompetent mama’s boy and she’s an insufferable harpy — hi-larious!”). So, if you’d prefer another example of this that I’ve seen in the past, consider the character on the show “Bones.” I tried watching that show for a season or two, but the main character was just absurd, notwithstanding the fact that the whole show was clearly set up to string you along, waiting for her to hook up with that FBI dude. But her whole vibe was, “I am highly intelligent and logical, but even searching my vast repository of situational knowledge and anthropological nuance and I cannot seem to deduce why there is moisture in and around your tear ducts after hearing that the woman who gave birth to you expired. Everyone expires, so it’s hardly remarkable.” She has an IQ of about 245 (and is also apparently a beautiful ninja) but hasn’t yet groked the syllogism of “people cry when they’re sad and people are sad when their parents die.” This character and Sheldon and so many others are preposterous one-dimensional caricatures of human beings, and when people in mathy-sciency fields ham it up along with them, I’m kind of reminded of this blurb from the Onion from a long time ago.

But it goes beyond just playing to the audience. As a collective, we engineers, programmers, scientists, etc., embrace and exaggerate this persona for internal cred. Because my field is programming, I’ll speak to the programmer archetype: the lone hero and iconoclast, a socially inept hacker. If Hollywood and reductionist popular culture are to be believed, it is the mediocre members of our field who are capable of social lives, normal interactions and acting like decent human beings. But the really good programmers are a mashup of Sheldon and Gregory House — lone, misanthropic, socially maladjusted weirdos whose borderline personalities and hissy fits simply have to be endured in order to bask in their prodigious, savant-like intellects and to extract social value out of them. Sheldon may be ridiculous, but he’s also probably the only one that can stop hackers or something, just as House’s felonious, unethical behavior and flaunted drug addiction are tolerated at his hospital because he’s good at his job.

Attribute Point Shaving

As humans, we like to believe in what some probably refer to as justice. I’m not really one to delve into religion on this blog, but the concept of “hell” is probably the single biggest illustrator of what I mean. It gives us the ability to answer our children’s question: “Mommy, why didn’t that evil man go to jail like in the movies?” We can simply say, “Oh, don’t worry, they’ll go to an awful place with fire and snakes and stuff after they die.” See, problem solved. Cosmic scales rebalanced. Hell is like a metaphysical answer to the real universe’s “dark energy” — it makes the balance sheet go to zero.

But we believe this sort of thing on a microscale as well, particularly when it comes to intelligence. “She’s not book smart, but she’s street smart.” “He may be book smart, but he has low EQ.” “She’s good at math, so don’t expect her to read any classic literature.” At some basic level, we tend to believe that those with one form of favorable trait have to pay the piper by sucking at something else, and those who lack a favorable trait must be good at something else. After all, if they had nothing but good traits, the only way to sort that out would be to send them directly to hell. And this RPG-like (or Madden Football-like, if you prefer), zero-sum system of points allocation for individual skills is how we perceive the world. Average people have a 5 out of 10 in all attributes. But since “math geniuses” have a 10 out of 10 in “good at math,” they must have a 0 out of 10 in “going out on dates.” The scales must balance.

This sword weirdly cuts the other way too. Maybe I’m only a 6 out of 10 at math and I really wish I were a 9 out of 10. I could try to get better, but that’s hard. What’s a lot easier to do is act like a 2 out of 10 in “going out on dates” instead of a 5 out of 10. People will then assume those 3 points I’m giving up must go toward math or some other dorky pursuit. If I want to hit a perfect 10 out of 10, I can watch Star Trek and begin most of my sentences with “so.” That’s gotta hurt me in some social category or another, and now I’m a math genius. Think this concept of personality point-shaving is BS? Ask yourself if you can remember anyone in junior high trying to get out of honors classes and into the mainstream so as not to seem geeky. Why do that? Shaving smart points for “street smart” points.

If you’re Hollywood, this is the best thing ever for portraying smart people. It’s hard to convey “extremely high IQ” in the medium of television to the masses. I mean, you can have other characters routinely talk about their intellect, but that’s a little trite. So what do you do? You can have them spout lots of trivia or show them beating grandmasters at chess or something… or you can shave points from everything else they do. You can make them woefully, comically inept at everything else, but most especially any form of social interaction. So you make them insufferable, low-EQ, dysfunctional d-bags in order to really drive home that they have high IQs.

In the lines of work that I mentioned earlier, there’s natural pressure to point shave as a measure of status. I think that this hits a feedback loop and accelerates into weird monocultures and that having low scores in things like “not getting food on yourself while you eat” and “not looking at your feet while you talk” actually starts to up your cred in this weird, insular world. Some of us maybe grew up liking Star Trek while others who didn’t pretend to, since that shaves some points off of your social abilities. In turn, in the zero-sum game of personal attributes, it makes you a better STEM practitioner.

What’s the Harm?

So we might exaggerate our social awkwardness or affect some kind of speech impediment or write weird, cryptic code to augment the perception of our skills… so what? No big deal, right? And, yeah, maybe we go to work and delight in telling a bunch of suits that we don’t understand all of their BS talk about profits and other nonsense and to just leave us alone to write code. Awesome, right? In one fell swoop, we point shave for social grace in favor of intelligence and we also stick it to the man. Pretty sweet, right?

LiveLongAndProsper

I guess, in the moment, maybe. But macroscopically, this is a disaster. And it’s a disaster that’s spawned an entire industry of people that collect larger salaries than a lot of middle managers and even some executives but have almost no real voice in any non-software-based organization. It’s a disaster that’s left us in charge of the software that operates stock exchanges, nuclear plants and spaceships, but apparently not qualified enough to talk directly to users or manage our own schedules and budgets without detailed status reports. Instead of emerging into being self-sufficient, highly-paid, autonomous knowledge workers like doctors and lawyers, we’re lorded over by whip-cracking, Gantt-chart-waving middle managers as if we were assembling widgets on the factory floor and doing it too slowly. And we’ve done it almost entirely voluntarily.

So what am I advocating, exactly? Simply that you refuse to buy into the notion that you’re just a “code slinger” and that all that “business stuff” is someone else’s problem. It’s not. It’s your problem. And it’s really not that hard if you pay attention. I’m not suggesting that you trade in your IDE for Microsoft Project and Visio, but I am suggesting that you spend a bit of time learning enough about the way business is conducted to speak intelligently. Understand how to make a business case for things. Understand the lingo that analysts and project managers use well enough to filter out all of the signaling-oriented buzzwords and grasp that they are communicating some ideas. Understand enough to listen, understand and critique those ideas. In short, understand enough to do away with this layer of ‘translators’ the world thinks that we need, reclaim some autonomy, and go from “slinging code” to solving problems with technology and being rewarded with freedom and appropriate compensation for doing so.

I’ll close with one last thought, hopefully to drive my point home. How many times (this is kind of programmer-specific) have people approached you and said something like, “let’s make an app; you write the code and get it into the app store and I’ll do, like, the business stuff.” And how many times, when you hear this, is it proposed that you run the show? And how many times is it proposed that you’ll do it for pay or for 49% equity or something? They had the idea, they’ll do business things, and you’re the code-monkey, who, you know, just makes the entire product.

Consider this lyric from the Pink Floyd:

Everybody else is just green, have you seen the chart?
It’s a helluva start, it could be made into a monster
If we all pull together as a team.

And did we tell you the name of the game, boy?
We call it Riding the Gravy Train.

It’s from a song called, “Have a Cigar,” and it spoke to the corporate record industry who essentially brokered its position to “team up” with musicians to control their careers and passively profiteer (from the cynical songwriter’s perspective, anyway — I’m not interested in debating the role of the record industry in creating 70’s rock stars since it’s pretty easy to argue that there wouldn’t be a whole lot of money for anyone without the record labels). “If we all pull together as a team,” is the height of irony in the song, the same way it is in the pitches you hear where the “idea guy” tells you that he’ll be the CEO, since it was his idea, and Bill will be in charge of marketing and Sue will be the CFO, and you can handle the small detail of writing the entire application that you’re going into business to make.

Is this heads-down, workhorse role worth having the most geek cred? I don’t think so, personally. And if you also don’t, I’d encourage you to get a little outside of your comfort zone and start managing your career, your talent and your intellectual property like a business. If we all do that — if we all stop with the point shaving — I think we can change the nature of the tech game.

By

Opening Word is Software Development Fail

I’m going to perform a slight experiment on myself in this post. As I type this second sentence, I have only a general idea for a post in mind, and I’m going to go out on a limb and say that this will be a relatively short post. I won’t come back and edit this paragraph if it turns out not to be, however. I’ll just look stupid. I’m going meta here because my posting cadence has slipped and I’m going to aim to combat that by mixing in some relatively short posts that are quicker to write and read. I’ve tried this before, and it’s sort of like a yo-yo diet where I cut back for a bit and then just kind of bloat back out to where I was. Anywho…

Over the course of my career a fairly common thing that I’ve done at a new company is to set up a wiki for collaboration. Almost invariably, this wiki replaces or, at least, aims to replace a series of Word documents. It’s as though there’s some kind of knowledge collection progression that goes, “nothing, README, Word, Wiki (or, perhaps, Sharepoint),” and I make my home at an organization just long enough to say, “hey, there’s a better option than shared drive/source controlled Word documents.” Why is the next step better? Searchability, not needing that “version history” table at the top, sane linking, changing the emphasis to content over styling, etc.

antiword

But, I had a thought today that I’d been sort of missing the point all these years. It’s not that wiki isn’t good, by any means, as a collaboration tool, but it’s that I’m often using it to mitigate the symptoms rather than treat the illness. If, as a developer, you find yourself opening Word to document a process, you’ve failed. If you optimize by documenting in a wiki, you’re just failing in a more searchable, sophisticated, and open-source way (unless you use Sharepoint, and then not open-source and maybe not sophisticated… hiyo, just kidding, but kind of not kidding.)

Is this a harsh thing to say? Certainly. Could you argue that I’m link baiting once you hear what comes next? Probably. But I think there’s an element of truth if you allow yourself to de-stigmatize the word “failure” and interpret it without a value judgment. For instance, today, I failed at being someone with 10,000 RSS subscribers to my blog. It’s true, but not damning. Developers documenting things with Word fall into this same category. You’re failing when you do this, and what you’re failing to do is automate.

I’ve blogged about a similar line of thought with code comments in the past. Comments in method bodies are basically developers saying, “I’m going to punt on making this code expressive and just kinda cheat by explaining this mess in English.” So it goes on a broader level with Word documents. Why do you write Word documents, after all? It’s to explain how to setup your development environment or how to perform a deploy. Maybe it’s to document what needs to happen whenever you add a new feature to the code base or in the event that you need to rollback to a previous version of the application. Whatever it is, you’re faced with some kind of procedure and you declare, “there’s a precise sequence of instructions that needs to be executed, and, as a programmer, I’m going to write them in English and use the next poor sap that happens on them as the runtime interpreter.”

I mean, think about it. If your process is defined well enough to merit a series of numbered steps in a Word document, it’s probably defined well enough to automate. Now, it might be that it’d take you three months to automate and 30 minutes to make the Word document. It might be that there are steps you lack the authority or permission to automate (or even do). It might be that you’re making a user manual or API document. There are any number of practical reasons that you’re not some kind of failure as a person for cracking open Word and explaining how to do something with a computer. You’re not a failure, but you have failed. For whatever reason, you’ve failed to automate. So next time you find yourself reflexively starting Word to make some sort of “writeup” about a technical thing, pause and ask yourself, “would automation of this process be possible and worthwhile?” It might not be, but then again, you might be surprised to find that the answer is “yes.”