DaedTech

Stories about Software

By

Beware of The Magnetars in Your Codebase

Lately, I’ve been watching a lot of “How the Universe Works” and other similar shows about astronomy. I’ve been watching them a lot, as in, I think I have some kind of problem. I want to watch them and find them fascinating and engaging and yet I also seem suddenly to be unable to fall asleep without them on.

Last night, I was indulging this strange problem when I saw what has to be the single most intense thing in the universe: a magnetar. Occasionally when a massive star runs out of fuel in its core, it explodes into as a supernova and spews matter and radiation everywhere, sending concussive shock waves hurtling out into the universe. In the aftermath, the rest of the star that doesn’t escape out collapses in on itself into an unimaginably dense thing called a “neutron star,” which is the size of Manhattan but weighs as much as the sun (for perspective, a sugar cube of neutron star would weigh as much as all of the people on earth combined).

One particularly exotic type of neutron star is called a magnetar. It’s a neutron star with a magnetic field of absolutely mind-boggling strength and a crust made out of solid iron (but up to 10 billion times stronger than steel, thanks to the near-black-hole-like gravity of the star crushing imperfections out of the crystals that form the crust). A magnetar is so intensely magnetized that if the moon were a magnetar (and forget the gravity for a moment) it would tear the watch off of your wrist and render your credit cards useless. This thing rotates many times per second, whipping its magnetic field into a frenzy and sloshing the ultra-dense neutron goo that makes up its core into a froth until the internal pressure causes something called a “starquake,” which, if it were measured on a the Richter scale, would be a 32. When these starquakes happen, the result is that the magnetar spews a torrent of radiation so powerful that it has a profound effect on the earth’s magnetic field and atmosphere from halfway across the Milky Way.

So to recap, a magnetar is a tiny thing leftover from a huge event that’s not really visible or particularly noticeable from a distance. At least, it isn’t noticeable until the unimaginable destructive force roiling in its bowels is randomly unleashed, and then it pretty much annihilates anything in its close vicinity and has a profound effect universally.

Magnetar

Image courtesy of wikipedia

I was idly thinking about this concept today while looking at some code, and I realized something. How many projects do you work on where there’s some kind of scramble or to get some new feature in ahead of schedule, to absorb scope creep and last minute changes, or to slam some kind of customization into production for a big client with a minimum of testing? Whether this goes well or poorly, the result is generally spectacular.

And when the dust settles and everyone has taken their two or three weeks off, come down from the ledge and breathed a sigh of relief, the remnants of the effort is often some quiet, dense, unapproachable and dangerous bit of code pulsing in the middle of your code base. You don’t get too near it for fear that it will tear the watch off of your wrist or result in a starquake — okay, more accurately, that it will introduce some nasty regression bug — and you just kind of leave it there to rotate and pulse ominously.

Much later, when you’ve pretty well forgotten it, it erupts and unleashes a torrent of devastation into your team’s life. One day you suddenly recall (one day too late) that if you don’t log into that one SQL server box and restart that scheduled task on any March 1st not in a leap year, all 173,224 users of the Initrode account are suddenly unable to log into anything in their ERP system, and they’re planning a shipment of medical supplies to hurricane victims and abused puppies. You’ve had all of the atoms in your organization pulverized out of existence by the flare of a magentar in your code base.

How do you avoid this fate? I’ll give you a list of two:

  1. Do the right thing now.
  2. Push back against creating the situation in the first place.

The first one is the more politically tenable one in organizations. The business is going to do what the business is going to do, and that’s to allow sales to promise clients a cure for cancer by June 15th if they promise to pitch in personally for steak dinners for the dev team, on their honor. It can be hard to push back against it, so what you can do is ride the storm out and then make sure you carve out time to repair the damage when the dust settles. Don’t let that rogue task threaten your very existence once a year (but not in leap years). And don’t cop out by documenting it on a wiki somewhere. Do the right thing and write some code that automates whatever it is that should trigger it to happen. While you’re at it, automate some sort of reminder scheme for monitoring purposes and some fault tolerance, since this seems pretty important. You may have needed to hack something out to meet your deadline, but there’s nothing saying you have to live with that and let it spin and pulse its way to bursting anger.

The better solution, though, is to push back on the business and not let supernovae into your development process in the first place. This is hard, but it’s the right path. Instead of disarming volatile things that you’ve introduced in a pinch, avoid introducing them altogether. Believe it or not, this is a skill that actually takes practice because it involves navigating office-political terrain beyond simply explaining things to someone in rational fashion and prevailing upon their good judgment.

But really, I could boil these two points down to one single thing that logically implies both: care about the fate of the project and the codebase. If you invest yourself in it and truly care about it, you’ll find that you naturally avoid letting people introduce explosive forces in the first place. You certainly don’t allow alien, stealth deathbombs to fester in it, waiting to spew radiation at you. Metaphorical radiation, that is. Unless you code for a nuclear power company. Then, real radiation.

By

The 7 Habits of Highly Overrated People

I remember having a discussion with a more tenured coworker, with the subject being the impending departure of another coworker. I said, “man, it’s going to be rough when he leaves, considering how much he’s done for us over the last several years.” The person I was talking to replied in a way that perplexed me. He said, “when you think about it, though, he really hasn’t done anything.” Ridiculous. I immediately objected and started my defense:

Well, in the last release, he worked on… that is, I think he was part of the team that did… or maybe it was… well, whatever, bad example. I know in the release before that, he was instrumental in… uh… that thing with the speed improvement stuff, I think. Wait, no, that was Bill. He did the… maybe that was two releases ago, when he… Holy crap, you’re right. He doesn’t do anything!

How did this happen? Meaning, how did I get this so wrong? Am I just an idiot? It could be, except that fails as an explanation for this particular case because the next day` I talked to someone who said, “boy, we’re sure going to miss him.” It seemed I was not alone in just assuming that this guy had been an instrumental cog in the work of the group when he had really, well, not been.

In the time that has passed since that incident, I’ve paid attention to people in groups and collaborating on projects. I’ve had occasion to do this as a team member and a team lead, as a boss and a line employee, as a consultant and as a team member collaborating with consultants, and just about everything else you can think of. And what I’ve observed is that this phenomenon is not a function of the people who have been fooled but the person doing the fooling. When you look at people who wind up being highly overrated, they share certain common habits.

If you too want to be highly overrated, read on. Being overrated can mean that you’re mediocre but people think that you’re great, or it can mean that you’re completely incompetent but nestle in somewhere and go unnoticed, doing, as Peter Gibbons in Office Space puts it, “just enough not to get fired.” The common facet is that there’s a sizable deficit between your actual value and your perceived value — you appear useful while actually being relatively useless. Here’s how.

TomSawyer

1. “Overcommunicate”

I’m putting this term in quotes because it was common enough at one place I worked to earn a spot on a corporate BS Bingo card, but I’ve never heard it anywhere else. I don’t know exactly what people there meant by it, and for all I know, neither do they, so I’m going to reappropriate it here. If you want to seem productive without doing anything useful, then a great way to do so is to make lots of phone calls, send lots of emails, create lots of memos, etc.

A lot of people mistake activity for productivity, and you can capitalize on that. If you send one or two emails a day, summarizing what’s going on with a project in excruciating detail, people will start to think of you as that vaguely annoying person who has his fingers on the pulse all of the time. This is an even better strategy if you make the rounds, calling and talking to people to get status updates as to what they’re doing before sending an email.

Now, I know what you’re thinking — that might actually be productive. And, well, it might be, nominally so. But do you notice that you’ve got a very tangible plan of action here and there’s been no mention of what the project actually involves? A great way to appear useful without being useful is engage heavily in an activity completely orthogonal to the actual goal.

2. Be Bossy and Critical

Being an “overcommunicator” is a good start, but you can really drive your phantom value home by ordering people around and being hypercritical. If your daily (or hourly) status report is well received, just go ahead and start dropping instructions in for the team members. “We’re getting a little off schedule with our reporting, so Jim, it’d be great if you could coordinate with Barbara on some checks for report generation.” Having your finger on the pulse is one thing, but creating the pulse is a lot better. Now, you might wind up getting called out on this if you’re in no position of actual authority, but I bet you’d be surprised how infrequently this happens. Most people are conflict avoiders and reconcilers and you can use that to your advantage.

But if you do get called out (or even if you don’t), just get hypercritical. “Oh my God, Jim and Barbara, what is up with the reports! Am I going to have to take this on myself?!” Don’t worry about doing the actual work yourself — that’s not part of the plan. You’re just making it clear that you’re displeased and using a bit of shaming to get other people to do things. This shuts up people inclined to call you out on bossiness because they’re going to become sidetracked by getting defensive and demonstrating that they are, in fact, perfectly capable of doing the reports.

3. Shamelessly Self Promote

If a deluge of communication and orders and criticisms aren’t enough to convince people how instrumental you are, it never hurts just to tell them straight out. This is sort of like “fake it till you make it” but without the intention of getting to the part where you “make it.” Whenever you send out one of your frequent email digests, walk around and tell people what hard work it is putting together the digests and saying things like, “I’d rather be home with my family than staying until 10 PM putting those emails together, but you know how it is — we’ve all got to sacrifice.” Don’t worry, the 10:00 part is just a helpful ’embellishment’ — you don’t actually need to do things to take credit for them (more on that later).

Similarly, if you are ever subject to any criticisms, just launch a blitzkrieg of things that you’ve done at your opponent and suggest that everyone can agree how awesome those things are. List every digest email you’ve sent over the last month, and mention the time you sent each one. By the fifth or sixth email, your critic will just give up out of sheer exasperation and agree that your performance has been impeccable.

4. Distract with Arguments about Minutiae

If you’re having trouble making the mental leap to finding good things about your performance to mention, you can always completely derail the discussion. If someone mentions that you haven’t checked in code in the last month, just point out that in the source control system you’re using, technically, “check in” is not the preferred verbiage. Rather, you “promote code.” The distinction may not seem important, but the importance is subtle. It really goes to the deeper philosophy of programming or, as some might call it, “the art of software engineering.” Now, when you’ve been doing this as long as I have, you’ll understand that code promotions… ha! You no longer have any idea what we were talking about!

This technique is not only effective for deflecting criticism but also for putting the brakes on policy changes that you don’t like and your peers getting credit for their accomplishments. Sure, Susan might have gotten a big feature in ahead of schedule, but a lot of her code is using a set of classes that some have argued should be deprecated, which means that it might not be as future-proof as it could. Oh, and you’ve run some time trials and feel like you could definitely shave a few nanoseconds off of the code that executes between the database read and the export to a flat file.

5. Time It So You Look Good (Or Everyone Else Looks Bad)

If you ever wind up in the unfortunate position of having to write some code, you can generally get out of it fairly easily. The most tried and true way is for the project to be delayed or abandoned, and you can do your part to make that happen while making it appear to be someone else’s fault. One great way to do that is to create a huge communication gap that appears to be everyone’s fault but yours.

Here’s what I mean. Let’s say that you’re working with Bill and Bill goes home every night at 6:00 PM. At 6:01, send Bill an email saying that you’re all set to get to work, but you just need the class stub from him to get started. Sucker. Now 15 hours are going to pass where he’s the bottleneck before he gets in at 9:00 the next morning and responds. If you’re lucky and you’ve buried him in digest emails, you might even get an extra hour or two.

If Bill wises to your game and stays a few extra minutes, start sending those emails at like 10:00 PM from home. After all, what’s it to you? It takes just as little effort not to work at 6:00 as it does at 10:00. Now, you’ve given up a few hours of response time, but you’re still sitting pretty at 11 hours or so, and you can now show people that you work pretty much around the clock and that if you’re going to be saddled with an idiot like Bill that waits 12 hours to get you critical information, you pretty much have to work around the clock.

6. Plan Excuses Ahead of Time

This is best explained with an example. Many years ago, I worked as lead on a project with an offshore consultant who was the Rembrandt of pre-planned excuses. This person’s job title was some variant of “Software Engineer” but I’m not sure I ever witnessed software or engineering even attempted. One morning I came in and messaged him to see if he’d made progress overnight on a task I’d set him to work on. He responded by asking if I’d seen his email from last night. I hadn’t, so I checked. It said, “the clock is wrong, and I can’t proceed — please advise.”

After a bit of back and forth, I came to realize that he was referring to the clock in the taskbar on his desktop. I asked him how this could possibly be relevant and what he told me was that he wasn’t sure how the clock being off might affect the long-running upload that was part of the task, and that since he wasn’t familiar with Slackware Linux, he didn’t know how to adjust the clock. I kid you not. A “software engineer” couldn’t figure out how to change the time on his computer and thought that this time being wrong would adversely affect an upload that in no way depended on any kind of timestamp or notion of time. That was his story, and he was sticking with it.

And it is actually perfect. It’s exasperating but unassailable. After all, he was a “complete expert in Windows and several different distributions of Linux,” but Slackware was something he hadn’t been trained in, so how could he possibly be expected to complete this impossible task without me giving him instructions? And, going back to number five, where had I been all night, anyway? Sleeping? Pff.

7. Take Credit in Non-Disprovable Ways

The flip side of pre-creating explanations for non-productivity so that you can sit back in a metaphorical hammock and be protected from accusations of laziness is to take credit inappropriately, but in ways that aren’t technically wrong. A good example of this might be to make sure to check in a few lines of code to a project that appears as though it will be successful so that your name automatically winds up on the roster of people at the release lunch. Once you’re at that lunch, no one can take that credit away from you.

But that’s a little on the nose and not overly subtle. After all, anyone looking can see that you added three lines of white space, and objective metrics are not your friends. Do subjective things. Offer a bunch of unsolicited advice to people and then later point out that you offered “leadership and mentoring.” When asked later at a post mortem (or deposition) whether you were a leader on the project, people will remember those moments and say, grudgingly and with annoyance, “yeah, I guess you could say that.” And, that’s all you’re after. If you’re making sure to self-promote as described in section three, all you really need here is a few people that won’t outright say that you’re lying when asked about your claims.

Is This Really For You?

Let me tell you something. If you’re thinking of doing these things, don’t. If you’re currently doing them, stop. I’m not saying this because you’ll be insufferable (though you will be) and I want to defend humanity from this sort of thing. I’m offering this as advice. Seriously. These things are a whole lot more transparent than the people who do them think they are, and acting like this is a guaranteed way to have a moment in life where you wonder why you’ve bounced around so much, having so much trouble with the people you work with.

A study I read once of the nature of generosity said that appearing generous conferred an evolutionary advantage. Apparently generous people were more likely to be the recipients of help during lean times. It also turned out that the best way to appear generous was actually to be generous since false displays of generosity were usually discovered and resulted in ostracism and a substantially worse outcome than even simply being miserly. It’s the same thing in the workplace with effort and competence. If you don’t like your work or find it overwhelming, then consider doing something else or finding an environment that’s more your speed rather than being manipulative or playing games. You and everyone around you will be better off in the end.

By the way, if you liked this post and you're new here, check out this page as a good place to start for more content that you might enjoy.

By

Wasted Talent: The Tragedy of the Expert Beginner

Back in September, I announced the Expert Beginner e-book. In that same post, I promised to publish the conclusion to the series around year-end, so I’m now going to make good on that promise. If you like these posts, you should definitely give the e-book a look, though. It’s more than just the posts strung together — it shuffles the order, changes the content a touch, and smooths them into one continuous story.

But, without further ado, the conclusion to the series:

The real, deeper sadness of the Expert Beginner’s story lurks beneath the surface. The sinking of the Titanic is sharply sad because hubris and carelessness led to a loss of life, but the sinking is also sad in a deeper, more dull and aching way because human nature will cause that same sort of tragedy over and over again. The sharp sadness in the Expert Beginner saga is that careers stagnate, culminating in miserable life events like dead-end jobs or terminations. The dull ache is endlessly mounting deficit between potential and reality, aggregated over organizations, communities and even nations. We live in a world of “ehhh, that’s probably good enough,” or, perhaps more precisely, “if it ain’t broke, don’t fix it.”

There is no shortage of literature on the subject of “work-life balance,” nor of people seeking to split the difference between the stereotypical, ruthless executive with no time for family and the “aim low,” committed family type that pushes a mop instead of following his dream, making it so that his children can follow theirs. The juxtaposition of these archetypes is the stuff that awful romantic dramas starring Katherine Heigl or Jennifer Lopez are made of. But that isn’t what I’m talking about here. One can intellectually stagnate just as easily working eighty-hour weeks or intellectually flourish working twenty-five-hour ones.

I’m talking about the very fabric of Expert Beginnerism as I defined it earlier: a voluntary cessation of meaningful improvement. Call it coasting or plateauing if you like, but it’s the idea that the Expert Beginner opts out of improvement and into permanent resting on one’s (often questionable) laurels. And it’s ubiquitous in our society, in large part because it’s encouraged in subtle ways. To understand what I mean, consider institutions like fraternities and sororities, institutions granting tenure, multi-level marketing outfits, and often corporate politics with a bias toward rewarding loyalty. Besides some form of “newbie hazing,” what do these institutions have in common? Well, the idea that you put in some furious and serious effort up front (pay your dues) to reap the benefits later.

This isn’t such a crazy notion. In fact, it looks a lot like investment and saving the best for last. “Work hard now and relax later” sounds an awful lot like “save a dollar today and have two tomorrow,” or “eat all of your carrots and you can enjoy dessert.” For fear of getting too philosophical and prying into religion, this gets to the heart of the notion of Heaven and the Protestant Work Ethic: work hard and sacrifice in the here and now, and reap the benefits in the afterlife. If we aren’t wired for suffering now to earn pleasure later, we certainly embrace and inculcate it as a practice, culturally. Who is more a symbol of decadence than the procrastinator–the grasshopper who enjoys the pleasures of the here and now without preparing for the coming winter? Even as I’m citing this example, you probably summon some involuntary loathing for the grasshopper for his lack of vision and sobriety about possible dangers lurking ahead.

A lot of corporate culture creates a manufactured, distorted version of this with the so-called “corporate ladder.” Line employees get in at 8:30, leave at 5:00, dress in business-casual garb, and usually work pretty hard or else. Managers stroll in at 8:45 and sometimes cut out a little early for this reason or that. They have lunches with the corporate credit card and generally dress smartly, but if they have to rush into the office, they might be in jeans on a Thursday and that’s okay. C-level executives come and go as they please, wear what they want, and have you wear what they want. They play lots of golf.

There’s typically not a lot of illusion that those in the positions of power work harder than line employees in the sense that they’re down operating drill presses, banging out code, doing data entry, crunching numbers, etc. Instead, these types are generally believed to be the ones responsible for making the horrible decisions that no one else would want to make and never being able to sleep because they are responsible for the business 24/7. In reality, they probably whack line employees without a whole lot of worry and don’t really answer that call as often as you think. Life gets sweeter as you make your way up, and not just because you make more money or get to boss people around. The C-level executives…they put in their time working sixty-hour weeks and doing grunt work specifically to get the sweet life. They earned it through hard work and sacrifice. This is the defining narrative of corporate culture.

But there’s a bit of a catch here. When we culturally like the sound of a narrative, we tend to manufacture it even when it might not be totally realistic. For example, do we promote a programmer who pours sixty hours per week into his job for five years to manager because he would be good at managing people or because we like the “work hard, get rewarded” story? Chicken or egg? Do we reward hard work now because it creates value, or do we manufacture value by rewarding it? I’d say, in a lot of cases, it’s fairly ingrained in our culture to be the latter.

In this day and age, it’s easy to claim that my take here is paranoid. After all, the days of fat pensions and massive union graft have fallen by the wayside, and we’re in some market, meritocratic renaissance, right? Well, no, I’d argue. It’s just that the game has gotten more distributed and more subtle. You’ll bounce around between organizations, creating the illusion of general market merit, but in reality, there is a form of subconscious collusion. The main determining factor in your next role is your last role. Your next salary will probably be five to ten percent more than your last one. You’re on the dues-paying train, even as you bounce around and receive nominally different corporate valuations. Barring aberration, you’re working your way, year in and year out, toward an easier job with nicer perks.

But what does all of this have to do with the Expert Beginner? After all, Expert Beginners aren’t CTOs or even line managers. They’re, in a sense, just longer-tenured grunts that get to decide what source control or programming language to use. Well, Expert Beginners have the same approach, but they aim lower in the org chart and have a higher capacity for self-delusion. In a real sense, management and executive types are making an investment of hard work for future Easy Street, whereas Expert Beginners are making a more depressing and less grounded investment in initial learning and growth for future stagnation. They have a spasm of marginal competence early in their careers and coast on the basis of this indefinitely, with the reward of not having to think or learn and having people defer to them exclusively because of corporate politics. As far as rewards go, this is pretty Hotel California. They’ve put in their time, paid their dues, and now they get to reap only the meager rewards of intellectual indolence and ego-fanning.

In terms of money and notoriety, there isn’t much to speak of either. The reward they receive isn’t a Nobel Prize or a world championship in something. It’s not even a luxury yacht or a star on the Walk of Fame. We have to keep getting more modest. It’s not a six bedroom house with a pool and a Lamborghini. It’s probably just a run-of-the-mill upper middle class life with one nice vacation per year and the prospect of retiring and taking that trip they’ve always wanted, a visit to Rome and Paris. They’ve sold their life’s work, their historical legacy, and their very existence for a Cadillac, a nice set of woods and irons, a tasteful ranch-style house somewhere warm, and trans-Atlantic flight or two in retirement. And that–that willingness to have a low ceiling and that short-changing of one’s own potential–is the tragedy of the Expert Beginner.

Expert Beginners are not dumb people, particularly given that they tend to be knowledge workers. They are people who started out with a good bit of potential–sometimes a lot of it. They’re the bowlers who start at 100 and find themselves averaging 150 in a matter of weeks. The future looks pretty bright for them right up until they decide not to bother going any further. It’s as if Michael Jordan had decided that playing some pretty good basketball in high school was better than what most people did, or if Mozart had said, “I just wrote my first symphony, which is more symphonies than most people write, so I’ll call it a career.” Of course, most Expert Beginners don’t have such prodigious talent, but we’ll never hear about the accomplishment of the rare one that does. And we’ll never hear about the more modest potential accomplishments of the rest.

At the beginning of the saga of the Expert Beginner, I detailed how an Expert Beginner can sabotage a group and condemn it to a state of indefinite mediocrity. But writ large across a culture of “good enough,” the Tragedy of the Expert Beginner stifles accomplishments and produces dull tedium interrupted only by midlife crises. En masse in our society, they’ll instead be taking it easy and counting themselves lucky that their days of proving themselves are long past. And a shrinking tide lowers all boats.

By

Decision Points in Programming

I have a sort of personality quirk that causes me to constantly play what I describe to others as the “what-if game.” This is where I have some kind of oddball thought about altering something that we take for granted and imagining how it plays out. Lest you think that I’m engaging in fatuous self-aggrandizing, I’m not talking about some kind of fleeting stoner thought like “what if I had like eight million Doritos and also X-ray vision?!?” I mean that I actually really start to think strange things through in detail.

For example, not too long ago I was in an elevator and thought to myself, “would I ride this elevator if I knew that there was a 1 in 10,000 chance that the elevator would plummet to the bottom of the elevator shaft?” I thought that I would. I was going up a ways and the odds were in my favor. I then thought to myself that this was a rational choice but viscerally insane — why take the chance?

This led to the thought “what if elevators around the world just suddenly had those odds of that outcome for some reason, and it was intrinsic to the nature of elevators?” Meaning, nothing we could do would possibly fix it. Elevators are now synonymous with 1 in 10,000 plummets. How does the world react?

It’s a wild thing to think about, but the predictive possibilities and analysis are endless. First of all, we’ve got all of these tall buildings, so it’s not as though we’d just leave everything in them and become brownstone dwellers. Some brave souls would go up to get things from these buildings and keep playing the odds. The property value of high-rises would immediately plummet, and you’d probably invert the real-estate structure nearly overnight with suburban/country home prices skyrocketing and swanky downtown high-rises becoming where extremely poor people and drug addicts lived (who else would routinely brave the odds?) I think the buildings would still stand because of the sheer amount of elevation required to knock them down and the fact that we actually develop quite a tolerance for risky things (like driving to work every day).

There’d also be odd anthropological effects. I’d imagine that a whole generation of teenage thrill seekers and death defiers would start doing elevator joy rides to prove their mettle. People would develop all kinds of cargo cult ways to stand or sit in the elevators with a mind toward simply surviving the plummets. In fact, perhaps humankind would just become really good at making the plummets survivable. Politically, I’d imagine that a huge wedge issue debate would emerge about freedom to ride elevators versus the sanctity of life or something. I could go on forever about this, but I’ll have mercy and stop now.

The point is that I take these mental trips several times per day, considering a whole variety of topics. Most of the thoughts that emerge are bizarre and beneficial only as exercises in creativity, like the elevator example, but some are genuine ideas for reboots in thinking about our craft. I find the exercise of indulging these mental divergences and quasi-daydreams to be a good way to get the subconscious brain working on perhaps more immediate problems.

So if you’re up for it, I invite you to have a go at this sort of thinking, but perhaps in a more structured sense. At times in the history of programming, decisions were made or ideas proposed that wound up having a profound effect on the industry. Imagine a world where these had gone differently:

  1. Tony Hoare introduced what we later called “the billion dollar mistake” — he implemented the concept of 0 as a null reference. But what if there were no null?
  2. A lot of what we do to this day as programmers has its roots in decisions made for the typewriter: for example, the QWERTY keyboard and using CR/LF for end of line. What if these conventions had been different when the computer started to take off?
  3. Edsger Dijkstra famously swung the tide against the use of GOTO as a programming language construct with a seminal paper. What if it had popularly stuck around to this day and GOTO statements were still something we thought about a lot?
  4. Of the three programming paradigms (structured, object-oriented, and functional), functional is the oldest, but it lied dormant for 40 years or so before gaining serious popularity today. What would the world look like if it had been the most popular from the get-go and stayed that way?
  5. C++ really took OOP mainstream, but it did it in a language that was effectively a superset of C, a non OOP language. This allowed for the continuation of a very procedural style of programming in an OOP language. What if that cut had just been made cleanly?
  6. What if the most popular object oriented languages didn’t have the concept of “static” and everything had to belong to an instance?
  7. What if Javascript had been carefully planned in an enterprise-y way, instead of thrown together in 10 days?
  8. If disk space had been as cheap as it was and the need for stored information rather than calculation had been higher, would the RDBMs as we know it ever have become popular?

Thinking through these things might just be a random exercise in imagination. But, who knows, it may give you an oblique solution to a problem you’ve been mulling over or a different philosophical approach to some aspect of programming. Things that we do, even highly conventional or traditional ones, are always fair game for reevaluation.

By

Years of Lessons Learned from Home Automation

I’ve had three variations of my home automation setup. The first incarnation was a series of Linux command line utilities and cron jobs. There was some vague intention of a GUI, but that never really materialized. The second was a very enterprise-y J2EE implementation that involved Spring, MongoDB, layered architecture, and general wrangling with java. The current and most recent reboot involves nodes in a nod to Service Oriented Architecture and the “Internet of Things.” As I mentioned in my last post, the I’m turning a Raspberry Pi into a home automation controlling REST endpoint and then letting access and interaction happen in more distributed, ad-hoc fashion.

The flow of these applications seems to reflect the trajectory of my career from entry level developer to architect — from novice and hobbyist to experienced professional and, well, still hobbyist. And I think that, to an extent, they also reflect the times and trends in technology. It’s interesting to reflect on it.

When I started out as a programmer in the working world, I was doing a lot in the Linux world with C and C++. In that world, there was no cred to writing any kind of GUI — it was all about being close to the metal, and making things work behind the scenes. GUIs were for the faint of heart. I wrote drivers and kernel space code and automated various interactions between hardware and software. This mentality was carried over into the world of hobby when I discovered home automation. X10 was the province of hobbyist electrical engineers who wrote code out of necessity, and I fell in nicely with this approach. It was all about banging away, hacking, and making things work. Architecture, planning, testing, deployment strategies, etc… who cares? Making it work was all that mattered. I was a beginner.

As my career wound on, I started doing more and different kinds of programming. I found my way into web development with Java, did things in the .NET space, worked with databases, and started to become interested in architecture, software processes and honing my craft. With my newfound knowledge of a breadth of technologies and better software development approaches, I decided on a home automation reboot. I chose Linux and Java to keep the budget as shoe-string as possible. For a server, I could use the machine I took with me to college — a 400 MHz P2 processor and 384 meg of RAM. The hardware, OS, and software were thus all free, and all I had to do was pop for the X10 modules at $10-$20 a piece. Not too shabby.

I was cost conscious, and I had a technical vision for the architecture. I knew that if I created a web application on the server that what I did would be accessible from anywhere: Windows computers, Linux computers, even cell phones (which were a lot more limited as nodes in a network 5-6 years ago when I started laying this out). Java was a good choice because it gave me a framework to integrate all of the different functionality that I could imagine. And I imagined plenty of it.

There was no shortage of gold plating. Part of this was because I was interested in learning new technologies as long as I was doing hobby work and part of this was because I hadn’t yet learned the value of limiting myself to the minimum set of features needed to get going. I had advanced technically enough to see the value in architecture and having a plan for how I’d handle future added features, but I hadn’t advanced enough to keep the system flexible without putting more in up front than I needed. A web page with a link for turning a lamp on may not need data access, domain, service, and presentation layers. And, while I had grand plans to integrate things like home inventory management, recipe tracking, a family calendar and more, those never actually materialized due to how busy I tend to be. But I was practicing my craft and teaching myself these concepts by exploring them, so I don’t look back ruefully. Lesson learned.

Now, I’m rebooting. My old P2 machine is dying slowly but surely, and I recently purchased a lake house where I want to replicate my setup. I don’t have another ancient machine, and it’s time to get more repeatable anyway. A minimal REST endpoint on a Raspberry Pi is cheap and repeatable, and it lets me build the system in my house(s) more incrementally and flexibly. If I want to use WPF to build a desktop app for controlling the thing, then great. If I want to use PHP or Java on a server, then also great. ASP MVC, whatever. Anything that can speak REST will work, and everything speaks REST.

Maybe in another three years, I’ll do the fourth reboot and think of how silly I was “back then” in 2013. But for now, I’ll take the lessons that I’ve learned in my reboots and reflect. I’ve learned that software is about solving problems for people, not just for the sake of solving the problem. A cron job that I can tweak turns my lights on and off, but it doesn’t present any way that the system isn’t weird and confusing for my non-technical girlfriend. I’ve learned that building more than what you need right now is a guarantee that you’ll have more complexity than you need and less benefit. I’ve learned that a system composed of isolated, modular components is better than a monolithic juggernaut that can handle everything. And, most importantly, I’ve learned that you’ve never really got it all figured out; whatever grand plan you have right now is going to need constant care and refinement.