At various times in my professional life I have been someone who builds software, someone who leads other people who build software, someone who sells and translates the power of custom software to clients, someone who supports the work of people who build software, and someone who tests software as a user. Each role has its joys and challenges, but I've realized that so far I find building software to be the most fun and rewarding of all of them.
This post is about some of the thinking behind how I build software and the processes I've found to work best over time:
- Great blueprints make great buildings
- We don't know what users want
- Do the hardest thing first
- Security and performance are not afterthoughts
- Don't trust memory, document everything
- Let your peers save you from yourself
- Don't launch on Friday
- Maintenance is software development, too
At the end I've also included links to some books and other reading about software development that I've found useful.
Great blueprints make great buildings
There's such a thing as too much planning, but it's rarely something you find in the world of software development. Most people (myself included) who have an idea for a tool, app or other piece of software want to see their vision rendered into a working thing as soon as possible. The excitement about building often drowns out any concern for the details of what to build and how to build it.
But I know from experience that if the person who wants the software thing built (the client) is not the same as the person actually building it (the developer), there will almost always be a gap to close between each others` understanding of process, milestones, expectations and criteria for success. Even something as simple as "a website that takes a single text input and displays a message based on that input" can be reasonably interpreted and implemented a hundred different ways.
So there's just no substitute for a careful, intentional, detail-oriented blueprinting process. (What I call blueprinting, other people might call variations of discovery, technical requirements and specification development, wire-framing, creating a statement of work, and many other names.)
I prefer to have an ongoing and open-ended conversation with my client about what they want and why they want it. I prefer to have the decision points and outcomes of the conversation documented in a shared and always accessible way for everyone to see. The conversation almost always needs to involve other people who are good at graphic design, user interface and user experience development. And I always want the people who will eventually be the ones to say "approved for launch!" to be in the conversation from the very beginning; I have so many stories of projects where the person making the decisions up until launch time didn't actually have the ultimate authority to do so, and then much of the work had to be modified or scrapped altogether.
Ideally, when the blueprinting process is done there should be a clear and shared understanding of what the end result is going to do, how it's going to look, and what resources (time, people, money, etc.) are going to be required to get there. There should be little or no room for interpretation. That doesn't mean that the software development process can have no room for changes or even spontaneity in the building process, but it does mean that everyone involved and affected understands the difference between a "change" and a failure to plan or to adhere to the plan.
Blueprinting can be one of the hardest parts of software development. It's often the slowest, the most tedious, and the most fraught with opportunities for conflict. I suspect many clients and developers would prefer to skip it altogether if they could - and some do. But in my experience, it's one of the most important factors in building good software on a reasonable timeline at a reasonable cost.
We don't know what users want
As someone who has spent a lot of time using software over the years, I might think I know all about what constitutes an intuitive user interface or a well-organized menu. But I've come to learn that I should not assume that I know anything about what users want until I actually learn that directly from users themselves. The same is true of my clients; despite their best intentions, they may not know their users very well, and I shouldn't take for granted that they do.
This becomes painfully obvious when I compare my profile as an educated, white, English-speaking, currently abled male with a background in technology to the profile of all the other software users out there. If I create software for people just like me, I will almost certainly exclude many other people who could have found it useful. This was highlighted quite poignantly for me when Kat Holmes (an advisor to Automattic) corrected my thinking about disability as "a personal health condition" to instead be cases where someone's environment is not matched to that person's unique situation. Oh, how many times have I been a part of creating that mismatched environment!?
Understanding what users want and need doesn't mean working from anecdotal information about what one or two of them say about your software as as you test it (though this could still be helpful). When you start thinking about inclusive design as essential to software development, it becomes a whole intertwined layer of the blueprinting, planning, development and testing process. It means focus groups, testing booths, surveys, diversity in planner and developer teams, filtering out cognitive biases, new kinds of web standards, massive data analysis operations, and much more. It's important, fascinating stuff - my colleague John Maeda writes more about inclusive design thinking in many places, including here and here.
Do the hardest things first
Software development work offers so many opportunities for positive feedback - I built it, I ran it, it worked! - that it can be easy to get in the habit of pursuing quick wins and low-hanging fruit to keep that good feeling alive. Tasks that require a more methodical approach, or that depend on complex interactions with other people, systems or APIs don't offer the same rush of accomplishment. And so it's tempting to do the easy things first and put off the hard things until "later."
The problem I've found is that later is always later and never now. And when the time finally comes to build the hard thing, I've frequently had the experience that there was something I wish I'd done a long time ago - get a question answered by the client, run a test to confirm a planned approach, seek the input of a coworker, and so on. Sometimes building the big, hard things can actually change how the easy, smaller things work and it would have been better overall if I'd started there.
One of the things I dislike most in software development is having to tell a client that something is not going to launch on time for no other reason than that I was bad at planning. Time estimation in software is so hard, and so frequently done poorly or not done at all. Most developers (myself included) are inherently bad at predicting how long something will take them to do, and even when the first three phases of a project have all ended up taking three times as long as they originally estimated, there's still a good chance they'll make a bad guess on phase four. 🙂 So I think it makes that dynamic even worse when I put off the hard things until the end...that's when bad estimates are going to bite even harder than they would otherwise.
Yes, it's always fun to mix some quick wins in along the way. But in general, I do the hardest things first.
Security and performance are not afterthoughts
I have built so much insecure, poorly performing software in my life that it's not even funny. I cringe when headlines scold software companies for some major security breach - for the people affected, yes, but also for the software developers whose oversights made the breach possible. There but for the grace of your favorite deity here go I. The same is true when you hear of a website or point of sale crashing at some critical moment. It's quite easy to build insecure, slow software, and so many of us do.
I know now that thinking about security and performance has to be a part of the software development process all along. If I write one function with the thought that "I'll come back later to secure this," then I've already lost. If I don't have tools like test suites, load testers and code sniffers involved throughout, I know that there will be a reckoning later on, sometimes a very expensive or embarrassing one. And I also know that most software developers (myself included) are woefully optimistic and/or unaware about all the ways that software can be manipulated into doing things it wasn't intended to do. I have a few co-workers at Automattic who can look at lines of code that I see as bulletproof and rattle off three different ways it would fall at the hands of skilled attackers. I secretly worship them.
And so I try (and fail, I'm sure) to bring security and performance best practices into my software building work all the time. I think about validating, sanitizing and escaping more than ever before. I use a code sniffer with a fairly strict ruleset integrated into my code editor so that I am confronted with errors and warnings as early as possible. I read articles about the details of recent public software exploits so I can try to understand how they worked. I take courses and workshops. And I assume that any data I work with online and in the world of software will eventually be compromised, thinking about how to minimize the negative effects of that eventuality where possible.
Don't trust memory, document everything
For every software project (or any technical project I take on, really) I start a text document where I drop any and all notes, links, code snippets and other material related to my work. For each step along the way I type a summary of what I did and how I did it; the only exception might be if that information is available in a commit log for a given code repository. If I explored a certain approach but ultimately decide against using it, I have notes about what I tried, why it didn't work, and why I left it behind. If I encounter a problem or a bug, I link out to all of the Google results and Stack Overflow articles that helped me understand how to get around it.
Most usefully, I have a section where I have the current thing I'm working on, along with the next two or three things I want to tackle. Even if all of those are reflected in a GitHub issues list or elsewhere, I have a local copy that I can focus on. Even if I'm only taking a couple hours for a break, I don't step away from my computer until I've written down what's next when I come back.
These practices free up my brain to think about more interesting parts of the problem solving without worrying about tracing my steps back through past work. They keep me focused and make sure I don't forget tiny little to-do items that I want to circle back to later, especially if those things aren't quite appropriate to put in a shared issue tracker ("Should I have duplicate detection on that migration script? Research needed.")
Best of all, I'm creating some useful history that I can refer back to later when I'm creating client-facing documentation, debriefing the project or planning for a future one. I've saved myself so much time by avoiding solving the same kinds of problems twice and just searching through my notes instead.
Let your peers save you from yourself
Despite all of the tool talk above, there's really no substitute for having another human being look at your code with you. Not only will they see things I don't -- security issues, unnecessarily repeated functionality, logic errors, etc. -- just the process of talking about software together will often uncover useful tips and tricks that can be incorporated into my own workflow. (It's one of the things I miss most about being in the same physical space with my coworkers - the opportunities to soak in new and better ways of doing things just by casual observation in the course of the day.)
It's tempting for any software developer (or creative person?) to resist having their work commented upon or criticized. Code review is fertile ground for ego problems, power struggles or just plain territorialism. But I've found that when I let go of any pride or sense of ownership and really listen to what my peers have to say about my code, the long-term value of what I learn far outweighs the short-term rush from "going it alone."
It's also tempting for people who manage software development processes to avoid building in time for peer code review. I've been that person, trying to be accountable to a client's desire for a fast timeline and just wishing that the thing we were building could be done faster. But I think it's a mistake to cut corners here, and in my own experience, it's an area where many teams could be investing more time and still not doing enough.
Don't launch on Friday
(For the purposes of this discussion, "Friday" should be generalized as "the end of someone's regular work week and the beginning of their non-work leisure time." I realize some people don't work in a traditional Monday-Friday schedule.)
Sometimes things get finished up on a Friday afternoon. And it's so tempting to push it out the door, confident that it will work well enough to make it until Monday. I have done this so many times. Sometimes I would get lucky, but sometimes I would really regret it. And now, it's just easier to know that Fridays are off limits for launching things.
Maybe there's something in our brains that makes us overly optimistic as the end of the week comes. The part of me that works well under pressure certainly enjoys that bit of crunch and excitement knowing that the clock is ticking. But if the code is truly finished, it will still be finished on Monday morning when you have the whole week ahead of you to troubleshoot bugs, respond to questions about what's changed, and communicate with coworkers and stakeholders during your normal working hours.
By being patient in this way, you're avoiding that scenario where you launch, the whole thing crashes down, and then you have to spend part of your weekend fixing it. Or the nightmare where you launch and things don't crash right away, but you've introduced a bug that was silently deleting critical data...and by the time you discover and fix it it's been out there for several days or more.
(There are times where peak usage times or client demands will necessitate launching on a Friday. If this happens, I find it's best to rearrange the team's schedule so that "Friday night" is essentially "Wednesday morning" and everyone will be around to help with any issues that come up. Once things are quiet they can take a break and go from there.)
Maintenance is software development, too
Releasing software into the world should be one of the most exciting times in the software development lifecycle. It's going to get used by real people in real-world ways, and you're going to get feedback unlike anything you were able to generate during internal testing. Unfortunately, so much software is released and then left unmaintained or even abandoned.
Sometimes this can be okay. Most software is purpose-built and the problem space it helped to solve will probably change soon enough to need a new software thing built for a slightly different set of requirements.
But I've found I don't enjoy working on software as much when its post-launch future is unknown or ignored. Certainly for the software I build for fun, I get really clear early on about whether it's something I'll be committing my time to maintaining over the years, or if it's something I'll label as unmaintained from the start. But paying clients and constantly evolving teams of developers are different - maybe the funds won't be there, or the user demand, or the political will, or something else. Still, I think it's important to approach software maintenance with the same care and consideration as other parts of a project.
Thanks for reading. Are there concepts or practices you have found especially useful (or especially NOT useful) in the world of software development?