The "I Don't Know" Problem: Twelve questions for your engineering leadership
Almost 25 years ago, Joel Spolsky asked developers to grade their workplace's effectiveness with twelve straightforward, yes or no questions. Today, we'd like to ask you twelve more questions, and we want you to think really hard about the answers.
Mar 8, 2024 • 8 Minute Read
This is a sales pitch.
We make a product called "Flow"; at a glance it's a developer-analytics tool, but it's a lot more than that. It does some interesting things that I think are important. We'll get to that in a minute.
Almost 25 years ago Joel Spolsky asked developers to grade their workplace's effectiveness as a software shop with twelve straightforward, yes or no questions:
Do you use source control?
Can you make a build in one step?
Do you make daily builds?
Do you have a bug database?
Do you fix bugs before writing new code?
Do you have an up-to-date schedule?
Do you have a spec?
Do programmers have quiet working conditions?
Do you use the best tools money can buy?
Do you have testers?
Do new candidates write code during their interview?
Do you do hallway usability testing?
As he described it, "A score of 12 is perfect, 11 is tolerable, but 10 or lower and you’ve got serious problems. The truth is that most software organizations are running with a score of 2 or 3, and they need serious help, because companies like Microsoft run at 12 full-time."
Today, in a world where version-control-as-a-service is one click away, containerization is everywhere and hey buddy your first continuous deployment is free, it's strange to imagine working in a software shop where the answer to some of these might be "no". A few things have changed - specs and schedules up-front have grown at odds with the agility and… scrumminess, if that's even a word [1], that most shops aspire to today - but with nearly a quarter-century of hindsight, an eternity in industry time, his thesis has almost entirely proven itself out.
Today - apart from Point 8 - Spolsky's twelve questions have become common wisdom, solved problems or both. Some grew into mature tools or processes, some were standardized or saasified [2], they've become the background noise of modern development. This is no criticism - it's a big improvement over what we all had back in the day - and if you're early in your career, be honest: can you imagine trying to get anything done in a world where these are questions? Building software without version control? Trying to collaborate and ship when getting something to build at all was a contraption, when you couldn't run a test suite, integrate changes or deploy in a single motion?
Where the answer to "where do I file a bug" might be "I don't know?"
Madness. Absolute lunacy. You might as well be trying to chisel your software out of a rock with a slightly nicer rock while muttering about how well your gazelle femur collection will scale. It's amazing that anything worked at all, and the cold, hard truth is that most of what came out of the shops operating in that zero to four range mostly didn't.
Point 8, though. Why does Point 8 jump out of that list? Why is that still a question?
I think the most important questions in software aren't about source control or schedules and never have been. The real Experience of being a Developer, the stuff that matters most, has almost nothing to do with your tools, and everything to do with your management and leadership. And the job of management is never getting easier, because people are a lot more complicated and a lot more important than code, and this industry's leadership has put more than four hundred thousand of those people in the streets since 2022, and I'll bet you a thousand dollars that damn near every last one of those people knew where to file a bug.
That's what's different about point 8; it's the only thing on Spolsky's list that is unambiguously the responsibility of leadership.
If you're a manager here in 2024 wondering why your people - you know, the people worried they might not have a job next week if they can't convince a VP that a 90 minute commute to an open plan office where the only doors that close are in the bathroom isn't exactly a boon to productivity - might not be laser-focused on "developer velocity" or "CI/CD throughput", I have some questions for you:
Do you know how long bugs take to fix, on average?
Can you see the state of your team's backlog at any time?
Do you have usage or throughput metrics from your dev environment?
Do you have a person or team responsible for improving your tooling and workflow?
Do you use metrics to inform training, recognition and promotion decisions?
Can you measure the effectiveness of cross-team collaborations?
Can you measure processes before and after you change them?
If an issue goes untriaged or a pull request goes unreviewed too long, is anyone automatically notified?
Do you look for empirical research when you're making decisions about process or culture change?
Can you use historical data to predict future shipping schedules?
Do you know what parts of your codebase are the most expensive?
Do you have a way to show your org's leadership that your team is investing most of their time in their highest priorities?
I'd love to be able to riff off, rip off if I'm honest, Spolsky's entire post ("Have you ever heard of DORA? It’s a fairly esoteric system for measuring how good a devops team is. No, wait! Don’t follow that link!") but this is where that all goes off the rails, because we're talking about management capabilities, not engineering practice.
I'd love to be able to say that "most software organizations are running with a score of 2 or 3, and they need serious help", but the cold, hard truth is that in most software shops, most of the answers to most of these questions aren't even yes or no.
They're "I don't know."
It doesn't need to be this way. That's the sales pitch.
And these are just questions about capability; more than a quarter of developers don't know whether or not their managers and leaders use metrics to understand their engineering work at all.
I should be specific: when I say "more than a quarter of developers" I mean "26%, and that number was not obtained easily". We surveyed more than twelve hundred developers and almost five hundred managers from every corner of this industry to bring you that number. This isn't "N = A Couple Of My Office Buddies But Also Some Dudes From That One Hackernews Thread" territory, that "26%" is very, very real.
Some other things we learned from those 1282 developers - who on average had 14.4 years of experience, so about eighteen thousand years of working experience between them - are that:
87% of them believe that software metrics would help them do better work as individuals and collaborate more effectively as teams, but
Only 27% of them believe their company is using the right metrics for their work, and
Again, 26% don't know if their organization is using metrics, or for what, at all.
Let me sweeten the pot for you: 88% of the 465 managers we surveyed believe that a critical part of their jobs is to make their developers' technical work and efforts visible to the company at large, but only 24% of developers believe their managers have that visibility at all.
These are, I say again, hard-won numbers built on thousands of years of combined developer experience, and they're saying that three quarters of developers do not believe their managers have the tools they need to do their jobs.
This is madness. This is absolute lunacy. You might as well be trying to chisel your roadmaps and milestones out of a rock with a slightly nicer rock while muttering about how velocitivity accelerifies deliveration [3].
It does not need to be this way. We don't need to live like this. That's the sales pitch.
We make a product called "Flow", a developer-analytics tool that does some interesting things that I think are important. Twelve of them are that you can answer all those questions with an evidence-backed "yes", but that's really the least of it.
By starting with empirically- and morally-defensible metrics and measuring what really matters, the most interesting thing about Flow isn't seeing the data day-to-day. It's the cultural change that becomes possible by having real data day-to-day.
One of our partners has described our onboarding process - the "data walk" where we explore their development histories through Flow for the first time - like turning the lights on after a party. It can be a jarring, startling experience. "It was fun, we were having a good time… what the hell happened here?" At times it feels like an intervention, like therapy; we treat that process with as much care and kindness as we can muster, because often these companies and their leaders are seeing themselves as the data shows they really are for the first time, and most of the time that moment of clarity is a lot.
But once you can see how your company really works, by aggregating information across teams and departments to see the early signs of change and larger trends over time, a lot changes. When individual devs have real visibility into how their team can better work together, they're able to make a real case for change. Managers and directors who can anchor leadership decisions in data can make informed decisions about investments in training, organizational change and process improvement, and show the organization how those decisions played out or paid off. A way to not guess but know what your teams need to become greater than the sum of their parts.
That's what we're really selling, of course, and that's where Flow shines. Empowered developers, data-driven decision-making, the entire experience of effective, evidence-based engineering leadership.
You can have that. That's the sales pitch.
If that sounds good to you, you should get in touch with us. We're here to help.
[1] - It is not a word.
[2] - This isn't a word either.
[3] - No.