Tech debt is just a thing that happens, right? We start writing code, tech debt accumulates, and bam, there's suddenly too much of it. But does it have to be like this?
Have you ever heard of a software development team that didn't have to deal with technical debt?
Me neither. In fact:
Engineers spend ~33% of their time dealing with technical debt which crushes team morale and costs companies ~$85Bn/year.
Let that sink in. That's $85 billion.
A third of all engineering time is spent dealing with technical debt. It fills software engineers with dread. It truly is the worst thing about their job—and to add insult to injury, the rest of the business often fails to empathise with them. It's no-one's fault, but technical debt is widely misunderstood. It slowly ossifies software companies until they can no longer grow without taking months or years to rewrite their application . . . which is a lifetime in today's market.
Tech debt is just a thing that happens, right? We start writing code, tech debt accumulates, and bam, there's suddenly too much of it. Our codebase becomes mired in debt, and we reach technical bankruptcy. So we fight it. We embrace the grind and repay some of that debt. Maybe we'll make it, maybe we won't. That's just the nature of software businesses. It's how we roll.
But does it have to be like this? Are there some fundamental laws of software development that make it so? And if there are, can we bend these laws to our advantage? Technical debt is a fact of life, but technical bankruptcy doesn't have to be.
Let's look to at a different science for some clues, because the laws of thermodynamics can give us key insights into why technical debt is inevitable.
Also known as the law of conservation of energy, this states that:
The total energy of an isolated system remains constant; it is said to be conserved over time.
In other words, energy cannot be created or destroyed, though it can be changed from one form to another.
To keep things simple, let's think of our codebase as an isolated system. Ignore third-party dependencies and anything else that lives outside of our codebase. Under these conditions—a stable number of engineers, shipping features at a high, yet steady pace—the amount of entropy (a measure of the disorder and randomness in a system) in our codebase remains constant.
Our engineers can't become superhuman overnight, so we can't increase their efficiency if they're already at maximum capacity. But we can hire more engineers, and this increases the energy in our system. It's why Brook's law is a thing: ‘Adding people to a late software project makes it later’ because it increases energy—or chaos—in a system where chaos is already high.
High-growth software companies constantly battle with this force. They raise a round of funding, double the size of their engineering team as quickly as the labour market will allow, and then have to deal with the massive increase of 'energy' in their codebase. It's often overwhelming and it can lead to a sharp increase in technical debt if they fail to implement countermeasures.
But wait. Why should it lead to an increase in technical debt?
A closed system's disorder cannot be reduced, it can only remain unchanged or increase.
Essentially, isolated systems naturally degenerate into a more disordered state. This 'disorder'—which we called 'energy' or 'chaos' above—is called entropy. Ivar Jacobson and friends have researched the phenomenon of entropy in codebases and coined the term software entropy. As a codebase is modified, it's entropy increases. This growing chaos is the cause of technical debt.
Back at our high-growth software company, their customer base keeps expanding, and the market is heating up. Their expanding engineering team ships day and night to keep up with the growth. If nothing is done, the codebase will become ever more complex. The pressure is on.
I look at technical debt as entropy in the codebase. I don't think it ever ends, it's a constant struggle.
Ron Paridges, VP of Engineering at Carta
When planning every sprint, our company gets to choose: should they do something to tackle the growing complexity? Or should they just ship new features?
The third and final law of thermodynamics makes it clear why this trade-off is nowhere close to obvious.
The entropy of a system approaches a constant value as the temperature approaches absolute zero.
This complex-sounding statement is, in fact, quite simple in essence, and while the laws of thermodynamics have far-reaching—and sometimes mind-boggling—implications, their principles are easily grasped. For example, when water takes the form of vapour, its molecules are "free" to behave in a totally chaotic way. Entropy is all over the place. However, when water freezes, its molecules are "trapped" and stay in place (more or less). Entropy decreases and approaches a constant value.
So, what can we do to control software entropy?
Well, we could stop shipping code. That's why some teams like to have a 'code freeze' to properly QA their system before it's released to customers. If the codebase isn't modified anymore, there's no more entropy, no more chaos, no more unintended consequences, and no increase in tech debt.
It's nonsensical to try and instigate a permanent code freeze—we've got to ship. So our only other option is to refactor.
The process of code refactoring can result in stepwise reductions in software entropy.
Here's a free VSCode extension to help you do just that. Get started now before things get out of hand!
So far so good. In fact, this all seems a bit like stating the obvious; anyone who knows anything about building software knows this stuff, at least intuitively.
So why are we still being caught off-guard by technical debt?
It's because even though we know we should refactor to reduce complexity, there is a myriad of other pressure-inducing forces that prevent us from allocating the time and resources needed to do it properly and often enough. This means that—in our codebase, and all other codebases out there—software entropy is constantly increasing.
We all know Moore's Law, but, consider Bill Gates' variant on Wirth's law:
The speed of software halves every 18 months.
I'm convinced that the growing entropy in our codebases is the main driver behind this law.
The growth rate of software entropy is directly correlated to the growth rates of things like technology, software markets and companies, and coding literacy—all of which are growing pretty damn fast.
Imagine the pressure this expectation puts on software product development teams. These are daunting global trends, and they can feel crushing to a relatively small group of people. It's almost like asking them to fight gravity and take flight by flapping their arms.
Next Wednesday, we'll look at the micro-trends constantly pushing us towards technical bankruptcy, and how high-growth software organisations can combat them. Stay tuned! 🙌🏻