I've read a nice article about economics of software correctness a recently.
It was a nice reminder that we are often forgetting about economics when dealing with the minutiae of our industry.
However, let me try to paint a bit broader picture, dealing not only with software correctness but software quality in general.
It's no secret that basically all the code we have is a mess. And that's putting it bluntly. Lot of people would choose more expressive language to describe the status quo.
So what's the reason for that?
As the article linked above argues, there's a cost to finding and fixing a bug and that cost is often prohibitive. And why is it so expensive? Because the code quality is poor. Code we deal with is a patchwork of buggy technologies hastily assembled together using duct tape. So, one would argue, improving the overall quality of code would lead to more maintainable systems and drive the cost of bug fixing down to sustainable levels.
And here's where it gets interesting.
Imagine you are an IT manager and you have a limited budget for doing engineering work. How would you spend it?
On one hand, you can invest in code quality. You ask the engineers to do code clean-ups, refactoring and so on. What you get is more maintainable codebase, which means you can fire, say, 20% of the department. Great! You have just reduced the cost by 20%.
On the other hand, you can ask them to implement a new feature. The code quality would stay poor but the new feature will attract more users. Great! You have just increased the userbase and thus the revenue by 20%.
What you'll really do depends on the ratio of these two numbers. In reality, they won't be equal and they won't be 20% either.
And when you look at the software landscape today, the number of users scales exponentially with respect to number of developers. We are in a phase of explosive growth. If you try to save costs, you can fire a half of IT department, but it's peanuts compared to what you can possibly get when focusing on delivering features: You can grow the revenue by whole orders of magnitude.
In the end, there's little question about what to invest in.
The above is bad news for software quality, but turning a blind eye on the problem won't solve it. This way we can at least speak about it and reason about it.
For example: Do we have good models for what's happening? Well, yes. It turns out that ecology have studied this kind of dynamics for decades.
To draw a parallel to software development, there are opportunistic species that take advantage of new niches (say a new clearing in the forest) and try to fill it as fast as possible, even at the cost of being fragile and prone to physical damage, diseases and so on. If the niche is stable, the opportunistic species are overtaken by so called climax species which invest heavily in "quality". They may not grow so fast, but they are going to survive next storm or next extra-chilly winter.
In central Europe you can compare Tree of Heaven (Ailanthus Altissima) which is an invasive tree species growing up at astounding speed to common oak (Quercus Robur) which dominates old forests. The difference in "quality" is quite large. While the wood of Ailanthus breaks easily, sawing through an oak plank is requires a non-trivial amount of physical strength. Unsurprisingly Ailanthus' lifespan is approximately 60-80 years while oak can survive for 500 or even 800 years.
So, can we expect the software niches to be filled eventually and opportunistic software to give way to quality climax software?
I don't see it happening any time soon. There's still too much empty space to fill. It's still Wild West and unlimited horizons down here.
However, we shouldn't get desperate. Software isn't going to grow on forever. Consider steam engines. At their heyday one would say there was no end to their expansion. Yet, today, steam engines are a pretty small and stable niche seeing little growth. When we'll eventually get there, we'll finally have time to write some beautiful, well-designed and bug-less code.
Martin Sústrik, October 14th, 2015
My analysis is different. Markets for simple CRUD apps are already nearly full, but management of most companies is not interested in non-trivial softwares. Management doesn't want to fill the markets but to hire as many low quality people as possible. Fortunately, low quality developers can't quickly add features to non-trivial programs like AI softwares. Thus, uncultured middle managers and executives will likely avoid AI softwares that require a few high quality researchers and/or developers. Google is a good example of AI.
You can add features quickly by hiring 5-15 high-quality devs instead of hiring 50-200 low quality developers. Then, why does management want to hire as many people as possible if they can hire a few high quality developers at lower costs for business? Middle managers and executives prefer to have more people under them because it's hard to prove they managed a few high quality developers but easy to prove they managed 50-200 people. It's difficult to get good titles by managing a few people, but it's relatively straightforward to get a shining title and promotions by managing hundreds of people. With a better title, you can negotiate better positions in other companies. With more people, it's easier to decouple yourself from responsibilities of work outcomes. Plus, if you hire high quality developers, management would look incompetent by comparison and won't be able to negotiate better titles due to the perception of low performance compared to developers.
We can start replacing low quality applications with high quality applications by making middle management work.
One possible solution is to create an organization that pools resources, funds companies, and supervises them. michaelochurch.wordpress.com /2015/10/01/silicon-valley-can-be-beaten suggests a new form of such organization called category. It's a long article. Category is a new concept, so it may not work in practice.
If quality is sufficiently bad, then you will not be able to add features at a pace that works for the market. Improving quality to satisfy engineering sensibility, especially if there are no external impact, is often a waste of time. You need good enough, and invest where it matters. What I found works well is to improve the code in the area you need to work in anyways, say, for a feature. You already paid for the learning curve. Adding tests pays off right away, and then you have lower risk for refactoring. You need good enough. It's depressing to have a never ending backlog of defects, and while I am more on the better than worse side because of that, I implementing aging of the backlog at one point. It was truly astonishing how many defects no one really cared about.
Not sure how to quality if a particular market segment is full. Price of end product? Cost of the talent? Number of competing products to choose from?
I think it's way to simplistic to categorize engineers into low and high quality. Engineers operate in an environment that drives behavior (rewards, schedule, resources etc). If all your stakeholders care about is features, engineers understand that and optimize accordingly. There was an article floating around a while ago about how the JPL team writes software for the shuttle. They optimize for quality, and have very low defect rates as a result.
I too have seen the phenomenon you describe, as a manger and execute when interviewing, you are being evaluated on size of budget (cost) instead of the result (revenue) or cost effectiveness. It's really weird to me. Some of it is perceived risk mitigation. Not sure if there is really anything you can do on applicant side of things, but you know, but it sounds like it would be loads of fun competing with companies like that.
/Allan
We don't know the right balance between research and development, yet, and management of the vast majority of companies doesn't allow research at all. If management doesn't want to allow research, how would it measure external impact of research? So, it'll be definitely beneficial to allow more research in companies and measure the impacts at this point.
Technology is too important for us to get it wrong. If we get it wrong in this century, we might head toward permanent dystopia. In my opinion, mainstream business culture is bad for profit, products, and potential for economic growth. Mainstream business culture has had the worst kind of success because the talented and intelligent people haven't learned to organize and pool enough resources from outside companies to control management. And, people haven't seen better alternatives to mainstream business culture, yet, so they tend to think it is necessary or even beneficial.
If some organizations like "category" protect "open allocation" from outside hundreds or thousands of companies by pooling a lot of resources, funding companies, and supervising them, then we'll probably see more economic growth and better products.
If you are curious of the concept of category, refer to michaelochurch.wordpress.com /2015/10/01/silicon-valley-can-be-beaten
Also relevant here: Bad code isn’t Technical Debt, it’s an Unhedged Call Option
I dunno.
If investing a bit in quality lets you fire 20% of your developers, it also must let you add features 20% faster with the same developers.
If you have a mature product, that means it'll be able to grow faster (or at all). If you have an immature product, it means you'll exchange an earlier doubling for a shorter doubling time at future iterations.
Both should make sense in lots and lots of situations. Yet, we don't see companies thinking like that in practice.
Post preview:
Close preview