Since the early days of the Internet, the role of software has become increasingly important, to the point where our world now heavily relies on it. Software is integral in various operations, from launching satellites into orbit to regulating vast networks of supply chains. We depend on it in our daily activities, whether driving cars, flying in airplanes, or paying for groceries.
As the number of software solutions continues to grow, it’s reasonable to anticipate a corresponding rise in incidents caused by software bugs. However, if you work at a tech company or depend on software for your daily job, you might notice issues with performance and usability on a regular basis.
Why does this happen, though? The complexity of modern software plays a significant role, despite the creation of specialized roles like user experience experts and quality assurance engineers who strive to minimize the number of bugs in software. Even with these efforts, the industry still confronts significant challenges. This article aims to explore the underlying reasons for this ongoing quality crisis, shedding light on why, despite advancements, software quality remains a critical issue.
If you wish to skip reading and watch a video instead, here’s my YouTube video which goes through the points I made in this article.
What do I mean by quality?
There are many definitions for quality, when it comes to software. Personally, I would define it having in mind the following dimensions: functionality, reliability, usability and security,
In terms of functionality, does the software provide all the necessary features for the users to achieve their goals? Does one need to rely on other pieces of software as well? A really bad example for not fulfilling the functional aspect when it comes to consumer software is Facebook. You need one app to use the platform that serves the news feed, and another one to use the chat function, even though the web app offers both functions at the same url. A different example, still applied to consumer software, is the video game industry. When I was playing video games back in my college years, I’ve never heard a game requiring 20 patches to play. Now, game companies launch their product, but it still feels as if it’s a beta version. They usually release 2 patches in the first two days after launching it to make it playable. The industry seems to be completely broken, and the prices are on the rise as well.
For the reliability aspect, it must be dependable, with minimal downtime or errors. This includes the ability to recover from failures and to maintain data integrity. There are many reports showing that cloud services, or other critical software are not reliable enough. For example, in 2023, Google support forums were flooded with reports from users complaining that their data has disappeared from Google drive. I personally considered storing data in the cloud safer than on a portable drive. It is definitely not!
From the usability perspective, the software must be user-friendly, not overly-complicated, as well as performant. I live in a place where the financial sector dominates a large portion of the economy. Banks and investment companies have tremendous power and close to unlimited resources here. With all that, their mobile banking platforms are a joke! Laggy, complex and unintuitive UI’s, I get frustrated all the time when I have to use them. Even their feedback form doesn’t work, throwing a 500 whenever I try to submit it.
Lastly, one of the most important aspects: security. Software nowadays has many dependencies, and we all know how hard it can be to manage them and keep them up-to-date. Even with automatic tools like Renovate or Dependabot, the providers for various libraries do not even ensure backwards compatibility for their newer major versions. This makes the job of the developers much harder, and often they are forced to handle these dependency upgrades manually. However, before they manage to do it, attackers might already use the vulnerabilities of old dependencies to steal user data. Such breaches are not just inconveniences for companies that may face legal actions from angry users, they are a danger for the end users as well. Thousands of Gigabytes of user data, like emails, passwords or even phone numbers, are dumped everyday on dark web forums. This is the main reason why we notice a spike in spam calls all over the world.
What’s broken?
Now that I have defined what quality is, I am going to give my personal opinion on what’s broken with software nowadays. I must warn you that you will see many direct references to either people or groups of people in this section. Needless to say that I do not intend to blame anyone, any group or profession for the mess that we’re in. I just want to provide my view, shaped by more than 10 years of experience working in this industry. It’s just a personal opinion based on anecdotes, so don’t get defensive or take this personally in any way.
There’s a competency crisis affecting the world right now. People are just too distracted due to the immense volume of entertainment we are subjected to every day. The attention span for everyone is decreasing to the point where we cannot stay concentrated for more than a few minutes.
In my experience, this problem is really prevalent among the youngest people. I had the pleasure to work with brilliant people that are part of the Gen Z bracket, but most of the ones I worked with can’t get anything done. It’s not lack of experience, it’s because they are constantly defaulting to their phones, watching endless videos or even playing games while they are supposed to deliver. It sometimes feels like you’re working with an addict. And it’s not just me, there are scientific studies showing the level of addiction to social media among this age group.
People cannot focus hard and long enough to handle complex tasks properly anymore, and this translates to more bugs, more usability problems and less dedication to their jobs and their mission as software developers.
One would counteract this point of view saying that effective software teams nowadays need to follow the latest best practices to maintain quality. TDD, event-driven architecture, hermetic testing, pushing straight to master and other esoteric concepts will help your team deliver continuously and maintain the quality level at the peak. Let’s take a step back and analyze where these suggestions are coming from?
One of the thought leaders of software engineering is Martin Fowler. Many engineers follow blindly everything that he proposes. I’ve seen this trend in many companies, especially in cult-like organizations that brag about their unmatched engineering culture. Many such organizations will build microservices even if their software domain complexity is not high and the software itself is not projected to scale that much. They will do it because their God told them to do it. But how did he, and other such leaders become so influential? They must have built world-class software themselves, right?
I searched for what Martin Fowler built everywhere. There’s no open source, or any published closed-source software that is marketed as being built by him. So I am asking again: how did he become so famous? Simply, by feeding what mediocre engineers crave for: controversy.
Building software nowadays is not about engineering anymore. It’s about choosing the right design patterns for the job. This comes as a result of the prevalence of high-level programming language, where thinking about complex things like memory allocation is not necessary anymore. This democratizes the access to a software engineering career to almost anyone. It’s not a thing for the truly passionate people. The main effect is that the standard lowers to such an extent, that you get to have people in your team that don’t even write code everyday, but they are highly-appreciated by non-technical managers for their level of implication in endless debates about whether one should use TDD or not. The value that these people bring is close to zero, and this reflects in the software they deliver.
With the emergence of those programming languages, and the abundance of frameworks that many companies release, the level of abstraction is on the rise. This gradually removes the need for engineers to develop a deep understanding of the technologies they’re using. They just don’t need that anymore. Don’t get me wrong, this has its own advantages. The fact that you get to be more productive, and everything now has a faster learning curve are great achievements. However, increasing the dependency on those tools can lead to a risk where engineers may lack a deep understanding of underlying principles, which can be crucial in debugging, optimization, and understanding the limitations of abstractions.
All in all, I believe software is broken due to these main reasons: lack of focus, fake engineers, as well as lack of deep understanding of what lies beneath the abstraction. I am really curious how AI is going to exacerbate the quality crisis we see in software nowadays. My bet is that in few years the situation will get worse, and good engineers will be forced out of the industry by their mediocre counterparts, as they might be preferred being more vocal and preaching what the fake gurus of the industry come up with.