Software is Virtual; the Virtual is Disruptive; Software Disrupts the Development of Software
That capitalism tends towards the virtual, and that the virtual is disruptive, not least to capitalism itself, is not a new idea. It’s already there in the work that ironically most defines capitalism, Marx’s Kapital. If the latter seems like an odd statement, consider that Marx himself disowned what had come to be called, already, “Marxism”. What we call “Capitalism”, although the word had been around, didn’t become a determinate term until the publication and dissemination of Kapital. Prior to that the more common terms were ‘free market economics’, ‘bourgeois economics’, etc. Since it can’t be considered a “Marxist” work, what is Kapital other than a very thorough, and very prescient, analysis of capitalism?
In any event, software is the most purely virtual of products. While it can never be said to be completely actual, at the same time it’s definitively real. In a sense, software is the closest thing to magic — you write text and it has real effects. Although writing has always had an effect, that effect has always come about in the way that the effect of Kapital came about, via publication, understanding and dissemination (and understanding always includes misunderstandings). The text in software, though, is interpreted or compiled via other software, there’s no need for a human being to interpret and understand it for it to have real effects. We tend to have a Disney type understanding of magic these days, but magic was never particularly considered necessarily good. It was feared as much as respected, and for good reason, it was disruptive.
That software has disrupted capitalism is not really in doubt for anyone with their eyes and ears open. Most new ‘disruptive’ companies are able to be such via software. Being disruptive itself has disrupted even the base notion of corporate capitalism, where new companies are now given awards for being the ‘most disruptive’. That that doesn’t seem odd, when being disruptive in the heyday of corporate capitalism, the 1950’s and early 1960’s, would have been considered close to treason, is both an effect of the continually increasing disruptiveness of software as virtual, and a clear indication of the extent it’s progressed so far.
While the two key disruptions software has been in process of effecting are the decapitalization of industry, and the Desubstantialization of currency, those aren’t my focus in this article, since I’ve written about both elsewhere in any case, but more cogently because the disruptive nature of software doesn’t only affect other industries, but has one of its most powerful effects on the software industry itself. This shouldn’t be a surprise, like currency and as virtual, software is one of the easiest things to disrupt, particularly by other software. The newness of the industry, combined with its focus on newness, results in this disruptive tendency being nearly unrestricted. A short look at the history of software development and the companies that have cropped up, become huge, and disappeared, never mind their products, should be sufficient evidence that the software industry experiences as much, if not far more, disruption than it causes in other industries.
There are exceptions. Those are primarily where things must work, and disrupting that is not an option. Even there the effects of disruption can be seen, but they must be minimized as far as possible. Those are niche areas though — medical equipment, military equipment, aviation etc. The mainstream of software development, though, doesn’t deal with life or death situations, other than the life and death of companies and, occasionally, ideas.
One oddity about being ‘someone who designs and writes software’ is that none of the names for that role is fully accepted. There are developers, coders, software engineers, etc., which in a general way all mean the same, though they have significantly different implications. In a common phrase, coders ‘hack on’ problems’. While the notion of an aviation engineer ‘hacking on’ an engine design ought to make anyone who needs to take a flight somewhere nervous, in the software industry an “engineer”, if differentiated at all from a “coder”, is often seen as the lesser of the two: less creative, less inventive, less dedicated, etc. Although the specific sense depends on perspective, they are seen in one sense or another, as lesser.
This oddity, and many others, are part of the disruptive nature of software reflexively disrupting its own production and development.
The industry disrupts itself in a myriad of ways and justifies it in a myriad of ways. The most common justifications, though, circle around the notion of popularity. This is problematic, because it’s a situation of applying judgments appropriate to end products, to core technologies instead, when it’s precisely by using stable, reliable, specific, and therefore not especially popular core technologies that the combination of flexibility to produce new and potentially popular products with reasonable efficiency is made possible in every other industry. By “circle”, I mean specifically that while various “reasons” why different things become or don’t become popular, remain or don’t remain popular, etc., those reasons are all themselves determined by popularity itself. But the notion of something becoming popular due to popularity is absurd, just as is the notion of something losing popularity due to popularity.
The former is generally ascribed to luck or to the emergence of a ‘killer’ feature, which amount to the same since ‘killer’ features appear in terrible technologies as often as they do in good technologies. The latter, though, has a more determinate initial reason, things lose popularity because they’re no longer ‘new’. By not being ‘new’, or ‘fresh’, they immediately become ‘outdated’, ‘old fashioned’, etc. When this means of judging technologies is applied to core technologies rather than user products, constant disruption results in a situation where stable, gradually improving core technologies capable of being built on, so that the user product can be quickly changed and adjusted to consumer whims without overly significant additional costs, essentially never happens.
The effects, aside from the tendency of all but the biggest software vendors to come and go, is the fact that software development remains the only industry where labour intensive methodologies haven’t become less so over time. In other industries, the “bottom line” dictates that they must, since if they don’t, they will at a competitor company, and that competitor will put you out of business. In software, the companies that “go” rarely do so because they have the worst methods, or even the worst products. Even in non-commercial or open source software, what “goes” in terms of no longer attracting further development, without which It becomes unusable due to changes in the underlying technologies, also has little or nothing to do with quality. In some cases, the squeaky wheel effect even favors worse technologies. The main reason a given technology, assuming it had been sufficiently used, fails to attract further development or even maintenance is simply because it’s no longer ‘new’. This has the rather perverse effect that by the time software is mature enough to become a stable base, the industry has lost interest and consigned it to the dustbin of proprietary abandonware or unvisited open source repositories.
If ‘newer’ technologies were in fact better, this wouldn’t be as much of a problem. If they were as good, it would be frustrating, but still not the size of problem it in fact is. It is as much of a problem as it is because, in general, rather than being better, the newer technologies are more often worse.
With that statement, probably over half of the people who had read that far, stopped, feeling that it’s a ludicrous statement. After all, even Windows is better than it was 20 or even 10 years ago. However, I can agree that it is, and still maintain that newer core technologies are, overall, worse.
Firstly, Windows isn’t a core development technology, though it does contain core development technologies. Rather, it’s a user technology that contains development technologies simply because they must be there at deployment time. The confusion stems from the fact that software can only be developed using software, and thus developers are seen by developers who write developer tools as users. Core technologies, which include some of the things in Windows and other operating environments, but aren’t exhausted by them, generally must be hosted on an existing software environment (and hopefully developers at least test on the target deployment environment). This confusion isn’t limited to operating environments, with the result that as noted above, developer tools and core technologies are judged by inappropriate criteria, and correlatively, and perhaps with a more detrimental effect, they are designed using user technology criteria, since they are going to be judged by user technology criteria. This confusion doesn’t simply affect things that are used on both sides of the (somewhat porous) line, but to things that are definitively never going to be of interest to an end user.
Secondly, is Windows better due to better tools and methodologies? Not to any great degree, the tools and methodologies used to build Windows 10 are not significantly different than those used to build Windows 1. Windows 10 is better, in large part, because it can be, even if built using essentially the same methods and tools, and it can be due to the radical increase in hardware capability over the period between the two. This increase is the most extreme example of the effects of using a stable, gradually improving core set of technologies, which due to that gradual, non-disruptive improvement provide the capability for rapid and in some cases exponential improvement in the final product. Beyond that, radical, breaking changes in the product are prevented by the need for backwards compatibility, which prevents that rapid increase from triggering a necessary wholesale change in operating environments. Current intel CPU’s are largely (not completely, but sufficiently from iteration to iteration that you need to go back a fair number of iterations before you find breaking changes) compatible with the instruction set from the 8080 CPU introduced in the late 1970’s. On the slightly higher end systems, which have radically lower sales in any case, Sun/Oracle’s Sparc/Ultrasparc architecture is binary compatible back to 1992, while IBM’s POWER architecture is similarly backwards compatible.
That the development pace of hardware is the main reason the software industry can maintain any pretense of improvement is demonstrated by the comparison between Office 2000 on then current Windows on an average machine for the time, versus Office 2007 on the then current version of Windows on an average machine of that time. Despite have virtually no new features, Office 2007 on 2007 hardware (which itself was magnitudes faster than 2000 era hardware) was slower than Office 2000 on hardware from its time. If that were an isolated situation, it would be one thing, but it’s far from being unusual.
Even the modest improvements in the tools used to build Windows have been largely made possible by faster hardware, while the increase in resource usage by those tools have vastly outpaced hardware development.
Another result of the rapid development of hardware and resulting improvement, at least in some cases, of the base operating environment, is that the two come to be viewed as the base of core technologies, rather than the prerequisite host environment for both developer and end user tooling. Everything else, development tools included, are viewed as user products.
This has several effects on the nature of the tools themselves, something I’ll go into more detail on in a follow-on article. In this article I’m going to just look at the key disruptive effect to developers and therefore to the industry.
Developers, as much or more than end users, like new things (most of us got into it because we like shiny new toys, after all) and define themselves by that like. Engineers in other fields like new things too, but since an engineer in general is defined by the ability to make things work, if new tools don’t work better, no matter how shiny they may be, they’re played with a bit and then put on the shelf to gather dust.
That implies, though, that the current tools work well, that a stable base of tools was achieved. Since that’s not really the case, at least in the mainstream of software development, ‘new’ tools, even if worse, are viewed as at least potentially better, since the old ones never became as good as they were expected to be. Since we’re also not known for exceptional attention spans, other than in spurts when necessary, by the time it becomes obvious that the ‘new’ tools have also failed to live up to whatever promise was seen in them, whether it was there or not, most have forgotten that the previous set had realistically gotten further, until they were ditched for this set.
One useful technology, though hardly as new as its purveyors like to portray, is the existence of virtual environment hosts, many of which can quite readily run older environments that can in turn host those older tools, and it’s instructive to occasionally run them up (most are available on various abandonware sites, so it doesn’t cost much if anything, just a few minutes of time) and compare them to what’s currently in use. If more people did that, the pretense that tools and methodologies have gotten better would be far less easy to maintain.
Were it just a matter of the effect on developers themselves it would already be a significantly negative thing, because there is a scarcity of good developers, largely caused by frustration and a resulting short average career span. Most get out of development after gaining about the same experience it takes to first become a qualified electrician, never mind an experienced electrical engineer. Even for those who have stayed in development, because we get a kick out of making things that work that nothing else would really satisfy, eventually the minute to minute annoyances caused by tools that either don’t work reliably, or work so slowly that you can go for a long lunch if you must restart them and still get back before they’re ready for use, eventually in combination overcome the desire to make things work. Since developers largely don’t make the decisions on the tools they have to use, most can’t simply drop things that don’t work and use something else, even if something else exists that will work in the overall environment.
Unfortunately, though it’s worse than just being an annoyance for developers, because many of the systems we depend on were written and can only be built with tools that are no longer available on current environments, which means they not only can’t easily be improved, but can’t even be maintained sufficiently to continue to run on current machines and environments. One system I know of in that situation happens to be a system that’s crucial to aviation companies and companies that produce parts for them. Knowing the problem, and that no solution to it currently exists, my lack of desire to fly (originally because airports are the most boring places in the world) has gone from passive distaste to active avoidance. That similar situations exist in plenty of other industries is a serious problem.
There are alternatives, but the entrenchment of attitudes that actively work against improvement means that they are and will remain niche, even if only because they are niche, at least in North America. In other places that can’t afford the same inefficiencies, those alternatives are already vastly more popular than they are here. Switching to them here would be a massive disruption, though, not due to the need to learn new tools and methodologies, since doing that all the time is part of the problem, but because to be better, they require a different mindset than most developers have learned.
More fundamentally, they require that one be an engineer, and the leveling effects of poor development tools has resulted in a situation where only a minority of current developers have the capability of learning them.