The traditional friction from deriving commercial success and revenue from open source software is the growing prominence and influence of large end users. Massive webscale players, large financial institutions, healthcare companies and many other verticals are now participating and represented in modern open source software projects and communities. While these organizations have everything to gain from software innovation and driving the capabilities and features they need, they are also interested in keeping the code neutral and open, without the implications or entanglements of traditional, commercial software (lock-in, lack of influence on roadmap, audits).
The 451 Take
A network effect is where an individual's value of a product or service changes depending on how many other people are using the product or service. A positive effect is where, as more people use it, the individual derives more value. A negative effect is where, as more people use it, the individual derives less value. Most open source software projects have both network effects, and these effects provide the entire basis for its success.
First, there is the network effect of interoperability. As more people use open source software, there will be a greater range of providers to work with and applications to interoperate with. This increase in choice makes open source more attractive. A classic example of this effect is the invention of the telephone. When there were only two phones in the world, buying a phone had limited appeal because one individual had only one choice of whom to call. However, as more people owned telephones, the range of people to call grew bigger, and the attraction of the phone to the individual became more appealing. Network effects should be key to any provider's decision to interoperate. Second, there is the network effect of product development. As more features are added to open source and it becomes more robust and enterprise-ready, it becomes more attractive and there is a greater pool of people who want to contribute to the project and improve it even further. More people contributing to the project improves the value of the project to an individual user.
These are positive network effects and virtuous circles. The potential to interoperate with more people and develop a better platform could theoretically go on forever. The aim of the process is common ownership of a project leading to a good outcome for everybody – a state called 'the cornucopia of the commons.' Large open source software projects and communities such as Linux, Cloud Foundry, OpenStack and Kubernetes were developed with the intention of creating a 'cornucopia of the commons' scenario, where common ownership leads to mutual benefits.
Unfortunately, the temptation for individuals to take advantage of the mutual resource for their own (and rational) benefit is a very real threat to the success of these and other open source projects. This temptation can lead to economic scenarios known as the 'tragedies of the commons and anti-commons,' and its effects have been seen in areas as diverse as fishing, vandalism, transport and DVD releases.
The key to preventing these tragedies is mutual coercion, mutually agreed upon. The time to debate, clarify and define the project is before contributors are overcome by the desire to profit. Open source software projects and foundations must identify core capabilities and define what makes the software proper for certification. This debate may take some time, and not all parties will be completely satisfied. But defining the state of play before this temptation takes control is the best way of avoiding economic tragedy.
The cornucopia is in direct contrast to the better-known 'tragedy of the commons,' known to economists since as early as the 1880s. In British villages, commons are areas of open space belonging to everybody. The tragedy arises when everyone takes more than their share of the resource. As soon as one person starts using the common, everyone else wants his or her fair share. Everyone seeks to use as much of the resource as possible – after all, if one family has a bigger picnic blanket, why shouldn't someone else? The result is that the common's resources are overused and rapidly depleted because everyone seeks to maximize their own share of the resource.
Profiting from open source requires balance
Any profit-making organization, including the majority of those involved in open source projects, is self-interested and will rationally seek to maximize its profit. This isn't a bad thing – companies exist to make money, and this characteristic is absolutely critical to world economies and (ultimately) our way of life.
How does a company make a profit using open source software? It typically will use the open source software as a core product, and then will bundle other premium services on top of it. Consumers like the open source core because it enables them to interoperate with other companies using the same core product; they like the ability to use premium services because they enable easier implementation or management of the open source platform through additional tools or value-added services. Providers like this approach because it gives the impression of openness and interoperability desired by their target market, but they can still profit through upselling.
In a 'cornucopia of the commons' scenario, everyone owns the resource and everyone benefits. This is at odds with profit-making companies' raison d'être. In open source projects, the common resources are the pool of skilled developers and the software itself. Everybody who uses this common resource wants a bit of the action, and this is where the risk of the tragedy of the commons presents itself.
Nevertheless, the success of open source projects highlight that open source software can evolve and thrive through collaboration even as backing vendors make billions of dollars in revenue.
There is still the challenge that each contributor to the project naturally wants to fulfill its self-interest and make opportunities for profit. As such, they are likely to try to angle the project to their requirements. As soon as one contributor does this, the others feel they are disadvantaged and are not gaining as much value from the common resource as others. As a result, other contributors want to prioritize their requirements. The result is that the commons is depleted; development is directed toward each individual rather the common good, and the project barely moves forward because everyone disagrees. The resource is being overused.
Strong management can stop this overuse. But because contributors haven't been able to derive value through a platform built just for them, they must look for other ways to gain value, perhaps through the addition of intellectual property. And this leads us into the tragedy of the anti-commons. We have seen cases where a fork of an open source software project or even just the threat of a fork can act as disincentive to steering or influencing for a particular group or provider's benefit, but this presents other challenges to the code moving forward.
Threat of underusing the resource through the tragedy of the anti-commons
To add some differentiation to core open source projects, companies create add-ons, extensions and even distributions. These contain the core code plus several integration tools, installers, enterprise extensions and all the software components needed to make it easier to consume. There may also be levels of additional security, scalability or performance, and they may be built for different hardware and enterprise applications. Each extension and distribution is essentially intellectual property. The companies take open source software and then make it profitable by adding provider-specific features and assurances.
And this is how the tragedy of the anti-commons arises. The good outcome of the project was that it was a common resource owned by everyone that can interoperate. Now some elements of the resource are common, but not all of them – some elements are proprietary. This makes it harder for software components and distributions to work together. And without this interoperability, the common resource is less useful for everyone. Furthermore, it makes development more difficult because proprietary technology cannot be used as the basis for building further open source capability – a so-called 'patent thicket.' Bundling becomes difficult – some tools are seen as differentiators and not for general release and are subsequently not allowed in other companies' distributions.
Each provider is thinking of its self-interest rather than the good of the project and will seek to develop its product to be differentiated. But as this differentiation takes place, the good outcome everyone hoped for begins to collapse. This is the tragedy of the anti-commons. Providers would rather invest time in their differentiated offerings than in the common good.
Once one extension or distribution has been developed, it is almost impossible for competition not to do the same. Competition then suffers a network effect, too. As more competitors make their own extensions and distributions, the remaining competitors increasingly feel they are missing an opportunity and are losing market share. But conversely, by breaking off and making their own extensions and distributions, the main benefit of the openness and interoperability is increasingly being lost as well. And it is increasingly difficult to build better software since components required to provide core functionality may not be open source, and so incremental developments to the project may be defeated.
Open source projects and foundations set up systems to make roadmaps and projects more democratic. Leadership is set up in a way that if a particular person or company is trying to hijack the agenda, they will not be in that position for long. Boards and foundations typically follow the 'rising tide lifts all boats' theory. Granted, there are examples of companies that add some 'secret sauce' to open source projects for competitive advantage, but they do so at the risk of not being interoperable with others in the ecosystem and facing the wrath of the community and social media. With sometimes dozens of vendors and hundreds or thousands of developers, keeping all participants on the straight and narrow while keeping them satisfied is a challenge indeed. Incentivizing collaboration is the key, but participants also need opportunities to differentiate and profit for the project to be successful, and if the project is too rigid, then participants will drift and make their own decisions.
For preventing both the tragedy of the commons and anti-commons, the concept of mutual coercion, mutually agreed upon, is vital. Early on is the right time to debate, and mutually agree upon, the project. The framework for participants to use, integrate and build upon the open source code must be defined before individual participants are pulled away from the common good by the desire to profit. Once participants are too far down the road of diverting resources to their own needs, or have started protecting their contributions, it is very hard to go back. The debate continues to rage, and this is not a bad thing as long as progress is made.
All open source projects are at risk of tragedy. The fact that there are so many open source successes, such as Linux, Cloud Foundry and OpenStack show that tragedies can be avoided and a cornucopia can be realized. Most open source projects have support, provider and enterprise attention. As open source software's popularity and capability increase, will contributors be tempted to pull away and differentiate themselves to the detriment of the project? The growing prominence and influence of large end users in open source software communities is helping to answer this question with a 'no.'
As Research Director, Owen Rogers leads the firm's Digital Economics Unit, which serves to help customers understand the economics behind digital and cloud technologies so they can make informed choices when costing and pricing their own products and services, as well as those from their vendors, suppliers
Jay Lyman is a Principal Analyst in the Development, DevOps & IT Ops Channel. He covers infrastructure software, primarily private cloud platforms, cloud management and enterprise use cases that center on orchestration, the confluence of software development and IT operations known as DevOps, Docker