Today, the Linux operating system turns 25 – in the United States, that means it’s old enough to vote, drink and rent a hotel room in a beach town. The Linux kernel, by itself, is quite different than what it was 25 years ago – it’s had to scale to encompass new technologies and to retain a level of standardization that makes it the benchmark for innovation in the modern datacenter. But the kernel is expected to change – technology is not static. What I think is the more important evolution over the past quarter century is the changes to the Linux project itself; as the kernel has grown, so has the project, growing in scale and complexity to outpace, the R&D departments at any single technology giant.
So what are these changes?
- The upstream is now valuable (to the enterprise)
In the earliest days of Linux, the upstream, or the community project itself, provided the key bits of code to make Linux work. It was common to have 5 different versions of the same driver (ie disk, network, graphics) – one upstream, and numerous other “forks” where companies did their “value add”. This was primarily because in the early days the companies both didn’t see the longterm value of one authoritative source, and many were unfamiliar with community interaction. This made for a poor user experience, attempting to hunt down the “right” version.
The problem was that this caused branching – as drivers and technologies like Real-Time spread out, there was no longer a single point of truth. Maintainers would change jobs or roles, and these branches would die out, leaving customers stranded in terms of updates. To combat this, companies now don’t try to maintain their own forks; instead, they now understand the importance of the upstream and have begun actively contributing to the kernel project, rather than trying to shortcut with their own version.
- Decentralization to succeed
As with many open source projects, the initial governance structure of the Linux project required that all code be reviewed by Linus Torvalds. He was, of course, the progenitor of the operating system and the project is even a modified version of his name. This worked while the project was relatively small, but as the demand for Linux grew and the kernel became more and more complex, this structure simply wasn’t sustainable.
Now, the project has various subsystems, each with their own maintainer overseeing a specialized niche of the kernel, like networking or file systems. Within each of these subsystems are additional submaintainers, who oversee even more specialized components – this was the logical scaling for the project and it works. The number of technologies that the project must address via the kernel today requires this type of hierarchy, one that allows for distributed control and specialization.
- A global project to meet global needs
Initially, the majority of contributions to Linux originated from Europe and North America. However this rapidly spread to other countries – primarily because from the start the community was an online forum. Yet there were some portions of the globe that were under represented. There were several factors, including corporate acceptance of the collaborative model, and also cultural. Examples of cultural challenges are the overly direct and often confrontational nature of the discourse.
Now, however, Linux is a truly a global community, done via hard work on the project’s side to fully engage with business cultures where open collaboration and communities are not the norm. There is still work to do here in terms of diversity and inclusiveness, but Linux is truly a global project, as individuals worldwide have learned how to engage and mentor their peers.
- Collaboration remains key
Linux was not the first open source operating system, but it was the most successful – often, when the question of “what would the world look like if Linux never existed?” is asked, the answer is typically something along the lines of “well, [open source OS] would have filled in.” Technically, perhaps that is true. But at the community level, other operating system projects simply could not match Linux’s openness and inclusiveness.
From the get-go, Linux was an open community, not a walled garden controlled by a clique of experts. It was combative and chaotic at times, but those were growing pains, ultimately helping the community flourish. Whether or not these other operating systems were technically “better” than Linux is a moot point; Linux had the better community approach, and it won by fostering both corporate and individual collaboration. It could be called “an operating system by the people, for the people”.
- From here…where?
Like it did during its formative years, Linux needs to keep evolving, both as a community and as a technology. On the community level, we need to continue to self-police ourselves, to ensure active discourse without being overtly combative, ensuring that all individuals are welcome to contribute. New blood is what keeps open source projects active, and these fresh ideas and faces can ultimately help us anticipate and address the IT problems of the future, potentially before they even exist.
At a technology level, Linux has long been focused on infrastructure, with sysadmins carefully hand-tuning and configuring systems for very specific purposes. With the growth of DevOps, this emphasis on infrastructure needs to give way to more automation and scripting, to help reduce the need for hand-tuning and provide more prescriptive default settings for the developer who just wants to “develop,” not set up a very specific server use case. It’s really about continuing to make Linux easier to use and more applicable to a much broader set of scenarios, both developer and operations wise, out of the box.
Collaboration is Linux’s lifeblood, but rather than internal collaboration, which the project excels at, the next 25 years of the community will be defined by how Linux collaborates outside of the project boundaries. Linux needs to continue to bridge divides between other communities, like OpenStack or the various platform-as-a-service projects, to ensure the continued survival of not just Linux, but pure open source as we know. In fact, consider the highly disruptive innovation underway right now in cloud, devops, and data analytics. Major advancements in these spaces are based on a foundation of Linux. Each of these areas of innovation have very tight dependencies on Linux evolution – for example in virtualization, security features and resource isolation, network & IO acceleration. Some may question “Is Linux done? Does it really matter?” The answer is that Linux means more than ever! It’s not just a base foundation. Rather it’s now a foundational component in a much broader collection of open source capabilities that are literally changing the world. Which is why nobody needs to be nostalgic looking back thinking they missed the “glory days” of Linux. Because it has never been more influential and core to broader technology innovation than right now. There’s a ton of incredibly challenging work ahead that more than ever will require the power of community to succeed.
It’s been a great 25 years, Linux – the next 25 are poised to be even more impactful through our collective efforts.
source : https://www.redhat.com/en/about/blog/community-collaboration-scale-quarter-century-evolution-linux-project?sc_cid=701600000011gf0AAA