What Can Blockchain Projects Learn from Open Source?

I've been involved with open source over a decade now. I've been part of small projects with innovative ideas which grew into large projects with solid communities. I've also witnessed how dysfunctional communities can suck the energy of projects for years. All that thanks to the open source development and collaboration.
In recent times, I'm active on the blockchain space as well: reading, writing, and contributing to projects. And I came to the conclusion that blockchain projects are startups with open development and open business models. And to be successful, the first and foremost, blockchain startups have to learn how to build communities the open source way.

Open source code

One of the fundamental premises of blockchain is decentralization and giving control and data back to the user. Such decentralization cannot be achieved without transparency and openness. If the source code is closed, that is no different to the centralized closed systems of today. Without making the code open, there is no way to read and confirm that a system is doing what it is promising to do. There are projects that are trying to avoid it, but even they recognize that the code has to be open to a certain level at a minimum. For example, Hedera Hashgraph (which is technically not a blockchain project, but a similar class of software) has said the code cannot be freely distributed (forked), but it will be open for review. That proves our premise: blockchain projects, first and foremost are open source projects. Whether this can be classified as open source according to "The Open Source Initiative" is not in the scope of this article. The point is, if the source code is not readable/verifiable, there is no point in having something run on a non-trusted blockchain platform.

Open runtime

In addition to the source being open, what differentiates blockchain from non-blockchain open source projects is that fact that for the first the runtime is open as well. An open source project can be developed in the open, but then run and consumed as an open core, as a service, or as part of a closed system. Public blockchain (not looking into private ones here) are permissionless, anyone can join and leave a network, anyone can run a node or two. It represents a trustless and borderless runtime with open governance.

Open data

Another distinct aspect of blockchain is that blockchain projects in addition to the open source code, open runtime, also have open data. Anyone can fork the code (the client application), fork the data  (the blockchain history) and start a new network. That ultimately makes blockchain projects the most open software systems ever existed. Open code, open data, open runtime, open business model, ensure openness in multiple dimensions.

Open business model

Blockchain startups are a very unique mix of open source development, and open value capture models, all blended into one at source code level. While a non-blockchain based open source project is typically used for creating value through collaborative development and open adoption, capturing value happens through a separate business model. The business model can be thought in advance or defined later such as SaaS, open core, subscription, etc. With the blockchain projects, the business model is described in a white paper, and the token model capturing value is implemented in the source code in advance. All that makes blockchain projects a unique blend of value creation and instant capture and distribution.

Why be so open?

Most of the blockchain projects are aiming to become some kind of platform or a hub with open standards and protocols that will attract and be adopted by the developers and consumed by users subsequently. The primary way these platforms and protocols attract developers is not through technical superiority over non-blockchain technology, but by the unique decentralization, characteristics achieved through openness in multiple dimensions. These platforms have to be open in order to become more attractive than the existing closed systems which already have all the developers and users on them. Being open is not only a prerequisite for its transparency, but also for its distribution and adoption. That is especially valid for projects which are aimed to be consumed as a platform or protocol by developers rather than end users. Open source is the primary way for developers to explore, learn and start using a project.

Isn't "open" a weakness?

There was a time when being open source was considered a dangerous act as a competitor could copy and steal the code or the ideas. The recent times proved that being open source is the primary way for developer adoption, especially for developer-centric platforms, tools, and libraries. But as we have seen above, blockchain is also open runtime and open data as well. Which means anybody can fork the code and the data and start a parallel network. That makes a project vulnerable to even more kinds of splits/forks and value grab. And we have seen this happened many times with the forks of the most popular blockchain networks such as Bitcoin and Ethereum. Yet, these projects are performing better than projects which are looking for ways to prevent forking but also lack the ability to attract followers. That is because being open is actually a sign of strength. If a network is so open and has survived forks and attacks, it makes its community only stronger.
We can observe the act of being open not only in projects, but also people and organizations. Today, people and organization rush into sharing and showing off their knowledge through open source code, conference talks, blogging, tweeting, etc. The innovation is happening so fast in certain areas that by the time somebody can understand and copy an idea, the inventor of the idea will have created the next one. And being a copycat in a winner takes all markets has a negative networking effect on community growth. In the journey to conquer the closed and centralized systems, being open is the primary weapon.

Hype is different than a community

I've seen many times, how successful Initial Coin Offering (ICO) investors measure hype around a project for an early investment. Typically such a measure works only when the early investment is accompanied by an early exit. In practical terms that means identifying the most hyped ICO, and selling all tokens as soon as it hits an exchange. Measuring such a hype is done by simple statistics around Twitter followers, Facebook followers, Reddit subscribers, Telegram users, etc. These metrics have a little value for measuring a community strength for the following reasons:
  • Metrics are artificially inflated with fake accounts, paid followers, subscribers, etc;
  • The ICOs themselves run airdrops campaigns and distribute tokens for following, subscribing, joining, etc;
  • These are the wrong metrics for measuring a developer-centric community;
What I mean by the latter is that an open source project that is going to be used by developers (as a platform, protocol, whatever) should measure developer activity, rather than airdrop hunter activities. None of the actions mentioned above are building stickiness in a project community. In fact, all of these activities are purposefully skewing the community metrics using temporary incentives.

Community over market cap

The Apache Software Foundation (ASF) is one of the biggest and oldest software foundations, home of hundreds of popular open source projects. And there, we (I'm a member, committer, and PMC there) have a very fundamental belief that says: "Community over Code". As a software foundation, we are all about code, and wouldn't have a reason for existing without the code, but this slogan actually codifies how we do things, and how we go about decision making. ASF is first a home for communities rather than a repository for code. The code is the by-product of a good and healthy community. And we first try to grow healthy communities united around projects.

If we look for example how an ASF project measures its quarterly activity and progress, that is by the number of mailing list subscribers, emails sent, issues opened/closed, pull requests created/merged, software releases done, committers and PMCs voted for. The last one is a very important long term indicator for the health of a project measuring the ultimate level of commitment of community members to the success of the projects. If you look at these metrics, these are all about activities performed by technical people rather than temporarily incentivised airdrop hunters. These activities are harder to fake as they require doing something for the project (usually consuming brain power and time) rather than clicking a like/follow button which easier to outsource.

A blockchain project has a more complex ecosystem than an open source project alone. There are developers, but also miners (or their equivalent for running the network), investors, and eventually users. Measuring only the developer activity won't be indicative enough for the full ecosystem, but focusing on the right metrics would be a good start.
In a similar spirit to the ASF's "Community over Code", I think the cryptocurrencies would benefit from "Community over Market Cap". A healthy community is a far more important long-term measure than a temporary large market cap. The price of a token/coin and its market cap can be artificially manipulated or temporarily affected by a bear market. A strong and healthy community can hodl and survive ups and downs. An unhealthy community, without any stickiness to the project would fall apart anyway.

Building communities the blockchain way

Are there good examples of building stickiness and community around the new blockchain projects? I have seen a few projects that have recognized the importance of the community from the very beginning and approached their token sale completely uniquely. These projects aimed to familiarizing the prospective early investors with the project goals, white paper, mission and not only ask for money. There are definitely more examples, but the projects with unique token sale processes I have seen are the following.
  • DFINITY project had a registration process that cost close to 10$. Then they gave that money back in the form of a swag and a free t-shirt. But it was a good method to get rid of the people who are there only for the noise and not even willing to commit 10 bucks.
  • QuarkChain ICO process had quiz with 25 not very simple questions. In order to join the token sale, one had to be part of their telegram channel from early days + have a good score on the quiz + pass the lottery. While the lottery and telegram channel components were already present in other ICOs at the time, the quiz actually forced candidates to find the answers in a short time, and learn about the project (that led to a blackmarket of quiz answers, but it was a nice attempt the least).
  • One of the best executions of community building during ICO phase has been of Mainframe. Mainframe run three crowdgift campaigns:
  • Proof of Being - where tokens where literally physically dropped from the air in certain locations around the world. To get tokens, one had to get to the meetup, meet the team and grab some tokens.
  • Proof of Freedom - where participants had to answer the question why Mainframe mission mattered to them, and submit the answers in any form: tweet, blog post, audio, video, drawing, etc. I also took part in it by writing a blog post.
  • Proof of Heart - where participants were asked to donate Ether which then went to a few non-profit organizations.
We can see how Mainframe used three different methods (each with its pros and cons) to build stickiness, awareness and community around its project and even managed to raise money for non-profit organizations.
Blockchain projects are especially sensitive to Metcalfe's law and their value is directly proportional to the size of its community. A token not used by anybody is worth nothing. A platform without developers is a zombi platform. Building a community around the crypto project is as important as building the platform itself, if not more. While the crypto world knows how to raise money, the open source world knows how to build communities. They can learn something from each other.

Follow me on twitter for other posts in this space. A shorter version of this post was originally published on under CC BY-SA 4.0. If you prefer, read the same post on Medium.

The New Kingdom Builders

Today’s developers aren’t just kingmakers; thanks to blockchain, they’re building their own kingdoms.

 The New Kingmakers

"The New Kingmakers" by Stephen O'Grady is a great book explaining why the developers are the most important assets a business has. In it, Stephen explains how developers are shaping products in new ways and organizations that understand and embrace the value of this shift will be the most successful in the years to come. It shows how IT decision makers aren’t making the decisions any longer, but the developers are. They have the power to make or break businesses, whether by their experience, their talent, or their passion. The book also has quotes from the legendary CEOs (the Kings - if we keep the book analogy) quantifying developers:
  • Steve Jobs who believed that an elite talent was 25 times more valuable to Apple than an average alternative.
  • Facebook's Mark Zuckerberg saying that someone who is exceptional in their role is not just a little better than someone who is pretty good, they are 100 times better. 
  • For Bill Gates, the number is 10,000 times better, etc.
In summary, every business is a software business and the developers are the most important constituency in it. The developers can make a company great, or break it apart. Developers can help a company conquer the world and make new kings (hence - The New Kingmakers).

The New Kingdom Builders

The era of the kingmaker developers has not ended. The urge to hire the best developers continues. The quest for engaging with developers, getting the developer mind-share and acceptances continues. The open cloud, open standards, free developer tools, open patents, open source are just the latest tools in this endeavor. The fact that open source has become an ubiquitous indication of how important it is to be accepted by developers.

While I agree with the written above, I think we have come to a new era where developers can conquer the world by creating their own kingdoms. Today there is technology that allows great developers to challenge established kingdoms, to build new kingdoms, and become kings themselves. This technology is the blockchain.

Open source is a collaborative software development and distribution model that allows people with common interests produce something that no individual can create on their own. It allows best ideas to spread openly and implemented collectively. It allows great developers to express their creativity and mastery in a subject. But open source doesn't allow capturing value. Open source produces value, then a business model built separately on top of open source captures the value.

Since there is no verified photo of Satoshi,
I went with Vitalik's photo
While open source is a value creation model, blockchain is a value distribution and capturing model. Open source is a development time characteristic, whereas blockchain is the runtime characteristic of a software. Developers that can combine both (not only create but also capture the value with code) can create new kingdoms out of thin air. That is possible because the value capturing in blockchain based projects is embedded in its core, it is part of the code, rather than being a distinct model build on top of the code and allowing somebody else to monetize it. Open source unites mainly the techies in creating something new, but the blockchain model can unite investors through ICOs to support best ideas, it can unite miners (not investors, not users, but a new class of actors in this ecosystem) to run the network nodes, it can attract the final consumers who care about decentralization and transparency. Blockchain brings a closure to the end-to-end cycle of building and then running software in the open i.e. from idea stage into mass consumption.

To be precise, open source is not a prerequisite for capturing value, but it helps for its creation. Also there are other technologies similar to blockchain, such as tangle, hashgraph, etc which follow a similar distribution and value capture model. The common characteristic among all of them is the fact that value capturing and distribution is embedded in the technology itself rather than being a separate model.

Here are a few of the pioneer kingdom creators of our time:
  • Satoshi Nakamoto started wrote the Bitcoin whitepaper. After 10 years, his/her/their vision is worth over $100 billion. More importantly, that vision gave the spark for many more to follow.
  • Vitalik Buterin created the Ethereum project which is worth over $1 billion now.
  • Daniel Larimer created Steem project worth over $250 million and then created EOS that is close to $5 billion.
  • Charles Hoskinson created Cardano worth over $2 billion.
  • Jed McCaleb created Ripple and then Stellar projects, worth billions...
Clearly, the list above contains only the geniuses and visionaries of this new world. These are the Linus Torvalds of blockchain and they are the only handful. But there are many who follow them. There are many who use the platforms created by these geniuses and come up with new business models and experiment in creating their own kingdoms. Some will fail, and some will succeed. But clearly, there is a new path for developers to conquer the world and this time for real.

Follow me on twitter for other posts in this space. This post was originally published on under CC BY-SA 4.0. If you prefer, read the same post on Medium.

Cloud Native Container Design Principles

Creating a containerized application that behaves like a good cloud native citizen and can be automated effectively by a platform such as Kubernetes requires some discipline. See below what it is. (Alternatively, read the same post on Medium)

Software design principles

Principles exist in many areas of life, and they generally represent a fundamental truth or belief from which others are derived. In software, principles are rather abstract guidelines, which are supposed to be followed while designing software. There are fundamental principles for writing quality software such as KISS (Keep it simple, stupid), DRY (Don’t repeat yourself), YAGNI (You aren’t gonna need it), SoC (Separation of concerns), etc. Even if these principles do not specify concrete rules, they represent a language and common wisdom that many developers understand and refer to regularly.

There are also SOLID principles that were introduced by Robert C. Martin, which represent guidelines for writing better object-oriented software. It is a framework consisting of complementary principles that are generic and open for interpretation but still give enough direction for creating better object-oriented designs. The SOLID principles use object-oriented primitives and concepts such as classes, interfaces, and inheritance for reasoning about object-oriented designs. In a similar way, there also principles for designing cloud native applications in which the main primitive is the container image rather than a class. Following these principles will ensure that the resulting containers behave like a good cloud native citizen, allowing them to be scheduled, scaled, and monitored in an automated fashion.

SOLID principles for cloud native applications

Cloud native applications anticipate failure; they run and scale reliably even when their infrastructure experiences outages. To offer such capabilities, cloud native platforms like Kubernetes impose a set of contracts on applications. These contracts ensure that applications they run conform to certain constraints and allow the platform to automate application management. Nowadays, it is possible to put almost any application in a container and run it. But to create a containerized application that can be automated and orchestrated effectively by a cloud native platform such as Kubernetes requires additional efforts. The principles for creating containerized applications listed here use the container as the basic primitive and the container orchestration platforms as the target container runtime environment.

Principles of container-based application design
Below is a short summary of what each principle dictates. To read in more details, download the freely available white paper from here (no signup required).

Build time:

  • Single Concern: Each container addresses a single concern and does it well.
  • Self-Containment: A container relies only on the presence of the Linux kernel. Additional libraries are added when the container is built.
  • Image Immutability: Containerized applications are meant to be immutable, and once built are not expected to change between different environments.


  • High Observability: Every container must implement all necessary APIs to help the platform observe and manage the application in the best way possible.
  • Lifecycle Conformance: A container must have a way to read events coming from the platform and conform by reacting to those events.
  • Process Disposability: Containerized applications must be as ephemeral as possible and ready to be replaced by another container instance at any point in time.
  • Runtime Confinement: Every container must declare its resource requirements and restrict resource use to the requirements indicated.
The build time principles ensure that containers have the right granularity, consistency, and structure in place. The runtime principles dictate what functionalities must be implemented in order for containerized applications to possess cloud native function. Adhering these principles, we are more likely to create containerized applications that are better suited for automation in cloud native platforms such as Kubernetes. Check out the white paper for more details.

The Cathedral and the Bazaar: Moving from Barter to a Currency System

This post was originally published as "How blockchain can complement open source" on under CC BY-SA 4.0. If you prefer, you can also read the same post on Medium.

Open Won Over Closed

The Cathedral and The Bazaar is the classic open source story written 20 years ago by Eric Steven Raymond. In the story, Eric describes a new revolutionary software development model where complex software projects are built without (or with a very little) central management. This new model is open source. Eric's story compares two models:
  • The classic model (represented by the cathedral) where software is crafted by a small group of individuals in a closed and controlled environment through slow and stable releases.
  • And the new model (represented by the bazaar) where software is crafted in an open environment where individuals can participate freely, but still produce a stable and coherent system.
Some of the reasons for open source being so successful can be traced back to the founding principles described by Eric. Releasing early, releasing often, accepting the fact that many heads are inevitably better than one allows open source projects to tap into the world's pool of talent (and not many companies can match that using the closed source model).

Two decades after Eric's reflective analysis of the hacker community, we see open source becoming dominant. It is not any longer a model only for scratching a developer’s personal itch, but instead, the place where innovation happens. It is the model that even worlds largest software companies are transitioning to in order to continue dominating.

A Barter System

If we look closely at how the open source model works in practice, we realize that it is a closed system exclusive only to open source developers and techies. The only way to influence the direction of a project is by joining the open source community, understanding the written and the unwritten rules, learning how to contribute, the coding standards, etc, and doing it yourself. This is how the bazaar works and where the barter system analogy comes from. A barter system is a method of exchange of services and goods for other services and goods in return. In the bazaar - where the software is built, that means, in order to take something, you have to be also a producer yourself, and give something back in return. And that is, by exchanging your time and knowledge for getting something done. A bazaar is a place where open source developers interact with other open source developers and produce open source software, the open source way.

The barter system is a great step forward and an evolution from the state of self-sufficiency where everybody has to be a jack of all trades. The bazaar (open source model) using the barter system allows people with common interests and different skills to gather, collaborate and create something that no individual can create on their own. The barter system is simple and lacks complex problems of the modern monetary systems, but it also has some limitations to name a few:
  • Lack of divisibility - in the absence of a common medium of exchange, a large indivisible commodity/value cannot be exchanged for a smaller commodity/value. For example, even if you want to do a small change in an open source project, you may still have to go through a high entry barrier sometimes.
  • Storing value - if a project is important to your company, you may want to have a large investment/commitment in it. But since it is a barter system among open source developers, the only way to have a strong say is by employing many open source committers and that is not always possible.
  • Transferring value - if you have invested in a project (trained employees, hired open source developers) and want to move focus to another project, it is not possible to transfer expertise, reputation, influence quickly.
  • Temporal decoupling - the barter system does not provide a good mechanism for deferred or in advance commitments. In the open source world, that means a user cannot express its commitment/interest in a project in a measurable way in advance, or continuously for future periods.
We will see below what is the back door to the bazaar and how to address these limitations.

A Currency System

People are hanging at the bazaar for different reasons: some are there to learn, some are there to scratch a developer’s personal itch and some work for large software farms. And since the only way to have a say in the bazaar is by becoming part of the open source community and joining the barter system, in order to gain credibility in the open source world, many large software companies pay these developers in a monetary value and employ them. The latter represents the use of a currency system to influence the bazaar. Open source is not any longer for scratching the personal developer itch only. It also accounts for a significant part of the overall software production worldwide and there are many who want to have an influence.

Open source sets the guiding principles through which developers interact and build a coherent system in a distributed way. It dictates how a project is governed, software is built and the output distributed to users. It is an open consensus model for decentralized entities for building quality software together. But the open source model does not cover how open source is subsidized. Whether it is sponsored, directly or indirectly, through intrinsic or extrinsic motivators is irrelevant to the bazaar.
Centralized and decentralized ecosystems supporting open source

Currently, there is no equivalent of the decentralized open source development model for subsidization purpose. The majority of the open source subsidization is centralized and monopolized typically by one company which dominates a project by employing the majority of the open source developers of that project. And to be honest, this is currently the best case scenario which guarantees that the developers will be employed and the project will continue flourishing. While a company is working on its paid software or services whether that is SaaS subsodizes open source development indirectly.

There are also exceptions for the project monopoly scenario: for example, some of the Cloud Native Computing Foundation projects are developed by a large number of competing companies. Also, the Apache Software Foundation aims for their projects not to be dominated by a single vendor by encouraging diverse contributors, but most of the popular projects, in reality, are still single vendor projects...

What we are still missing is an open and decentralized model that works like the bazaar without a central coordination and ownership, where consumers (open source users) and producers (open source developers) interact with each other driven by market forces and open source value. In order to complement open source, such a model also has to be open and decentralized and this is why the blockchain technology would fit here best. This would create an complementary ecosystem that flows subsidies from users to developers, but without a centralized and monopolized entity (such as an open source company). There are already successful open source cryptocurrency projects such as Decred, Dash, Monero, Zcash, that use a similar decentralized funding model where a portion of the mining or donations subsidy is used for their own development.

Most of the existing platforms (blockchain or non-blockchain) that aim to subsidize open source development are targeting primarily bug bounties, small and piecemeal tasks. There are also a few focused on the funding of new open source projects. But there are not many that aim to provide mechanisms for sustaining continued development of open source projects. Basically, a system that would emulate the behavior of an open source service provider company, or open core, open source based SaaS product company: ensuring developers get continued and predictable incentives, and guiding the project development based on the priorities of the incentivizers, i.e. the users. Such a model would address the limitations of the barter system listed above:
  • Allow divisibility - if you want something small fixed, you can pay a small amount without paying the full premium of becoming an open source developer for a project.
  • Storing value - you can invest a large amount into a project and ensure its continued development and ensure your voice is heard.
  • Transferring value - at any point, you can stop investing in the project and move funds into other projects.
  • Temporal decoupling - allow regular recurring payments and subscriptions.
There would be also other benefits raising purely from the fact that such a blockchain based system is transparent and decentralized: to quantify a project's value/usefulness based on its users' commitment, decentralized roadmap governance, decentralized decision making, etc. While there still will be user who prefer to use the more centrally managed software, there will be others who prefer the more transparent and decentralized way of influencing projects. There is enough room for all parties.


On the one hand, we see large companies hiring open source developers, and acquiring open source startups and even foundational platforms (such as Microsoft buying Github). Many if not most long-running successful open source projects are centralised around a single vendor. The significance of open source, and its centralisation is a fact.

On the other hand, the challenges around sustaining open source software are becoming more apparent, and there are many investigating deeper this space and its foundational issues. There are a few projects with high visibility and a large number of contributors, but there are also many other still important projects but with not enough contributors and maintainers.

There are many efforts trying to address the challenges of open source through blockchain. These projects should improve the transparency, decentralization, subsidization, and establish a direct link between open source users and developers. This space is still very young but progressing fast, and with time, the bazaar is going to have a cryptocurrency system.

Given enough time, and adequate technology, decentralization is happening at many levels:
  • The Internet is a decentralised medium that unlocked world's potential for sharing and acquiring knowledge.
  • Open source is a decentralized collaboration model that unlocked the world's potential for innovation.
  • And similarly, blockchain can complement open source and become the decentralized open source subsidization model.
Follow me on twitter for other posts in this space.

The Rise of Non-Microservices Architectures

(This post was originally published on Red Hat Developers, the community to learn, code, and share faster. To read the original post, click here.)

This is a short summary of my recent experiences with customers implementing architectures similar to microservices but with different characteristics in the post-microservices world.

The microservices architectural style has been around close to five years now, and much has been said and written about it. Today, I see teams deciding not to follow strictly certain principles of the "pure" microservices architecture and breaking some of the "rules". Teams are now more informed about pros and cons of microservices and take context driven decisions respecting team experience, organizational boundaries and accepting the fact that not every company is Netflix. Below are some examples I see in my recent microservices gigs.

No premium in advance

Teams (composed of devs, ops, testers, business analysts, architects, etc) are becoming more and more aware of on the premium they have to pay for the privilege of going into pure microservices based architecture. A typical Java based microservice running on Kubernetes (the most popular microservices platform) will require: a git repository, maven module, a collection of tests (unit, integration, acceptance), APIs, maven artifacts, container images, configurations, secure configurations, build pipelines, design, documentation, etc. At runtime, it will require CPU, memory, disk, networking, metrics aggregation, log aggregation, database, endpoints, service mesh proxy side car, etc. Also a collection of Kubernetes objects: container, volume, configmap, secret, pod, service, replica set, deployment, etc. Navigating and managing tens or hundreds of these artifacts puts a serious burden on everybody in a team. No surprise that recently Thoughtworks announced they are not intending to put microservices architecture into adopting phase of their technology radar in a foreseeable future.

Raison d'ĂȘtre

Considering there is a price per service (not envisaging all the hidden premium), rather than the original "start small, start with few hundred lines of code", systems start as one service, as a mono repository as long as it belongs to one team. Then for every service, there is a clearly identified reason with benefits justifying its existence as an independent microservice. That is a mandatory existence check before carving a service as a standalone. Talking about reasons, below are a few very valid reasons.

Breaking the bounded context

While the most talked method for decomposition into microservices is decomposition by bounded context, in practice there are many more reasons for creating microservices: decomposing by maturity, decomposing by data-access pattern (read vs write), decomposition by data source (rather than partitioning a data source per microservice, create a microservice per data source), aggregation for a derived functionality (create an orchestrating service for a few other services), aggregation for client convenience (such as backend for frontend pattern), aggregation to aid system performance, etc.

Shared data sources

One of the fundamental principles of microservices is that every service has a separate data store. While in theory, this principles makes perfect sense, in practice, for brownfield projects, it is the hardest part about microservices, and not always worth the effort. That is especially true for integration projects where the data source is typically owned by a different team or company and cannot be partitioned to start with. It is still possible to benefit from having independent services sharing the same data store, by acknowledging its future constraints caused by the data source level coupling.

Less inflated expectations

The good news is that teams are now making more informed decisions rather than blindly trusting microservices conference slides. In regards to Gartner's Hype cycle, that means, after a couple of years of "Inflated Expectations", the microservices architecture is heading (down) towards "Trough of Disillusionment" stage where expectations are more aligned with the real benefits.
Hype cycle
From here on, the future is full of enlightenment and productivity. Unless another cycle starts (such as serverless) before we rip the benefits of this one.

Mutated microservices

Microservices favor event-driven interactions and choreography over orchestration to decrease service coupling. But at the same time we have seen projects like Cadence by Uber, Conductor by Netflix created specifically to orchestrate distributed long-running services as an alternative of the choreography approach.

Bernd Ruecker has done a very good review of using events, orchestration and workflow engines in distributed system and analysing their real against perceived benefits.

In a different post titled Microservices in a Post-Kubernetes Era, I also described what are the changes in the microservices architectural style driven purely by Kubernetes and the cloud native primitives.

There are also others who also wrote about non-microservices architectures such as self-contained systems, miniservices, goodbye-microservices that are better alternatives to pure microservices in certain contexts. These are all good reasons for breaking away from pure microservices principles whenever the context requires it.


The best architecture is the context-driven architecture where you take a well-understood architecture and adapt it your needs. You question every principle, every rule and are not afraid of breaking away from some of the prescriptive elements as long as you understand and accept the consequences. A good analogy is “The map is not the territory”. If the architecture is the map, the context is the territory. Let me know which microservices rules you are breaking and it works for you even better. Be brave.

From Agile to Serverless

If you prefer, read the same post on Medium

Looking back. And forth.

The microservices architecture was born as a technological answer for the iterative Agile development methodology. At the early days of microservices, many companies were doing a form of Agile (XP, Scrum, Kanban, or a broken mixture of these) during development, but the existing software architectures didn't allow an incremental way of design and deployment. As a result, features were developed in fortnight iterations but deployed every six to twelve months iteration. Microservices came as a panacea in the right time, promising to address all these challenges. Architects and developers strangled the monoliths into tens of services which enabled them to touch and change different parts of the system without breaking the rest (in theory).

Microservices on its own put light into the existing challenges of distributed systems and created new ones as well. Creating tens of new services didn't mean they are ready to deploy into production and use. The process of releasing and handing them over to Ops teams had to be improved. While some fell for the extreme of "You Build It, You Run It", others joined the DevOps movement. DevOps meant, better CI/CD pipelines, better interaction between Devs and Ops and everything it takes. But a practice without the enabling tools wasn't a leap, and burning large VMs per service didn't last for a very long. That led to containers with the Docker format which took over the IT industry overnight. Containers came as a technical solution to the pain of microservices architecture and DevOps practice. With containers, applications could be packaged and run in a format that the Devs and Ops would both understand and use. Even at the very early days, it was clear that managing tens or hundreds of containers will require automation, and Kubernetes came from the heavens and swept all the competition with a swing.

Now, we do Agile development and use microservices architectures. We have one-click build and deployment pipelines with a unified application format. We practice DevOps and manage hundreds of microservices at scale, do blue-green and canary releases with Kubernetes. This was supposed to be the time of long-lasting technological peace where we can focus on the business problems and solve them. But it was not.

From Agile to Serverless and Beyond

With Kubernetes, Devs can develop and Ops can deploy in a self-service manner. But there are still only one or two company-wide Kubernetes clusters that occasionally run out of resources. The cluster is now the center of the universe. And it is even more critical than the monolith was before as it runs the build server, and the git server, and the maven repository, the website, MongoDB, and a few other things. So a promise for a serverless platform came from the clouds. It came even before the containers reached into production. With serverless, every service would have its own cluster, a cluster where resources are limited only by the limit of the credit cards. It is a platform to which you tightly couple, and bet the existence of your company for the price of going faster. So it started all over again.

How Blockchain will Influence Open Source

This post was originally published on under CC BY-SA 4.0. If you prefer, read the same post on Medium.

Interactions between users and developers enabled by blockchain technology can create self-sustaining, decentralized open source.

What Satoshi Nakamoto started as Bitcoin a decade ago has found a lot of followers and turned into a movement for decentralisation. For some, blockchain technology is a religion and will have the same impact on humanity as the Internet had. For others, it is another hype and a technology suitable for Ponzi schemes only. While blockchain is still evolving and trying to find its place, one thing is for sure: it is a disruptive technology that will fundamentally transform certain industries. And my bet is, open source will be one of them.

The Open Source Model

Open source is a collaborative software development and distribution model that allows people with common interests to gather and produce something that no individual can create on their own. It allows the creation of value that is bigger than the sum of its parts. It is enabled by distributed collaboration tools (IRC, email, git, wiki, issue trackers, etc), distributed and protected by an open source licensing model and often governed by software foundations (such as Apache Software Foundation (ASF), Cloud Native Computing Foundation (CNCF), etc.).

One interesting aspect of the open source model is the lack of financial incentives in its core. There are some who believe open source work should remain detached from money and remain a free and voluntary activity driven by intrinsic motivators only (such as "common purpose" and "for the greater good”). And there are others who believe open source work should be rewarded directly or indirectly through extrinsic motivators (such as financial incentive). While the idea of open source projects prospering only through voluntary contributions is a very romantic one, in reality the majority of the open source contributions are done through paid development. Yes, we have a lot of voluntary contributions, but that is on a temporary basis from contributors that come and go, or for exceptionally popular projects while they are at their peak. Creating and sustaining open source projects that are useful for the enterprises, requires developing, documenting, testing, bug fixing for prolonged periods even when the software is no longer no longer shiny and exciting. It is a boring activity that can be best motivated through financial incentives.

Commercial Open Source

Software foundations such as ASF accept and survive on donations and other income streams suchs as sponsorships, conference fees, etc. But those funds are primarily used to run the foundations, to ensure there is a legal protection for the projects, to ensure there are enough servers to run builds, issue trackers, mailing lists, etc. 
In a similar manner, CNCF has member fees (and similar other income streams as well) used to run the foundation and provide resources for the projects. Nowadays, most software is not build on laptops, it is run and tested on hundreds of machines on the cloud, and that requires money. Creating marketing campaigns, brand designs, distributing stickers, etc. all takes money and some foundations can assist with that as well. At its core, foundations implement the right processes for interacting with users, developers, and control mechanisms for distribution of the available financial resources to the open source projects for the common good. 

If users of the open source projects can donate money and the foundations can distribute it in a fair way, what is missing then? What is missing is a direct, transparent, trusted, decentralized, automated bidirectional link for transfer of value between the open source producers and the open source consumer. Currently, the link is either:

  • Unidirectional: a developer (BTW, I'm saying a developer, but think of it as any role that is involved in the production, maintenance and distribution of software) can use their brain juice and devote time to do a contribution and share that value with all open source users. But there is no reverse link.
  • or Indirect: if there is a bug that affects a specific user/company, the options are:
  • To have in-house developers to fix the bug and do a pull request. That is ideal, but not always possible to hire in-house developers knowledgeable about hundreds of open source projects used daily.
  • To hire a freelancer specialising around that specific open source project and pay for the services. Ideally, the freelancer is also a committer for the open source project and can directly change the project code quickly. Otherwise, the fix might not make it to the project ever.
  • Or to approach a company providing services around the open source project. Such companies typically employ open source committers to influence and gain credibility in the community and offer products, expertise and professional services, etc.
And the last one has been a successful model for sustaining many open source projects. Whether that is through services (training, consulting, workshops), support, packaging, open core, SaaS, there are companies employing hundreds of staff to work on the open source for full time. There is a long list of companies that have managed to build a successful open source business model over the years and the list is growing steadily. 

The companies backing open source projects have a very important role to play in the ecosystem. They are the catalyst between the open source projects and the users. The ones that add real value are not only packaging software nicely, but do much more. They can identify user needs, technology trends and create a full stack and even ecosystem of open source projects to address these needs. They can take a boring project and support it for years. If there is a missing piece in the stack, they can start an open source project from scratch and build a community around it. They can acquire a closed source software company and open source the projects. Here I got a little bit carried away, but yes, I'm talking about my employer Red Hat and what we do among other things.

To summarise, with the commercial open source model, the projects are officially or unofficially managed and controlled by a very few individuals or companies that monetise them and also give back to the ecosystem by ensuring the project is successful. It is a win-win-win for the open source developers, managing companies and end users. The alternative is inactive projects and expensive closed source software.

Self-sustaining, Decentralized Open Source

In order for a project to become part of a reputable foundation it has to conform to certain criteria. For example, at ASF and CNCF there are incubation and graduation processes respectively where apart from all the technical and formal requirements, a project must have a healthy number of active committer and users. And that is in the essence of forming a sustainable open source project. Having source code on Github is not the same as having an active open source project. The latter requires committers that write the code, and users using the code, and both groups enforcing each other continuously by exchanging value and forming an ecosystem where everybody benefits. Some project ecosystems might be tiny and short lived, and some may consist of multiple projects and competing service providers, with very complex interactions lasting for many years. But as long as there is an exchange of value and everybody benefits from it, the project is developed, maintained and sustaining.

If you loot at ASF Attic, you will find projects that have reached their end of life. Usually, that is the natural end of a project when it is not technologically fit for purpose any longer. Similarly in the ASF Incubator you can find tens of projects that have never graduated but got retired instead. Typically, these are projects that are not able to build a large enough community because they are very specialized or there are better alternatives available. But there are also cases where projects with high potential and superior technology cannot sustain themselves because they cannot form or maintain a functioning ecosystem for exchange of value. The open source model and the foundations do not provide a framework and mechanisms assisting developers get paid for their work, or users get their requests heard. There isn’t a common value commitment framework for either party. As a result, some projects can only sustain themselves in the context of commercial open source where a company acts as an intermediary and value adder between developers and users. That puts another constraint and the necessity of a service provider company for sustaining some open source projects. Ideally, users should be able to express their interest in a project and developers should be able to show their commitment to the project in a transparent and measurable way and form a community with common interest and intent for exchange of value.

Now, let's imagine there is a model, mechanisms and tools that enable direct interaction between the open source users and the developers. Not only code contributions through pull requests, questions over the mailing lists, GitHub stars, stickers on laptops, but also other ways to allow users influence projects destinies through a more richer, self-controlled and transparent means. That could include incentives for actions such as the following:

  • Funding open source projects directly (rather than through software foundations)
  • Influence the direction of projects through voting (by token holders)
  • Feature requests driven by user needs
  • On-time pull request merges
  • Bounties for bug hunts
  • Better test coverage incentives
  • Up-to-date documentation rewards
  • Long-term support guarantees
  • Timely security fixes
  • Expert assistance, support and services
  • Budget for evangelism and promotion of the projects
  • Budget for regular boring activities
  • Fast email and chat assistance
  • Full visibility of the overall project findings, etc. 
If you haven't guessed already, I'm talking about using blockchain and smart contracts that will allow such interactions between users and developers. Smart contracts that will give power to the hand of token holders to influence projects.

The usage of blockchain in the open source ecosystem.
The existing channels in the open source ecosystem provide ways for users to influence projects through financial commitments to the service providers or other limited means through the foundations. But the addition of blockchain based technology to the open source ecosystem could open new channels for interaction between users and developers. I'm not saying this will replace the commercial open source model. Most companies working with open source do much more and cannot be replaced by smart contracts. But smart contracts can spark a new way for bootstrapping new open source projects. A new way to give a second life to commodity projects that are a burden to maintain. A new way to motivate developers to apply boring pull requests, write documentation, get tests to pass, etc. A direct value exchange channel between users and open source developers. It can add new channels to open source projects to grow and be self-sustaining in the long term even when a company backing is not feasible. A new complementary model for self-sustaining open source projects. A win-win.

Tokenizing Open Source

There are already a number of initiatives aiming to tokenize open source. Some are focused only on an open source model, and some are generic but apply to open source development as well. Here is a list I built so far:
  • Gitcoin - grow open source, one of the most promising ones in this area.
  • Oscoin - cryptocurrency for open source
  • Open Collective - platform for supporting open source projects.
  • Fund Yourself
  • Kauri - support for open source project documentation.
  • Liberapay - recurrent donations platform.
  • FundRequest - a decentralized marketplace for open source collaboration.
  • CanYa - who recently acquired Bountysource - the world’s largest open source P2P bounty platform.
  • OpenGift - a new model for open source monetization.
  • Hacken - a white hat token for hackers.
  • Coinlancer - decentralized job market.
  • CodeFund - open source ad platform.
  • IssueHunt - funding platform for open source maintainers and contributors.
  • District0x 1Hive - crowdfunding and curation platform.
  • District0x Fixit - github bug bounties.
The list above is varied and growing rapidly. Some of these projects will disappear, some will pivot, but a few will emerge as the SourceForge, the ASF, the Github of the future. Not necessarily replacing these platforms, but complementing them with token models and creating a richer open source ecosystem. So every project can pick its distribution model (license), governing model (foundation), and incentive model (token). In all cases, this will pump fresh blood to the open source world.

The Future is Open and Decentralised

  • Software is eating the world.
  • Every company is a software company.
  • Open source is where innovation happens.
Given this, it is clear that open source is too big to fail, too important to be controlled by few or left to its own destiny. Open source is a shared-resource system that has value to all, and more more importantly it must be managed as such. It is only a matter of time until every company on earth will want to have a stake and a say in the open source world. Unfortunately, don't have the tools and the habits to do it yet. Such tools would allow anybody to show their appreciation or ignorance of software projects. It would create a direct and faster feedback loop between producers and consumers, between developers and users. It will foster innovation, innovation driven by user needs and expressed through token metrics.

Enterprise Integration for Ethereum

If you prefer, read the same post on Medium.
The most popular open source Java integration library — Apache Camel supports Ethereum’s JSON-RPC API now.

The Ethereum Ecosystem

Ethereum is an open source, public, blockchain platform for running smart contracts. It provides a decentralized Turing-complete virtual machine that can execute scripts and a cryptocurrency used to compensate participant mining nodes for computations performed or to mitigate spam. Today, Ethereum is one of the most established and mature blockchain platforms with interests from small and large companies, nonprofit organizations and governments. There is a lot that can be said about Ethereum ecosystem and the pace it moves with. But the facts talk for themselves, Ethereum has the momentum and all the indications of a technology with a potential:
  • Ethereum has an order of magnitude more active developers than any other blockchain platform and as dictated by the Metcalfe's law, this gap widens day by the day. Ethereum coding school CryptoZombies has over 200K users, Truffle development framework has over half a million downloads.
  • The cloud platforms Amazon Web Services and Microsoft Azure offer services for one-click Ethereum infrastructure deployment and management.
  • The Ethereum technology has the interest of enterprise software companies. Customized Ethereum-based applications are being developed and experimented by financial institutions such as JPMorgan Chase, Deloitte, R3, Innovate UK,  Barclays, UBS, Credit Suisse and many others. One of the best known in this area is the J. P. Morgan Chase developed permissioned of Ethereum blockchain called Quorum.
  • In 2017, Enterprise Ethereum Alliance (EEA) was setup up by various blockchain start-ups, Fortune 500 companies, research groups and others with the aim to help adoption of Ethereum based technology. It provides standards, resources for businesses to learn about Ethereum and leverage this groundbreaking technology to address specific industry use cases.
Ethereum has passed the moment when it was a hipster technology or a scientific experiment, and now it is a fundamental open source decentralization technology that enterprise companies are looking into. Talking about open source and the enterprise, I thought I also do my tiny piece of contribution to the Ethereum ecosystem and help for its adoption. Let's see what is it.

Open Source Enterprise Integration

Ethereum is distributed and decentralized, but it is mostly a closed system with the embedded ledger, the currency, and the executing nodes. In order to be useful for the enterprise, Ethereum has to be well integrated with existing legacy and new systems. Luckily, Ethereum offers a robust and lightweight JSON-RPC API with a good support for the JavaScript language. But in the enterprise companies, JavaScript is not the primary choice for integration, it is rather Java followed by .Net. Java is not necessary lightweight or fast evolving, but it has a huge developer community and a mature library ecosystem making it the top choice for the majority of enterprise companies. The main factor contributing to the productivity of the Java language is the reuse of existing libraries and avoiding reinventing the wheel. One of the most popular libraries enabling reuse and avoiding reinventing the wheel for integration is Apache Camel. Luckily, Camel happens to be my passion and a project I have been contributing for many years, so connecting the two was the most natural thing for me to do.
Apache Camel building blocks
Building blocks of Apache Camel
For those who are coming from a blockchain background and are not familiar with Camel, here is a very brief intro. Apache Camel is a lightweight open source integration library that is composed conceptually of three parts:
  • Implementations of the widely used Enterprise Integration Patterns (EIPs). (Notice this are not Ethereum Improvement Proposal that shares the same acronym.) EIPs provide a common notation, language and definition of the concepts in the enterprise integration space (think of publish-subscribe, dead letter channel, content-based router, filter, splitter, aggregator, throttler, retry, circuit breaker, etc.). Some of these patterns have been around for over a decade and some are new, but they are well known by anyone doing messaging and distributed system integration for a living.
  • The second major part of Apache Camel is the huge connectors library. Basically, as long as there is a Java library for a protocol, system endpoint, SaaS API, most likely there is a Camel connector for it (think of HTTP, JMS, SOAP, REST, AWS SQS, DropBox, Twitter, and now Ethereum, etc). Connectors abstract away the complexity of configuring the different libraries and provide a unified URI based approach for connecting to all kind of systems.
  • And the last piece of Apache Camel is the Domain Specific Language (DSL) that wires together connectors and EIPs in a higher level integration focused language. The DSL, combined with connectors and patterns makes developers highly productive in connecting systems and creates solutions that are industry standard and easier to maintain for long periods. All these are characteristics that are important for enterprise companies looking to create modern solutions based on mature technology.
    Companies are more integrated than ever, the systems within the companies are more integrated than ever. And if you are a Java shop, most likely there is already some Apache Camel based integration in use somewhere in the organization. Now you can use Camel and all the capabilities it provides also to talk to Ethereum.

    Apache Camel Connector for Ethereum

    The natural intersection of the two technologies is a Camel connector for Ethereum. Such a connector would allow integrating Ethereum with any other system, interaction style, and protocol. For that purpose, I evaluated the existing Java libraries for Ethereum and came to the conclusion that web3j is the right fit for this use case. Web3j is an actively developed, feature rich, Java library for interacting with Ethereum compatibles nodes over JSON-RPC. Camel-web3j connector (the technical name for the Camel Ethereum connector) is a thin wrapper that gives an easy way to use the capabilities offered by web3j from Apache Camel DSL. Currently, the connector offers the following features:
    The full power of this integration comes not from the connector features, but when the connector is used together with the other connectors, patterns and all other Camel capabilities to provide a complete integration framework around Ethereum.

    Ethereum compatible JSON-RPC APIs
    Next, I'm going to focus on adding support for Parity's Personal, and Geth's Personal client API, Ethereum wallet support, and others. The aim is to keep the component up-to-date with the web3j capabilities that are useful in system-to-system integration scenarios with Apache Camel. The connector is pushed to Apache Camel 2.22 and ready for early adopters to give it a try and provide feedback. To get started, have a look at the unit tests to discover how each operation is configured, and the integration tests to see how to connect to Ganache or Ethereum mainnet, etc. Enjoy.

    Use Cases for Apache Camel

    Bellow is the Enterprise Ethereum Architecture Stack (EEAS) which represents a conceptual framework of the common layers and components of an Enterprise Ethereum (EE) application according to the client specification v1.0.

    Enterprise Ethereum Architecture Stack
    Enterprise Ethereum Architecture Stack
    If you wonder where exactly Camel fits here, Camel-web3j is part of the tooling layer as an integration library with a focus on system-to-system integration. It uses the public Ethereum JSON-RPC API, which any Enterprise Ethereum compatible implementation must support and keep backward compatible with.
    Then, Camel would primarily be used to interact with services that are external to Ethereum but trusted by the smart contracts (so-called Oracles). In a similar manner, Camel can be used to interact with Enterprise Management Systems to send alerts and metrics, report faults, change configurations, etc.

    The main use cases I can think of for this connector are:
    • Listen for new blocks, events, happening in the Ethereum network, filter, transform, enrich and publish them into other systems. For example listen for new blocks, retrieving its transactions, filter out uninteresting ones, enriching others, and process them. That can be done using Ethereum node filters capabilities, or purely with Camel, using polling consumers to query a node periodically and idempotent filters to prevent processing previously processed blocks, etc.
    • The other use case would be, to listen for events and commands coming from an enterprise system (maybe a step in the business process) and then tell the Ethereum network about it. For example, a KYC is approved or payment is received in one system, which causes Camel to talk to the second system and retrieve a user's ERC20 address and perform an Ethereum transaction.
    Real world uses of Camel would involve a more complex mixture of the above scenarios ensuring high availability, resilience, replay, auditing, etc, in which Camel is really good at.

    Ethereum Oracle Implemented in Apache Camel

    "Talk is cheap. Show me the code." - Linus Torvalds

    In many occasions, smart contracts need information from the real world to operate. An oracle is, simply put, a smart contract that is able to interact with the outside world. The demonstrate the usage of Camel-web3j, I created a Camel route that represents an oracle. The route listens for CallbackGetBTCCap events on a specific topic, and when such an event is received, the Camel route generates a random value and passes it to the same contract by calling setBTCCap method. That is basically a "Hello world!" the Ethereum way.

    To trigger the event, you can call updateBTCCap method on the smart contract using the following unit test:
    mvn test -Dtest=CamelOracleRouteTest#updateBTCCap
    To check the current price in the contract, you can call getBTCCap method on the smart contract using the following unit test:
    mvn test -Dtest=CamelOracleRouteTest#getBTCCap
    Check the full instructions, the smart contract, Camel routes on Github and try it for yourself. If you use the component and have questions or feedback, if you like this and you are interested from implementing Camel connector for other blockchain projects, reach out. Take care. eth jar: 0x6fc1bF6A69B92C444772aCE4CB040705Afd255bD

    Cryptocurrencies with Bus Factor of One

    The Bus Factor

    As defined by Wikipedia, the bus factor is a measurement of the risk resulting from capabilities not being shared among members of an endeavour "in case they get hit by a bus". When looking at cryptocurrencies and trying to predict which one would grow in value and survive the test of time, one risk factor to keep in mind is the bus factor. Let's see some of the most popular cryptocurrencies and their bus factor.

    Bitcoin Bus Factor

    Let's imagine that the person on the Bus Factor Wikipedia page is the real Satoshi Nakamoto and he is about to be hit by a bus. If such a terrible event happens, since nobody knows who is Satoshi, and more importantly, since he is not any longer involved actively with the bitcoin project, such an event would not affect Bitcoin slightest.

    Bus Factor by Wikipedia

    Unless Satoshi has shared his private keys for the 980,000 Bitcoins with his grandchildren, the coins would remain locked forever but Bitcoin still would strive until the end of time. Since we cannot name one person as the face of Bitcoin, its Bus Factor is determined by the core team members and the supporting community as a whole. The risk is distributed and much smaller scale.

    Ethereum Bust Factor

    Now let's imagine the Etehreum founder Vitalik is declared dead. Actually that happent in the past where Etheresum lost 4$ Billion in market value instantly. That forced Vitalik to use PoL (Prove of Liveness) to calm the markets down.
    Vitaliks self prove of liveness
    Vitalik being the creator and still very actively involved in defining the project vision  increases Ethereum Bus Factor to a solid 1. Meaning, it takes only one bus accident to significantly impact the project.

    That is not to suggest that Bitcoin is infinitely decentralised and Ethereum is centralised. Actually, Cornell Professor Emin Gun Sirer proved that Ethereum is more distributed and decentralised than Bitcoin from node distribution point of view. Ethereum nodes are better distributed and spread compared to Bitcoin nodes where the majority are managed by limited large miners.
    But from visionary, leadership and influence point of view,  by hiding his/her real identity, Satoshi potentially removed the most centralized point in his decentralized system. Etehreum is the second most popular currency with huge community already that can supports years ahead, but Vitalik still remains the most centralazed point in the Ethereum system.

    Bus Factor in Action

    Bus Factor is not dictated only by life and death situations. Similar risk hovered over Ethereum also when Vitalik tweeted his views about child porn. But Factor can express itself in so many different unpredictable situations and ways. A recent example that comes to mind is when the founder of ZClassic (ZCL), Rhett Chreighton decided to fork and abandon the project and start Bitcoin Private project. The price chart below is self explanatory for the result of such an action.
    ZClassic drops 97%

    An example on the opposite site is the creator of Litecoin, Charlie Lee who took a different route, and publicly announced that he sold all of his Litecoin, to remain impartial to the project. He is basically an evangelist for Litecoin now who tries to make the project to survive without him.

    Personality Cult Coins

    There are other examples where a coin has the market cap close to the budget of a small country and that is primarily driven by a project founder or advisor. The common pattern to be aware is, when there is 1 to 1 association between a project and a single person, which is an indication of a bus factor of 1.
    A recent tweet by Kevin Pham pointed out coins that are driven by a single person. Expanding that list, I came to the following one:
    Notice the people listed here have a much bigger influence on their project than a developer would have. Most of the people listed here are typically the visionary, the face, the blood and the flesh of the coins. It is a risk much bigger than a bus factor which is typically measuring the dependency of projects to a software developer or sys admin role.

    Surviving Bus Accidents

    I've been working with open source over a decade now and there are good lessons to learn from Apache Software Foundation (ASF) and other open source foundations about creating long living project ecosystems. The main criteria at ASF for a project to graduate from incubation to a top level project is when it builds a self sustaining community. That is a diverse community (from multiple organizations) of committers, contributors and users. There is a similar criteria at Cloud Native Computing Foundation (CNCF) where the focus on supporting organizations rather than supporting individuals as in ASF. Ideally, you want senior developers and visionaries from multiple organizations and independent contributors all together.

    In the crypto world, one of the primary criteria to measure community is the number of telegram/twitter/reddit followers . While that is an indication of a user interest, it can be easily manipulated, and actually it is a common and mandatory practise nowdays to pump these statistics through bounty programs and airdrops by project themselves. On the other hand growing the develop community and project visionaries is much harder and takes longer and it is a more accurate indicator for long term success. Make your picks wisely

    Upvote this article on Steemit or ETH Tips Jar: 0x8b8945972392ed8E6983B6EE66e9777eE5Bd81aD

    About Me