Blogroll

Camel Rebirth with Subsecond Experiences

This post was originally published here. Check out my new Kubernetes Patterns book and  follow me @bibryam for future blog posts.

A look at the past decade

The integration space is in constant change. There are many open source projects and closed source technologies that have not passed the test of time and disappeared from the middleware stacks for good. After a decade, Apache Camel is still here and becoming even stronger for the next decade of integration.

Gang of Four for integration

Apache Camel started life as an implementation of the Enterprise Integration Patterns (EIP) book. Today, these patterns are the equivalent of the object-oriented Gang of Four Design Patterns but for messaging and integration domain. They are agnostic of programing language, platform, architecture, and provide a universal language, notation and description of the forces around fundamental messaging primitives.

But Camel community did not stop with these patterns, and kept evolving and adding newer patterns from SOA, Microservices, Cloud Native and Serverless paradigms. As a result, Camel turned into a generic pattern based integration framework suitable for multiple architectures.

Universal library consumption model

While the patterns gave the initial spark to Camel, its endpoints quickly became popular and turned into a universal protocol for using Java based integration libraries as connectors. Today, there are hundreds of Java libraries that can be used as Camel connectors using the Camel endpoint notation. It takes a while to realize that Camel can also be used without the EIPs and the routing engine. It can act as a connector framework where all libraries are consumed as universal URIs without a need for understanding the library specific factories and configurations that vary widely across Java libraries.

The right level of abstraction

When you talk to developers who have not used Camel in anger before, they will tell you that it is possible to do integration without Camel. And they are right about the 80% of the easy use cases, but not for the remaining 20% of the cases that can turn a project into a multi-year frustrating experience. What they do not realize yet is that without Camel, there are multiple manual ways of doing the same thing, but none are validated by the experience of hundreds of open source developers. And if you fast-forward a few years, 10s of different systems to integrate with, and 10s of developers coming and going, 100s of microservices, an integration project can quickly turn into a bespoke home-grown framework that nobody wants to work on. Doing integration is easy, but doing good integration that will evolve and grow for many years, by many teams, is the hardest part. Camel addresses this challenge with universal patterns and connectors, combined with integration focused DSLs, that have passed the test of time. Next time, if you think you don't need Camel, your are either thinking for short term gains, or you are not realizing yet how complex integration can become.

Embracing change

It takes only a couple of painful experiences in large integration projects to start appreciating Camel. But Camel is not great only because it was started by and built on the works of great minds, it is great because it evolves thanks to the world's knowledge, shared through the open source model and its networking effects. Camel started as the routing layer in ESBs during SOA period with a lot of focus on XML, WS, JBI, OSGI etc, but then it quickly adapted for REST, Swagger, Circuit breakers, SAGAs, and Spring Boot in the Microservices era. And the world has not stopped there, with Docker and Kubernetes, and now Serverless architecture, Camel keeps embracing the change. That is because Camel is written for integrating changing environments, and Camel itself grows and shines on change. Camel is a change enabling library for integration.

Behind the scene engine

One of the Camel secret sauces is that it is a non-intrusive, unopinionated, small (5MB and getting smaller) integration library without any affinity where and how you consume it. If you notice, this is the opposite of an ESB as commonly Camel is confused with because of its extensive capabilities. Over the years, Camel has been used as the internal engine powering projects such as:
  • Apache ServiceMix ESB
  • Apache ActiveMQ
  • Talend ESB
  • JBoss Switchyard
  • JBoss Fuse Service Works
  • Red Hat Fuse
  • Fuse Online/Syndesis 
  • And many other frameworks mentioned here.
You can use Camel standalone, embed it with Apache Tomcat, with Spring Boot starters, JBoss WildFly, Apache Karaf, Vert.x, Quarkus, you name it. Camel doesn't care and it will bring superpowers to your project every time.

Looking to the future

Nobody can tell how the ideal integration stack will look like in a decade, nor can I. But I will tell you about two novelties coming into Apache Camel now (and to Red Hat Fuse later), and why they will have a noticeable positive effect for the developers and the business. I call these changes as subsecond deployment and subsecond startup of Camel applications.

Subsecond deployments to Kubernetes

There was a time when cloud-native meant different technologies. Today, after a few years of natural selection and consolidation in the industry, cloud-native means applications created specifically for Kubernetes and its ecosystem of projects around CNCF. Even with this definition, there are many shades of cloud-native, from running a monolithic non-scalable application in a container, to triggering a function that is fully embracing the cloud-native development and management practices. The Camel community has realized that Kubernetes is the next generation application runtime, and it is steadily working on making Camel a Kubernetes native integration engine. The same way Camel is a first-class citizen in OSGI containers, JEE application servers, other fat-jar runtimes, Camel is becoming a first-class citizen on Kubernetes, integrating deeply and benefiting from the resiliency and scalability offered by the platform.

Here are a few of the many enhancement efforts going on in this direction:
  • Deeper Kubernetes integration - Kubernetes API connector, full health-check API implementation for Camel subsystems, service discovery through a new ServiceCall EIP, configuration management using ConfigMaps. Then a set of application patterns with special handling on Kubernetes such as: clustered singleton routes, scalable XA transactions (because sometimes, you have to), SAGA pattern implementation, etc.
  • Cloud-native integrations - support for other cloud-native projects such as exposing Camel metrics for Prometheus, tracing Camel routes through Jaeger, JSON formatted logging for log aggregation, etc.
  • Immutable runtimes - whether you use the minimalist immutable Karaf packaging, or Spring Boot, Camel is a first class citizen ready to put in a container image. There are also Spring Boot starter implementations for all Camel connectors, integration with routes, properties, converters, and whatnot.
  • Camel 3 - is a fact and actively progressing. A big theme for Camel 3 is to make it more modular, smaller, with faster startup time, reactive, non-blocking and triple awesome. This is the groundwork needed to restructure Camel for the future cloud workloads.
  • Knative integration - Knative is an effort started by Google in order to bring some order and standardization in the serverless world dominated by Amazon Lambda. And Camel is among the projects that integrate with Knative primitives from early days and enhances the Knative ecosystem with hundreds of connectors acting as generic event sources.  
  • And here is a real game-changer initiative: Camel-K (a.k.a deep Kubernetes integration for Camel) - we have seen that Camel is typically embedded into the latest modern runtime where it acts as the developer-friendly integration engine behind the scene. The same way Camel used to benefit from JEE services in the past for hot-deployment, configuration management, transaction management, etc, today Camel-K allows Camel runtime to benefit from Kubernetes features for high-availability, resiliency, self-healing, auto-scaling, and basically distributed application management in general. The way Camel-K achieves this is through a CLI and an Operator where the latter is able to understand the Camel applications, its build time dependencies, runtime needs, and make intelligent choices from the underlying Kubernetes platform and its additional capabilities (from Knative, Istio, Openshift and others in the future). It can automate everything on the cluster such as picking the best suited container image, runtime management model and update them when needed. The CLI can automate the tasks that are on the developer machine such as observing the code changes and streaming those to the Kubernetes cluster, and printing the logs from the running Pods. 
Camel route auto-deployment to Kubernetes with Camel-K
Camel-K operator understands two domains: Kubernetes and Camel. By combining knowledge of both areas, it can automate tasks that usually require a human operator.
The really powerful part is that, with Camel-K, a Camel route can be built and deployed from source code to a running Camel route on Kubernetes in less than a second!
Time to deploy and run a Camel integration(in seconds)
 
Forget about making a coffee, or even having a sip while building and deploying a Camel route with Camel K. As soon as you make changes to your source code and open a browser, the Camel route will be running in Kubernetes. This has a noticeable impact on the way the developers write Camel code, compile, drink coffee, deploy and test. Apart from changing the development practices and habits, this toolset will significantly reduce the development cycles which would be noticed by the business stakeholders too. For live demonstration, check out the following awesome video from Fuse engineers working on Camel-K project.

Subsecond startups of Camel applications

A typical enterprise integration landscape is composed of stateless services, stateful services, clustered applications, batch jobs, file transfers, messaging, real time integrations, and may be even blockchain based business processes. To that mix, today, we also have to add serverless workloads as well, which is best suited for event driven workloads. Historically, the heavy and slow Java runtime had significant drawbacks compared Go, Javascript and other light runtimes in the serverless space. That is one of the main motivations for Oracle to create GraalVM/Substrate VM. Substrate VM is a framework that enables ahead-of-time (AOT) compilation of Java applications into native executables that are light and fast. Then a recent effort by Red Hat led to the creation of Quarkus project which improves further the resource consumptions, startup and response times of Java applications mind-blowingly (a term not-used lightly here).

Supersonic Subatomic Java with Quarkus
 
As you can see from the metrics above, Quarkus combined with SubstrateVM is not a gradual evolution. It is a mutation, and a revolutionary jump that suddenly changes the perspectives on the Java’s footprint and speed in the cloud native workloads. It makes Java friendly for serverless architecture. Considering the huge Java ecosystem composed of developers and libraries, it even turns Java into the best suited language for Serverless applications. And Camel combined with Quarkus, the best placed integration library in this space.

Summary

With the explosion of Microservices architecture, the number of services has increased tenfold which gave birth to Kubernetes-enabled architectures. These architectures are highly dynamic in nature, and most powerful with light and fast runtimes that enable instant scale up and higher deployment density.

Camel is the framework to fill the space between disparate systems and services. It offers data consistency guarantees, reliable communication, failover, failure detection and recovery, and so on, in a way that makes developers productive. Now, imagine the same powerful Apache Camel based integration in 2020 that: deploys to Kubernetes in 20ms; starts up in 20ms; requires 20MB memory, consumes 20MB on the disk... That is regardless whether it runs as a stateless application in a container, or as a function on KNative. That is 100x faster deployments to Kubernetes, 100x faster startup time, 10x less resource consumption allowing real-time scale-up, scale-down, and scale to zero. That is a change that the developers will notice during development, users will notice when using the system, and the business will notice on the infrastructure cost and overall delivery velocity. That is the real cloud-native era we have been waiting for.

Getting started with blockchain for Java developers

Follow me on twitter for other posts in this space. This post was originally published on Opensource.com under CC BY-SA 4.0. If you prefer, read the same post on Hacker Noon.

Top technology prognosticators have listed blockchain among the top 10 emerging technologies with the potential to revolutionize our world in the next decade, which makes it well worth investing your time now to learn. If you are a developer with a Java background who wants to get up to speed on blockchain technology, this article will give you the basic information you need to get started.
Blockchain is a huge space and at first it can be overwhelming to navigate. Blockchain is different from other software technologies as it has a parallel non-technical universe with a focus on speculations, scams, price volatility, trading, ICOs, cryptocurrencies, Bitcoin maximalism, game theory, human greed, etc. Here we will ignore that side of blockchain completely and look at the technical aspects only.

The theoretical minimum for blockchain

Regardless of the programing language, implementation details, there is a theoretical minimum about blockchain that you should be familiar with. Without this understanding, it is impossible to grasp the foundations, and build on. Based on my experience, the very minimum two technologies that must be understood are Bitcoin and Ethereum. It happens that both projects introduced something new in this space, both currently have the highest market cap, and highest developer community, etc. Most other blockchain projects, whether they are public or private, permissionless or permissioned, are forks of Bitcoin or Ethereum, or build and improve their shortcomings in some ways by making certain trade-offs. Understanding these two projects is like taking networking, database theory, messaging, data structures and two programing language classes in the university. Understanding how these two blockchain technologies will open your mind for the blockchain universe.
Tech books to start with blockchain
 The two books I recommend for this purpose happen to be from the same author - Andreas M. Antonopoulos:
  • Mastering Bitcoin is the most in depth, technical but still understandable and easy to read book I could find about Bitcoin. The tens of other books I checked on this topic were either mostly philosophical and non-technical.
  • On the Ethereum side, there are many more technical books, but I liked the level of detail in Mastering Ethereum most.
  • Building Ethereum Dapps is another book I found very thorough and covering the Ethereum development very well.

Most popular Java based blockchain projects

If you are coming from a technical background, it makes sense to build on that knowledge and see what blockchain brings to the table. In the end, blockchain is a fully new technology, but a new combination of existing technologies with human behavior fueled by network effects.

It is worth stating that the popular technologies such as Java, .Net, relational databases are not common in the blockchain space. This space is primarily dominated by C, Go, Rust on the server side, and JavaScript on the client side. But if you know Java, there are a few projects and components written in Java that can be used as a leveraged entry point to the blockchain space.
Assuming you read the above two books, and want to get your hands dirty, here are a few open source blockchain projects written in Java:
Popular Java-based blockchain projects
  • Corda - this is probably the most natural starting point for a Java developer. Corda is JVM based project that builds on top of popular widely used Java projects such as Apache Artemis, Hibernate, Apache Shiro, Jackson, and relational databases. It is inspired by Bitcoin, but has elements of business processes, messaging, and other familiar concepts. Check out my first impressions from it as a Java developer here.
  • Pantheon - is a full implementation of an Ethereum node in Java. It is specifically created to attract developers from the Java ecosystem into the blockchain world. Here is an intro and a getting started video by its creators.
  • BitcoinJ - is the most popular Java implementation of the Bitcoin protocol. If you prefer to start with Bitcoin directly, this is the Java project to explore.
  • Web3J - while Corda, Pantheon are examples of a full blockchain node implemented in Java, Web3J is client library written in Java. It is very well documented and active project that makes talking to Ethereum compatible nodes straight forward. I created a Apache Camel connector for it and wrote about it here.
  • Hyperledger Fabric Java SDK - one of the most popular enterprise blockchain projects is Hyperledger Fabric and it has a full-featured Java SDK to play with.
  • FundRequest - I also want to point you to full end user applications written in Java. While the above projects are examples of clients or nodes, FundRequest is an open source funding platform implemented on top of Ethereum network and fully written in Java. It gives a good idea how to implement a complete blockchains project interacting with the Ethereum network.
  • Eventum - this is a Java project that can help you monitor the Ethereum network and store Events on Kafka. It addresses a few of the common challenges when integrating with blockchain networks which are decentralized.
If you are still not sure where to start, I suggest you read Mastering Bitcoin, that will give you the solid foundation. If you like touching technology before reading, go to Github and play with one of the projects listed above. The rest will follow. The future is open and decentralized.

The next integration evolution - blockchain

Below is the conclusion from an article I wrote at TechCrunch. Checkout the full article here.

Enterprise integration has multiple nuances. Integration challenges within an organization, where all systems are controlled by one entity and participants have some degree of trust to each other, are mostly addressed by modern ESBs, BPMs and Microservices architectures. But when it comes to multi-party B2B integration, there are additional challenges. These systems are controlled by multiple organizations, have no visibility of the business processes and do not trust each other. In these scenarios, we see organizations experimenting with a new breed of blockchain-based technology that relies not only on sharing of the protocols and contracts but sharing of the end-to-end business processes and state.
Integration evolution stages

And this trend is aligned with the general direction integration has been evolving over the years: from sharing the very minimum protocols, to sharing and exposing more and more in the form of contracts, APIs and now business processes. This shared integration infrastructure enables new transparent integration models where the previously private business processes are now jointly owned, agreed, built, maintained and standardized using the open-source collaboration model. This can motivate organizations to share business processes and form networks to benefit further from joint innovation, standardization and deeper integration in general.

Open Source and Deforestation

Follow me on twitter for other posts in this space. If you prefer, read the same post on Medium.

Open source is like a forest

A forest is a complex ecosystem of plants, animals microorganisms, non-living material, all balanced delicately by nature. It requires the right geography, the right soil, the right amount of rain and sun, and decades to build a forest.

So is open source. An open source project is a delicate ecosystem of contributors, reviewer, users, supporting organizations, all balanced by a feeling of a community. It requires the right ideas at the right time, the right group of developers, the right technology, an enormous amount of dedication and passion, and years to build a project.

Forests are home for many species, the source of oxygen, clean water, and air, prevent floods block winds, source of wood when used in a sustainable manner. Forests offer endless benefits to many when consumed responsibly and without destroying it completely.

So is open source. Open source is the place where newbies learn to collaborate, communicate and code. It is the place where experienced innovate, standardize and distribute cutting edge software. It is the place where remote developers scratch their itch, get paid, and the result benefits everybody. The open source model provides the foundations of the digital infrastructure of modern human life.

Deforestation

In the late 1960s, the deforestation of the Amazon (not the company, but the rainforest) started at an enormous rate. Trees were cleared, lands were transformed. The delicately balanced rain forests were destroyed unrecoverably leading to varying degrees of loss of soil, erosion, landslides, climate change, and even change the patterns of weather. And some companies captured enormous value from destroying the commons of nature by cutting the trees for wood and fuel in an unsustainable manner.

Tragedy of the commons image by Wikipedia

Today, open source is the latest battleground that could have consequences similar to deforestation. It is a battleground of business models, a battleground of small against large, of consultancy and tool producers against large cloud service providers.

On one hand, there are the small companies employing open source developers to build and maintain open source projects and become de facto consultancy and tooling provider around that technology. The open source model helps attract customers for the small businesses and they contribute back their work in return. It is a win-win for free riders, paying customers, maintainers, and the open source projects continue getting wider adoption.

But enterprise customers don’t want consultancy or tooling, they want to use technology in a fast, scalable, reliable and secure way and focus their resources on the business domain instead. And this is what cloud services offer. If that premise is true, any widespread open source technology will eventually be wrapped as a cloud service, and companies will prefer to use it there rather than running themselves in-house with the help of consultancy or proprietary automation tools.

That leaves small companies benefiting from and contributing to open source, out of business. As Thomas Dinsmore said it on Twitter “It’s impossible to argue that software should be open AND the originators have the sole right to monetize. If you believe in the latter, the solution is commercial software. With a license key.”. Any company benefiting from the open source model is also accepting the risk of being eaten by a large cloud service provider eventually. These are the rules of the open source game, and they are fair.

That makes the future of these open source project unclear. When technology is offered as a cloud service, the smaller companies start protecting their investments by introducing additional licenses and moving away from being truly open source. We have seen that with Redis, MongoDB, Kafka, and others to follow eventually. That also means the small companies will have less incentive to develop the open source project beyond the open core elements and instead will focus more effort on their non-open source value adding competitive edges.

As for the cloud providers, they are not obliged to sustain open source by license, not forced by their business model either. If a project stagnates, loses contributors and users, a cloud provider could quicker than anybody jump into the next popular project and offer that as a service. That is likely to lead to a change of dynamics of contributors from small and mid-range companies to individuals (as these early indications around Redis) or new active on the open source arena cloud providers. This is a new reality, and we are yet to see how open source adapts to it.

A sustainable future

The mightiest corporations of our times capture enormous value by using the commons of the open source community, by wrapping the projects into services and offering them for the cost of hardware usage. While benefiting from open source unconditionally and without any expectations is perfectly fine according to the license agreements and the rules of capitalism, benefiting without contributing back a fair share, nor helping for sustainability, has the effects of deforestation of the ecosystem. In the absence of a sustainable model, the delicate balance of contributors and users can be easily broken, leading to confusion, fear, license changes, leading to multi-license projects, leading to discouraged contributors, leading to cautious users, changing the open source model as we all know it, irreversibly.

The good news is that the software industry is increasingly getting more educated on understanding how open source works, what it takes to produce, and sustain it. With recent reactions on social media, we can see that there is no legal, but moral expectation from the companies benefitting from open source most, to play their equal part in sustaining the same open source by being exemplary with their actions, and not only exploit the commons the other contributors are building and the whole society is relying upon.

Today, every household has something made of wood. We can keep it that way by sustaining our forests. Every business depends on something made of open source. We can keep it that way by sustaining our open source ecosystem.

A Java developer’s first impressions from Corda

Follow me on twitter for other posts in this space. If you prefer, read the same post on Medium.

Recently I had a chance to play a little bit with the open source permissioned JVM based blockchain platform Corda. I was surprised to discover how it blends blockchain ideas with the commodity middleware technology and creates a new brief of decentralized enterprise integration. Below are my first impressions from it along with an Apache Camel connector contribution.

What is Corda? 

Corda is a decentralized database and business process platform designed and built from the ground up for the implementation of legal agreements among identifiable parties. It is a DLT implementation heavily influenced by the Bitcoin's UTXO model and driven by the "enterprisy" requirements of the financial industry. Corda is written in Kotlin, runs on the JVM and uses many of the proven middleware technologies. As such, compared to other blockchain platforms, Corda offers a low-entry barrier for Java developers experienced with integration, messaging and business processes management.

Design Principles

  • Permissioned (instead of permissionless network such as Bitcoin, Ethereum, etc) - this is a no surprise as enterprise blockchain use cases are primarily focused around automating the business integration challenges among identifiable parties.
  • Point-to-point (instead of global transaction broadcasts as in Bitcoin and others) - this enables data to be shared only among the nodes that need-to-know it which also leads to improved privacy and scalability.
  • UTXO model similar to Bitcoin (instead of the account model of Ethereum) - which is the part that makes Corda a DLT/Blockchain rather than a distributed business process management platform.
  • Re-use (instead of building everything from scratch) - this is the favorite part of Corda for me. Reuse of the Java ecosystem, reuse of relational databases, reuse of the messaging systems, etc.
The combination of these design principles makes Corda a very unique DLT platform among its competitors. It has elements of Bitcoin UTXO model, Ethereum smart contracts capabilities, Fabric private channels and most importantly - it reuses and builds on top of the existing battle-tested middleware technologies whenever possible.

Main Concepts

  • A permissioned network made up of point-to-point communicating nodes.
  • A ledger where each node maintains its unique database, rather than a single store.
  • Notary nodes that prevent double spends and validate transactions.
  • Oracle services that only sign transactions if the included facts are true (slightly different to typical oracles).
  • State objects are immutable that represent on-ledger facts. The state is modified through transactions and stored on owning nodes only.
  • Contracts are deterministic JVM based functions that validate the transactions.
  • Transactions are candidate updates to the ledger and must be contractually valid and signed to be committed.
  • Flows encapsulate business processes and abstract all the networking, I/O, storage and concurrency. All smart contract activity occurs within the scope of flows which can be started through RPC calls or other flow calls. Flows do not run within sandbox as in the case of contracts but executed as regular Java code.
A Corda Flow that interacts with Node A, Node B, and the Notary Pool

In Ethereum, the concepts of smart contracts encapsulate both the business logic and the state into one. In Corda, state and contract objects are separate concepts: the state is persisted, and contracts are deterministic functions (meaning all transaction validations that are performed by the contracts on different nodes and should produce the same result).
In addition, Corda introduces the concept of a Flow (kind of a distributed orchestration engine), which in Ethereum world would be similar to contracts calling each other (kind of choreography). But Corda flows are not deployed into all nodes, they are not part of the shared state, but rather represent standard JVM code, specific to individual nodes.

Technology Stack

Driven by the "re-use" principle, Corda is reusing existing storage, messaging and Java solutions. While blockchain platforms such as Quorum take a permissionless PoW framework such as Ethereum, and make it "enterprisy" by replacing the consensus mechanism, removing gas payments, introducing private transactions, etc., Corda takes the opposite approach. Corda, takes existing middleware technologies and applies the Bitcoin concept of UTXO and creates a new class of software that can be described as "a distributed business process and state management system". Corda achieves that through the use of commodity technologies such as relational databases for storage, and messaging for state replication and distributed business process coordination.
 Main components of Corda
High level technology stack:
  • It builds with Gradle, requires Oracle JDK 8, runs on Docker (and Linux on production).
  • A Corda node is a flat classpath JVM application (no Spring Boot, App Server, or OSGI container required).
  • Storage: relational database - H2, PostgreSQL, SQL Server, OracleDB.
  • Object-Relational Mapping: JPA - JBoss Hibernate.
  • Messaging: AMQP based - Apache ActiveMQ /Artemis.
  • Metrics: Jolokia
  • Other: Quasar, Kryo, Shiro, Jackson, etc,

Apache Camel Integration with Corda

Driven by my background in enterprise integration and interest in the blockchain, recently I created and wrote about Apache Camel connector for Ethereum and Quorum. In the same spirit of exploring enterprise blockchains, I created an Apache Camel connector for Corda. The connector uses Corda-RPC library and provides Camel producer/consumer endpoints to interact with a Corda node. The component offers a consumer for signing up and receiving events from a Corda node, and producer to send commands to a node.

Apache Camel connector for Corda
 Here is the full set of supported operations:

Consumer: vaultTrack, vaultTrackBy, vaultTrackByCriteria, vaultTrackByWithPagingSpec, vaultTrackByWithSorting, stateMachinesFeed, networkMapFeed, networkMapFeed, stateMachineRecordedTransactionMappingFeed, startTrackedFlowDynamic.

Producer: currentNodeTime, getProtocolVersion, networkMapSnapshot, stateMachinesSnapshot, stateMachineRecordedTransactionMappingSnapshot, registeredFlows, clearNetworkMapCache, isFlowsDrainingModeEnabled, setFlowsDrainingModeEnabled, notaryIdentities, nodeInfo, addVaultTransactionNote, getVaultTransactionNotes, uploadAttachment, attachmentExists, openAttachment, queryAttachments, nodeInfoFromParty, notaryPartyFromX500Name, partiesFromName, partyFromKey, wellKnownPartyFromX500Name, wellKnownPartyFromAnonymous, startFlowDynamic, vaultQuery, vaultQueryBy, vaultQueryByCriteria, vaultQueryByWithPagingSpec, vaultQueryByWithSorting.

To find more about Camel, and how it can complement Corda solutions, read the Camel Ethereum connector article linked above.

Conclusion

Public permissionless blockchains are facing serious technical challenges in the form of scaling, governance, energy waste and non-technical challenges with speculation, regulation, general usefulness and applicability. They have the noble idea of decentralizing everything but are yet to prove that the technology and the economic models are capable of delivering that vision.

On the other hand, private permissioned blockchains such as Corda, Fabric, Quorum are immune to these technical challenges as they target use cases with a smaller number of identifiable parties in regulated markets. Their goal is to improve and automate the existing business models of the enterprise rather than trying to discover brand new economic models. In a sense, permissioned blockchains represent the next generation cross-organization business process and data integration systems.

In this space, Corda is not revolutionary, but rather an evolutionary platform built on top of a well-estabilished storage and middleware technology ecosystems. The blockchain technology still has to prove itself, and building on top of a proven technology is the first step.

My next stop on exploring enterprise blockchains will be Hyperledger Fabric. Take care.

Tipping Points in Open Source

Follow me on twitter for other posts in this space. A shorter version of this post was originally published on Opensource.com under CC BY-SA 4.0. If you prefer, read the same post on Medium.


Over the last two decades open source has been expanding into all aspects of technology: from software to hardware, from small disruptive startup technologies to large boring enterprise software, from open standards to open patents. In this short post, I will try to call out a three tendencies that I think are reaching the tipping point in open source.

Open for Non-Coders

For good or bad, as the name open "source" suggests, this model has been primarily focused around the source code. Regardless of the intent or the believes, if we look at the open source communities, they are primarily composed of developers who are working on the source code. If we look deeper, the tools used in open source projects such as source control systems, issue trackers, mailing list names, chat channel names, etc - they all assume that the developers are the center of the universe. And that has caused a big loss. It is a big loss not having creative people, designers, document writers, event organizers, community managers, lawyers, accountants, and many others as part of the open source communities. We don't have such individuals actively participating in the open source communities because we don't have the processes and tools for such an inclusion yet. We need and want non-code contributions, but we don't have the means to measure their value, nor ways to reward their efforts in return. Reward by recognition from peers, by the community, by the employers, or anybody in general. As a result, it has been a lose-lose for decades. And we can see the implications in many open source projects with ugly looking websites, amateur logos, badly written and formatted documentation, disorganized events, etc. All of that leading for the so called "open source companies" filling the gap when it is inherently an open source problem.

The good news is that we are getting various signals indicating it is reaching the tipping point and a change is on the way:
  • Linus Torvalds apologises for his "bad behavior". While this is not an action specifically focused around non-coders, it is a very symbolic act in making "open source" a non-hostile place for less-technical contributors.
  • CNCF introduced the non-code contributions guide. In addition to showing how many ways there are for contributing to open source projects as a non-coder, this also sets the baseline for such contributions that other open source projects and foundations will end up following to keep up.
  • More or less around at the same time, ASF has been working in the same direction. There have been long discussions and we will have some concrete output very soon (that is ASF soon).
And there is a little known secret. One thing that non-coders (and new to open source) do not realize yet is that the easiest way to be recognized and become part of an established open source project is by performing non-coding activities. Nowadays, with complex software stacks, and tough competition, there is a pretty high bar for entering a project as a committer. Performing non-coding activities is less popular and it opens a fast backdoor to the open source communities.

Macro Acquisitions

Open source may have started in the hacker community as a way of scratching developers' personal itch, but today, it is the model where innovation happens. It is the model that even worlds largest software companies are transitioning to in order to continue dominating. And why is the enterprise so interested in open source? The "Open Source in the Enterprise" fee e-book I came across recently lists a few good reasons:
  • Multiplying the company’s investment through contributions.
  • Benefiting from the most recent advances and avoid reinvent the wheel in-house.
  • Spreading knowledge of the software and its broader adoption.
  • Increasing the developer base and hiring pool.
  • Upgrading internal developer skills by learning from top coders in the field.
  • Building reputation - developers want to work for organizations they can boast about.
  • Recruiting and retaining - developers want to work on exciting projects that affect large groups of people.
  • Faster startup of new companies and projects through open source networking effect, etc.
Many of these and other benefits of open source are recognized by large organizations which leads to even more open source adoption through company acquisitions. Building an open source company takes many years of effort in the open. Hiring developers who are good and also willing to work in the open, building a community around a project, and a business model is delicate effort. Companies that can manage that are particularly attractive for investment and acquisition as they serve as a catalyst in turning the acquirer an open source company at scale.
Open source related large acquisions of 2018
 Above is a list of the biggest open source software companies acquired in the current year. The list of open source companies and the following acquisitions is getting bigger every other day and this trend is only getting stronger.

Micro OSS Fundings

In addition to the macro acquisitions of open source companies, there is also an increase in the decentralized micro funding of self sustaining open source projects. 
On one end of the spectrum, there are open source projects that are maintained primarily by intrinsically motivated developers. On the other end, large open source companies are hiring developers to work on open source projects driven by company road maps and strategies. That leaves in the middle a large portion of open source projects that are not exciting enough for the accidental contributors, nor in the radar of the enterprise open source companies. In the recent years there is an increase in platforms for funding and sustaining these open source projects through bug bounties, micro payments, recurring donations, funding, subscriptions, etc. These open source funding platforms (which I have listed at http://oss.fund) allow individuals and open source users to take the responsibility of the open source sustainability in their own hands and pay the open source maintainers directly. It is the same open source model, but applied for value transfer rather than code contributions.

Open source contributor funnels

 The diagram above show the three intensive channels for open source contributors:
  • Hobbyist contributing to open source projects because of intrinsic motivations rather than monetary value.
  • Regular, planned and centralized subsidization by companies with open source business model (open core, SaaS, support, services, etc), monetizing the open source projects directly.
  • Irregular, micro, decentralized subsidization by independent open source users through OSS funding platforms.
While the hobbyist and hackers started the open source movement, it got quickly turned into an enterprise monetization model. Now, having something for the remaining open source projects is welcome.

What Can Blockchain Projects Learn from Open Source?

I've been involved with open source over a decade now. I've been part of small projects with innovative ideas which grew into large projects with solid communities. I've also witnessed how dysfunctional communities can suck the energy of projects for years. All that thanks to the open source development and collaboration.
In recent times, I'm active on the blockchain space as well: reading, writing, and contributing to projects. And I came to the conclusion that blockchain projects are startups with open development and open business models. And to be successful, the first and foremost, blockchain startups have to learn how to build communities the open source way.

Open source code

One of the fundamental premises of blockchain is decentralization and giving control and data back to the user. Such decentralization cannot be achieved without transparency and openness. If the source code is closed, that is no different to the centralized closed systems of today. Without making the code open, there is no way to read and confirm that a system is doing what it is promising to do. There are projects that are trying to avoid it, but even they recognize that the code has to be open to a certain level at a minimum. For example, Hedera Hashgraph (which is technically not a blockchain project, but a similar class of software) has said the code cannot be freely distributed (forked), but it will be open for review. That proves our premise: blockchain projects, first and foremost are open source projects. Whether this can be classified as open source according to "The Open Source Initiative" is not in the scope of this article. The point is, if the source code is not readable/verifiable, there is no point in having something run on a non-trusted blockchain platform.

Open runtime

In addition to the source being open, what differentiates blockchain from non-blockchain open source projects is that fact that for the first the runtime is open as well. An open source project can be developed in the open, but then run and consumed as an open core, as a service, or as part of a closed system. Public blockchain (not looking into private ones here) are permissionless, anyone can join and leave a network, anyone can run a node or two. It represents a trustless and borderless runtime with open governance.

Open data

Another distinct aspect of blockchain is that blockchain projects in addition to the open source code, open runtime, also have open data. Anyone can fork the code (the client application), fork the data  (the blockchain history) and start a new network. That ultimately makes blockchain projects the most open software systems ever existed. Open code, open data, open runtime, open business model, ensure openness in multiple dimensions.

Open business model

Blockchain startups are a very unique mix of open source development, and open value capture models, all blended into one at source code level. While a non-blockchain based open source project is typically used for creating value through collaborative development and open adoption, capturing value happens through a separate business model. The business model can be thought in advance or defined later such as SaaS, open core, subscription, etc. With the blockchain projects, the business model is described in a white paper, and the token model capturing value is implemented in the source code in advance. All that makes blockchain projects a unique blend of value creation and instant capture and distribution.

Why be so open?

Most of the blockchain projects are aiming to become some kind of platform or a hub with open standards and protocols that will attract and be adopted by the developers and consumed by users subsequently. The primary way these platforms and protocols attract developers is not through technical superiority over non-blockchain technology, but by the unique decentralization, characteristics achieved through openness in multiple dimensions. These platforms have to be open in order to become more attractive than the existing closed systems which already have all the developers and users on them. Being open is not only a prerequisite for its transparency, but also for its distribution and adoption. That is especially valid for projects which are aimed to be consumed as a platform or protocol by developers rather than end users. Open source is the primary way for developers to explore, learn and start using a project.

Isn't "open" a weakness?

There was a time when being open source was considered a dangerous act as a competitor could copy and steal the code or the ideas. The recent times proved that being open source is the primary way for developer adoption, especially for developer-centric platforms, tools, and libraries. But as we have seen above, blockchain is also open runtime and open data as well. Which means anybody can fork the code and the data and start a parallel network. That makes a project vulnerable to even more kinds of splits/forks and value grab. And we have seen this happened many times with the forks of the most popular blockchain networks such as Bitcoin and Ethereum. Yet, these projects are performing better than projects which are looking for ways to prevent forking but also lack the ability to attract followers. That is because being open is actually a sign of strength. If a network is so open and has survived forks and attacks, it makes its community only stronger.
We can observe the act of being open not only in projects, but also people and organizations. Today, people and organization rush into sharing and showing off their knowledge through open source code, conference talks, blogging, tweeting, etc. The innovation is happening so fast in certain areas that by the time somebody can understand and copy an idea, the inventor of the idea will have created the next one. And being a copycat in a winner takes all markets has a negative networking effect on community growth. In the journey to conquer the closed and centralized systems, being open is the primary weapon.

Hype is different than a community

I've seen many times, how successful Initial Coin Offering (ICO) investors measure hype around a project for an early investment. Typically such a measure works only when the early investment is accompanied by an early exit. In practical terms that means identifying the most hyped ICO, and selling all tokens as soon as it hits an exchange. Measuring such a hype is done by simple statistics around Twitter followers, Facebook followers, Reddit subscribers, Telegram users, etc. These metrics have a little value for measuring a community strength for the following reasons:
  • Metrics are artificially inflated with fake accounts, paid followers, subscribers, etc;
  • The ICOs themselves run airdrops campaigns and distribute tokens for following, subscribing, joining, etc;
  • These are the wrong metrics for measuring a developer-centric community;
What I mean by the latter is that an open source project that is going to be used by developers (as a platform, protocol, whatever) should measure developer activity, rather than airdrop hunter activities. None of the actions mentioned above are building stickiness in a project community. In fact, all of these activities are purposefully skewing the community metrics using temporary incentives.

Community over market cap

The Apache Software Foundation (ASF) is one of the biggest and oldest software foundations, home of hundreds of popular open source projects. And there, we (I'm a member, committer, and PMC there) have a very fundamental belief that says: "Community over Code". As a software foundation, we are all about code, and wouldn't have a reason for existing without the code, but this slogan actually codifies how we do things, and how we go about decision making. ASF is first a home for communities rather than a repository for code. The code is the by-product of a good and healthy community. And we first try to grow healthy communities united around projects.

If we look for example how an ASF project measures its quarterly activity and progress, that is by the number of mailing list subscribers, emails sent, issues opened/closed, pull requests created/merged, software releases done, committers and PMCs voted for. The last one is a very important long term indicator for the health of a project measuring the ultimate level of commitment of community members to the success of the projects. If you look at these metrics, these are all about activities performed by technical people rather than temporarily incentivised airdrop hunters. These activities are harder to fake as they require doing something for the project (usually consuming brain power and time) rather than clicking a like/follow button which easier to outsource.

A blockchain project has a more complex ecosystem than an open source project alone. There are developers, but also miners (or their equivalent for running the network), investors, and eventually users. Measuring only the developer activity won't be indicative enough for the full ecosystem, but focusing on the right metrics would be a good start.
In a similar spirit to the ASF's "Community over Code", I think the cryptocurrencies would benefit from "Community over Market Cap". A healthy community is a far more important long-term measure than a temporary large market cap. The price of a token/coin and its market cap can be artificially manipulated or temporarily affected by a bear market. A strong and healthy community can hodl and survive ups and downs. An unhealthy community, without any stickiness to the project would fall apart anyway.

Building communities the blockchain way

Are there good examples of building stickiness and community around the new blockchain projects? I have seen a few projects that have recognized the importance of the community from the very beginning and approached their token sale completely uniquely. These projects aimed to familiarizing the prospective early investors with the project goals, white paper, mission and not only ask for money. There are definitely more examples, but the projects with unique token sale processes I have seen are the following.
  • DFINITY project had a registration process that cost close to 10$. Then they gave that money back in the form of a swag and a free t-shirt. But it was a good method to get rid of the people who are there only for the noise and not even willing to commit 10 bucks.
  • QuarkChain ICO process had quiz with 25 not very simple questions. In order to join the token sale, one had to be part of their telegram channel from early days + have a good score on the quiz + pass the lottery. While the lottery and telegram channel components were already present in other ICOs at the time, the quiz actually forced candidates to find the answers in a short time, and learn about the project (that led to a blackmarket of quiz answers, but it was a nice attempt the least).
  • One of the best executions of community building during ICO phase has been of Mainframe. Mainframe run three crowdgift campaigns:
  • Proof of Being - where tokens where literally physically dropped from the air in certain locations around the world. To get tokens, one had to get to the meetup, meet the team and grab some tokens.
  • Proof of Freedom - where participants had to answer the question why Mainframe mission mattered to them, and submit the answers in any form: tweet, blog post, audio, video, drawing, etc. I also took part in it by writing a blog post.
  • Proof of Heart - where participants were asked to donate Ether which then went to a few non-profit organizations.
We can see how Mainframe used three different methods (each with its pros and cons) to build stickiness, awareness and community around its project and even managed to raise money for non-profit organizations.
Blockchain projects are especially sensitive to Metcalfe's law and their value is directly proportional to the size of its community. A token not used by anybody is worth nothing. A platform without developers is a zombi platform. Building a community around the crypto project is as important as building the platform itself, if not more. While the crypto world knows how to raise money, the open source world knows how to build communities. They can learn something from each other.

Follow me on twitter for other posts in this space. A shorter version of this post was originally published on Opensource.com under CC BY-SA 4.0. If you prefer, read the same post on Medium.

The New Kingdom Builders

Today’s developers aren’t just kingmakers; thanks to blockchain, they’re building their own kingdoms.

 The New Kingmakers

"The New Kingmakers" by Stephen O'Grady is a great book explaining why the developers are the most important assets a business has. In it, Stephen explains how developers are shaping products in new ways and organizations that understand and embrace the value of this shift will be the most successful in the years to come. It shows how IT decision makers aren’t making the decisions any longer, but the developers are. They have the power to make or break businesses, whether by their experience, their talent, or their passion. The book also has quotes from the legendary CEOs (the Kings - if we keep the book analogy) quantifying developers:
  • Steve Jobs who believed that an elite talent was 25 times more valuable to Apple than an average alternative.
  • Facebook's Mark Zuckerberg saying that someone who is exceptional in their role is not just a little better than someone who is pretty good, they are 100 times better. 
  • For Bill Gates, the number is 10,000 times better, etc.
In summary, every business is a software business and the developers are the most important constituency in it. The developers can make a company great, or break it apart. Developers can help a company conquer the world and make new kings (hence - The New Kingmakers).

The New Kingdom Builders

The era of the kingmaker developers has not ended. The urge to hire the best developers continues. The quest for engaging with developers, getting the developer mind-share and acceptances continues. The open cloud, open standards, free developer tools, open patents, open source are just the latest tools in this endeavor. The fact that open source has become an ubiquitous indication of how important it is to be accepted by developers.

While I agree with the written above, I think we have come to a new era where developers can conquer the world by creating their own kingdoms. Today there is technology that allows great developers to challenge established kingdoms, to build new kingdoms, and become kings themselves. This technology is the blockchain.

Open source is a collaborative software development and distribution model that allows people with common interests produce something that no individual can create on their own. It allows best ideas to spread openly and implemented collectively. It allows great developers to express their creativity and mastery in a subject. But open source doesn't allow capturing value. Open source produces value, then a business model built separately on top of open source captures the value.

Since there is no verified photo of Satoshi,
I went with Vitalik's photo
While open source is a value creation model, blockchain is a value distribution and capturing model. Open source is a development time characteristic, whereas blockchain is the runtime characteristic of a software. Developers that can combine both (not only create but also capture the value with code) can create new kingdoms out of thin air. That is possible because the value capturing in blockchain based projects is embedded in its core, it is part of the code, rather than being a distinct model build on top of the code and allowing somebody else to monetize it. Open source unites mainly the techies in creating something new, but the blockchain model can unite investors through ICOs to support best ideas, it can unite miners (not investors, not users, but a new class of actors in this ecosystem) to run the network nodes, it can attract the final consumers who care about decentralization and transparency. Blockchain brings a closure to the end-to-end cycle of building and then running software in the open i.e. from idea stage into mass consumption.

To be precise, open source is not a prerequisite for capturing value, but it helps for its creation. Also there are other technologies similar to blockchain, such as tangle, hashgraph, etc which follow a similar distribution and value capture model. The common characteristic among all of them is the fact that value capturing and distribution is embedded in the technology itself rather than being a separate model.

Here are a few of the pioneer kingdom creators of our time:
  • Satoshi Nakamoto started wrote the Bitcoin whitepaper. After 10 years, his/her/their vision is worth over $100 billion. More importantly, that vision gave the spark for many more to follow.
  • Vitalik Buterin created the Ethereum project which is worth over $1 billion now.
  • Daniel Larimer created Steem project worth over $250 million and then created EOS that is close to $5 billion.
  • Charles Hoskinson created Cardano worth over $2 billion.
  • Jed McCaleb created Ripple and then Stellar projects, worth billions...
Clearly, the list above contains only the geniuses and visionaries of this new world. These are the Linus Torvalds of blockchain and they are the only handful. But there are many who follow them. There are many who use the platforms created by these geniuses and come up with new business models and experiment in creating their own kingdoms. Some will fail, and some will succeed. But clearly, there is a new path for developers to conquer the world and this time for real.

Follow me on twitter for other posts in this space. This post was originally published on Opensource.com under CC BY-SA 4.0. If you prefer, read the same post on Medium.

Cloud Native Container Design Principles

Creating a containerized application that behaves like a good cloud native citizen and can be automated effectively by a platform such as Kubernetes requires some discipline. See below what it is. (Alternatively, read the same post on Medium)

Software design principles

Principles exist in many areas of life, and they generally represent a fundamental truth or belief from which others are derived. In software, principles are rather abstract guidelines, which are supposed to be followed while designing software. There are fundamental principles for writing quality software such as KISS (Keep it simple, stupid), DRY (Don’t repeat yourself), YAGNI (You aren’t gonna need it), SoC (Separation of concerns), etc. Even if these principles do not specify concrete rules, they represent a language and common wisdom that many developers understand and refer to regularly.

There are also SOLID principles that were introduced by Robert C. Martin, which represent guidelines for writing better object-oriented software. It is a framework consisting of complementary principles that are generic and open for interpretation but still give enough direction for creating better object-oriented designs. The SOLID principles use object-oriented primitives and concepts such as classes, interfaces, and inheritance for reasoning about object-oriented designs. In a similar way, there also principles for designing cloud native applications in which the main primitive is the container image rather than a class. Following these principles will ensure that the resulting containers behave like a good cloud native citizen, allowing them to be scheduled, scaled, and monitored in an automated fashion.

SOLID principles for cloud native applications

Cloud native applications anticipate failure; they run and scale reliably even when their infrastructure experiences outages. To offer such capabilities, cloud native platforms like Kubernetes impose a set of contracts on applications. These contracts ensure that applications they run conform to certain constraints and allow the platform to automate application management. Nowadays, it is possible to put almost any application in a container and run it. But to create a containerized application that can be automated and orchestrated effectively by a cloud native platform such as Kubernetes requires additional efforts. The principles for creating containerized applications listed here use the container as the basic primitive and the container orchestration platforms as the target container runtime environment.

Principles of container-based application design
Below is a short summary of what each principle dictates. To read in more details, download the freely available white paper from here (no signup required).

Build time:

  • Single Concern: Each container addresses a single concern and does it well.
  • Self-Containment: A container relies only on the presence of the Linux kernel. Additional libraries are added when the container is built.
  • Image Immutability: Containerized applications are meant to be immutable, and once built are not expected to change between different environments.

Runtime:

  • High Observability: Every container must implement all necessary APIs to help the platform observe and manage the application in the best way possible.
  • Lifecycle Conformance: A container must have a way to read events coming from the platform and conform by reacting to those events.
  • Process Disposability: Containerized applications must be as ephemeral as possible and ready to be replaced by another container instance at any point in time.
  • Runtime Confinement: Every container must declare its resource requirements and restrict resource use to the requirements indicated.
The build time principles ensure that containers have the right granularity, consistency, and structure in place. The runtime principles dictate what functionalities must be implemented in order for containerized applications to possess cloud native function. Adhering these principles, we are more likely to create containerized applications that are better suited for automation in cloud native platforms such as Kubernetes. Check out the white paper for more details.

The Cathedral and the Bazaar: Moving from Barter to a Currency System

This post was originally published as "How blockchain can complement open source" on Opensource.com under CC BY-SA 4.0. If you prefer, you can also read the same post on Medium.

Open Won Over Closed

The Cathedral and The Bazaar is the classic open source story written 20 years ago by Eric Steven Raymond. In the story, Eric describes a new revolutionary software development model where complex software projects are built without (or with a very little) central management. This new model is open source. Eric's story compares two models:
  • The classic model (represented by the cathedral) where software is crafted by a small group of individuals in a closed and controlled environment through slow and stable releases.
  • And the new model (represented by the bazaar) where software is crafted in an open environment where individuals can participate freely, but still produce a stable and coherent system.
Some of the reasons for open source being so successful can be traced back to the founding principles described by Eric. Releasing early, releasing often, accepting the fact that many heads are inevitably better than one allows open source projects to tap into the world's pool of talent (and not many companies can match that using the closed source model).

Two decades after Eric's reflective analysis of the hacker community, we see open source becoming dominant. It is not any longer a model only for scratching a developer’s personal itch, but instead, the place where innovation happens. It is the model that even worlds largest software companies are transitioning to in order to continue dominating.

A Barter System

If we look closely at how the open source model works in practice, we realize that it is a closed system exclusive only to open source developers and techies. The only way to influence the direction of a project is by joining the open source community, understanding the written and the unwritten rules, learning how to contribute, the coding standards, etc, and doing it yourself. This is how the bazaar works and where the barter system analogy comes from. A barter system is a method of exchange of services and goods for other services and goods in return. In the bazaar - where the software is built, that means, in order to take something, you have to be also a producer yourself, and give something back in return. And that is, by exchanging your time and knowledge for getting something done. A bazaar is a place where open source developers interact with other open source developers and produce open source software, the open source way.

The barter system is a great step forward and an evolution from the state of self-sufficiency where everybody has to be a jack of all trades. The bazaar (open source model) using the barter system allows people with common interests and different skills to gather, collaborate and create something that no individual can create on their own. The barter system is simple and lacks complex problems of the modern monetary systems, but it also has some limitations to name a few:
  • Lack of divisibility - in the absence of a common medium of exchange, a large indivisible commodity/value cannot be exchanged for a smaller commodity/value. For example, even if you want to do a small change in an open source project, you may still have to go through a high entry barrier sometimes.
  • Storing value - if a project is important to your company, you may want to have a large investment/commitment in it. But since it is a barter system among open source developers, the only way to have a strong say is by employing many open source committers and that is not always possible.
  • Transferring value - if you have invested in a project (trained employees, hired open source developers) and want to move focus to another project, it is not possible to transfer expertise, reputation, influence quickly.
  • Temporal decoupling - the barter system does not provide a good mechanism for deferred or in advance commitments. In the open source world, that means a user cannot express its commitment/interest in a project in a measurable way in advance, or continuously for future periods.
We will see below what is the back door to the bazaar and how to address these limitations.

A Currency System

People are hanging at the bazaar for different reasons: some are there to learn, some are there to scratch a developer’s personal itch and some work for large software farms. And since the only way to have a say in the bazaar is by becoming part of the open source community and joining the barter system, in order to gain credibility in the open source world, many large software companies pay these developers in a monetary value and employ them. The latter represents the use of a currency system to influence the bazaar. Open source is not any longer for scratching the personal developer itch only. It also accounts for a significant part of the overall software production worldwide and there are many who want to have an influence.

Open source sets the guiding principles through which developers interact and build a coherent system in a distributed way. It dictates how a project is governed, software is built and the output distributed to users. It is an open consensus model for decentralized entities for building quality software together. But the open source model does not cover how open source is subsidized. Whether it is sponsored, directly or indirectly, through intrinsic or extrinsic motivators is irrelevant to the bazaar.
Centralized and decentralized ecosystems supporting open source

Currently, there is no equivalent of the decentralized open source development model for subsidization purpose. The majority of the open source subsidization is centralized and monopolized typically by one company which dominates a project by employing the majority of the open source developers of that project. And to be honest, this is currently the best case scenario which guarantees that the developers will be employed and the project will continue flourishing. While a company is working on its paid software or services whether that is SaaS subsodizes open source development indirectly.

There are also exceptions for the project monopoly scenario: for example, some of the Cloud Native Computing Foundation projects are developed by a large number of competing companies. Also, the Apache Software Foundation aims for their projects not to be dominated by a single vendor by encouraging diverse contributors, but most of the popular projects, in reality, are still single vendor projects...

What we are still missing is an open and decentralized model that works like the bazaar without a central coordination and ownership, where consumers (open source users) and producers (open source developers) interact with each other driven by market forces and open source value. In order to complement open source, such a model also has to be open and decentralized and this is why the blockchain technology would fit here best. This would create an complementary ecosystem that flows subsidies from users to developers, but without a centralized and monopolized entity (such as an open source company). There are already successful open source cryptocurrency projects such as Decred, Dash, Monero, Zcash, that use a similar decentralized funding model where a portion of the mining or donations subsidy is used for their own development.

Most of the existing platforms (blockchain or non-blockchain) that aim to subsidize open source development are targeting primarily bug bounties, small and piecemeal tasks. There are also a few focused on the funding of new open source projects. But there are not many that aim to provide mechanisms for sustaining continued development of open source projects. Basically, a system that would emulate the behavior of an open source service provider company, or open core, open source based SaaS product company: ensuring developers get continued and predictable incentives, and guiding the project development based on the priorities of the incentivizers, i.e. the users. Such a model would address the limitations of the barter system listed above:
  • Allow divisibility - if you want something small fixed, you can pay a small amount without paying the full premium of becoming an open source developer for a project.
  • Storing value - you can invest a large amount into a project and ensure its continued development and ensure your voice is heard.
  • Transferring value - at any point, you can stop investing in the project and move funds into other projects.
  • Temporal decoupling - allow regular recurring payments and subscriptions.
There would be also other benefits raising purely from the fact that such a blockchain based system is transparent and decentralized: to quantify a project's value/usefulness based on its users' commitment, decentralized roadmap governance, decentralized decision making, etc. While there still will be user who prefer to use the more centrally managed software, there will be others who prefer the more transparent and decentralized way of influencing projects. There is enough room for all parties.

Conclusion

On the one hand, we see large companies hiring open source developers, and acquiring open source startups and even foundational platforms (such as Microsoft buying Github). Many if not most long-running successful open source projects are centralised around a single vendor. The significance of open source, and its centralisation is a fact.

On the other hand, the challenges around sustaining open source software are becoming more apparent, and there are many investigating deeper this space and its foundational issues. There are a few projects with high visibility and a large number of contributors, but there are also many other still important projects but with not enough contributors and maintainers.

There are many efforts trying to address the challenges of open source through blockchain. These projects should improve the transparency, decentralization, subsidization, and establish a direct link between open source users and developers. This space is still very young but progressing fast, and with time, the bazaar is going to have a cryptocurrency system.

Given enough time, and adequate technology, decentralization is happening at many levels:
  • The Internet is a decentralised medium that unlocked world's potential for sharing and acquiring knowledge.
  • Open source is a decentralized collaboration model that unlocked the world's potential for innovation.
  • And similarly, blockchain can complement open source and become the decentralized open source subsidization model.
Follow me on twitter for other posts in this space.

About Me