1.5M ratings
277k ratings

Specifically, the report indicated that investments in open source and cloud-native companies grew to $53.4 billion thus far in 2021 from $12.4 billion in 2016. The scale of capital ingested by tech companies is staggering, but this data point underscores just how much money is flowing into open source-based tech companies.

image

The cloud infrastructure unicorn presents an interesting mix of open source and proprietary code, with recurring revenues and a nascent hosted product.

It has built a suite of tools that help other companies manage cloud applications (Terraform), keep them and their data secure (Boundary, Vault), handle networking on a granular level (Consul) and deal with orchestration and cross-platform deployment (Nomad, Waypoint).

Per the company’s S-1 filing, it has “deliberately built [its] products using an open-core software development model.” In practical terms, that means that all HashiCorp software products are “developed as open source projects, with large communities of users, contributors, and partners collaborating on their development.”

Regardless, how does HashiCorp make money from open source software? The company has built proprietary code atop its open source products that it sells, while also offering support and a hosted version of its products.

image
image

By now, it should be clear that this money should and can be used in a better way to enable enterprisewide adoption using scalable development of generalized components, as opposed to tailored point solutions that are limited to a single-use case.

As models and ML code become more and more democratized, an unfortunate consequence of this is that machine learning is an infrastructure problem.

This calls for a paradigm shift: From a model-centric approach to a production-centric approach.

Instead of throwing money at the modeling problem, start investing in infrastructure and orchestration. This requires a shift in mindset more than technological prowess.

Enterprises should take a step back and see the big picture of the AI journey, and start thinking of a systematic way to utilize many AI models in a single, robust framework.