At the DockerCon 2019 conference, Rafay Systems promised to make available implementations of several Kubernetes controllers, operators and custom resource descriptors (CRDs) as open source code.

Rafay Systems CEO Haseeb Budhani says the company developed these tools and utilities as part of a software-as-a-service (SaaS) application that abstracts away the complexity of deploying and managing Kubernetes clusters running on-premises or in any public cloud.

Via the SaaS application accessed via a graphical user interface, IT administrators can operationalize Kubernetes without having to manage Kubernetes primitives ongoing, says Budhani. The Rafay platform automates the distribution, operations, cross-region scaling and lifecycle management of containerized microservices end to end basis, and all the IT operations team needs to do is express their intent in terms of where they want a Kubernetes cluster to be deployed, he says. The tools and utilities provided by Rafay Systems make it easier to connect a vanilla Kubernetes cluster to the Rafay service.

One of the issues slowing down adoption of Kubernetes is not only can it be challenging to set up, but ongoing management of the platform requires a level of skill most IT teams don’t have. IT vendors have compounded that problem by making separate tools available to configure and manage Kubernetes clusters. Each time an IT team wants to spin up a new cluster, they have to fire up a separate tool. Rafay is making a case for a Kubernetes management service designed from the ground up to both deploy and manage Kubernetes within the same workflow.

Rafay leverages Kubernetes to orchestrate containers. Each Rafay instance is self-hosted and relies on a central scheduler to make global decisions where to run each container. Developers can deploy containerized applications without having to manage or operate multiple container clusters across geographies, notes Budhani.

The Rafay Systems platform includes built-in pipelines to distribute container images, application configurations, crypto artifacts and other data sets required. It also has built-in pipelines to collect application logs and metrics, along with system and application health data. Finally, the Rafay platform exposes application programming interfaces (APIs) that make it easier to distribute data in a consistent fashion. Developers can choose from several synchronization models depending on the application’s needs.

Budhani says Rafay Systems is applying to Kubernetes many of the concepts that providers of content delivery networks employ to update thousands of distributed nodes.

Rafay Systems is trying to give back to the Kubernetes community by offering some of the tools it has developed as open source code. But the company is betting most organizations will be a lot more interested in accessing the capabilities those tools enable via a SaaS application.

It’s too early to say what impact tools that substantially reduce the operational overhead will have on spurring increased Kubernetes adoption. Many large enterprise IT organizations already have deployed it in a production environment. However, the number of workloads running on Kubernetes as a percentage of the IT environment remains slight. By making it easier to operationalize Kubernetes, the rate at which containerized applications will be deployed on it is likely to increase as resistance from IT operations teams starts to decline. is a leading source of news and analysis on Content Delivery Network Infrastructure. We are honored they covered Rafay Systems in a recent articleCool Startup – Rafay Systems Introduces CDN for Microservices“. Read more

Creating a network of edges is a much harder problem than it appears. I know because I talk with enterprise and service providers every day about this challenge.

Read more

When you see or hear the word “serverless,” which of the following comes to mind:

  1. AWS Lambda@Edge
  2. Cloudflare Workers
  3. Physical servers whimsically flying out of a data center

Read more

There is a growing body of myths and inflated expectations swirling around the edge and its place in the computing ecosystem.  Read more

As latency bounds go down, the number of locations goes up. But how many edges will it take to cover the planet … with a low-latency network?

Read more