How the Social Giant ShareChat Plans to Save Millions While Reducing Engineer Effort With CAST AI

“Any modern company which is running on Kubernetes should try CAST AI, no matter the scale of their infrastructure. The platform combines many functionalities for which one would need separate products.”

Company size

2500+ employees

Industry

Social media 

Headquarters

Bengaluru, India

Cloud services used

Google Kubernetes Engine (GKE)

A social media platform built for scale and speed 

India’s largest homegrown social media company, ShareChat, runs over 90% of its infrastructure on Kubernetes and is India’s largest Google Cloud Platform customer. The company automatically scales to almost 7 billion web requests daily to deliver a great user experience.

Kubernetes is our key component, critical in terms of all the operations. With the volume of users and usage, we need to scale infrastructure fast. Our application/service deployment should be quick, Kubernetes help us achieve that. Scalability and extensibility are the key reasons why we chose Kubernetes.

Jenson C S, Senior Engineering Manager at ShareChat

Counting more than 400 million monthly active user sacross its brands – ShareChat, Moj and Moj Lite+, ShareChat has reached a valuation of $5 billion. 

Slashing several million dollars off the annual cloud bill

ShareChat was looking for a third-party solution that would help optimize the company’s massive Kubernetes deployment, streamline autoscaling capabilities, and improve resource usage. That is why the company implemented CAST AI.

ShareChat uses a feature leading to instant cloud cost savings: rebalancing, where CAST AI replaces suboptimal nodes with new ones and moves the workloads automatically to help clusters quickly reach an optimal state.

We use rebalancing extensively. That way our infrastructure stays optimized, thanks to the right configurations and choice of the right machine types, saving us considerable resources. We are already witnessing significant savings and expect to achieve more than a million in annual savings.

Jenson C S, Senior Engineering Manager at ShareChat
Get results like ShareChat – book a demo with CAST AI now

Bringing flexibility to Kubernetes node pools

One of the main pain points ShareChat’s DevOps team experienced is optimizing node pools. Kubernetes provides functionalities like autoscaling, but dynamic and flexible node pool provisioning are missing. In the context of dynamic application changes and variety in developer requests, a predefined node pool size would result in availability issues if not properly configured. 

“We don’t want to stick with one machine type, size, or family, we need different types to run and scale our platform. For example, we wanted to be able to choose the percentage of nodes that would run on-demand and spot instances for different types of workloads. CAST AI offers us this flexibility,” said Jenson.

Reducing the engineering workload

The ShareChat team no longer needs to carry out quota management internally since CAST AI has the flexibility to choose the next available quota. Before implementing CAST AI, ShareChat found this part of its cloud management hectic and time-consuming. “There’s no babysitting with the quotas anymore. This task is completely gone from our team schedule,” added Jenson C S. 

Another benefit of implementing CAST AI was the increase in the speed of internal support. “We have an internal support tool. When I compare it before and after CAST AI, at least 10 tickets per week for node pool management and customization are gone. The main advantage is now developers no longer need to wait for infrastructure provisioning – we don’t want to be a blocker for shipping features, so that’s great,” said Jenson C S. 

In terms of the Infrastructure or DevOps team, the Kubernetes management effort such as manual rightsizing, creation of node pools, and upgrade efforts are drastically reduced, so the DevOps team can focus on building products to improve developer productivity.

Jenson C S, Senior Engineering Manager at ShareChat

All the cost-saving features under one roof

Over the course of implementing and using the platform, ShareChat benefitted from CAST AI’s continual expansion in feature scope at no additional cost.

Any modern company which is running on Kubernetes should try CAST AI, no matter the scale of their infrastructure. The platform combines many functionalities for which one would need separate products.

Jenson C S, Senior Engineering Manager at ShareChat

Outstanding support and partnership with a long-term outlook

While developing its relationship with CAST AI, ShareChat has become the platform’s design partner for several features, leading to many improvements and the delivery of a stable solution.

“The improvements of CAST AI over the last couple of months were amazing. The support team from CAST AI are really helpful and friendly. They’re always available and ready to improve their product,” said Jenson.

Overall, it was a wonderful experience with CAST AI. Personally, I’d recommend CAST AI to anyone – it’s one of the good vendors with which we interacted in terms of the relationship or support. The platform was able to show its value in under one month.

Jenson C S, Senior Engineering Manager at ShareChat
Get results like ShareChat – book a demo with CAST AI now
CAST AI features used
  • Spot instance automation
  • Real-time autoscaling
  • Instant Rebalancing
  • Full cost visibility

You’re underway to simplify Kubernetes

  • No more complexity of Kubernetes management
  • 50%+ lower cloud costs without repetitive tasks
  • Predictable cloud bills and performance at all times

4.1/5 – Average rating

5/5 – Average rating

Users love CAST AI on G2 CAST AI is a leader in Cloud Cost Management on G2 CAST AI is a leader in Cloud Cost Management on G2 CAST AI is a leader in Small-Business Cloud Cost Management on G2

Book a demo

Find out how much you can save on your cloud bill.

✓ Valid number ✕ Invalid number
Which Kubernetes services are you using?(Required)
This field is for validation purposes and should be left unchanged.