Cloud Native 8 min read

Robusta KRR: Kubernetes Resource Recommender – Features, How It Works, and Installation Guide

Robusta KRR is a local CLI tool that gathers pod metrics from Prometheus, recommends CPU and memory requests and limits, supports custom strategies, and can be installed via Homebrew or source, helping Kubernetes clusters cut up to 69% of cloud costs.

DevOps Cloud Academy
DevOps Cloud Academy
DevOps Cloud Academy
Robusta KRR: Kubernetes Resource Recommender – Features, How It Works, and Installation Guide

Robusta KRR (Kubernetes Resource Recommender) is a command‑line tool that runs on a local machine, collects pod usage data from Prometheus, and suggests CPU and memory requests and limits to dramatically lower cost and improve performance.

Features

No sidecar: KRR runs as a local CLI and does not require pods in the cluster.

Prometheus integration: built‑in Prometheus queries collect resource usage; custom queries are coming soon.

Extensible policies: you can easily create and use your own strategies to compute recommendations.

Future support: upcoming versions will handle custom resources (e.g., GPU) and custom metrics.

According to a recent Sysdig study ( https://sysdig.com/blog/millions-wasted-kubernetes/), on average Kubernetes clusters have 69 % unused CPU and 18 % unused memory. Using KRR to right‑size containers can save roughly 69 % of cloud costs.

If you use Robusta SaaS starting with version v0.10.15, you can view all recommendations (including historical ones) and filter or sort them by cluster, namespace, or name.

How It Works

Metric Collection

Robusta KRR uses the following Prometheus queries to gather usage data:

CPU usage:

sum(irate(container_cpu_usage_seconds_total{namespace="{object.namespace}", pod="{pod}", container="{object.container}"}[{step}]))

Memory usage:

sum(container_memory_working_set_bytes{job="kubelet", metrics_path="/metrics/cadvisor", image!="", namespace="{object.namespace}", pod="{pod}", container="{object.container}"})

Algorithm

By default KRR applies a simple policy:

CPU: set the request to the 99th percentile with no limit, allowing occasional bursts.

Memory: use the maximum observed value over the past week plus a 5 % buffer.

Installation & Usage

MacOS/Linux users can install with Homebrew in one step:

brew tap robusta-dev/homebrew-krr
brew install krr

Verify the installation:

krr --help  # first run may take a while

For manual installation, ensure Python 3.9+ is present, then clone the repository:

git clone https://github.com/robusta-dev/krr
cd krr

Install dependencies:

pip install -r requirements.txt

Run the tool:

python krr.py --help

All examples use the krr command; if you installed from source, replace it with python krr.py.

Basic usage examples:

krr simple
krr simple -n default -n ingress-nginx
krr simple -c my-cluster-1 -c my-cluster-2
krr simple --logtostderr -f json > result.json
krr simple --logtostderr -f yaml > result.yaml
krr simple -v

If Prometheus is not auto‑discovered, pass its URL explicitly with -p:

krr simple -p http://127.0.0.1:9090

You can also forward a Prometheus pod locally:

kubectl port-forward pod/kube-prometheus-st-prometheus-0 9090

Creating a custom strategy is straightforward; the example below registers a strategy that uses user‑provided parameters for CPU and memory:

# This is an example on how to create your own custom strategy

import pydantic as pd
import robusta_krr
from robusta_krr.api.models import HistoryData, K8sObjectData, ResourceRecommendation, ResourceType, RunResult
from robusta_krr.api.strategies import BaseStrategy, StrategySettings

class CustomStrategySettings(StrategySettings):
    param_1: float = pd.Field(99, gt=0, description="First example parameter")
    param_2: float = pd.Field(105_000, gt=0, description="Second example parameter")

class CustomStrategy(BaseStrategy[CustomStrategySettings]):
    """
    A custom strategy that uses the provided parameters for CPU and memory.
    Made only in order to demonstrate how to create a custom strategy.
    """

    def run(self, history_data: HistoryData, object_data: K8sObjectData) -> RunResult:
        return {
            ResourceType.CPU: ResourceRecommendation(request=self.settings.param_1, limit=None),
            ResourceType.Memory: ResourceRecommendation(request=self.settings.param_2, limit=self.settings.param_2),
        }

# Running this file will register the strategy and make it available to the CLI
# Run it as `python ./custom_strategy.py my_strategy`
if __name__ == "__main__":
    robusta_krr.run()
GitHub repository: https://github.com/robusta-dev/krr
CLIcloud nativePythonKubernetesPrometheus
DevOps Cloud Academy
Written by

DevOps Cloud Academy

Exploring industry DevOps practices and technical expertise.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.