A/B Testing with Argo Rollouts Experiments for Progressive Delivery
This article explains how to perform data‑driven A/B testing in progressive delivery using Argo Rollouts Experiments on Kubernetes, covering the concepts, required resources, YAML manifests, command‑line steps, and the benefits of combining A/B tests with canary deployments.
Traditional software releases give engineers little control after deployment, but progressive delivery and A/B testing let teams evaluate multiple versions and roll back quickly. This guide introduces A/B testing, its role in progressive delivery, and demonstrates how to use Argo Rollouts Experiments to run data‑driven tests.
A/B Testing 101
A/B testing compares two application versions by serving each to a different user group and collecting metrics to decide which performs better. It enables data‑driven decisions rather than opinion‑based changes, allowing continuous improvement of features such as button colors that can increase conversion rates by up to 21%.
Data‑Driven Progressive Delivery
In progressive delivery, two deployments can be tested simultaneously. Teams define metrics (e.g., latency, click‑through, bounce rate) or external indicators to choose the version to promote. Using A/B testing makes the rollout process more resilient and faster.
Using Argo Rollouts Experiments for A/B Testing
Argo Rollouts is a Kubernetes controller offering advanced deployment strategies like Canary, Blue‑Green, and Experiments. An Experiment CRD runs analysis on ReplicaSets and decides rollout progression based on the results.
Experiment : Runs on ReplicaSets for analysis; can be time‑bounded or run indefinitely.
AnalysisTemplate : Defines how analysis is performed (frequency, duration, success criteria).
AnalysisRun : Executes the template and reports success or failure, influencing the rollout.
Setting Up Argo Rollouts Controller
Create a namespace and install the controller:
kubectl create namespace argo-rollouts kubectl apply -n argo-rollouts -f https://github.com/argoproj/argo-rollouts/releases/latest/download/install.yamlInstall the kubectl argo rollout plugin to interact with the controller:
curl -LO https://github.com/argoproj/argo-rollouts/releases/latest/download/kubectl-argo-rollouts-linux-amd64
chmod +x ./kubectl-argo-rollouts-linux-amd64
sudo mv ./kubectl-argo-rollouts-linux-amd64 /usr/local/bin/kubectl-argo-rollouts
kubectl argo rollouts versionArgo Rollouts Canary Deployment with Experiment
Define a rollout.yaml that includes a Canary strategy and an Experiment step, plus a Service and Ingress:
apiVersion: argoproj.io/v1alpha1
kind: Rollout
metadata:
name: rollout-experiment
spec:
replicas: 2
strategy:
canary:
steps:
- setWeight: 20
- pause: {duration: 10}
- experiment:
duration: 5m
templates:
- name: canary
specRef: canary
analyses:
- name: canary-experiment
templateName: webmetric
- setWeight: 40
- pause: {duration: 10}
- setWeight: 60
- pause: {duration: 10}
- setWeight: 80
- pause: {duration: 10}
revisionHistoryLimit: 2
selector:
matchLabels:
app: rollout-experiment
template:
metadata:
labels:
app: rollout-experiment
spec:
containers:
- name: rollouts-demo
image: docker.io/atulinfracloud/weathersample:v1
imagePullPolicy: Always
ports:
- containerPort: 5000
---
apiVersion: v1
kind: Service
metadata:
name: rollout-weather-svc
spec:
selector:
app: rollout-experiment
ports:
- protocol: "TCP"
port: 80
targetPort: 5000
type: NodePort
---
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: rollout-ingress
annotations:
kubernetes.io/ingress.class: nginx
spec:
rules:
- http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: rollout-weather-svc
port:
number: 80The Experiment runs for 5 minutes, creates a Canary ReplicaSet, and references an AnalysisTemplate named webmetric :
apiVersion: argoproj.io/v1alpha1
kind: AnalysisTemplate
metadata:
name: webmetric
spec:
metrics:
- name: webmetric
successCondition: result.completed == true
provider:
web:
url: "https://jsonplaceholder.typicode.com/todos/4"
timeoutSeconds: 10The template checks the completed field of the JSON response; only when it is true does the AnalysisRun succeed, allowing the rollout to continue.
Deploying the Resources
Apply the AnalysisTemplate first, then the rollout:
kubectl apply -f analysis.yaml
kubectl apply -f rollout.yamlVerify the rollout status:
kubectl argo rollouts get rollout rollout-experimentUpdating the Application
Trigger a new image version with:
kubectl argo rollouts set image rollout-experiment rollouts-demo=docker.io/atulinfracloud/weathersample:v2This starts the Canary rollout, launches the Experiment, runs the AnalysisRun for the configured duration, and proceeds only if the analysis succeeds.
Benefits of A/B Testing in Progressive Delivery
Run Experiments for a fixed period or indefinitely before promoting.
Automatically advance deployments based on defined metrics and success criteria.
Faster disaster recovery: a failed Experiment halts the rollout.
Conclusion
By combining A/B testing with Argo Rollouts Experiments and Canary deployments, teams can execute custom, data‑driven tests, automatically decide which version to promote, and achieve more resilient and faster releases.
DevOps Cloud Academy
Exploring industry DevOps practices and technical expertise.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.