A/B Testing with Argo Rollouts Experiments for Progressive Delivery
This article explains how to perform data‑driven A/B testing in progressive delivery using Argo Rollouts Experiments, covering the concepts of progressive delivery, A/B testing fundamentals, the Argo Rollouts architecture, required Kubernetes resources, and step‑by‑step commands and YAML manifests for a weather‑app example.
Traditionally, software and applications that encounter problems after an upgrade leave engineering teams powerless because they have little control over the application once it is released. With the emergence of progressive delivery strategies , teams can better control their releases, allowing quick rollbacks to previous versions if issues arise.
Most deployment strategies (such as Canary and Blue‑Green) let us roll back application versions while maintaining different versions. However, before releasing a larger audience, we need a way to determine which version performs better.
A/B Testing 101
A/B testing is the process of comparing and evaluating two different versions of an application to determine which performs better, also known as contrast testing. Different versions are shown to different user groups, and metrics are collected for statistical analysis to help decide the better version.
One of the main reasons A/B testing is popular is that it allows teams to change their product based on data rather than opinion . While teams may have a vision, user feedback shapes the product’s future, enabling continuous improvement.
This method is widely used to measure landing page and application performance. Using A/B testing, teams can make design decisions and understand user‑experience impacts on performance. Even a simple change like button color can increase conversion rates by 21% .
Data‑Driven Progressive Delivery with A/B Testing
In progressive delivery, you have two different deployments and can run tests on both to determine which version is better. Before deciding which to continue, you can run long‑running experiments on the different application versions.
The tests and metrics you collect depend entirely on your use case. You can run tests on the application version itself, or use external metrics to decide the final version . Teams typically use latency, click‑through rate, and bounce rate to choose the version to promote to production.
Using A/B testing in progressive delivery makes the process more resilient and faster , and helps you determine the best user experience based on data.
Using Argo Rollouts Experiments for A/B Testing
Argo Rollouts is a Kubernetes controller and a set of CRDs that provide advanced deployment strategies such as Canary, Blue‑Green, and Experiments . It creates ReplicaSets similar to Deployments and manages their creation and deletion based on the spec.template defined in the Rollout object.
An Argo Rollouts Experiment is a Kubernetes CRD that allows us to perform A/B testing and Kayenta‑style analysis. An Experiment consists of three components:
Experiment : Runs on ReplicaSets for analysis. It can run for a predefined duration or indefinitely until stopped. If no duration is set, it runs until all requiredForCompletion analyses are finished. Internally it references an AnalysisTemplate to create baseline and canary deployments and compare their metrics.
AnalysisTemplate : Defines how analysis is performed, including frequency, duration, and success/failure criteria used to determine whether the experiment succeeds.
AnalysisRun : Jobs executed according to the AnalysisTemplate details. They are classified as successful, failed, or uncertain, and the result decides whether the Rollout update proceeds.
Implementing Argo Rollouts Experiment
First, you need a Kubernetes cluster (e.g., Minikube with the ingress plugin enabled). Clone the example repository:
git clone https://github.com/infracloudio/ArgoRollouts-ABTesting-WeatherExample.gitInstall the Argo Rollouts controller:
kubectl create namespace argo-rollouts kubectl apply -n argo-rollouts -f https://github.com/argoproj/argo-rollouts/releases/latest/download/install.yamlVerify that all components are running:
kubectl get all -n argo-rolloutsInstall the kubectl argo rollout plugin to interact with the controller:
curl -LO https://github.com/argoproj/argo-rollouts/releases/latest/download/kubectl-argo-rollouts-linux-amd64
chmod +x ./kubectl-argo-rollouts-linux-amd64
sudo mv ./kubectl-argo-rollouts-linux-amd64 /usr/local/bin/kubectl-argo-rollouts
kubectl argo rollouts versionArgo Rollouts Canary Deployment
We will create a rollout.yaml that defines a Canary strategy with an embedded Experiment, and an AnalysisTemplate that calls a simple JSON API.
apiVersion: argoproj.io/v1alpha1
kind: Rollout
metadata:
name: rollout-experiment
spec:
replicas: 2
strategy:
canary:
steps:
- setWeight: 20
- pause: {duration: 10}
- experiment:
duration: 5m
templates:
- name: canary
specRef: canary
analyses:
- name: canary-experiment
templateName: webmetric
- setWeight: 40
- pause: {duration: 10}
- setWeight: 60
- pause: {duration: 10}
- setWeight: 80
- pause: {duration: 10}
revisionHistoryLimit: 2
selector:
matchLabels:
app: rollout-experiment
template:
metadata:
labels:
app: rollout-experiment
spec:
containers:
- name: rollouts-demo
image: docker.io/atulinfracloud/weathersample:v1
imagePullPolicy: Always
ports:
- containerPort: 5000
---
apiVersion: v1
kind: Service
metadata:
name: rollout-weather-svc
spec:
selector:
app: rollout-experiment
ports:
- protocol: "TCP"
port: 80
targetPort: 5000
type: NodePort
---
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: rollout-ingress
annotations:
kubernetes.io/ingress.class: nginx
spec:
rules:
- http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: rollout-weather-svc
port:
number: 80The AnalysisTemplate named webmetric queries https://jsonplaceholder.typicode.com/todos/4 and succeeds when the JSON field completed == true :
apiVersion: argoproj.io/v1alpha1
kind: AnalysisTemplate
metadata:
name: webmetric
spec:
metrics:
- name: webmetric
successCondition: result.completed == true
provider:
web:
url: "https://jsonplaceholder.typicode.com/todos/4"
timeoutSeconds: 10Apply the resources:
kubectl apply -f analysis.yaml
kubectl apply -f rollout.yamlCheck the rollout status:
kubectl argo rollouts get rollout rollout-experimentTo update the image and trigger the Canary rollout with the Experiment:
kubectl argo rollouts set image rollout-experiment rollouts-demo=docker.io/atulinfracloud/weathersample:v2During the update, the following occurs:
Canary deployment starts.
The Experiment launches and triggers the AnalysisTemplate for a 5‑minute run.
The AnalysisRun executes until completion.
If the AnalysisRun succeeds, the Experiment succeeds after 5 minutes and the rollout proceeds to version v2.
You can view the Argo Rollouts dashboard with:
kubectl argo rollouts dashboardAccess the dashboard at http://localhost:3100 while the Experiment is running to see real‑time status.
A/B Testing Benefits in Progressive Delivery
Run Experiments for a defined duration or indefinitely before promotion.
Automatically advance deployments based on defined metrics and success criteria.
Faster disaster recovery – if an Experiment fails, the deployment stops.
Conclusion
Progressive delivery and A/B testing enable teams to execute custom experiments and deploy applications faster. This article demonstrated how to use Argo Rollouts Experiments to perform A/B testing via a Canary deployment, covering AnalysisTemplates, AnalysisRuns, and how their outcomes influence the rollout process.
DevOps Cloud Academy
Exploring industry DevOps practices and technical expertise.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.