Why Some Teams Thrive with AI While Others Stall: Insights from a 5,000‑Engineer Study

A comprehensive report based on nearly 5,000 technology professionals reveals how AI tools act as amplifiers of existing team practices, outlines seven distinct team profiles, and provides data‑driven strategies for leaders to turn AI adoption into measurable performance gains.

Continuous Delivery 2.0
Continuous Delivery 2.0
Continuous Delivery 2.0
Why Some Teams Thrive with AI While Others Stall: Insights from a 5,000‑Engineer Study

When AI tools like ChatGPT and GitHub Copilot become standard for developers, a key question emerges: why do some teams achieve a qualitative leap with AI while others see little impact?

The Google Cloud and DORA team’s extensive report, based on nearly 5,000 technology professionals, reveals the real state of AI in software development—not a simple tool‑usage guide but a mirror exposing the truth of AI investment.

1. Data Foundations: Authority and Credibility

The report’s data set is industry‑leading, covering deep surveys of ~5,000 professionals, over 100 hours of qualitative analysis, collected from June 13 2025 to July 21 2025. It is backed by platinum sponsors Swarmia, Thoughtworks, gold sponsors Buildkite, CodeRabbit, and core partners GitHub, GitLab, ensuring authenticity.

2. Core Insight: AI Is an Amplifier, Not a Savior

AI

in software development acts as an “amplifier” – it magnifies existing team conditions. Chaotic processes and high technical debt become worse, while efficient workflows and strong platforms turn AI into a super‑accelerator.

The return on AI investment depends more on underlying organizational systems—platform quality, workflow clarity, and team collaboration—than on the tools themselves.

Without these foundations, AI yields only marginal productivity gains that are easily offset by downstream chaos, explaining why costly AI tools sometimes fail to deliver.

3. Key Finding: Seven Team Profiles Reveal the AI Success Code

AI Adoption Landscape: High Enthusiasm, Divergent Outcomes

AI adoption and usage status chart
AI adoption and usage status chart

Adoption rates are high, but results vary sharply, confirming the “amplifier” theory.

7 Team Profiles: Which One Is Yours?

The report clusters teams using eight key factors (team performance, product performance, delivery throughput, delivery instability, individual effectiveness, valuable work, friction, burnout) and identifies seven distinct types.

Seven team profiles clustering chart
Seven team profiles clustering chart

These profiles serve as precise diagnostic tools for tech leaders; each type has unique traits, strengths, and challenges, guiding tailored AI strategies.

From “high‑performance but burnt‑out” to “stable but constrained by legacy systems,” each requires a different AI adoption approach, explaining why one‑size‑fits‑all rollouts often fail.

Platform Engineering: The Essential Foundation for AI Success

Platform engineering adoption reaches 94%, becoming a prerequisite for AI payoff. Organizations that treat the platform as an internal product that improves developer experience see higher AI returns.

When developers have a unified, efficient, easy‑to‑use internal platform, AI tools integrate smoothly and deliver maximum value; otherwise, duplicated effort erodes AI benefits.

Value Stream Management: AI’s “Multiplier”

Value Stream Management (VSM) acts as AI’s multiplier by visualizing, analyzing, and improving workflows, ensuring that AI‑driven productivity gains translate into measurable team and product performance improvements.

VSM’s core value lies in preventing downstream chaos, making visible the entire value stream, identifying bottlenecks, and safeguarding AI‑generated efficiency.

4. Performance Data: Debunking the Speed‑vs‑Quality Trade‑off

DORA Metrics: Scientifically Measuring Team Effectiveness

The DORA framework evaluates teams on two dimensions:

Throughput

Lead time for changes: time from code commit to production

Deployment frequency

Mean time to restore: recovery time after a failed deployment

Stability

Change failure rate: proportion of deployments requiring manual intervention

Rework rate: proportion of unplanned fault‑fix deployments

Key Finding: High‑Speed, High‑Quality Delivery Is Achievable

The report shows that clusters 6 (pragmatic high‑efficiency executors) and 7 (harmonious high‑efficiency) make up nearly 40% of the sample, delivering both high throughput (speed) and low instability (quality).

This proves that fast, high‑quality delivery is not a myth but a realistic outcome when the right combination of team management and technical practices is applied.

5. Practical Guide: AI Adoption Strategies for Technical Leaders

Three Core Recommendations: System First, Balanced Trust, Precise Diagnosis

System First: Invest in Foundations, Not Just Tools

AI value is determined by technology and cultural environment, not the tool itself. Leaders should prioritize:

Internal platform building: unified, efficient development environment

Data ecosystem maturity: ensure AI tools have sufficient data support

Core engineering practices: CI/CD, code review, test automation

Balanced Trust: Trust but Verify

Even with high AI adoption, a “trust but verify” mindset is essential. Training should focus on:

Guiding correct AI tool usage

Evaluating AI output quality

Validating feasibility of AI recommendations

Rather than merely encouraging more AI usage.

Precise Diagnosis: Replace One‑Size‑Fits‑All with Team Profiles

Use the seven team profiles to pinpoint issues:

High‑performance but burnt‑out: address workload balance

Stable but limited by legacy systems: manage technical debt

Each type needs a customized AI adoption plan

From Data to Action: Leveraging the Report

The report provides a scientific reference for AI strategy. Leaders can:

Form hypotheses based on research data

Run small‑scale experiments to validate

Measure results and iterate

Identify high‑performance factors that fit their context

Continuous improvement hinges on longitudinal comparison, focusing on learning and adaptation rather than chasing top‑line metrics.

software developmentAI adoptionDORA metricsteam performancevalue stream management
Continuous Delivery 2.0
Written by

Continuous Delivery 2.0

Tech and case studies on organizational management, team management, and engineering efficiency

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.