Making AI Write Production-Ready Code via Context Engineering (COLA 5.0 & Spring Boot 3.x)
The article presents a systematic context‑engineering method that feeds project‑level, file‑level and task‑level knowledge to Claude/Claude‑CLI or Cursor, turning AI from an inconsistent code generator into a reliable collaborator, and quantifies token savings and consistency gains in a COLA 5.0 + Spring Boot 3.x project.
Background
Since early 2025 the team experimented with various code‑generation AIs (Copilot, Cursor, Tongyi Lingma, ChatGPT). Individual developers obtained speed gains, but when merging the generated code the project suffered from inconsistent injection styles, DTO naming, utility‑class choices, and layer violations, leading to heavy review and testing costs.
Root Cause: Missing Context
AI lacks knowledge of project conventions, architecture constraints, and naming rules. Each conversation starts from a clean slate, so the AI behaves like a new intern who must be corrected repeatedly.
Solution – Context Engineering
The team introduced a three‑layer context model that automatically supplies precise project knowledge before code generation.
L0 – Project‑Level Context ("New‑Hire Document")
Stored in CLAUDE.md at the repository root. It contains an overview of the tech stack, architecture, module list, and an index of coding‑style rules. The document was trimmed from 858 lines to 422 lines by moving cross‑layer conventions to L1 and keeping only a high‑level index.
L1 – File‑Level Context (Precise Injection)
Implemented via two mechanisms:
Cursor : .cursor/rules/*.mdc files. Three alwaysApply rules (architecture, API response, Java coding style) are always injected. Seven glob rules match file paths (e.g., **/adapter/web/**, **/domain/**) to inject layer‑specific conventions.
Claude CLI : Sub‑directory CLAUDE.md files (one per layer) plus a central MEMORY.md that aggregates the same 10 rule files into nine chapters. The CLI loads the root CLAUDE.md and the nearest sub‑directory file automatically.
Rule‑writing principles: base rules on existing code, provide concrete templates rather than abstract principles, and keep each rule file between 30‑80 lines.
L2 – Task‑Level Context (Active Input)
Developers supply a structured prompt for each conversation, including scene type (A/B/C), a reference implementation file, and a link to the requirement document. This tells the AI which scenario template to follow and which “gold‑standard” module to imitate.
Configuration Mapping
A table maps each layer to the corresponding files for Cursor and Claude CLI, showing how always‑apply rules, glob matches, and sub‑directory CLAUDE.md files align.
Actual Effects
After applying the three‑layer model, code consistency improved across several dimensions (dependency injection, DTO naming, layer boundaries, response wrappers, exception handling, thread‑pool usage). The development flow changed from “AI free‑form → heavy manual correction” to “AI follows template → light micro‑adjustments”.
Token Consumption Analysis
Before slimming, loading the full 858‑line CLAUDE.md cost ~12,000 tokens per round. After slimming to 422 lines plus 10 rule files, the cost dropped to ~8,500 tokens, saving ~3,500 tokens per round. More importantly, the reduction in correction rounds (from 3‑5 to 0‑1) saved 20‑50 K tokens per round, yielding an overall token reduction of 50‑60 % for a typical feature.
Conclusion
The core insight is that AI coding productivity hinges on context engineering rather than prompt engineering. By establishing L0 project knowledge, L1 precise file‑level rules, and L2 task‑specific prompts, teams can turn AI from an inconsistent intern into a knowledgeable worker, especially in large, multi‑developer projects.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Yunqi AI+
Focuses on AI-powered enterprise digitalization, sharing product and technology practices. Covers AI use cases, technical architecture, product design examples, and industry trends. Aimed at developers, product managers, and digital transformation professionals, providing practical solutions and insights. Uses technology to drive digitization and AI to enable business innovation.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
