Turn XMind Test Cases into Live Pytest Scripts with Allure and CI/CD

This guide shows how to design API test cases in XMind, automatically generate pytest scripts with Allure reporting, export YAML/Excel files, and integrate the whole workflow into GitHub Actions for seamless CI/CD execution.

Test Development Learning Exchange
Test Development Learning Exchange
Test Development Learning Exchange
Turn XMind Test Cases into Live Pytest Scripts with Allure and CI/CD

Why Use XMind for Test Cases?

XMind provides a visual, hierarchical structure (module → interface → case → parameters) that is easy for non‑technical team members to collaborate on, requires no database, and can be parsed programmatically when a node format is agreed upon.

Standard XMind Test Case Template

The following tree must be followed (fourth level uses key: value pairs):

└── 用户管理(模块)
    └── 登录接口(API)
        ├── 正常登录
        │   ├── method: POST
        │   ├── url: /api/v1/login
        │   ├── headers: {"Content-Type": "application/json"}
        │   ├── body: {"username": "admin", "password": "123456"}
        │   ├── assert_status: 200
        │   ├── assert_json: {"code": 0}
        │   └── description: 用户名密码正确,登录成功
        └── 用户名为空
            ├── method: POST
            ├── url: /api/v1/login
            ├── body: {"username": "", "password": "123456"}
            ├── assert_status: 400
            └── description: 用户名为空,返回错误

Supported fields include

method, url, headers, params, body, assert_status, assert_json, description, data_driven, test_data

, etc.

Step 1: Install Dependencies

pip install xmindparser requests pytest allure-pytest assertpy pandas openpyxl pyyaml

Step 2: Main Generation Script (generate_api_tests.py)

The script performs three tasks: parse XMind, export YAML/Excel, and generate pytest test files.

import os
from xmindparser import xmind_to_dict
from utils.export_yaml import export_to_yaml
from utils.export_excel import export_to_excel

BASE_URL = os.getenv("API_BASE_URL", "https://api.example.com")

def parse_and_generate():
    # 1. Parse XMind
    cases = parse_xmind("test_cases.xmind")
    # 2. Export auxiliary files
    export_to_yaml(cases, "test_cases.yaml")
    export_to_excel(cases, "test_cases.xlsx")
    # 3. Generate pytest script
    generate_pytest_file(cases, "generated_tests/test_api_auto.py")

if __name__ == "__main__":
    parse_and_generate()

Step 3: Example of Generated Pytest Test

@allure.feature("用户管理")
@allure.story("登录接口")
@allure.title("用户名密码正确,登录成功")
def test_001_正常登录():
    resp = requests.post(
        "https://api.example.com/api/v1/login",
        headers={"Content-Type": "application/json"},
        json={"username": "admin", "password": "123456"}
    )
    assert resp.status_code == 200
    assert resp.json()["code"] == 0

The generated tests include Allure annotations for a beautiful visual report.

Advanced Feature: Data‑Driven & Mock Support

Example XMind snippet for batch login testing:

├── data_driven: true
├── test_data: [{"user":"admin","pwd":"123"},{"user":"guest","pwd":"456"}]
├── method: POST
├── url: /api/v1/login
└── assert_status: 200

The script will split this into multiple test cases and can be extended to use @pytest.mark.parametrize.

Step 4: CI/CD Integration (GitHub Actions)

# .github/workflows/api-test.yml
name: API Auto Test
on: [push, pull_request]
jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Setup Python
        uses: actions/setup-python@v5
        with: { python-version: '3.10' }
      - run: pip install -r requirements.txt
      - run: python generate_api_tests.py
      - run: |
          cd generated_tests
          pytest test_api_auto.py --alluredir=./report
      - uses: actions/upload-artifact@v4
        with:
          name: allure-report
          path: generated_tests/report/

Each push of the XMind file triggers automatic testing; failures block merges.

Project Structure Recommendation

api-auto-test/
├── test_cases.xmind          # single source of truth
├── test_cases.yaml          # auto‑generated, Git‑friendly
├── test_cases.xlsx          # auto‑generated for review
├── generate_api_tests.py   # main generator
├── utils/                  # export helpers
├── generated_tests/        # generated pytest scripts
└── .github/workflows/      # CI configuration

Why Adopt This Solution?

Automation value lies not in the amount of code written but in how efficiently test cases flow from design to execution. Using XMind as a unified entry point bridges the “last mile” between specification and testing, turning static test cases into a powerful quality‑guarding tool.

PythonCI/CDAPI testingpytestAllurexmind
Test Development Learning Exchange
Written by

Test Development Learning Exchange

Test Development Learning Exchange

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.