How Test Engineers Without Coding Skills Can Effectively Discuss Design with Developers
The article outlines a step‑by‑step approach for non‑coding test engineers to prepare, communicate, and focus on business, user experience, and testability when discussing product design with developers, helping avoid misunderstandings and design gaps.
1. Prepare Three Key Items: Bring Specific Questions, Don’t Just Listen
Before meeting developers, test engineers should:
1) Fully understand business requirements and the problems the design aims to solve.
Identify user scenarios (e.g., a user needs to correct an address before payment) and business rules (e.g., address changes must re‑validate logistics zones). Record these points and ask whether the design covers them.
2) List test scenarios to anticipate design gaps.
For a feature such as “member points redemption,” enumerate normal, abnormal, and edge cases—e.g., sufficient points, insufficient points, network interruption, and simultaneous purchases of limited‑stock items. Use these scenarios to probe the design.
3) Grasp basic technical concepts to follow the discussion.
Know the business impact of terms like “Redis cache for user points” (fast reads but possible sync delay) and ask relevant questions (e.g., will points display delay cause duplicate redemption?).
2. Follow a Three‑Step Communication Logic
Step 1 – Confirm Design Covers All Business Scenarios
Ask concrete questions about each scenario. Example: for a login redesign, verify whether both password and SMS‑code login remain available and whether rate‑limiting (e.g., max three codes per minute) is considered.
Step 2 – Clarify How Exceptions Are Handled
Probe error handling, such as what message appears when a transfer target account does not exist, or how the system behaves after a network outage during a transfer (e.g., status query feature).
Step 3 – Anticipate Testing and Post‑Release Risks
Discuss testability and operational concerns, like how a recommendation engine behaves for new users without browsing history, or whether a test‑only log entry can help verify recommendation logic.
3. Focus on Three Core Concerns
1) User Experience – Be the User’s Advocate
Question whether multi‑step flows (e.g., registration) are overly complex and suggest consolidations that improve conversion.
2) Business Rule Consistency – Avoid Contradictions
Ensure discount rules are applied consistently across modules (e.g., whether a birthday‑month discount stacks with a full‑price reduction).
3) Testability – Make Verification Feasible
Ask how encryption can be verified in test environments (e.g., disabling encryption for packet capture) while keeping production secure.
4. Three Pitfalls to Avoid
1) Don’t Ask “Outsider” Technical Questions
Replace language‑specific queries (Java vs. Python) with business‑oriented ones (e.g., response time under weak network).
2) Avoid Over‑Criticizing; Offer Suggestions Instead
Frame concerns as proposals (e.g., “Could we add an offline verification method for elderly users?”) rather than outright dismissals.
3) Don’t Fear Asking When You Don’t Understand
Admit gaps in business knowledge and request clarification, which leads to more accurate testing and fewer downstream issues.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
