Artificial Intelligence 12 min read

Boosting CAD & Ad Design Algorithms with a Goldenset Review Platform

The article describes how a custom algorithm review platform, built around goldenset test cases, quantifies and visualizes CAD recognition and advertising design tool outputs, enabling rapid regression testing, objective metric tracking, and efficient manual review, ultimately improving development speed and bug detection rates.

Qunhe Technology Quality Tech
Qunhe Technology Quality Tech
Qunhe Technology Quality Tech
Boosting CAD & Ad Design Algorithms with a Goldenset Review Platform

Background

Under the CoolSpace business there are two products: a CAD recognition tool that generates design data from user‑uploaded DWG files, and an advertising design tool that converts files such as CDR, AI, PSD, SVG, JPG into CoolHome models. Both rely on proprietary algorithms to parse uploaded files.

Challenges

Manual testing of a single file takes about five minutes and cannot cover the large number of cases.

During daily iterations it is hard to guarantee that algorithm changes always improve results.

Lack of objective metrics makes detailed manual evaluation difficult.

Solution – Algorithm Effect Review Platform

The platform introduces two main capabilities:

Abstract objective metrics from algorithm output and compare each regression run with a baseline.

Visualize algorithm results so reviewers can grasp the effect at a glance.

Goldenset

A “goldenset” is a test case that previously produced perfect results. Each iteration adds new successful cases to the goldenset, forming a regression suite. The CAD tool has accumulated over 800 goldenset cases; the advertising tool has over 100.

With a rich goldenset the platform can quickly assess the impact of code changes and serve as both a regression and functional testing suite.

CAD goldenset illustration
CAD goldenset illustration

Objective Metrics Definition

CAD Recognition Tool

Count of generated components (walls, columns, doors, windows, etc.).

Detection of target components – e.g., whether a specific wall (identified by its midline, thickness, and type) is present.

Advertising Design Tool

JSON representation of the

SvgElement

structure extracted from the SVG file.

Subsequent conversion of the SVG into geometric data (

EdgesData

) for rendering.

<code>{
    "type": "wall",
    "location": {
        "tp": "LS2",
        "p0": {"x": 1994.3481940833808, "y": -5686.922285582281},
        "p1": {"x": 1994.3481940833808, "y": 4863.077714417719}
    },
    "thickness": 261.15322966017993,
    "preWall": false,
    "id": 1,
    "height": {"s": 0.0, "e": 2900.0},
    "wallType": 0,
    "autoJoin": false,
    "bearing": false
}
</code>
<code>&lt;?xml version="1.0" encoding="UTF-8"?&gt;
&lt;!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"&gt;
&lt;svg xmlns="http://www.w3.org/2000/svg" xml:space="preserve" width="1554.29mm" height="532.904mm" version="1.1" shape-rendering="geometricPrecision" text-rendering="geometricPrecision" image-rendering="optimizeQuality" fill-rule="evenodd" clip-rule="evenodd" viewBox="0 0 67.8159 23.2515"&gt;
    ...
&lt;/svg&gt;
</code>
<code>public class SvgElement {
    private String tagName; // element tag name
    private Map<String, String> attributes; // attribute string
    private List<SvgElement> children; // child elements list
}
</code>

Derived Objective Indicators

Face count

Group count

SVG node count

Image material count

Color material count

Total edge count of planar surfaces

PowerClip count

Image material type count

Color material type count

Vector count

Result Visualization

Instead of manually opening the tool and uploading files (which takes 2‑3 minutes per case), the platform renders the geometry directly on a canvas, using different colors to distinguish component types, allowing instant visual assessment.

Visualization example
Visualization example

Regression Efficiency and Outcomes

For CAD recognition the platform maintains 823 goldenset cases covering more than 300 different drawings. A full regression of all cases completes in about five minutes, enabling developers to self‑test while committing code.

Since deployment, the CAD regression platform has been executed 1 074 times (358 for developer self‑testing). Automated bug detection rate reached 21 % for the agile team, and after the advertising regression platform was added the rate rose to 57.9 % with 11 bugs found in 34 self‑test runs.

Conclusion

Key takeaways are to abstract objective metrics that highlight differences during regression and to provide quick visual feedback, which together accelerate algorithm iteration and reduce manual testing effort.

advertisingalgorithmmetricsregressionvisualizationCADgoldenset
Qunhe Technology Quality Tech
Written by

Qunhe Technology Quality Tech

Kujiale Technology Quality

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.