Operations 15 min read

How DEA Transforms Industrial Technology Strength Evaluation: A Step‑by‑Step Guide

This article explains how Data Envelopment Analysis (DEA) is applied to assess the technological strength of industrial sectors, detailing indicator design principles, calculation steps, result interpretation, and actionable insights for improving productivity and competitiveness across Guangzhou’s industries.

Model Perspective
Model Perspective
Model Perspective
How DEA Transforms Industrial Technology Strength Evaluation: A Step‑by‑Step Guide

Case Study

This example demonstrates the full DEA process for evaluating industrial technology strength, treating each industry as a Decision‑Making Unit (DMU) with specific inputs and outputs. Selecting a suitable input‑output system is essential for meaningful, comparable DEA results.

Principles for Designing the Indicator System

Comprehensiveness – cover all basic aspects of technological capability, including R&D capacity, quality management, and economic benefits.

Independence – each indicator should have a clear, relatively independent meaning.

Guidance – align with high‑tech industrialization policies and support R&D activities.

Stability – use three‑year averages for long‑cycle indicators and smooth abnormal spikes.

Operability – derive data from existing statistics, avoiding unrealistic metrics.

Systematic – ensure dynamic links among indicators to form a complete system.

Comparability – prefer relative indicators to enable intra‑ and inter‑industry comparisons.

Following these principles, 12 input and output indicators were grouped into seven categories to build Guangzhou’s industrial technology DEA index system.

Meaning and Formulas of Each Indicator

(1) Proportion of R&D personnel among total staff – reflects human resource investment in R&D.

(2) Ratio of scientists and engineers among technical staff – indicates quality of technical personnel.

(3) Share of micro‑electronics‑controlled equipment in total machinery value – measures equipment modernization.

(4) Ratio of fixed assets of research institutions to total fixed assets – reflects material foundation for technology activities.

(5) Technology introduction investment as a share of total expenditure – shows the degree of high‑tech adoption.

(6) R&D expenditure to product sales revenue ratio – gauges emphasis on technological progress.

(7) Number of patents granted per 100 technical staff – reflects innovation output.

(8) Number of technology awards per 100 technical staff – reflects achievement level.

(9) New‑product sales revenue as a share of total product sales – indicates economic benefit of new products.

(10) New‑product export value as a share of product sales – reflects international competitiveness.

(11) Added value per 10,000 yuan of technology introduction investment – measures economic return of technology imports.

Input and output selection should reflect the evaluation purpose and content; the chosen indicators capture development capability, quality, management level, and the outputs of human, material, and financial inputs.

To ensure comparability across industries, each of the 12 indicators is expressed as a growth rate relative to the 1988 baseline, using 1999 values as the reporting period.

Using Guangzhou’s industrial statistics, after removing industries with many missing values and merging similar categories, 14 industries remain. DEA calculations based on the grouped indicator system produce comprehensive efficiency scores, average scores, and rankings (shown in the following tables).

Result Analysis

Overall Conclusion

Among the 14 industries, electronic communications and pharmaceuticals achieve DEA effectiveness in product development, technology, and management, indicating the highest development potential. Metals and building‑material sectors perform the worst. The top nine industries are Guangzhou’s industrial pillars, with a noticeable shift toward capital‑ and technology‑intensive sectors, supporting the city’s rise to the second‑largest industrial output among Chinese megacities.

DEA Findings for Product Development, Technology, and Management

Product‑development scores (Scenario 1) show good input‑output performance for most pillar industries except electrical machinery. Technology scores (Scenario 2) reveal that traditional sectors such as electrical machinery, electronics, petrochemicals, food and beverage are mature but technologically lagging, highlighting the need for new technologies and processes. Management scores (Scenario 3) suggest that transportation equipment manufacturers should improve management to stay competitive after WTO accession.

DEA Findings for Human, Material, and Financial Inputs

Scenario 4 indicates a relatively low effectiveness coefficient for the electronic communications sector, reflecting intense competition and insufficient talent quality. Scenarios 5 and 6 show that traditional industries (pharmaceuticals, petrochemicals, machinery, metal smelting, food) need high‑tech equipment to boost output efficiency, while food, beverage, and electrical machinery sectors show low attention to high‑tech adoption.

Step‑by‑Step Summary

The general DEA workflow includes: defining the evaluation purpose, selecting and constructing input/output indicator systems, collecting and cleaning data, choosing an appropriate DEA model, performing calculations, analyzing results, and offering decision‑making recommendations.

DEA’s core function is relative efficiency evaluation among comparable DMUs. Clarifying the evaluation purpose determines the choice of DMUs, indicators, and model.

DMU selection requires homogeneity in external environment, inputs, outputs, and objectives. Time‑based segmentation can also generate comparable DMUs.

Building the input/output system is foundational; inputs often represent cost‑type variables, while outputs represent benefit‑type variables. The system must fully reflect the evaluation goal, and each indicator should have clear economic meaning.

Indicators must be capable of achieving the evaluation purpose.

The system should comprehensively capture the goal, often requiring multiple inputs and outputs.

Consider the relationships between input and output vectors; pilot DEA analyses can help prune low‑impact indicators.

Accurate data collection and organization are critical, as data quality directly influences DEA results. Both dimensional and dimensionless data can be used without prior normalization.

Choosing a DEA model depends on the analysis needs; often a non‑radial model is preferred for identifying weak DMUs, and multiple models may be applied for richer insights.

Non‑radial models are convenient for detecting weak efficiency.

Applying different model types simultaneously can provide complementary information.

After finalizing the indicator system, select a suitable DEA model, compute relative efficiencies, analyze the causes of inefficiency, propose improvement measures, and compile a decision‑support report.

Reference: Modern Comprehensive Evaluation Methods and Selected Cases, Du Dong, Pang Dahua, Wu Yan (eds.), 3rd edition, Tsinghua University Press, 2015.

operations researchTechnology AssessmentDEAData Envelopment AnalysisIndustrial Evaluation
Model Perspective
Written by

Model Perspective

Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.