Guidelines for the National Data Standard System: Building Trusted Data Infrastructure and Secure Data Flow Standards
The newly released National Data Standard System Construction Guide outlines a comprehensive framework for data supply, circulation, utilization, and security, emphasizing trusted infrastructure, controlled anonymization, cross‑domain usage control, and unified security assessment to enable safe, efficient data exchange across entities and industries.
Recently, six ministries including the National Development and Reform Commission and the National Data Administration jointly issued the "National Data Standard System Construction Guide" (the "Guide"). The Guide follows the principle of "data can be supplied, flow, be used well, and be kept safe" and builds a national data standard system covering infrastructure, resources, technology, circulation, integrated applications, and security, guiding the formulation and implementation of critical data standards and supporting rapid market response to new technologies and models.
Unlike the traditional internal‑focused data governance model, the data‑factor market emphasizes external circulation across entities, industries, and scenarios, unlocking deeper value but also raising governance difficulties and circulation risks such as unclear ownership, compliance, security protection, and responsibility tracing, as well as insufficient data protection capabilities among many enterprises.
In this context, establishing a trustworthy data infrastructure is crucial. It encourages cross‑entity data providers to participate, accelerates secure external circulation, and, through pilot projects for high‑value, high‑sensitivity data, helps formulate industry entry standards, guidance, and economies of scale, reducing marginal costs for large‑scale data supply, flow, use, and safety.
Standards serve as tools to consolidate industry consensus, clarify responsibilities, set industry baselines, and define governance requirements. The Guide highlights the foundational role of data infrastructure in data circulation, offering guidance on storage‑computing facilities, network transmission standards, and technical, procedural, and control requirements for flow‑utilization facilities to ensure compliant, efficient, and orderly data flow.
Based on this foundation, three key directions are identified for immediate attention: controlled anonymization, cross‑domain data usage control, and a universal security grading standard for trusted data circulation.
1. Implementable Controlled Anonymization Standards as a "Stabilizer" for Personal Information Protection and Data‑Compliant Use Existing laws such as the Cybersecurity Law and Personal Information Protection Law introduce anonymization clauses but lack clear legal definitions and implementation standards. Controlled anonymization can protect personal privacy while providing the data needed for AI model training, making re‑identification risk negligible within a controlled environment.
Controlled anonymization focuses on limiting data to a specific environment, assessing re‑identification risk only for information that could enter that environment, and ensuring that no individual can be identified or reconstructed, while the environment remains secure against unauthorized access and theft.
Therefore, establishing a set of metrics and evaluation criteria for anonymization in controlled settings is essential; a unified process and clear definitions of sufficient de‑identification can prevent sensitive information leakage while preserving data value for research and analysis.
2. Standards for Cross‑Domain Data Usage Rights Control as the "Keel" Ensuring Trusted Data Flow Merging data from different entities and industries creates richer data portraits but also generates trust anxiety about proper handling, illegal interception, or alteration. The lack of technical requirement standards hampers assurance of protection, permissible algorithms, usage limits, and effective enforcement.
To maintain control over data during circulation, standards should define domain boundaries, obligations of each participant, technical requirements for infrastructure supporting cross‑domain control, and lifecycle processes for pre‑, during‑, and post‑use handling, providing systematic security guarantees.
3. General Security Assessment Standards for Trusted Data Flow as the "Metric" for Large‑Scale Secure Circulation Trusted data flow technologies—privacy‑preserving computation, usage control, and blockchain—enable efficient, low‑cost, intelligent data processing. However, disparate security mechanisms lack a unified evaluation framework, making it difficult for users to compare solutions across technologies.
Developing a comprehensive security grading standard will allow selection of appropriate technologies based on required security levels, balancing safety, performance, and cost.
In summary, the national data standard system aims to build a trusted, cost‑effective ecosystem that lowers barriers for data supply, flow, use, and safety. By implementing standards for controlled anonymization, cross‑domain usage control, and universal security assessment, the system supports seamless, secure data exchange across multiple entities, scenarios, and industries, empowering data as a strong driver for the digital economy.
AntTech
Technology is the core driver of Ant's future creation.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.