How High‑Order Programs Are Making Large AI Models Trustworthy for Professional Use
At the 2025 World AI Conference, Ant Group unveiled an open‑source High‑Order Program framework that blends intelligent models with engineering safeguards to overcome reliability challenges and enable large‑scale, trustworthy AI applications across finance, security, and healthcare.
At the 2025 World Artificial Intelligence Conference in Shanghai, Ant Group’s Ant Misu announced the open‑source High‑Order Program (HOP) framework for trustworthy large‑model applications, aiming to bridge intelligence and engineering for professional AI use.
Professor Chen Chun of Zhejiang University highlighted that reliability remains the biggest hurdle for specialized large‑model deployment, noting that simply eliminating “hallucinations” would reduce models to mechanical retrieval tools and that engineering safeguards are needed.
Ant Misu Vice President Wei Tao argued that combining intelligent systems with robust engineering processes can ensure reliable professional applications, comparing the approach to centuries‑old engineering advances that enable complex missions like moon landings.
The HOP framework consists of programmatic business logic, scenario knowledge graphs, and a controlled toolchain, embedding verification mechanisms throughout the workflow to validate critical results and maintain precision even when hallucinations occur.
By translating standard operating procedures into executable code, HOP has already been applied in financial risk control, network intrusion detection, and medical billing, delivering higher reliability and faster response compared with traditional manual pipelines.
Wei Tao emphasized that HOP complements large models, turning them from assistants into scalable professional productivity tools, and believes that solving reliability will unlock new killer applications for AI.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
