Artificial Intelligence 30 min read

DataFun Summit 2022: AI Foundations, Large‑Scale Model Training, and AI Infrastructure

The DataFun Summit 2022 brings together leading AI researchers and industry experts to discuss deep‑learning frameworks, ultra‑large model training, AI chips, compilers, MLOps, and end‑to‑end AI infrastructure, offering live streaming of six thematic forums and dozens of technical talks.

DataFunSummit
DataFunSummit
DataFunSummit
DataFun Summit 2022: AI Foundations, Large‑Scale Model Training, and AI Infrastructure

On November 19, 2022, DataFunSummit2022: AI Foundations Software Architecture Summit will be held online, featuring six major thematic forums covering deep‑learning frameworks, ultra‑large model training, AI chips and compilers, next‑generation AI infrastructure and applications, MLOps, and edge inference.

The summit includes keynote chairs and presenters from top companies such as Alibaba Cloud, Baidu, Microsoft, NVIDIA, Amazon, and many academic institutions, who will share insights on topics like TorchDynamo, MegCC, AI compiler optimization, AI chip heterogenous compilation, large‑scale model training with Megatron, and real‑time feature engineering platforms.

In addition to technical sessions, the event showcases product introductions, speaker biographies, and detailed session outlines, providing attendees with practical knowledge on AI model optimization, deployment pipelines, and emerging AI hardware and software ecosystems.

All sessions will be live‑streamed, and participants can join the community by scanning the QR code to receive updates and engage with the AI community.

machine learningAIdeep learningmlopslarge modelsAI InfrastructureCompilers
DataFunSummit
Written by

DataFunSummit

Official account of the DataFun community, dedicated to sharing big data and AI industry summit news and speaker talks, with regular downloadable resource packs.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.