DataFunSummit
Nov 14, 2021 · Artificial Intelligence
Overview of Pre‑training Models and the UER‑py Framework for Natural Language Processing
This article introduces the importance of pre‑training in natural language processing, reviews classic pre‑training models such as Skip‑thoughts, BERT, GPT‑2 and T5, presents the modular UER‑py framework and its Chinese resources, compares it with Huggingface Transformers, and outlines practical deployment steps in industry.
Machine LearningNLPUER-py
0 likes · 22 min read