AI2ML AI to Machine Learning
AI2ML AI to Machine Learning
Oct 24, 2025 · Artificial Intelligence

Beyond RAG: Three Emerging Knowledge‑Engineering Strategies (ICL, Online Learning, SLM)

The article outlines three post‑RAG knowledge‑engineering approaches—In‑Context Learning with dynamic few‑shot selection, Online Learning encompassing Meta‑Learning and Lifelong Learning to quickly adapt to new tasks, and the Small Language Model path that combines fine‑tuned task‑specific experts with LLM‑SLM collaboration for efficient, privacy‑preserving inference.

In-Context LearningKnowledge EngineeringLLM
0 likes · 4 min read
Beyond RAG: Three Emerging Knowledge‑Engineering Strategies (ICL, Online Learning, SLM)
AI2ML AI to Machine Learning
AI2ML AI to Machine Learning
Oct 13, 2025 · Artificial Intelligence

How Large‑and‑Small Language Model Collaboration Is Shaping the Future

The article argues that combining large, high‑capacity models with lightweight, fine‑tuned small models can cut costs, lower latency, enable specialized vertical tasks, and shift development from chasing ever‑bigger models toward optimal system architectures, outlining key techniques such as state‑space models, knowledge distillation, and staged fine‑tuning.

AI architectureLarge Language Modelefficiency
0 likes · 3 min read
How Large‑and‑Small Language Model Collaboration Is Shaping the Future
21CTO
21CTO
Apr 24, 2024 · Artificial Intelligence

Microsoft’s Phi‑3 Mini: The Smallest LLM That Beats GPT‑3.5 on iPhone

Microsoft unveiled the open‑source Phi‑3 series, a lightweight family of large language models that outperform larger rivals, run offline on smartphones, and cost a fraction of comparable AI models, opening new possibilities for edge and mobile AI applications.

LLMPhi-3offline-inference
0 likes · 8 min read
Microsoft’s Phi‑3 Mini: The Smallest LLM That Beats GPT‑3.5 on iPhone