Code DAO
Author

Code DAO

We deliver AI algorithm tutorials and the latest news, curated by a team of researchers from Peking University, Shanghai Jiao Tong University, Central South University, and leading AI companies such as Huawei, Kuaishou, and SenseTime. Join us in the AI alchemy—making life better!

100
Articles
0
Likes
0
Views
0
Comments
Recent Articles

Latest from Code DAO

100 recent articles max
Code DAO
Code DAO
Dec 17, 2021 · Artificial Intelligence

Applying UNETR Transformer for 3D Medical Image Segmentation

This article walks through using the UNETR transformer architecture to segment 3D brain MRI scans from the BRATS dataset, detailing environment setup, data preprocessing with MONAI, model construction, training with DiceCE loss, validation metrics, and visualizing the best‑performing model outputs.

3D segmentationBRATSMONAI
0 likes · 16 min read
Applying UNETR Transformer for 3D Medical Image Segmentation
Code DAO
Code DAO
Dec 16, 2021 · Fundamentals

How Poisson Hidden Markov Models Enable Count‑Based Time‑Series Regression

This article explains how mixing a Poisson process with a discrete k‑state hidden Markov model creates a Poisson HMM that captures autocorrelation in integer‑valued time‑series, detailing the model formulation, prediction via expectation over states, and parameter estimation using MLE or EM.

EMMLEMarkov model
0 likes · 11 min read
How Poisson Hidden Markov Models Enable Count‑Based Time‑Series Regression
Code DAO
Code DAO
Dec 15, 2021 · Artificial Intelligence

Should You Monitor Your Machine Learning Models? An Introduction with Evidently AI

The article explains why monitoring production ML models is essential to detect data and target drift, describes the open‑source Evidently AI library and its statistical tests, and demonstrates its use on a weather‑forecast example and a plant‑seedling image classification case, including dashboards, code snippets, and visual analysis of drift impact.

Data DriftEvidently AIModel Monitoring
0 likes · 14 min read
Should You Monitor Your Machine Learning Models? An Introduction with Evidently AI
Code DAO
Code DAO
Dec 14, 2021 · Artificial Intelligence

Building a Chess AI from Scratch: Combining AlphaZero and Transformers (Part 2)

This article walks through constructing a learnable chess AI by integrating AlphaZero‑style Monte Carlo Tree Search with a decoder‑only Transformer, detailing the game tree logic, model architecture, input and output encodings, self‑play training loop, and code implementation in PyTorch.

AlphaZeroMonteCarloTreeSearchPyTorch
0 likes · 23 min read
Building a Chess AI from Scratch: Combining AlphaZero and Transformers (Part 2)
Code DAO
Code DAO
Dec 14, 2021 · Artificial Intelligence

Semantic Search on Wikipedia with Weaviate, GraphQL, Sentence‑BERT, and BERT Q&A

This article walks through building a large‑scale semantic search system on the English Wikipedia using the Weaviate vector database, GraphQL queries, and pre‑trained Sentence‑BERT and BERT Q&A models, covering dataset preparation, schema design, import pipelines, query examples, and production deployment strategies.

GraphQLSemantic SearchSentence-BERT
0 likes · 8 min read
Semantic Search on Wikipedia with Weaviate, GraphQL, Sentence‑BERT, and BERT Q&A
Code DAO
Code DAO
Dec 13, 2021 · Artificial Intelligence

A Comprehensive Guide to Ensemble Learning: Bagging, Boosting, and Stacking

This article explains the core concepts of ensemble learning, covering the bias‑variance trade‑off, the mechanics of bagging with bootstrap and random forests, the sequential strategies of boosting (AdaBoost and gradient boosting), and the heterogeneous stacking framework with meta‑models and multi‑layer extensions.

baggingboostingensemble learning
0 likes · 20 min read
A Comprehensive Guide to Ensemble Learning: Bagging, Boosting, and Stacking
Code DAO
Code DAO
Dec 12, 2021 · Artificial Intelligence

How to Boost Text Analysis Accuracy on a 2‑Billion‑Word Corpus

This article explains practical techniques for improving NLP model accuracy on massive corpora, covering challenges of multi‑field text, word‑embedding choices, a fasttext‑based regression demo with book‑review data, feature engineering tricks, and a comparison with tf‑idf + LASSO.

NLPPythonfasttext
0 likes · 13 min read
How to Boost Text Analysis Accuracy on a 2‑Billion‑Word Corpus
Code DAO
Code DAO
Dec 12, 2021 · Artificial Intelligence

Lightning Flash 0.3 Introduces New Tasks, Visualization Tools, Data Pipelines, and Registry API

Lightning Flash 0.3 expands the PyTorch Lightning ecosystem with eight new computer‑vision and NLP tasks, modular API design, integrated model hubs, visualisation callbacks, customizable data‑source hooks, and a central registry for model backbones, all illustrated with concrete code examples.

Computer VisionDeep LearningLightning Flash
0 likes · 7 min read
Lightning Flash 0.3 Introduces New Tasks, Visualization Tools, Data Pipelines, and Registry API
Code DAO
Code DAO
Dec 11, 2021 · Artificial Intelligence

Nimble: A Lightweight Parallel GPU Scheduler Boosting Deep Learning Performance

The article analyzes how Nimble reduces GPU scheduling overhead and enables parallel execution through ahead‑of‑time scheduling and automatic multi‑stream assignment, achieving up to 22.3× inference speedup over PyTorch and significantly improving GPU utilization for deep learning workloads.

Deep LearningGPU schedulingahead-of-time
0 likes · 9 min read
Nimble: A Lightweight Parallel GPU Scheduler Boosting Deep Learning Performance
Code DAO
Code DAO
Dec 11, 2021 · Artificial Intelligence

How to Optimize Machine Learning Hyperparameters with GridSearchCV

This article explains how GridSearchCV automates hyperparameter tuning for machine‑learning models, demonstrates its use with a RandomForest classifier on the breast‑cancer dataset—including code, cross‑validation, best‑parameter results, and discusses its advantages and scalability limits.

GridSearchCVRandomForestcross-validation
0 likes · 6 min read
How to Optimize Machine Learning Hyperparameters with GridSearchCV