Tag

regularization

0 views collected around this technical thread.

Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
May 5, 2024 · Artificial Intelligence

Comprehensive Guide to Neural Network Algorithms: Definitions, Structure, Implementation, and Training

This article provides an in‑depth tutorial on neural network algorithms, covering their biological inspiration, significance, advantages and drawbacks, detailed architecture, data preparation, one‑hot encoding, weight initialization, forward and backward propagation, cost functions, regularization, gradient checking, and complete Python code examples.

AIPythonbackpropagation
0 likes · 37 min read
Comprehensive Guide to Neural Network Algorithms: Definitions, Structure, Implementation, and Training
Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Apr 7, 2024 · Artificial Intelligence

Logistic Regression: Definition, Purpose, Structure, Implementation, and Regularization

This article explains logistic regression as a classification algorithm, covering its definition, purpose, mathematical structure, data preparation, core functions such as sigmoid, cost, gradient descent, prediction, model evaluation, decision boundary visualization, feature mapping, and regularization techniques, all illustrated with Python code examples.

Pythonclassificationgradient descent
0 likes · 33 min read
Logistic Regression: Definition, Purpose, Structure, Implementation, and Regularization
Rare Earth Juejin Tech Community
Rare Earth Juejin Tech Community
Apr 5, 2024 · Artificial Intelligence

Linear Regression Algorithm: Definition, Structure, Implementation, Cost Function, Gradient Descent, and Regularization

This article provides a comprehensive overview of linear regression, covering its definition, purpose, algorithmic steps, data preparation, feature scaling, parameter initialization, cost function computation, gradient descent optimization, visualization, normal equation solution, and regularization, accompanied by detailed Python code examples.

NumPyPythoncost function
0 likes · 19 min read
Linear Regression Algorithm: Definition, Structure, Implementation, Cost Function, Gradient Descent, and Regularization
DataFunSummit
DataFunSummit
Feb 14, 2023 · Artificial Intelligence

Deep Learning Hyperparameter Tuning and Training Tips: Insights from Zhihu Experts

This article compiles practical deep learning training and hyperparameter tuning advice from Zhihu contributors, covering model debugging, learning‑rate strategies, optimizer choices, data preprocessing, regularization techniques, initialization methods, common pitfalls, recommended research papers, and ensemble approaches.

deep learninggradient clippinghyperparameter tuning
0 likes · 13 min read
Deep Learning Hyperparameter Tuning and Training Tips: Insights from Zhihu Experts
Model Perspective
Model Perspective
Jan 15, 2023 · Artificial Intelligence

Mastering Model Evaluation: Key Metrics, Validation Techniques, and Diagnostics

This guide explains essential evaluation metrics for classification and regression models—including confusion matrix, ROC/AUC, R², and main performance indicators—covers model selection strategies such as train‑validation‑test splits, k‑fold cross‑validation, and regularization techniques, and discusses bias‑variance trade‑offs and diagnostic tools.

cross-validationevaluation metricsmachine learning
0 likes · 6 min read
Mastering Model Evaluation: Key Metrics, Validation Techniques, and Diagnostics
Model Perspective
Model Perspective
Jul 22, 2022 · Artificial Intelligence

Understanding Ridge Regression: Definitions, Properties, and Parameter Selection

This article explains ridge regression by defining the estimator, outlining its key properties, discussing methods for choosing the ridge parameter, and demonstrating its application to economic data with Python code and visualizations.

Pythonlinear modelsparameter selection
0 likes · 6 min read
Understanding Ridge Regression: Definitions, Properties, and Parameter Selection
Model Perspective
Model Perspective
Jul 21, 2022 · Artificial Intelligence

Tackling Multicollinearity: Ridge and LASSO Regression Explained with Python

This article explains how multicollinearity undermines ordinary least squares estimates, introduces ridge and LASSO regularization as remedies, and demonstrates their application on a 1966 French economic dataset using Python’s statsmodels, complete with code and interpretation of results.

LASSOPythoneconometrics
0 likes · 7 min read
Tackling Multicollinearity: Ridge and LASSO Regression Explained with Python
DataFunTalk
DataFunTalk
Dec 4, 2021 · Artificial Intelligence

Practical Deep Learning Training Tricks: Cyclic LR, Flooding, Warmup, RAdam, Adversarial Training, Focal Loss, Dropout, Normalization and More

This article compiles essential deep learning training techniques—including cyclic learning rates, flooding, warmup, RAdam optimizer, adversarial training, focal loss, dropout, batch/group/weight normalization, label smoothing, Wasserstein GAN, skip connections, and weight initialization—providing concise explanations and code snippets for each method.

Optimizationdeep learningneural networks
0 likes · 11 min read
Practical Deep Learning Training Tricks: Cyclic LR, Flooding, Warmup, RAdam, Adversarial Training, Focal Loss, Dropout, Normalization and More
DataFunTalk
DataFunTalk
Aug 10, 2021 · Artificial Intelligence

Practical Deep Learning Tricks: Cyclic LR, Flooding, Warmup, RAdam, Adversarial Training, Focal Loss, Dropout, Normalization, ReLU, Group Normalization, Label Smoothing, Wasserstein GAN, Skip Connections, Weight Initialization

This article presents a concise collection of practical deep‑learning techniques—including cyclic learning‑rate, flooding, warmup, RAdam, adversarial training, focal loss, dropout, various normalization methods, ReLU, group normalization, label smoothing, Wasserstein GAN, skip connections, and weight initialization—along with code snippets and references for implementation.

GANadversarial trainingdeep learning
0 likes · 8 min read
Practical Deep Learning Tricks: Cyclic LR, Flooding, Warmup, RAdam, Adversarial Training, Focal Loss, Dropout, Normalization, ReLU, Group Normalization, Label Smoothing, Wasserstein GAN, Skip Connections, Weight Initialization
DataFunTalk
DataFunTalk
Apr 5, 2021 · Artificial Intelligence

Summary of Methods and Findings from the NLP Chinese Pre‑training Model Generalization Challenge

The article reviews the Chinese NLP pre‑training model generalization competition, detailing data preprocessing, augmentation, external data usage, model scaling and architecture tweaks, loss functions, learning‑rate and adversarial training strategies, regularization techniques, post‑processing optimizations, and ineffective methods, highlighting their impact on performance metrics.

NLPPretrainingdata augmentation
0 likes · 15 min read
Summary of Methods and Findings from the NLP Chinese Pre‑training Model Generalization Challenge
Architects' Tech Alliance
Architects' Tech Alliance
Sep 3, 2020 · Artificial Intelligence

Deep Learning Specialization Infographic Overview

This article presents a comprehensive English summary of the deep learning specialization infographics originally shared by Andrew Ng, covering fundamentals, logistic regression, shallow and deep neural networks, regularization, optimization, hyperparameters, convolutional and recurrent networks, and practical advice for model building and evaluation.

CNNOptimizationRNN
0 likes · 21 min read
Deep Learning Specialization Infographic Overview
Beike Product & Technology
Beike Product & Technology
Mar 21, 2019 · Artificial Intelligence

Optimization Foundations and Applications in Machine Learning and Computer Vision

This article introduces how machine learning problems are formulated as optimization tasks, explains the construction of objective functions with examples such as linear regression, robust fitting, regularization, and demonstrates various applications ranging from K‑means clustering to image inpainting and 3D reconstruction.

Optimizationcomputer visionlinear regression
0 likes · 9 min read
Optimization Foundations and Applications in Machine Learning and Computer Vision
Qunar Tech Salon
Qunar Tech Salon
Oct 10, 2018 · Artificial Intelligence

Introduction to Lasso Regression with scikit-learn

This article provides a comprehensive guide to Lasso regression, covering its theoretical background, scikit-learn API parameters, step‑by‑step Python implementation, cross‑validation for hyper‑parameter tuning, visualization of predictions, and a discussion of its advantages over ridge regression.

Data VisualizationPythoncross-validation
0 likes · 6 min read
Introduction to Lasso Regression with scikit-learn
Qunar Tech Salon
Qunar Tech Salon
Oct 9, 2018 · Artificial Intelligence

Ridge Regression with scikit-learn: Theory, Implementation, and Example

This article introduces Ridge regression, explains its theory and regularization role, discusses overfitting and bias‑variance trade‑offs, presents scikit‑learn parameters, and provides a complete Python example from data loading to model training, evaluation, and optimal alpha selection.

Pythonmachine learningregression
0 likes · 7 min read
Ridge Regression with scikit-learn: Theory, Implementation, and Example