Tag

seq2seq

0 views collected around this technical thread.

Tencent Technical Engineering
Tencent Technical Engineering
Apr 16, 2025 · Artificial Intelligence

Understanding Transformer Architecture for Chinese‑English Translation: A Practical Guide

This practical guide walks through the full Transformer architecture for Chinese‑to‑English translation, detailing encoder‑decoder structure, tokenization and embeddings, batch handling with padding and masks, positional encodings, parallel teacher‑forcing, self‑ and multi‑head attention, and the complete forward and back‑propagation training steps.

EmbeddingPositional EncodingPyTorch
0 likes · 26 min read
Understanding Transformer Architecture for Chinese‑English Translation: A Practical Guide
Baidu Geek Talk
Baidu Geek Talk
Oct 26, 2022 · Artificial Intelligence

Exploring Automatic Advertising Copy Generation: Techniques, Practices, and Future Directions

The article surveys automatic advertising copy generation, detailing why optimization is needed, the fundamentals of neural text generation with Seq2Seq and attention, extractive versus abstractive approaches, modern embeddings and MASS pre‑training, practical data and evaluation methods, and future enhancements such as multi‑stage attention, knowledge integration, and large pre‑trained models.

AIMASSNLG
0 likes · 21 min read
Exploring Automatic Advertising Copy Generation: Techniques, Practices, and Future Directions
Youku Technology
Youku Technology
Feb 28, 2022 · Artificial Intelligence

Seq2Path: Generating Sentiment Tuples as Paths of a Tree

Seq2Path treats each sentiment tuple as an independent tree path, training with average path loss and decoding via constrained beam search with a discriminative token, achieving state‑of‑the‑art results on five aspect‑based sentiment analysis datasets and deployment in Alibaba Entertainment AI Brain.

ACLInformation ExtractionNatural Language Processing
0 likes · 3 min read
Seq2Path: Generating Sentiment Tuples as Paths of a Tree
58 Tech
58 Tech
Oct 12, 2021 · Artificial Intelligence

Seq2Seq Approaches for Phone Number Extraction from Two‑Speaker Voice Dialogues

This article presents a practical study of extracting phone numbers from two‑speaker voice dialogues using Seq2Seq models—including LSTM, GRU with attention and feature fusion, and Transformer—detailing data characteristics, model architectures, training strategies, experimental results, and comparative analysis showing the GRU‑Attention approach achieving the best performance.

AttentionGRULSTM
0 likes · 13 min read
Seq2Seq Approaches for Phone Number Extraction from Two‑Speaker Voice Dialogues
JD Tech
JD Tech
Jun 17, 2021 · Artificial Intelligence

MTrajRec: Map-Constrained Trajectory Recovery via Seq2Seq Multi‑Task Learning

The paper introduces MTrajRec, a Seq2Seq multi‑task learning framework that simultaneously restores low‑sampling‑rate GPS trajectories to high‑sampling‑rate and aligns them to the road network, achieving more accurate and efficient trajectory recovery for downstream applications such as navigation and travel‑time estimation.

KDD 2021deep learningmap matching
0 likes · 9 min read
MTrajRec: Map-Constrained Trajectory Recovery via Seq2Seq Multi‑Task Learning
58 Tech
58 Tech
Mar 5, 2021 · Artificial Intelligence

Intelligent Job Title Generation with Pipeline and Seq2Seq Approaches

This article presents a comprehensive study on generating recruitment job titles by combining a rule‑based pipeline with advanced seq2seq models—including BiLSTM‑Attention, Pointer‑Generator, and a Field‑Gate Dual‑Attention architecture—demonstrating significant performance gains on real‑world hiring data.

NLPdeep learningjob title
0 likes · 14 min read
Intelligent Job Title Generation with Pipeline and Seq2Seq Approaches
New Oriental Technology
New Oriental Technology
Feb 1, 2021 · Artificial Intelligence

Neural Machine Translation: Seq2Seq, Beam Search, BLEU, Attention Mechanisms, and GNMT Improvements

This article explains key concepts of neural machine translation, covering Seq2Seq encoder‑decoder models, beam search strategies, BLEU evaluation, various attention mechanisms, and the enhancements introduced in Google's Neural Machine Translation system to improve speed, OOV handling, and translation quality.

AttentionBLEUGNMT
0 likes · 11 min read
Neural Machine Translation: Seq2Seq, Beam Search, BLEU, Attention Mechanisms, and GNMT Improvements
DataFunSummit
DataFunSummit
Dec 18, 2020 · Artificial Intelligence

Complex Semantic Representation in Voice Assistants: NLP Layers, DIS Limitations, and the CMRL Schema

This article explains how voice assistants rely on a three‑layer NLP pipeline (lexical, syntactic, and semantic analysis), discusses the shortcomings of the traditional DIS (Domain‑Intent‑Slot) structure for complex commands, and introduces the hierarchical CMRL schema along with two neural models (copy‑write seq2seq and seq2tree) for converting natural language into structured logical expressions.

CMRLNLPsemantic parsing
0 likes · 14 min read
Complex Semantic Representation in Voice Assistants: NLP Layers, DIS Limitations, and the CMRL Schema
New Oriental Technology
New Oriental Technology
Nov 23, 2020 · Artificial Intelligence

A Seq2Seq Deep Learning Approach for Recognizing Mathematical Formulas in Images

This article presents a deep‑learning Seq2Seq model that converts images of mathematical formulas—including matrices, equations, fractions, and radicals—into LaTeX sequences with over 95% accuracy, detailing data preparation, LaTeX normalization, model architecture, training, inference, and post‑processing techniques.

Formula RecognitionLaTeXOCR
0 likes · 9 min read
A Seq2Seq Deep Learning Approach for Recognizing Mathematical Formulas in Images
Sohu Tech Products
Sohu Tech Products
Nov 18, 2020 · Artificial Intelligence

Understanding Sequence‑to‑Sequence (seq2seq) Models and Attention Mechanisms

This article explains the fundamentals of seq2seq neural machine translation models, covering encoder‑decoder architecture, word embeddings, context vectors, RNN processing, and the attention mechanism introduced by Bahdanau and Luong, with visual illustrations and reference links for deeper study.

AttentionRNNdeep learning
0 likes · 11 min read
Understanding Sequence‑to‑Sequence (seq2seq) Models and Attention Mechanisms
DataFunTalk
DataFunTalk
Aug 3, 2020 · Artificial Intelligence

Advances in Sequence‑to‑Sequence Text Generation: Attention, Pointer, Copy, and Transformer Models

This article reviews the evolution of encoder‑decoder based text generation, covering classic seq2seq with attention, pointer networks, copy mechanisms, knowledge‑enhanced models, convolutional approaches, and the latest Transformer‑based pre‑training such as MASS, highlighting their architectures, key innovations, and practical considerations.

NLPTransformercopy mechanism
0 likes · 17 min read
Advances in Sequence‑to‑Sequence Text Generation: Attention, Pointer, Copy, and Transformer Models
58 Tech
58 Tech
Mar 13, 2019 · Artificial Intelligence

Design and Implementation of the 58.com Intelligent Article Writing Robot

The article describes the design, workflow, and two‑stage model improvements of 58.com’s intelligent writing robot, which uses template matching, seq2seq with attention and BeamSearch, and slot‑replacement techniques to automatically generate titles and body content for real‑estate and used‑car promotions, achieving high publishing volume and readership.

AI writingBLEUBeamSearch
0 likes · 9 min read
Design and Implementation of the 58.com Intelligent Article Writing Robot
Qunar Tech Salon
Qunar Tech Salon
Mar 1, 2018 · Artificial Intelligence

Open-Domain Chatbot Implementation: Retrieval and Generative Approaches

This article explains the implementation of open-domain chatbots for customer service, comparing retrieval‑based and generative seq2seq approaches, describing hybrid methods that first attempt constrained retrieval before falling back to generation, and discusses training data, keyword extraction, and performance observations.

AIChatbotCustomer Service
0 likes · 6 min read
Open-Domain Chatbot Implementation: Retrieval and Generative Approaches