JavaEdge
JavaEdge
Jul 22, 2024 · Artificial Intelligence

What Is a Transformer and Why It’s Transforming AI?

This article explains the fundamentals of transformer models, why they outperform earlier neural networks, their core components such as self‑attention and positional encoding, practical use cases across language and biology, and how they differ from RNNs, CNNs, and other architectures.

AISelf-attentionSequence-to-Sequence
0 likes · 20 min read
What Is a Transformer and Why It’s Transforming AI?
Alibaba Cloud Developer
Alibaba Cloud Developer
Jun 27, 2019 · Artificial Intelligence

Generating Personalized E‑commerce Review Replies with Product Information

This paper presents a sequence‑to‑sequence model that fuses product‑detail tables with customer comments, using gated multimodal attention, copy mechanisms and reinforcement learning to automatically produce high‑quality, context‑aware replies for e‑commerce platforms, and validates the approach with extensive experiments on a large Taobao dataset.

Sequence-to-Sequencecopy mechanisme‑commerce
0 likes · 21 min read
Generating Personalized E‑commerce Review Replies with Product Information
Alibaba Cloud Developer
Alibaba Cloud Developer
Aug 29, 2018 · Artificial Intelligence

How Alibaba’s Tmall Genie Achieves Human‑AI Co‑Creation in Poetry

Alibaba’s AI Lab explains how its Tmall Genie poetry system combines deep‑learning models, beam‑search recommendations, and a unified evaluation network to let users collaboratively generate acrostic love poems, showcasing the potential of human‑AI co‑creation in natural language generation.

AI poetryAlibaba AI LabSequence-to-Sequence
0 likes · 7 min read
How Alibaba’s Tmall Genie Achieves Human‑AI Co‑Creation in Poetry
Alibaba Cloud Developer
Alibaba Cloud Developer
Aug 17, 2018 · Artificial Intelligence

Can Multi‑Task Learning Shorten E‑Commerce Titles Without Losing Sales?

This paper proposes a multi‑task learning approach that compresses overly long e‑commerce product titles into concise short titles using a Pointer Network, while simultaneously generating user search queries with an attention‑based encoder‑decoder, achieving higher readability, informativeness, and conversion rates than traditional methods.

Attention MechanismMulti-Task LearningSequence-to-Sequence
0 likes · 11 min read
Can Multi‑Task Learning Shorten E‑Commerce Titles Without Losing Sales?