21CTO
21CTO
Jan 13, 2023 · Artificial Intelligence

How Google’s Muse Is Redefining Text‑to‑Image Generation with Parallel Decoding

Google’s new Muse model, a Transformer‑based text‑to‑image system running on TPUv4, claims to generate 256×256 images in 0.5 seconds—far faster than Imagen—while delivering unprecedented photorealism and deep language understanding through parallel decoding and large‑scale LLM‑conditioned training.

AI researchGoogle MuseLLM conditioning
0 likes · 4 min read
How Google’s Muse Is Redefining Text‑to‑Image Generation with Parallel Decoding