How the Mixture-of-Queries Transformer Tackles Camouflaged Instance Segmentation
The IJCAI 2025 paper showcase introduces the Mixture‑of‑Queries Transformer, a novel model that combines frequency‑domain feature enhancement with collaborative query decoding to achieve state‑of‑the‑art camouflaged instance segmentation across multiple datasets.
International Joint Conference on Artificial Intelligence (IJCAI) 2025 will be held in Montreal, Canada, receiving 5,404 submissions with a 19.3% acceptance rate. This article highlights a breakthrough paper titled "Mixture‑of‑Queries Transformer (MoQT)" that addresses camouflaged instance segmentation.
MoQT introduces two key components: a frequency‑enhancement feature extractor that captures camouflage clues in the frequency domain, strengthening contours and suppressing misleading colors; and a mixture‑of‑queries decoder that employs multiple newly initialized query experts per layer to cooperatively discover camouflage features, generating refined instance masks.
By coupling spatial and frequency cues, MoQT outperforms 19 state‑of‑the‑art methods on the COD10K and NC4K datasets.
Key Highlights
Frequency Magic: A novel dual‑domain feature fusion framework that breaks the “invisibility cloak” of camouflaged objects.
Collaborative Decoding: Dynamic cooperation among multiple query experts achieves pixel‑level precise segmentation and boundary refinement.
General Potential: The technique can be extended to deep‑fake detection, medical image analysis, and other camouflage‑related scenarios.
The paper’s first author, Feng Weiwei, a senior algorithm engineer at Ant Group’s security AI team, will present the design ideas and validation process in a live session.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
