SnowflakeNet: Point Cloud Completion by Snowflake Point Deconvolution with Skip-Transformer
SnowflakeNet introduces a novel Snowflake Point Deconvolution architecture combined with a Skip-Transformer to progressively split seed points, enabling high‑quality point‑cloud completion that preserves fine‑grained geometric details such as smooth surfaces, sharp edges, and corners across dense and sparse datasets.
Point cloud shape completion is a hot research topic that aims to predict high‑quality complete shapes from incomplete inputs. Existing methods often struggle to recover local geometric details due to the discrete and unstructured nature of point clouds. This paper proposes SnowflakeNet, a new network that better captures and restores local geometry by using multi‑layer Snowflake Point Deconvolution (SPD) and a Skip‑Transformer.
Innovations
We introduce SnowflakeNet, converting complete point‑cloud generation into an explicit, locally structured splitting process.
Snowflake Deconvolution (SPD) progressively increases point count by splitting each parent point into multiple child points, mimicking snowflake growth in 3D space.
A Skip‑Transformer is embedded in each SPD layer to capture local shape features and unify splitting patterns between adjacent SPDs, ensuring consistent and collaborative point splitting.
Related Work
Previous deep‑learning‑based point‑cloud completion methods fall into three categories: folding‑based, coarse‑to‑fine, and deformation‑based. Folding‑based methods (e.g., FoldingNet, SA‑Net) lack explicit constraints on intermediate features, limiting detail recovery. Coarse‑to‑fine approaches generate a sparse cloud first and then refine it, but they often ignore structured predictions for fine details. SnowflakeNet builds on these ideas while explicitly modeling local structure during point generation.
Method Description
Network Overview : The architecture consists of a feature extraction module, a seed‑point generation module, and a point‑generation module composed of three SPD layers.
Snowflake Deconvolution (SPD) : Each SPD layer receives the point set from the previous layer, splits each point, and uses a Skip‑Transformer to integrate local shape context and previous splitting patterns. The split operation produces child‑point features and offset vectors via an MLP; offsets guide subsequent SPD layers.
Skip‑Transformer (ST) : ST takes current SPD point features and the offset features from the previous SPD, applying attention to fuse them into a refined shape‑context feature for the current layer.
Experimental Results
Quantitative comparisons on the PCN (dense, 16384 points) and Completion3D (sparse, 2048 points) datasets using Chamfer Distance show that SnowflakeNet achieves lower average CD and better per‑category performance than prior methods, demonstrating superior completion quality and generalization.
Visual comparisons on both datasets reveal that SnowflakeNet not only reconstructs overall shapes more accurately but also preserves fine local structures such as sofa cushions, chair backs, aircraft wings, and engine parts.
Visualization of the SPD process illustrates consistent splitting paths for both flat surfaces and complex corners, confirming the method’s ability to generate detailed point clouds.
Real‑world tests on ScanNet scenes (chairs) using a model pretrained on Completion3D show that SnowflakeNet can handle sparser, noisier inputs while still producing clean, complete shapes.
References
[1] Yaoqing Yang et al., FoldingNet, CVPR 2018. [2] Xin Wen et al., Skip‑Attention Network with Hierarchical Folding, CVPR 2020. [3] Wentao Yuan et al., PCN, 3DV 2018. [4] Xiaogang Wang et al., Cascaded Refinement Network, CVPR 2020. [5] Wenxiao Zhang et al., Detail Preserved Completion, ECCV 2020. [6] Zitian Huang et al., PF‑Net, CVPR 2020. [7] Xin Wen et al., PMP‑Net, CVPR 2021.
Kuaishou Tech
Official Kuaishou tech account, providing real-time updates on the latest Kuaishou technology practices.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.