Skip to content

linlianjiang/pointmac

Repository files navigation

PointMAC: Meta-learned Adaptation for Robust Test-time Point Cloud Completion (NeurIPS 2025)

Linlian Jiang, Rui Ma, Li Gu, Ziqiang Wang, Xinxin Zuo*, Yang Wang*

Concordia University, Jilin University

arXiv Page

TODO List

  • We have released the code (Dec. 2025)

Overview

TTA with meta-auxiliary learning provides a general framework: the auxiliary task is NOT restricted to enumerating all possible failure modes. Instead, it can be tailored to different objectives as needed, fundamentally overcoming the limitations of static, pre-defined strategies. This enables dynamic, per-sample adaptation to previously unseen patterns, allowing our model to robustly recover fine details where traditional static training or enumerative augmentation would fail.

Datasets

We use the PCN, ShapeNet-34/21, MVP, and KITTI datasets in our experiments, which are available below:

Preparation

Tips: If you have a configured virtual environment for SnowflakeNet (or PoinTr), you can reuse it instead of installing a new one.

  1. Clone pointmac.
git clone --recursive https://github.com/linlianjiang/pointmac.git
cd pointmac
  1. Create the environment. Here we show an example using conda
conda create -n pointm python=3.8
conda activate pointm
pip install -r requirements.txt

Training

(1) It needs pre-training to provide the initial weights.

(2) Meta-train

bash run.sh

Evaluation

To evaluate a pre-trained model, set the model_path in the configuration file before running:

python test.py --configs <config>

Citing

If you use our PointMAC code in your research, please consider citing:

@article{jiang2025pointmac,
  title={PointMAC: Meta-Learned Adaptation for Robust Test-Time Point Cloud Completion},
  author={Jiang, Linlian and Ma, Rui and Gu, Li and Wang, Ziqiang and Zuo, Xinxin and Wang, Yang},
  journal={arXiv preprint arXiv:2510.10365},
  year={2025}
}

About

[NeurIPS 2025] PointMAC: Meta-learned Adaptation for Robust Test-time Point Cloud Completion

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors