Table of Contents

  1. Contribution
  2. Network Architecture
  3. Dataset & Annotations
  4. Qualitative Analysis
  5. Quantitative Analysis
  6. Video
  7. Citation
  8. Acknowledgement

BRep Boundary and Junction Detection for CAD Reverse Engineering

Sk Aziz Ali1 · Mohammad Sadil Khan1 · Didier Stricker1
1German Center of Artificial Intelligence (DFKI AV Group)

Arxiv Paper Dataset Code


Affiliations

Contribution

Our proposed BRepDetNet detects BRep boundaries and junctions by minimizing focal-loss and non-maximal suppression (NMS) during training time. Our main contributions are:

  1. BRep annotations on detailed industrial-level 3D scans from the CC3D and ABC datasets.
  2. A neural network that learns to detect BRep boundaries and junctions from a 3D scan.
  3. Our detection network reduces false positives and negatives directly during training, achieving notable improvements in boundary and junction detection recall compared to methods that use NMS as a post-prediction.

Network Architecture

Our neural network is comprised of seperate Boundary Detection Heads and Junction Detection Heads. Both network heads use DGCNN as point feature encoder that maps \(D: S \to \phi \in \mathbb{R}^{N \times 128}\) resulting into 128 dimensional point-level deep feature vectors. The two DGCNN encoders output boundary and junction embedding vectors \(\phi_B\) and \(\phi_J\) respectively. Next, we apply fully-connected layer to resize fc(\(\phi_B\)) and fc(\(\phi_J\)) with final dimensions in \(\mathbb{R}^{N \times 1}\).

Dataset and Annotations

We have meticulously annotated around 50K CC3D scans and 45K High-resolution Meshes with topological relations (e.g., next, mate, previous) between geometrical primitives (boundaries, junctions, loops, faces) in their BRep data structures. We invite the community to use these annotations. You can access the dataset via this link: Dataset ( !Coming Soon ).

Annotations legend

Qualitative Analysis

Visual results for boundary and junction prediction using ComplexGen/PieNet/BRepDetNet. As can be seen in the renderer, our model gives impressive results when compared to other models. (The red points are the boundary points and the green points are the junction points)

Quantitative Analysis

Recall and Precision of BRepDetNet

Recall and precision stats

Quantitiative Results on boundary and junction prediction tasks on ABC and CC3D dataset.
Note: All the models are trained on ABC dataset


Ablation Study on Cross-Dataset Training and Evlauation

Ablation study table

Quantitative results of an ablation study for BRepDetNet with or without NMS loss. Recall and precision for Boundary detection on ABC and CC3D dataset are reported. We also showcase cross-dataset generalization ability of our model.

Video

Citation

If you refer to the results or codes of this work, please cite the following:

@inproceedings{Khan_2024_CVPR,
title = {CAD-SIGNet: CAD Language Inference from Point Clouds using Layer-wise Sketch Instance Guided Attention},
author = {Khan, Mohammad Sadil and Dupont, Elona and Ali, Sk Aziz and Cherenkova, Kseniya and Kacem, Anis and Aouada, Djamila},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2024}
}
@inproceedings{ali2024brepdetnet,
author = {Ali, Sk Aziz and Khan, Mohammad Sadil and Stricker, Didier},
booktitle = {2024 IEEE 3rd International Conference on Computing and Machine Intelligence (ICMI)},
title = {BRep Boundary and Junction Detection for CAD Reverse Engineering},
year = {2024},
keywords={Training;Solid modeling;Three-dimensional displays;Reverse engineering;Neural networks;Machining;Mechanical systems;BRep;Boundary Detection;Junction Detection;Scan-to-CAD;Reverse Engineering;NMS},
doi = {10.1109/ICMI60790.2024.10585950}
}

Acknowledgement

This work was partially funded by the EU Horizon Europe Framework Program under grant agreement 101058236 (HumanTech). Authors thank Prof. Djamila Aouada and Dr. Anis Kacem (Snt, University of Luxembourg) for their valuable inputs in understanding Scan-to-BRep paradigm.

Disclaimer: This website was developed by Pritham Kumar Jena and Bhavika Baburaj, who are students at BITS Pilani, Hyderabad Campus.