2024-09-18 Real-world SceneSense applications and model updates on ARXIV: https://arxiv.org/abs/2409.10681
2024-06-30 SceneSense accepted to IROS 2024!
Method
Our occupancy in-painting method ensures that observed space remains intact while integrating SceneSense predictions. Drawing inspiration from image inpainting techniques like image diffusion and guided image synthesis, our approach continuously incorporates known occupancy information during inference. To execute occupancy in-painting, we select a portion of the occupancy map for diffusion, generating masks for occupied and unoccupied voxels. These masks guide the diffusion process to modify only relevant voxels while introducing noise at each step. This iterative process, depicted below, enhances scene predictions’ accuracy while preventing the model from altering observed geometry.
Presentation Video
Citation
@misc{reed2024scenesense,
title={SceneSense: Diffusion Models for 3D Occupancy Synthesis from Partial Observation},
author={Alec Reed and Brendan Crowe and Doncey Albin and Lorin Achey and Bradley Hayes and Christoffer Heckman},
year={2024},
eprint={2403.11985},
archivePrefix={arXiv},
primaryClass={cs.RO}
}