-
Notifications
You must be signed in to change notification settings - Fork 0
Description
The current system visualizes the incoming sensor data as a UsdGeom.Points primitive. This is a purely visual representation and does not interact with the physics engine or other objects in the Isaac Sim world.
The goal of this issue is to convert this transient point cloud data into a solid, physical UsdGeom.Mesh in real-time, effectively turning the sensor scan into a tangible 3D environment.
Proposed Solution
-
Integrate a Surface Reconstruction Library: Introduce a library capable of generating a mesh from a point cloud. Options include:
- Open3D: A modern library with several reconstruction algorithms (e.g., Alpha Shapes, Ball Pivoting, Poisson).
- PCL (Point Cloud Library): A classic, feature-rich library with robust meshing tools.
- A custom, faster algorithm like a Voxel Grid-based approach for real-time performance.
-
Create a "Meshing" Node/Module: This logic could be added to the C++ node for performance. It would take the transformed XYZ points and feed them into the reconstruction algorithm.
-
Update Isaac Sim Controller: The Python script will need to be updated to receive mesh data (vertices and faces) instead of just points. It will then create or update a
UsdGeom.Meshprimitive on the stage.
Acceptance Criteria
-
The system generates a solid 3D mesh from the incoming point cloud data.
-
The generated mesh can have a physics collider applied to it.
-
Other physics-enabled objects in Isaac Sim can successfully collide with and react to the reconstructed environment (e.g., a ball can be dropped and roll on the scanned floor).
-
The process is efficient enough to update the environment in near real-time.