From 0ccd58d1a7465e33615d4936ecf1bd6893ce96f9 Mon Sep 17 00:00:00 2001 From: yuecideng Date: Mon, 5 Jan 2026 10:36:44 +0800 Subject: [PATCH 1/5] wip --- docs/source/overview/sim/index.rst | 5 +- docs/source/overview/sim/sim_manager.md | 198 ++++++++++++++++++++++++ embodichain/lab/sim/sim_manager.py | 2 +- 3 files changed, 203 insertions(+), 2 deletions(-) create mode 100644 docs/source/overview/sim/sim_manager.md diff --git a/docs/source/overview/sim/index.rst b/docs/source/overview/sim/index.rst index fd6e56fd..2ee32098 100644 --- a/docs/source/overview/sim/index.rst +++ b/docs/source/overview/sim/index.rst @@ -14,10 +14,13 @@ Overview of the Simulation Framework: - Kinematics Solver - Motion Generation +Table of Contents +================= .. toctree:: :maxdepth: 1 :glob: - + + sim_manager.md solvers/index planners/index diff --git a/docs/source/overview/sim/sim_manager.md b/docs/source/overview/sim/sim_manager.md new file mode 100644 index 00000000..3e5dc7f1 --- /dev/null +++ b/docs/source/overview/sim/sim_manager.md @@ -0,0 +1,198 @@ +# Simulation Manager + +The `SimulationManager` is the central class in EmbodiChain's simulation framework for managing the simulation lifecycle. It handles: +- **Asset Management**: Loading and managing robots, rigid objects, soft objects, articulations, sensors, and lights. +- **Simulation Loop**: Controlling the physics stepping and rendering updates. +- **Rendering**: Managing the simulation window, camera rendering, material settings and ray-tracing configuration. +- **Interaction**: Providing gizmo controls for interactive manipulation of objects. + +## Configuration + +The simulation is configured using the `SimulationManagerCfg` class. + +```python +from embodichain.lab.sim import SimulationManagerCfg + +sim_config = SimulationManagerCfg( + width=1920, # Window width + height=1080, # Window height + num_envs=10, # Number of parallel environments + physics_dt=0.01, # Physics time step + sim_device="cpu", # Simulation device ("cpu" or "cuda:0", etc.) + arena_space=5.0 # Spacing between environments +) +``` + +### Configuration Parameters + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `width` | `int` | `1920` | The width of the simulation window. | +| `height` | `int` | `1080` | The height of the simulation window. | +| `headless` | `bool` | `False` | Whether to run the simulation in headless mode (no Window). | +| `enable_rt` | `bool` | `False` | Whether to enable ray tracing rendering. | +| `enable_denoiser` | `bool` | `True` | Whether to enable denoising for ray tracing rendering. | +| `spp` | `int` | `64` | Samples per pixel for ray tracing rendering. Only valid when ray tracing is enabled and denoiser is False. | +| `gpu_id` | `int` | `0` | The gpu index that the simulation engine will be used. Affects gpu physics device. | +| `thread_mode` | `ThreadMode` | `RENDER_SHARE_ENGINE` | The threading mode for the simulation engine. | +| `cpu_num` | `int` | `1` | The number of CPU threads to use for the simulation engine. | +| `num_envs` | `int` | `1` | The number of parallel environments (arenas) to simulate. | +| `arena_space` | `float` | `5.0` | The distance between each arena when building multiple arenas. | +| `physics_dt` | `float` | `0.01` | The time step for the physics simulation. | +| `sim_device` | `str` \| `torch.device` | `"cpu"` | The device for the physics simulation. | +| `physics_config` | `PhysicsCfg` | `PhysicsCfg()` | The physics configuration parameters. | +| `gpu_memory_config` | `GPUMemoryCfg` | `GPUMemoryCfg()` | The GPU memory configuration parameters. | + +## Initialization + +Initialize the manager with the configuration object: + +```python +from embodichain.lab.sim import SimulationManager, SimulationManagerCfg + +# User can customize the config as needed. +sim_config = SimulationManagerCfg() +sim = SimulationManager(sim_config) +``` + +## Asset Management + +The manager provides methods to add and retrieve various simulation assets. + +### Robots + +Add robots using `RobotCfg`. + +```python +from embodichain.lab.sim.cfg import RobotCfg + +robot_cfg = RobotCfg(uid="my_robot", ...) +robot = sim.add_robot(robot_cfg) + +# Retrieve existing robot +robot = sim.get_robot("my_robot") +``` + +### Rigid Objects + +Add rigid bodies (e.g., cubes, meshes) using `RigidObjectCfg`. + +```python +from embodichain.lab.sim.cfg import RigidObjectCfg + +obj_cfg = RigidObjectCfg(uid="cube", ...) +obj = sim.add_rigid_object(obj_cfg) +``` + +### Sensors + +Add sensors (e.g., Cameras) using `SensorCfg`. + +```python +from embodichain.lab.sim.sensors import CameraCfg + +camera_cfg = CameraCfg(uid="cam1", ...) +camera = sim.add_sensor(camera_cfg) +``` + +### Lights + +Add lights to the scene using `LightCfg`. + +```python +from embodichain.lab.sim.cfg import LightCfg + +light_cfg = LightCfg(uid="sun", light_type="point", ...) +light = sim.add_light(light_cfg) +``` + +## Simulation Loop + +The simulation loop typically involves stepping the physics and rendering the scene. + +```python +while True: + # Step physics and render + sim.update() + + # Or step manually + # sim.step_physics() + # sim.render() +``` + +### Methods + +- **`update(physics_dt=None, step=1)`**: Steps the physics simulation and updates the rendering. +- **`enable_physics(enable: bool)`**: Enable or disable physics simulation. +- **`set_manual_update(enable: bool)`**: Set manual update mode for physics. + +## Rendering + +- **`render_camera_group()`**: Renders all cameras in the scene. +- **`open_window()`**: Opens the visualization window. +- **`close_window()`**: Closes the visualization window. + +## Gizmos + +Gizmos allow interactive control of objects in the simulation window. + +```python +# Enable gizmo for a robot +sim.enable_gizmo(uid="my_robot", control_part="arm") + +# Toggle visibility +sim.toggle_gizmo_visibility(uid="my_robot", control_part="arm") + +# Disable gizmo +sim.disable_gizmo(uid="my_robot", control_part="arm") +``` + +## Example Usage + +Below is a complete example of setting up a simulation with a robot and a sensor. + +```python +import argparse +from embodichain.lab.sim import SimulationManager, SimulationManagerCfg +from embodichain.lab.sim.sensors import CameraCfg +from embodichain.lab.sim.cfg import RobotCfg, RigidObjectCfg +from embodichain.lab.sim.shapes import CubeCfg + +# 1. Configure Simulation +config = SimulationManagerCfg( + headless=False, + sim_device="cuda", + enable_rt=True, + physics_dt=0.01 +) +sim = SimulationManager(config) + +# 2. Add a Robot +# (Assuming robot_cfg is defined) +# robot = sim.add_robot(robot_cfg) + +# 3. Add a Rigid Object +cube_cfg = RigidObjectCfg( + uid="cube", + shape=CubeCfg(size=[0.05, 0.05, 0.05]), + init_pos=[1.0, 0.0, 0.5] +) +sim.add_rigid_object(cube_cfg) + +# 4. Add a Sensor +camera_cfg = CameraCfg( + uid="camera", + width=640, + height=480, + # ... other params +) +camera = sim.add_sensor(camera_cfg) + +# 5. Run Simulation Loop +while True: + sim.update() + + # Access sensor data + # data = camera.get_data() +``` +``` \ No newline at end of file diff --git a/embodichain/lab/sim/sim_manager.py b/embodichain/lab/sim/sim_manager.py index 83c231ce..ddb2dabb 100644 --- a/embodichain/lab/sim/sim_manager.py +++ b/embodichain/lab/sim/sim_manager.py @@ -135,7 +135,7 @@ class SimulationManagerCfg: """The time step for the physics simulation.""" sim_device: Union[str, torch.device] = "cpu" - """The device for the simulation engine. Can be 'cpu', 'cuda', or a torch.device object.""" + """The device for the physics simulation. Can be 'cpu', 'cuda', or a torch.device object.""" physics_config: PhysicsCfg = field(default_factory=PhysicsCfg) """The physics configuration parameters.""" From 7918b7cb6faaeda470a7cda532e9509987d13ea9 Mon Sep 17 00:00:00 2001 From: yuecideng Date: Tue, 6 Jan 2026 00:29:14 +0800 Subject: [PATCH 2/5] wip --- docs/source/overview/sim/index.rst | 3 +- docs/source/overview/sim/sim_assets.md | 157 ++++++++++++++++++++++++ docs/source/overview/sim/sim_manager.md | 147 +++++----------------- docs/source/resources/roadmap.md | 6 +- embodichain/lab/sim/material.py | 28 +++-- examples/sim/demo/scoop_ice.py | 4 +- 6 files changed, 211 insertions(+), 134 deletions(-) create mode 100644 docs/source/overview/sim/sim_assets.md diff --git a/docs/source/overview/sim/index.rst b/docs/source/overview/sim/index.rst index 2ee32098..48e1a3ca 100644 --- a/docs/source/overview/sim/index.rst +++ b/docs/source/overview/sim/index.rst @@ -8,8 +8,7 @@ Overview of the Simulation Framework: - Components - Simulation Manager - - Simulation Object - - Material + - Simulation Assets - Virtual Sensor - Kinematics Solver - Motion Generation diff --git a/docs/source/overview/sim/sim_assets.md b/docs/source/overview/sim/sim_assets.md new file mode 100644 index 00000000..a00f9022 --- /dev/null +++ b/docs/source/overview/sim/sim_assets.md @@ -0,0 +1,157 @@ +# Simulation Assets + +Simulation assets in EmbodiChain are configured using Python dataclasses. This approach provides a structured and type-safe way to define properties for physics, materials, objects and sensors in the simulation environment. + +## Physics Configuration + +The `PhysicsCfg` class controls the global physics simulation parameters. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `gravity` | `np.ndarray` | `[0, 0, -9.81]` | Gravity vector for the simulation environment. | +| `bounce_threshold` | `float` | `2.0` | The speed threshold below which collisions will not produce bounce effects. | +| `enable_pcm` | `bool` | `True` | Enable persistent contact manifold (PCM) for improved collision handling. | +| `enable_tgs` | `bool` | `True` | Enable temporal gauss-seidel (TGS) solver for better stability. | +| `enable_ccd` | `bool` | `False` | Enable continuous collision detection (CCD) for fast-moving objects. | +| `enable_enhanced_determinism` | `bool` | `False` | Enable enhanced determinism for consistent simulation results. | +| `enable_friction_every_iteration` | `bool` | `True` | Enable friction calculations at every solver iteration. | +| `length_tolerance` | `float` | `0.05` | The length tolerance for the simulation. Larger values increase speed. | +| `speed_tolerance` | `float` | `0.25` | The speed tolerance for the simulation. Larger values increase speed. | + +## Materials + +### Visual Materials + +The `VisualMaterialCfg` class defines the visual appearance of objects using Physically Based Rendering (PBR) properties. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `uid` | `str` | `"default_mat"` | Unique identifier for the material. | +| `base_color` | `list` | `[0.5, 0.5, 0.5, 1.0]` | Base color/diffuse color (RGBA). | +| `metallic` | `float` | `0.0` | Metallic factor (0.0 = dielectric, 1.0 = metallic). | +| `roughness` | `float` | `0.5` | Surface roughness (0.0 = smooth, 1.0 = rough). | +| `emissive` | `list` | `[0.0, 0.0, 0.0]` | Emissive color (RGB). | +| `emissive_intensity` | `float` | `1.0` | Emissive intensity multiplier. | +| `base_color_texture` | `str` | `None` | Path to base color texture map. | +| `metallic_texture` | `str` | `None` | Path to metallic map. | +| `roughness_texture` | `str` | `None` | Path to roughness map. | +| `normal_texture` | `str` | `None` | Path to normal map. | +| `ao_texture` | `str` | `None` | Path to ambient occlusion map. | +| `ior` | `float` | `1.5` | Index of refraction for ray tracing materials. | +| `material_type` | `str` | `"BRDF"` | material type. | + +## Objects + +All objects inherit from `ObjectBaseCfg`, which provides common properties. + +**Base Properties (`ObjectBaseCfg`)** + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `uid` | `str` | `None` | Unique identifier. | +| `init_pos` | `tuple` | `(0.0, 0.0, 0.0)` | Position of the root in simulation world frame. | +| `init_rot` | `tuple` | `(0.0, 0.0, 0.0)` | Euler angles (in degrees) of the root. | +| `init_local_pose` | `np.ndarray` | `None` | 4x4 transformation matrix (overrides `init_pos` and `init_rot`). | + +## Rigid Object + +Configured via `RigidObjectCfg`. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `shape` | `ShapeCfg` | `ShapeCfg()` | Shape configuration (e.g., Mesh, Box). | +| `attrs` | `RigidBodyAttributesCfg` | `RigidBodyAttributesCfg()` | Physical attributes. | +| `body_type` | `Literal` | `"dynamic"` | "dynamic", "kinematic", or "static". | +| `max_convex_hull_num` | `int` | `1` | Max convex hulls for decomposition (CoACD). | +| `body_scale` | `tuple` | `(1.0, 1.0, 1.0)` | Scale of the rigid body. | + +### Rigid Body Attributes + +The `RigidBodyAttributesCfg` class defines physical properties for rigid bodies. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `mass` | `float` | `1.0` | Mass in kg. Set to 0 to use density. | +| `density` | `float` | `1000.0` | Density in kg/m^3. | +| `angular_damping` | `float` | `0.7` | Angular damping coefficient. | +| `linear_damping` | `float` | `0.7` | Linear damping coefficient. | +| `max_depenetration_velocity` | `float` | `10.0` | Maximum depenetration velocity. | +| `sleep_threshold` | `float` | `0.001` | Threshold below which the body can go to sleep. | +| `enable_ccd` | `bool` | `False` | Enable continuous collision detection. | +| `contact_offset` | `float` | `0.002` | Contact offset for collision detection. | +| `rest_offset` | `float` | `0.001` | Rest offset for collision detection. | +| `enable_collision` | `bool` | `True` | Enable collision for the rigid body. | +| `restitution` | `float` | `0.0` | Restitution (bounciness) coefficient. | +| `dynamic_friction` | `float` | `0.5` | Dynamic friction coefficient. | +| `static_friction` | `float` | `0.5` | Static friction coefficient. | + +## Soft Object + +Configured via `SoftObjectCfg`. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `voxel_attr` | `SoftbodyVoxelAttributesCfg` | `...` | Voxelization attributes. | +| `physical_attr` | `SoftbodyPhysicalAttributesCfg` | `...` | Physical attributes. | +| `shape` | `MeshCfg` | `MeshCfg()` | Mesh configuration. | + +### Soft Body Attributes + +Soft bodies require both voxelization and physical attributes. + +**Voxel Attributes (`SoftbodyVoxelAttributesCfg`)** + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `triangle_remesh_resolution` | `int` | `8` | Resolution to remesh the softbody mesh before building physx collision mesh. | +| `triangle_simplify_target` | `int` | `0` | Simplify mesh faces to target value. | +| `simulation_mesh_resolution` | `int` | `8` | Resolution to build simulation voxelize textra mesh. | +| `simulation_mesh_output_obj` | `bool` | `False` | Whether to output the simulation mesh as an obj file for debugging. | + +**Physical Attributes (`SoftbodyPhysicalAttributesCfg`)** + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `youngs` | `float` | `1e6` | Young's modulus (higher = stiffer). | +| `poissons` | `float` | `0.45` | Poisson's ratio (higher = closer to incompressible). | +| `dynamic_friction` | `float` | `0.0` | Dynamic friction coefficient. | +| `elasticity_damping` | `float` | `0.0` | Elasticity damping factor. | +| `material_model` | `SoftBodyMaterialModel` | `CO_ROTATIONAL` | Material constitutive model. | +| `enable_kinematic` | `bool` | `False` | If True, (partially) kinematic behavior is enabled. | +| `enable_ccd` | `bool` | `False` | Enable continuous collision detection. | +| `enable_self_collision` | `bool` | `False` | Enable self-collision handling. | +| `mass` | `float` | `-1.0` | Total mass. If negative, density is used. | +| `density` | `float` | `1000.0` | Material density in kg/m^3. | + + +### Articulations & Robots + +Configured via `ArticulationCfg` and `RobotCfg` (which inherits from `ArticulationCfg`). + +These configurations are typically loaded from URDF or MJCF files. + +### Lights + +Configured via `LightCfg`. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `light_type` | `Literal` | `"point"` | Type of light (currently only "point"). | +| `color` | `tuple` | `(1.0, 1.0, 1.0)` | RGB color. | +| `intensity` | `float` | `50.0` | Intensity in watts/m^2. | +| `radius` | `float` | `1e2` | Falloff radius. | + +### Markers + +Configured via `MarkerCfg` for debugging and visualization. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `marker_type` | `Literal` | `"axis"` | "axis", "line", or "point". | +| `axis_size` | `float` | `0.002` | Thickness of axis lines. | +| `axis_len` | `float` | `0.005` | Length of axis arms. | +| `line_color` | `list` | `[1, 1, 0, 1.0]` | RGBA color for lines. | + +### Rigid Object Groups + +`RigidObjectGroupCfg` allows initializing multiple rigid objects, potentially from a folder. diff --git a/docs/source/overview/sim/sim_manager.md b/docs/source/overview/sim/sim_manager.md index 3e5dc7f1..5f77b48c 100644 --- a/docs/source/overview/sim/sim_manager.md +++ b/docs/source/overview/sim/sim_manager.md @@ -55,144 +55,53 @@ sim_config = SimulationManagerCfg() sim = SimulationManager(sim_config) ``` -## Asset Management +## Assets Management -The manager provides methods to add and retrieve various simulation assets. +The manager provides methods to add, retrieve and remove various simulation assets including: +- Rigid Objects +- Soft Objects +- Articulations +- Robots +- Sensors +- Lights +- Materials -### Robots +For more details on simulation assets, please refer to their respective documentation pages. -Add robots using `RobotCfg`. - -```python -from embodichain.lab.sim.cfg import RobotCfg - -robot_cfg = RobotCfg(uid="my_robot", ...) -robot = sim.add_robot(robot_cfg) - -# Retrieve existing robot -robot = sim.get_robot("my_robot") -``` - -### Rigid Objects - -Add rigid bodies (e.g., cubes, meshes) using `RigidObjectCfg`. - -```python -from embodichain.lab.sim.cfg import RigidObjectCfg +## Simulation Loop -obj_cfg = RigidObjectCfg(uid="cube", ...) -obj = sim.add_rigid_object(obj_cfg) -``` +### Manual Update mode -### Sensors +In this mode, the physics simulation should be explicitly stepped by calling `update()` method, which provides precise control over the simulation timing. -Add sensors (e.g., Cameras) using `SensorCfg`. +The use case for manual update mode includes: +- Data generation with openai gym environments, in which the observation and action must be synchronized with the physics simulation. +- Applications that require precise dynamic control over the simulation timing. ```python -from embodichain.lab.sim.sensors import CameraCfg +while True: + # Step physics simulation. + sim.update(step=1) -camera_cfg = CameraCfg(uid="cam1", ...) -camera = sim.add_sensor(camera_cfg) + # Perform other tasks such as get data from the scene or apply sensor update. ``` -### Lights - -Add lights to the scene using `LightCfg`. - -```python -from embodichain.lab.sim.cfg import LightCfg +> The default mode is manual update mode. To switch to automatic update mode, call `set_manual_update(False)`. -light_cfg = LightCfg(uid="sun", light_type="point", ...) -light = sim.add_light(light_cfg) -``` +### Automatic Update mode -## Simulation Loop +In this mode, the physics simulation stepping is automatically handling by the physics thread running in dexsim engine, which makes it easier to use for visualization and interactive applications. -The simulation loop typically involves stepping the physics and rendering the scene. +> When in automatic update mode, user are recommanded to use CPU `sim_device` for simulation. -```python -while True: - # Step physics and render - sim.update() - - # Or step manually - # sim.step_physics() - # sim.render() -``` ### Methods -- **`update(physics_dt=None, step=1)`**: Steps the physics simulation and updates the rendering. +- **`update(physics_dt=None, step=1)`**: Steps the physics simulation with optional custom time step and number of steps. If `physics_dt` is None, uses the configured physics time step. - **`enable_physics(enable: bool)`**: Enable or disable physics simulation. - **`set_manual_update(enable: bool)`**: Set manual update mode for physics. -## Rendering +### Related Tutorials -- **`render_camera_group()`**: Renders all cameras in the scene. -- **`open_window()`**: Opens the visualization window. -- **`close_window()`**: Closes the visualization window. - -## Gizmos - -Gizmos allow interactive control of objects in the simulation window. - -```python -# Enable gizmo for a robot -sim.enable_gizmo(uid="my_robot", control_part="arm") - -# Toggle visibility -sim.toggle_gizmo_visibility(uid="my_robot", control_part="arm") - -# Disable gizmo -sim.disable_gizmo(uid="my_robot", control_part="arm") -``` - -## Example Usage - -Below is a complete example of setting up a simulation with a robot and a sensor. - -```python -import argparse -from embodichain.lab.sim import SimulationManager, SimulationManagerCfg -from embodichain.lab.sim.sensors import CameraCfg -from embodichain.lab.sim.cfg import RobotCfg, RigidObjectCfg -from embodichain.lab.sim.shapes import CubeCfg - -# 1. Configure Simulation -config = SimulationManagerCfg( - headless=False, - sim_device="cuda", - enable_rt=True, - physics_dt=0.01 -) -sim = SimulationManager(config) - -# 2. Add a Robot -# (Assuming robot_cfg is defined) -# robot = sim.add_robot(robot_cfg) - -# 3. Add a Rigid Object -cube_cfg = RigidObjectCfg( - uid="cube", - shape=CubeCfg(size=[0.05, 0.05, 0.05]), - init_pos=[1.0, 0.0, 0.5] -) -sim.add_rigid_object(cube_cfg) - -# 4. Add a Sensor -camera_cfg = CameraCfg( - uid="camera", - width=640, - height=480, - # ... other params -) -camera = sim.add_sensor(camera_cfg) - -# 5. Run Simulation Loop -while True: - sim.update() - - # Access sensor data - # data = camera.get_data() -``` -``` \ No newline at end of file +- [Basic scene creation](https://dexforce.github.io/EmbodiChain/tutorial/create_scene.html) +- [Interactive simulation with Gizmo](https://dexforce.github.io/EmbodiChain/tutorial/gizmo.html) \ No newline at end of file diff --git a/docs/source/resources/roadmap.md b/docs/source/resources/roadmap.md index 4999e8bf..ba160ea2 100644 --- a/docs/source/resources/roadmap.md +++ b/docs/source/resources/roadmap.md @@ -1,6 +1,6 @@ # Roadmap -Currently, EmbodiChain is under active development. Our plan for the feature roadmap is as follows: +Currently, EmbodiChain is under active development. Our roadmap is as follows: - Simulation: - Rendering: @@ -15,12 +15,10 @@ Currently, EmbodiChain is under active development. Our plan for the feature roa - We are also exploring how to integrate [newton physics](https://github.com/newton-physics/newton) into EmbodiChain as an alternative physics backend. - Sensors: - Add contact and force sensors with examples. - - Kinematics Solvers: - - Improve the existing IK solver performance and stability (especially SRSSolver and OPWSolver). - Motion Generation: - Add more advanced motion generation methods and examples. - Useful Tools: - - We are working on USD support for EmbodiChain to enable better scene creation and asset management. + - We are working on USD support for EmbodiChain to enable better asset management and interoperability. - We will release a simple Real2Sim pipeline, which enables automatic task generation from real-world data. - Robots Integration: - Add support for more robot models (eg: LeRobot, Unitree H1/G1, etc). diff --git a/embodichain/lab/sim/material.py b/embodichain/lab/sim/material.py index c66c2c38..9f2f50ca 100644 --- a/embodichain/lab/sim/material.py +++ b/embodichain/lab/sim/material.py @@ -67,10 +67,10 @@ class VisualMaterialCfg: # Ray tracing specific properties ior: float = 1.5 - """Index of refraction for ray tracing materials""" + """Index of refraction for PBR materials, only used in ray tracing.""" - rt_material_type: str = "BRDF_GGX_SMITH" - """Ray tracing material type. Options: 'BRDF_GGX_SMITH', 'BTDF_GGX_SMITH', 'BSDF_GGX_SMITH'""" + material_type: str = "BRDF" + """Ray tracing material type. Options: 'BRDF', 'BTDF', 'BSDF'""" # Currently disabled properties # subsurface: float = 0.0 # Subsurface scattering factor @@ -95,12 +95,24 @@ class VisualMaterial: """ RT_MATERIAL_TYPES = [ - "BRDF_GGX_SMITH", - "BTDF_GGX_SMITH", - "BSDF_GGX_SMITH", + "BRDF", + "BTDF", + "BSDF", ] + MAT_TYPE_MAPPING: Dict[str, str] = { + "BRDF": "BRDF_GGX_SMITH", + "BTDF": "BTDF_GGX_SMITH", + "BSDF": "BSDF_GGX_SMITH", + } + def __init__(self, cfg: VisualMaterialCfg, mat: Material): + if cfg.material_type not in self.RT_MATERIAL_TYPES: + logger.log_error( + f"Invalid material_type '{cfg.material_type}'. " + f"Supported types: {self.RT_MATERIAL_TYPES}" + ) + self.uid = cfg.uid self.cfg = copy.deepcopy(cfg) self._mat = mat @@ -132,7 +144,9 @@ def set_default_properties( if self.is_rt_enabled: mat_inst.set_ior(cfg.ior) - mat_inst.mat.update_pbr_material_type(cfg.rt_material_type) + mat_inst.mat.update_pbr_material_type( + self.MAT_TYPE_MAPPING[cfg.material_type] + ) def create_instance(self, uid: str) -> VisualMaterialInst: """Create a new material instance from this material template. diff --git a/examples/sim/demo/scoop_ice.py b/examples/sim/demo/scoop_ice.py index 1baee81c..f4f75119 100644 --- a/examples/sim/demo/scoop_ice.py +++ b/examples/sim/demo/scoop_ice.py @@ -303,13 +303,13 @@ def create_ice_cubes(sim: SimulationManager): # Set visual material for ice cubes. # The material below only works for ray tracing backend. - # Set ior to 1.31 and material type to "BSDF_GGX_SMITH" for better ice appearance. + # Set ior to 1.31 and material type to "BSDF" for better ice appearance. ice_mat = sim.create_visual_material( cfg=VisualMaterialCfg( base_color=[1.0, 1.0, 1.0, 1.0], ior=1.31, roughness=0.05, - rt_material_type="BSDF_GGX_SMITH", + material_type="BSDF", ) ) ice_cubes.set_visual_material(mat=ice_mat) From 0079f5994811eb9b60bc1f6e30461d97842bc631 Mon Sep 17 00:00:00 2001 From: yuecideng Date: Sun, 11 Jan 2026 22:23:12 +0800 Subject: [PATCH 3/5] wip --- docs/source/overview/sim/sim_assets.md | 67 ++++++++++++++++--------- docs/source/overview/sim/sim_manager.md | 15 ++++++ embodichain/lab/sim/material.py | 8 +++ 3 files changed, 67 insertions(+), 23 deletions(-) diff --git a/docs/source/overview/sim/sim_assets.md b/docs/source/overview/sim/sim_assets.md index a00f9022..dfa07edf 100644 --- a/docs/source/overview/sim/sim_assets.md +++ b/docs/source/overview/sim/sim_assets.md @@ -1,26 +1,10 @@ # Simulation Assets -Simulation assets in EmbodiChain are configured using Python dataclasses. This approach provides a structured and type-safe way to define properties for physics, materials, objects and sensors in the simulation environment. +Simulation assets in EmbodiChain are configured using Python dataclasses. This approach provides a structured and type-safe way to define properties for physics, materials, objects and sensors in the simulation environment. -## Physics Configuration +## Visual Materials -The `PhysicsCfg` class controls the global physics simulation parameters. - -| Parameter | Type | Default | Description | -| :--- | :--- | :--- | :--- | -| `gravity` | `np.ndarray` | `[0, 0, -9.81]` | Gravity vector for the simulation environment. | -| `bounce_threshold` | `float` | `2.0` | The speed threshold below which collisions will not produce bounce effects. | -| `enable_pcm` | `bool` | `True` | Enable persistent contact manifold (PCM) for improved collision handling. | -| `enable_tgs` | `bool` | `True` | Enable temporal gauss-seidel (TGS) solver for better stability. | -| `enable_ccd` | `bool` | `False` | Enable continuous collision detection (CCD) for fast-moving objects. | -| `enable_enhanced_determinism` | `bool` | `False` | Enable enhanced determinism for consistent simulation results. | -| `enable_friction_every_iteration` | `bool` | `True` | Enable friction calculations at every solver iteration. | -| `length_tolerance` | `float` | `0.05` | The length tolerance for the simulation. Larger values increase speed. | -| `speed_tolerance` | `float` | `0.25` | The speed tolerance for the simulation. Larger values increase speed. | - -## Materials - -### Visual Materials +### Configuration The `VisualMaterialCfg` class defines the visual appearance of objects using Physically Based Rendering (PBR) properties. @@ -38,13 +22,43 @@ The `VisualMaterialCfg` class defines the visual appearance of objects using Phy | `normal_texture` | `str` | `None` | Path to normal map. | | `ao_texture` | `str` | `None` | Path to ambient occlusion map. | | `ior` | `float` | `1.5` | Index of refraction for ray tracing materials. | -| `material_type` | `str` | `"BRDF"` | material type. | + +### Visual Material and Visual Material Instance + +A visual material is defined using the `VisualMaterialCfg` class. It is actually a material template that can be used to create multiple instances with different parameters. + +A visual material instance is created from a visual material using the method `create_instance()`. User can set different properties for each instance. For details API usage, please refer to the [VisualMaterialInst](https://dexforce.github.io/EmbodiChain/api_reference/embodichain/embodichain.lab.sim.html#embodichain.lab.sim.material.VisualMaterialInst) documentation. + +For batch simualtion scenarios, when user set a material to a object (eg, a rigid object), the material instance will be created for each simulation instance automatically. + +### Code + +```python +# Create a visual material with base color white and low roughness. +mat: VisualMaterial = sim.create_visual_material( + cfg=VisualMaterialCfg( + base_color=[1.0, 1.0, 1.0, 1.0], + roughness=0.05, + ) +) + +# Set the material to a rigid object. +object: RigidObject +object.set_visual_material(mat) + +# Get all material instances created for this object in the simulation. If `num_envs` is N, there will be N instances. +mat_inst: List[VisualMaterialInst] = object.get_visual_material_inst() + +# We can then modify the properties of each material instance separately. +mat_inst[0].set_base_color([1.0, 0.0, 0.0, 1.0]) +``` + ## Objects All objects inherit from `ObjectBaseCfg`, which provides common properties. -**Base Properties (`ObjectBaseCfg`)** +**Base Properties** | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | @@ -85,6 +99,13 @@ The `RigidBodyAttributesCfg` class defines physical properties for rigid bodies. | `dynamic_friction` | `float` | `0.5` | Dynamic friction coefficient. | | `static_friction` | `float` | `0.5` | Static friction coefficient. | +For Rigid Object tutorial, please refer to the [Create Scene](https://dexforce.github.io/EmbodiChain/tutorial/create_scene.html). + +## Rigid Object Groups + +`RigidObjectGroupCfg` allows initializing multiple rigid objects, potentially from a folder. + + ## Soft Object Configured via `SoftObjectCfg`. @@ -123,6 +144,8 @@ Soft bodies require both voxelization and physical attributes. | `mass` | `float` | `-1.0` | Total mass. If negative, density is used. | | `density` | `float` | `1000.0` | Material density in kg/m^3. | +For Soft Object tutorial, please refer to the [Soft Body Simulation](https://dexforce.github.io/EmbodiChain/tutorial/create_softbody.html). + ### Articulations & Robots @@ -152,6 +175,4 @@ Configured via `MarkerCfg` for debugging and visualization. | `axis_len` | `float` | `0.005` | Length of axis arms. | | `line_color` | `list` | `[1, 1, 0, 1.0]` | RGBA color for lines. | -### Rigid Object Groups -`RigidObjectGroupCfg` allows initializing multiple rigid objects, potentially from a folder. diff --git a/docs/source/overview/sim/sim_manager.md b/docs/source/overview/sim/sim_manager.md index 5f77b48c..823786b8 100644 --- a/docs/source/overview/sim/sim_manager.md +++ b/docs/source/overview/sim/sim_manager.md @@ -43,6 +43,21 @@ sim_config = SimulationManagerCfg( | `physics_config` | `PhysicsCfg` | `PhysicsCfg()` | The physics configuration parameters. | | `gpu_memory_config` | `GPUMemoryCfg` | `GPUMemoryCfg()` | The GPU memory configuration parameters. | +### Physics Configuration + +The `PhysicsCfg` class controls the global physics simulation parameters. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `gravity` | `np.ndarray` | `[0, 0, -9.81]` | Gravity vector for the simulation environment. | +| `bounce_threshold` | `float` | `2.0` | The speed threshold below which collisions will not produce bounce effects. | +| `enable_ccd` | `bool` | `False` | Enable continuous collision detection (CCD) for fast-moving objects. | +| `length_tolerance` | `float` | `0.05` | The length tolerance for the simulation. Larger values increase speed. | +| `speed_tolerance` | `float` | `0.25` | The speed tolerance for the simulation. Larger values increase speed. | + +For more parameters and details, refer to the [PhysicsCfg](https://dexforce.github.io/EmbodiChain/api_reference/embodichain/embodichain.lab.sim.html#embodichain.lab.sim.cfg.PhysicsCfg) documentation. + + ## Initialization Initialize the manager with the configuration object: diff --git a/embodichain/lab/sim/material.py b/embodichain/lab/sim/material.py index 9f2f50ca..ac3d99ab 100644 --- a/embodichain/lab/sim/material.py +++ b/embodichain/lab/sim/material.py @@ -116,6 +116,7 @@ def __init__(self, cfg: VisualMaterialCfg, mat: Material): self.uid = cfg.uid self.cfg = copy.deepcopy(cfg) self._mat = mat + self._mat_inst_list: list[str] = [] self._default_mat_inst = self.create_instance(self.uid) @@ -127,6 +128,10 @@ def is_rt_enabled(self) -> bool: def mat(self) -> Material: return self._mat + @property + def inst(self) -> VisualMaterialInst: + return self._default_mat_inst + def set_default_properties( self, mat_inst: VisualMaterialInst, cfg: VisualMaterialCfg ) -> None: @@ -164,6 +169,7 @@ def create_instance(self, uid: str) -> VisualMaterialInst: # TODO: Support change default properties for material. # This will improve the instance creation efficiency. self.set_default_properties(inst, self.cfg) + self._mat_inst_list.append(uid) return inst def get_default_instance(self) -> VisualMaterialInst: @@ -183,6 +189,8 @@ def get_instance(self, uid: str) -> VisualMaterialInst: Returns: VisualMaterialInst: The material instance. """ + if uid not in self._mat_inst_list: + logger.log_error(f"Material instance with uid '{uid}' does not exist.") return VisualMaterialInst(uid, self._mat) From d56a03344d8ede5c5d50a1e79b34dbca42a495c5 Mon Sep 17 00:00:00 2001 From: guilong li Date: Fri, 16 Jan 2026 10:10:57 +0800 Subject: [PATCH 4/5] Add documentation for Robot, Articulation, and Sensor (#66) --- docs/source/overview/sim/sim_articulation.md | 112 +++++++++++++++++++ docs/source/overview/sim/sim_robot.md | 107 ++++++++++++++++++ docs/source/overview/sim/sim_sensor.md | 75 +++++++++++++ docs/source/overview/sim/sim_soft_object.md | 79 +++++++++++++ 4 files changed, 373 insertions(+) create mode 100644 docs/source/overview/sim/sim_articulation.md create mode 100644 docs/source/overview/sim/sim_robot.md create mode 100644 docs/source/overview/sim/sim_sensor.md create mode 100644 docs/source/overview/sim/sim_soft_object.md diff --git a/docs/source/overview/sim/sim_articulation.md b/docs/source/overview/sim/sim_articulation.md new file mode 100644 index 00000000..2e77459b --- /dev/null +++ b/docs/source/overview/sim/sim_articulation.md @@ -0,0 +1,112 @@ +# Articulation + +The `Articulation` class represents the fundamental physics entity for articulated objects (e.g., robots, grippers, cabinets, doors) in EmbodiChain. + +## Configuration + +Articulations are configured using the `ArticulationCfg` dataclass. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `fpath` | `str` | `None` | Path to the asset file (URDF/MJCF). | +| `init_pos` | `tuple` | `(0,0,0)` | Initial root position `(x, y, z)`. | +| `init_rot` | `tuple` | `(0,0,0)` | Initial root rotation `(r, p, y)` in degrees. | +| `fix_base` | `bool` | `True` | Whether to fix the base of the articulation. | +| `drive_props` | `JointDrivePropertiesCfg` | `...` | Default drive properties. | + +### Drive Configuration + +The `drive_props` parameter controls the joint physics behavior. It is defined using the `JointDrivePropertiesCfg` class. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `stiffness` | `float` / `Dict` | `1.0e4` | Stiffness (P-gain) of the joint drive. Unit: $N/m$ or $Nm/rad$. | +| `damping` | `float` / `Dict` | `1.0e3` | Damping (D-gain) of the joint drive. Unit: $Ns/m$ or $Nms/rad$. | +| `max_effort` | `float` / `Dict` | `1.0e10` | Maximum effort (force/torque) the joint can exert. | +| `max_velocity` | `float` / `Dict` | `1.0e10` | Maximum velocity allowed for the joint ($m/s$ or $rad/s$). | +| `friction` | `float` / `Dict` | `0.0` | Joint friction coefficient. | +| `drive_type` | `str` | `"force"` | Drive mode: `"force"` or `"acceleration"`. | + +### Setup & Initialization + +```python +import torch +from embodichain.lab.sim import SimulationManager, SimulationManagerCfg +from embodichain.lab.sim.objects import Articulation, ArticulationCfg + +# 1. Initialize Simulation +device = "cuda" if torch.cuda.is_available() else "cpu" +sim_cfg = SimulationManagerCfg(sim_device=device) +sim = SimulationManager(sim_config=sim_cfg) + +# 2. Configure Articulation +art_cfg = ArticulationCfg( + fpath="assets/robots/franka/franka.urdf", + init_pos=(0, 0, 0.5), + fix_base=True +) + +# 3. Spawn Articulation +# Note: The method is 'add_articulation' +articulation: Articulation = sim.add_articulation(cfg=art_cfg) + +# 4. Initialize Physics +sim.reset_objects_state() +``` +## Articulation Class +State Data (Observation) +State data is accessed via getter methods that return batched tensors. + +| Property | Shape | Description | +| :--- | :--- | :--- | +| `get_local_pose` | `(N, 7)` | Root link pose `[x, y, z, qw, qx, qy, qz]`. | +| `get_qpos` | `(N, dof)` | Joint positions. | +| `get_qvel` | `(N, dof)` | Joint velocities. | + + + +```python +# Example: Accessing state +# Note: Use methods (with brackets) instead of properties +print(f"Current Joint Positions: {articulation.get_qpos()}") +print(f"Root Pose: {articulation.get_local_pose()}") +``` +### Control & Dynamics +You can control the articulation by setting joint targets. + +### Joint Control +```python +# Set joint position targets (PD Control) +# Get current qpos to create a target tensor of correct shape +current_qpos = articulation.get_qpos() +target_qpos = torch.zeros_like(current_qpos) + +# Set target position +# target=True: Sets the drive target. The physics engine applies forces to reach this position. +# target=False: Instantly resets/teleports joints to this position (ignoring physics). +articulation.set_qpos(target_qpos, target=True) + +# Important: Step simulation to apply control +sim.update() +``` +### Drive Configuration +Dynamically adjust drive properties. + +```python +# Set stiffness for all joints +articulation.set_drive( + stiffness=torch.tensor([100.0], device=device), + damping=torch.tensor([10.0], device=device) +) +``` +### Kinematics +Supports differentiable Forward Kinematics (FK) and Jacobian computation. +```python +# Compute Forward Kinematics +# Note: Ensure 'build_pk_chain=True' in cfg +if getattr(art_cfg, 'build_pk_chain', False): + ee_pose = articulation.compute_fk( + qpos=articulation.get_qpos(), # Use method call + end_link_name="ee_link" # Replace with actual link name + ) +``` diff --git a/docs/source/overview/sim/sim_robot.md b/docs/source/overview/sim/sim_robot.md new file mode 100644 index 00000000..61134850 --- /dev/null +++ b/docs/source/overview/sim/sim_robot.md @@ -0,0 +1,107 @@ +# Robot + +The `Robot` class extends `Articulation` to support advanced robotics features such as kinematic solvers (IK/FK), motion planners, and part-based control (e.g., controlling "arm" and "gripper" separately). + +## Configuration + +Robots are configured using `RobotCfg`. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `control_parts` | `Dict[str, List[str]]` | `None` | Defines groups of joints (e.g., `{"arm": ["joint1", ...], "hand": ["finger1", ...]}`). | +| `solver_cfg` | `SolverCfg` | `None` | Configuration for kinematic solvers (IK/FK). | +| `urdf_cfg` | `URDFCfg` | `None` | Advanced configuration for assembling a robot from multiple URDF components. | + +### Setup & Initialization + +A `Robot` must be spawned within a `SimulationManager`. + +```python +import torch +from embodichain.lab.sim import SimulationManager, SimulationManagerCfg +from embodichain.lab.sim.objects import Robot, RobotCfg +from embodichain.lab.sim.solvers import SolverCfg + +# 1. Initialize Simulation Environment +# Note: Use 'sim_device' to specify device (e.g., "cuda:0" or "cpu") +device = "cuda" if torch.cuda.is_available() else "cpu" +sim_cfg = SimulationManagerCfg(sim_device=device, physics_dt=0.01) +sim = SimulationManager(sim_config=sim_cfg) + +# 2. Configure Robot +robot_cfg = RobotCfg( + fpath="assets/robots/franka/franka.urdf", + control_parts={ + "arm": ["panda_joint[1-7]"], + "gripper": ["panda_finger_joint[1-2]"] + }, + solver_cfg=SolverCfg() +) + +# 3. Spawn Robot +# Note: The method is 'add_robot', and it takes 'cfg' as argument +robot: Robot = sim.add_robot(cfg=robot_cfg) + +# 4. Reset Simulation +# This performs a global reset of the simulation state +sim.reset_objects_state() +``` + +## Robot Class + +### Control Parts + +A unique feature of the `Robot` class is **Control Parts**. Instead of controlling the entire DoF vector at once, you can target specific body parts by name. + +```python +# Get joint IDs for a specific part +arm_ids = robot.get_joint_ids(name="arm") + +# Control only the arm +# Note: Ensure 'sim.update()' is called in your loop to apply these actions +target_qpos = torch.zeros((robot.num_instances, len(arm_ids)), device=device) +robot.set_qpos(target_qpos, name="arm", target=True) +``` + +### Kinematics (Solvers) +The robot class integrates solvers to perform differentiable Forward Kinematics (FK) and Inverse Kinematics (IK). +#### Forward Kinematics (FK) +Compute the pose of a link (e.g., end-effector) given joint positions. +```python +# Compute FK for a specific part (uses the part's configured solver) +current_qpos = robot.get_qpos() +ee_pose = robot.compute_fk(qpos=current_qpos, name="arm") +print(f"EE Pose: {ee_pose}") +``` +#### Inverse Kinematics (IK) +Compute the required joint positions to reach a target pose. +```python +# Compute IK +# pose: Target pose (N, 7) or (N, 4, 4) +target_pose = ee_pose.clone() # Example target +target_pose[:, 2] += 0.1 # Move up 10cm + +success, solved_qpos = robot.compute_ik( + pose=target_pose, + name="arm", + joint_seed=current_qpos +) +``` +### Proprioception +Get standard proprioceptive observation data for learning agents. +```python +# Returns a dict containing 'qpos', 'qvel', and 'qf' +obs = robot.get_proprioception() +``` +### Advanced API +The Robot class overrides standard Articulation methods to support the name argument for part-specific operations. +| Method | Description | +| :--- | :--- | +| `set_qpos(..., name="part")` | Set joint positions for a specific part. | +| `set_qvel(..., name="part")` | Set joint velocities for a specific part. | +| `set_qf(..., name="part")` | Set joint efforts for a specific part. | +| `get_qpos(name="part")` | Get joint positions of a specific part. | +| `get_qvel(name="part")` | Get joint velocities of a specific part. | + + + diff --git a/docs/source/overview/sim/sim_sensor.md b/docs/source/overview/sim/sim_sensor.md new file mode 100644 index 00000000..ed240f5b --- /dev/null +++ b/docs/source/overview/sim/sim_sensor.md @@ -0,0 +1,75 @@ +# Sensors + +The Simulation framework provides sensor interfaces for agents to perceive the environment. Currently, the primary supported sensor type is the **Camera**. + +## Camera + +### Configuration + +The `CameraCfg` class defines the configuration for camera sensors. It inherits from `SensorCfg` and controls resolution, clipping planes, intrinsics, and active data modalities. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `width` | `int` | `640` | Width of the captured image. | +| `height` | `int` | `480` | Height of the captured image. | +| `intrinsics` | `tuple` | `(600, 600, 320.0, 240.0)` | Camera intrinsics `(fx, fy, cx, cy)`. | +| `extrinsics` | `ExtrinsicsCfg` | `ExtrinsicsCfg()` | Pose configuration (see below). | +| `near` | `float` | `0.005` | Near clipping plane distance. | +| `far` | `float` | `100.0` | Far clipping plane distance. | +| `enable_color` | `bool` | `True` | Enable RGBA image capture. | +| `enable_depth` | `bool` | `False` | Enable depth map capture. | +| `enable_mask` | `bool` | `False` | Enable segmentation mask capture. | +| `enable_normal` | `bool` | `False` | Enable surface normal capture. | +| `enable_position` | `bool` | `False` | Enable 3D position map capture. | + +### Camera Extrinsics + +The `ExtrinsicsCfg` class defines the position and orientation of the camera. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `parent` | `str` | `None` | Name of the link to attach to (e.g., `"ee_link"`). If `None`, camera is fixed in world. | +| `pos` | `list` | `[0.0, 0.0, 0.0]` | Position offset `[x, y, z]`. | +| `quat` | `list` | `[1.0, 0.0, 0.0, 0.0]` | Orientation quaternion `[w, x, y, z]`. | +| `eye` | `tuple` | `None` | (Optional) Camera eye position for look-at mode. | +| `target` | `tuple` | `None` | (Optional) Target position for look-at mode. | +| `up` | `tuple` | `None` | (Optional) Up vector for look-at mode. | + +### Usage + +You can create a camera sensor using `sim.add_sensor()` with a `CameraCfg` object. + +#### Code Example + +```python +from embodichain.lab.sim.sensors import Camera, CameraCfg + +# 1. Define Configuration +camera_cfg = CameraCfg( + width=640, + height=480, + intrinsics=(600, 600, 320.0, 240.0), + extrinsics=CameraCfg.ExtrinsicsCfg( + parent="ee_link", # Attach to robot end-effector + pos=[0.09, 0.05, 0.04], # Relative position + quat=[0, 1, 0, 0], # Relative rotation [w, x, y, z] + ), + enable_color=True, + enable_depth=True, +) + +# 2. Add Sensor to Simulation +camera: Camera = sim.add_sensor(sensor_cfg=camera_cfg) +``` +### Observation Data +Retrieve sensor data using camera.get_data(). The data is returned as a dictionary of tensors on the specified device. + +| Key | Data Type | Shape | Description | +| :--- | :--- | :--- | :--- | +| `color` | `torch.uint8` | `(B, H, W, 4)` | RGBA image data. | +| `depth` | `torch.float32` | `(B, H, W)` | Depth map in meters. | +| `mask` | `torch.int32` | `(B, H, W)` | Segmentation mask / Instance IDs. | +| `normal` | `torch.float32` | `(B, H, W, 3)` | Surface normal vectors. | +| `position` | `torch.float32` | `(B, H, W, 3)` | 3D Position map (OpenGL coords). | + +*Note: `B` represents the number of environments (batch size).* \ No newline at end of file diff --git a/docs/source/overview/sim/sim_soft_object.md b/docs/source/overview/sim/sim_soft_object.md new file mode 100644 index 00000000..4e4130e8 --- /dev/null +++ b/docs/source/overview/sim/sim_soft_object.md @@ -0,0 +1,79 @@ +# Soft Object + +The `SoftObject` class represents deformable entities (e.g., cloth, sponges, soft robotics) in EmbodiChain. Unlike rigid bodies, soft objects are defined by vertices and meshes rather than a single rigid pose. + +## Configuration + +Soft objects are configured using the `SoftObjectCfg` dataclass. + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `fpath` | `str` | `None` | Path to the soft body asset file (e.g., `.msh`, `.vtk`). | +| `init_pos` | `tuple` | `(0,0,0)` | Initial position `(x, y, z)`. | +| `init_rot` | `tuple` | `(0,0,0)` | Initial rotation `(r, p, y)` in degrees. | + +### Setup & Initialization + +```python +import torch +from embodichain.lab.sim import SimulationManager, SimulationManagerCfg +from embodichain.lab.sim.objects import SoftObject, SoftObjectCfg + +# 1. Initialize Simulation +device = "cuda" if torch.cuda.is_available() else "cpu" +sim_cfg = SimulationManagerCfg(sim_device=device) +sim = SimulationManager(sim_config=sim_cfg) + +# 2. Configure Soft Object +soft_cfg = SoftObjectCfg( + fpath="assets/objects/sponge.msh", # Example asset path + init_pos=(0, 0, 0.5), + init_rot=(0, 0, 0) +) + +# 3. Spawn Soft Object +# Note: Assuming the method in SimulationManager is 'add_soft_object' +soft_object: SoftObject = sim.add_soft_object(cfg=soft_cfg) + +# 4. Initialize Physics +sim.reset_objects_state() +``` +### Soft Object Class +#### Vertex Data (Observation) +For soft objects, the state is represented by the positions and velocities of its vertices, rather than a single root pose. + +| Method | Return Shape | Description | +| :--- | :--- | :--- | +| `get_current_collision_vertices()` | `(N, V_col, 3)` | Current positions of collision mesh vertices. | +| `get_current_sim_vertices()` | `(N, V_sim, 3)` | Current positions of simulation mesh vertices (nodes). | +| `get_current_sim_vertex_velocities()` | `(N, V_sim, 3)` | Current velocities of simulation vertices. | +| `get_rest_collision_vertices()` | `(N, V_col, 3)` | Rest (initial) positions of collision vertices. | +| `get_rest_sim_vertices()` | `(N, V_sim, 3)` | Rest (initial) positions of simulation vertices. | + +> Note: N is the number of environments/instances, V_col is the number of collision vertices, and V_sim is the number of simulation vertices. + +```python +# Example: Accessing vertex data +sim_verts = soft_object.get_current_sim_vertices() +print(f"Simulation Vertices Shape: {sim_verts.shape}") + +velocities = soft_object.get_current_sim_vertex_velocities() +print(f"Vertex Velocities: {velocities}") +``` +#### Pose Management +You can set the global pose of a soft object (which transforms all its vertices), but getting a single "pose" from a deformed object is not supported. + +| Method | Description | +| :--- | :--- | +| `set_local_pose(pose)` | Sets the pose of the object by transforming all vertices. | +| `get_local_pose()` | **Not Supported**. Raises `NotImplementedError` because a deformed object does not have a single rigid pose. | + + +```python +# Reset or Move the Soft Object +target_pose = torch.tensor([[0, 0, 1.0, 1, 0, 0, 0]], device=device) # (x, y, z, qw, qx, qy, qz) +soft_object.set_local_pose(target_pose) + +# Important: Step simulation to apply changes +sim.update() +``` \ No newline at end of file From a0f2e7929f2c655a37bbd339170dff917c9b2f28 Mon Sep 17 00:00:00 2001 From: yuecideng Date: Fri, 16 Jan 2026 12:17:43 +0800 Subject: [PATCH 5/5] wip --- docs/source/overview/gym/env.md | 52 +++++------ docs/source/overview/gym/event_functors.md | 2 +- .../overview/gym/observation_functors.md | 7 +- docs/source/overview/sim/index.rst | 5 +- docs/source/overview/sim/sim_articulation.md | 8 +- docs/source/overview/sim/sim_assets.md | 86 ++++--------------- docs/source/overview/sim/sim_manager.md | 24 +++--- docs/source/overview/sim/sim_robot.md | 13 +-- docs/source/overview/sim/sim_sensor.md | 53 +++++++++++- docs/source/overview/sim/sim_soft_object.md | 44 ++++++++-- 10 files changed, 169 insertions(+), 125 deletions(-) diff --git a/docs/source/overview/gym/env.md b/docs/source/overview/gym/env.md index e540a442..a15d26f9 100644 --- a/docs/source/overview/gym/env.md +++ b/docs/source/overview/gym/env.md @@ -3,11 +3,11 @@ ```{currentmodule} embodichain.lab.gym ``` -The {class}`envs.EmbodiedEnv` is the core environment class in EmbodiChain designed for complex Embodied AI tasks. It adopts a **configuration-driven** architecture, allowing users to define robots, sensors, objects, lighting, and automated behaviors (events) purely through configuration classes, minimizing the need for boilerplate code. +The {class}`~envs.EmbodiedEnv` is the core environment class in EmbodiChain designed for complex Embodied AI tasks. It adopts a **configuration-driven** architecture, allowing users to define robots, sensors, objects, lighting, and automated behaviors (events) purely through configuration classes, minimizing the need for boilerplate code. ## Core Architecture -Unlike the standard {class}`envs.BaseEnv`, the {class}`envs.EmbodiedEnv` integrates several manager systems to handle the complexity of simulation: +Unlike the standard {class}`~envs.BaseEnv`, the {class}`~envs.EmbodiedEnv` integrates several manager systems to handle the complexity of simulation: * **Scene Management**: Automatically loads and manages robots, sensors, and scene objects defined in the configuration. * **Event Manager**: Handles automated behaviors such as domain randomization, scene setup, and dynamic asset swapping. @@ -16,18 +16,18 @@ Unlike the standard {class}`envs.BaseEnv`, the {class}`envs.EmbodiedEnv` integra ## Configuration System -The environment is defined by inheriting from {class}`envs.EmbodiedEnvCfg`. This configuration class serves as the single source of truth for the scene description. +The environment is defined by inheriting from {class}`~envs.EmbodiedEnvCfg`. This configuration class serves as the single source of truth for the scene description. -{class}`envs.EmbodiedEnvCfg` inherits from {class}`envs.EnvCfg` (the base environment configuration class, sometimes referred to as `BaseEnvCfg`), which provides fundamental environment parameters. The following sections describe both the base class parameters and the additional parameters specific to {class}`envs.EmbodiedEnvCfg`. +{class}`~envs.EmbodiedEnvCfg` inherits from {class}`~envs.EnvCfg` (the base environment configuration class, sometimes referred to as `BaseEnvCfg`), which provides fundamental environment parameters. The following sections describe both the base class parameters and the additional parameters specific to {class}`~envs.EmbodiedEnvCfg`. ### BaseEnvCfg Parameters -Since {class}`envs.EmbodiedEnvCfg` inherits from {class}`envs.EnvCfg`, it includes the following base parameters: +Since {class}`~envs.EmbodiedEnvCfg` inherits from {class}`~envs.EnvCfg`, it includes the following base parameters: * **num_envs** (int): The number of sub environments (arenas) to be simulated in parallel. Defaults to ``1``. -* **sim_cfg** ({class}`embodichain.lab.sim.cfg.SimulationManagerCfg`): +* **sim_cfg** ({class}`~embodichain.lab.sim.SimulationManagerCfg`): Simulation configuration for the environment, including physics settings, device selection, and rendering options. Defaults to a basic configuration with headless mode enabled. * **seed** (int | None): @@ -41,40 +41,40 @@ Since {class}`envs.EmbodiedEnvCfg` inherits from {class}`envs.EnvCfg`, it includ ### EmbodiedEnvCfg Parameters -The {class}`envs.EmbodiedEnvCfg` class exposes the following additional parameters: +The {class}`~envs.EmbodiedEnvCfg` class exposes the following additional parameters: -* **robot** ({class}`embodichain.lab.sim.cfg.RobotCfg`): +* **robot** ({class}`~embodichain.lab.sim.cfg.RobotCfg`): Defines the agent in the scene. Supports loading robots from URDF/MJCF with specified initial state and control mode. This is a required field. -* **sensor** (List[{class}`embodichain.lab.sim.cfg.SensorCfg`]): +* **sensor** (List[{class}`~embodichain.lab.sim.sensor.SensorCfg`]): A list of sensors attached to the scene or robot. Common sensors include {class}`~embodichain.lab.sim.sensors.StereoCamera` for RGB-D and segmentation data. Defaults to an empty list. -* **light** ({class}`envs.EmbodiedEnvCfg.EnvLightCfg`): +* **light** ({class}`~envs.EmbodiedEnvCfg.EnvLightCfg`): Configures the lighting environment. The {class}`EnvLightCfg` class contains: * ``direct``: List of direct light sources (Point, Spot, Directional) affecting local illumination. Defaults to an empty list. * ``indirect``: Global illumination settings (Ambient, IBL) - *planned for future release*. -* **rigid_object** (List[{class}`embodichain.lab.sim.cfg.RigidObjectCfg`]): +* **rigid_object** (List[{class}`~embodichain.lab.sim.cfg.RigidObjectCfg`]): List of dynamic or kinematic simple bodies. Defaults to an empty list. -* **rigid_object_group** (List[{class}`embodichain.lab.sim.cfg.RigidObjectGroupCfg`]): +* **rigid_object_group** (List[{class}`~embodichain.lab.sim.cfg.RigidObjectGroupCfg`]): Collections of rigid objects that can be managed together. Efficient for many similar objects. Defaults to an empty list. -* **articulation** (List[{class}`embodichain.lab.sim.cfg.ArticulationCfg`]): +* **articulation** (List[{class}`~embodichain.lab.sim.cfg.ArticulationCfg`]): List of complex mechanisms with joints (doors, drawers). Defaults to an empty list. -* **background** (List[{class}`embodichain.lab.sim.cfg.RigidObjectCfg`]): +* **background** (List[{class}`~embodichain.lab.sim.cfg.RigidObjectCfg`]): Static or kinematic objects serving as obstacles or landmarks in the scene. Defaults to an empty list. * **events** (Union[object, None]): - Event settings for domain randomization and automated behaviors. Defaults to None, in which case no events are applied through the event manager. Please refer to the {class}`embodichain.lab.gym.managers.EventManager` class for more details. + Event settings for domain randomization and automated behaviors. Defaults to None, in which case no events are applied through the event manager. Please refer to the {class}`~envs.managers.EventManager` class for more details. * **observations** (Union[object, None]): - Custom observation specifications. Defaults to None, in which case no additional observations are applied through the observation manager. Please refer to the {class}`embodichain.lab.gym.managers.ObservationManager` class for more details. + Custom observation specifications. Defaults to None, in which case no additional observations are applied through the observation manager. Please refer to the {class}`~envs.managers.ObservationManager` class for more details. * **dataset** (Union[object, None]): - Dataset collection settings. Defaults to None, in which case no dataset collection is performed. Please refer to the {class}`embodichain.lab.gym.managers.DatasetManager` class for more details. + Dataset collection settings. Defaults to None, in which case no dataset collection is performed. Please refer to the {class}`~envs.managers.DatasetManager` class for more details. * **extensions** (Union[Dict[str, Any], None]): Task-specific extension parameters that are automatically bound to the environment instance. This allows passing custom parameters (e.g., ``episode_length``, ``obs_mode``, ``action_scale``) without modifying the base configuration class. These parameters are accessible as instance attributes after environment initialization. For example, if ``extensions = {"episode_length": 500}``, you can access it via ``self.episode_length``. Defaults to None. @@ -114,26 +114,26 @@ class MyTaskEnvCfg(EmbodiedEnvCfg): ## Manager Systems -The manager systems in {class}`envs.EmbodiedEnv` provide modular, configuration-driven functionality for handling complex simulation behaviors. Each manager uses a **functor-based** architecture, allowing you to compose behaviors through configuration without modifying environment code. Functors are reusable functions or classes (inheriting from {class}`envs.managers.Functor`) that operate on the environment state, configured through {class}`envs.managers.cfg.FunctorCfg`. +The manager systems in {class}`~envs.EmbodiedEnv` provide modular, configuration-driven functionality for handling complex simulation behaviors. Each manager uses a **functor-based** architecture, allowing you to compose behaviors through configuration without modifying environment code. Functors are reusable functions or classes (inheriting from {class}`~envs.managers.Functor`) that operate on the environment state, configured through {class}`~envs.managers.cfg.FunctorCfg`. ### Event Manager The Event Manager automates changes in the environment through event functors. Events can be triggered at different stages: * **startup**: Executed once when the environment initializes. Useful for setting up initial scene properties that don't change during episodes. -* **reset**: Executed every time ``env.reset()`` is called. Applied to specific environments that need resetting (via ``env_ids`` parameter). This is the most common mode for domain randomization. +* **reset**: Executed every time {meth}`~envs.Env.reset()` is called. Applied to specific environments that need resetting (via ``env_ids`` parameter). This is the most common mode for domain randomization. * **interval**: Executed periodically every N steps (specified by ``interval_step``, defaults to 10). Can be configured per-environment (``is_global=False``) or globally synchronized (``is_global=True``). -Event functors are configured using {class}`envs.managers.cfg.EventCfg`. For a complete list of available event functors, please refer to {doc}`event_functors`. +Event functors are configured using {class}`~envs.managers.cfg.EventCfg`. For a complete list of available event functors, please refer to {doc}`event_functors`. ### Observation Manager -While {class}`envs.EmbodiedEnv` provides default observations organized into two groups: +While {class}`~envs.EmbodiedEnv` provides default observations organized into two groups: * **robot**: Contains ``qpos`` (joint positions), ``qvel`` (joint velocities), and ``qf`` (joint forces). * **sensor**: Contains raw sensor outputs (images, depth, segmentation masks, etc.). -The Observation Manager allows you to extend the observation space with task-specific information. Observations are configured using {class}`envs.managers.cfg.ObservationCfg` with two operation modes: +The Observation Manager allows you to extend the observation space with task-specific information. Observations are configured using {class}`~envs.managers.cfg.ObservationCfg` with two operation modes: * **modify**: Update existing observations in-place. The observation must already exist in the observation dictionary. Useful for normalization, transformation, or filtering existing data. Example: Normalize joint positions to [0, 1] range based on joint limits. * **add**: Compute and add new observations to the observation space. The observation name can use hierarchical keys separated by ``/`` (e.g., ``"object/fork/pose"``). @@ -144,7 +144,7 @@ For a complete list of available observation functors, please refer to {doc}`obs For Imitation Learning (IL) tasks, the Dataset Manager automates data collection through dataset functors. It currently supports: -* **LeRobot Format** (via {class}`envs.managers.datasets.LeRobotRecorder`): +* **LeRobot Format** (via {class}`~envs.managers.datasets.LeRobotRecorder`): Standard format for LeRobot training pipelines. Includes support for task instructions, robot metadata, success flags, and optional video recording. ```{note} @@ -163,11 +163,11 @@ The manager operates in a single mode ``"save"`` which handles both recording an * ``use_videos``: Whether to save video recordings of episodes. * ``export_success_only``: Filter to save only successful episodes (based on ``info["success"]``). -The dataset manager is called automatically during ``env.step()``, ensuring all observation-action pairs are recorded without additional user code. +The dataset manager is called automatically during {meth}`~envs.Env.step()`, ensuring all observation-action pairs are recorded without additional user code. ## Creating a Custom Task -To create a new task, inherit from {class}`envs.EmbodiedEnv` and implement the task-specific logic. +To create a new task, inherit from {class}`~envs.EmbodiedEnv` and implement the task-specific logic. ```python from embodichain.lab.gym.envs import EmbodiedEnv, EmbodiedEnvCfg @@ -203,7 +203,7 @@ class MyTaskEnv(EmbodiedEnv): ``` ```{note} -The ``create_demo_action_list`` method is specifically designed for expert demonstration data generation in Imitation Learning scenarios. For Reinforcement Learning tasks, you should override the ``get_reward`` method instead. +The {meth}`~envs.EmbodiedEnv.create_demo_action_list` method is specifically designed for expert demonstration data generation in Imitation Learning scenarios. For Reinforcement Learning tasks, you should override the {meth}`~envs.EmbodiedEnv.get_reward` method instead. ``` For a complete example of a modular environment setup, please refer to the {ref}`tutorial_modular_env` tutorial. diff --git a/docs/source/overview/gym/event_functors.md b/docs/source/overview/gym/event_functors.md index e1f5a63d..efd8eb3d 100644 --- a/docs/source/overview/gym/event_functors.md +++ b/docs/source/overview/gym/event_functors.md @@ -3,7 +3,7 @@ ```{currentmodule} embodichain.lab.gym.envs.managers ``` -This page lists all available event functors that can be used with the Event Manager. Event functors are configured using {class}`envs.managers.cfg.EventCfg` and can be triggered at different stages: ``startup``, ``reset``, or ``interval``. +This page lists all available event functors that can be used with the Event Manager. Event functors are configured using {class}`~envs.managers.cfg.EventCfg` and can be triggered at different stages: ``startup``, ``reset``, or ``interval``. ## Physics Randomization diff --git a/docs/source/overview/gym/observation_functors.md b/docs/source/overview/gym/observation_functors.md index a3d6c265..f55252f8 100644 --- a/docs/source/overview/gym/observation_functors.md +++ b/docs/source/overview/gym/observation_functors.md @@ -3,7 +3,7 @@ ```{currentmodule} embodichain.lab.gym.envs.managers ``` -This page lists all available observation functors that can be used with the Observation Manager. Observation functors are configured using {class}`envs.managers.cfg.ObservationCfg` and can operate in two modes: ``modify`` (update existing observations) or ``add`` (add new observations). +This page lists all available observation functors that can be used with the Observation Manager. Observation functors are configured using {class}`~cfg.ObservationCfg` and can operate in two modes: ``modify`` (update existing observations) or ``add`` (add new observations). ## Pose Computations @@ -57,8 +57,11 @@ This page lists all available observation functors that can be used with the Obs - Normalize joint positions or velocities to [0, 1] range based on joint limits. Supports both ``qpos_limits`` and ``qvel_limits``. Operates in ``modify`` mode. ``` +```{currentmodule} embodichain.lab.sim.objects +``` + ```{note} -To get robot end-effector poses, you can use the robot's ``compute_fk()`` method directly in your observation functors or task code. +To get robot end-effector poses, you can use the robot's {meth}`~Robot.compute_fk()` method directly in your observation functors or task code. ``` ## Usage Example diff --git a/docs/source/overview/sim/index.rst b/docs/source/overview/sim/index.rst index 48e1a3ca..56f98ef2 100644 --- a/docs/source/overview/sim/index.rst +++ b/docs/source/overview/sim/index.rst @@ -13,13 +13,12 @@ Overview of the Simulation Framework: - Kinematics Solver - Motion Generation -Table of Contents -================= - .. toctree:: :maxdepth: 1 :glob: sim_manager.md + sim_assets.md + sim_sensor.md solvers/index planners/index diff --git a/docs/source/overview/sim/sim_articulation.md b/docs/source/overview/sim/sim_articulation.md index 2e77459b..1cde346f 100644 --- a/docs/source/overview/sim/sim_articulation.md +++ b/docs/source/overview/sim/sim_articulation.md @@ -1,11 +1,13 @@ # Articulation -The `Articulation` class represents the fundamental physics entity for articulated objects (e.g., robots, grippers, cabinets, doors) in EmbodiChain. +```{currentmodule} embodichain.lab.sim +``` -## Configuration +The {class}`~objects.Articulation` class represents the fundamental physics entity for articulated objects (e.g., robots, grippers, cabinets, doors) in EmbodiChain. -Articulations are configured using the `ArticulationCfg` dataclass. +## Configuration +Articulations are configured using the {class}`~cfg.ArticulationCfg` dataclass. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | | `fpath` | `str` | `None` | Path to the asset file (URDF/MJCF). | diff --git a/docs/source/overview/sim/sim_assets.md b/docs/source/overview/sim/sim_assets.md index dfa07edf..61463b21 100644 --- a/docs/source/overview/sim/sim_assets.md +++ b/docs/source/overview/sim/sim_assets.md @@ -1,12 +1,15 @@ # Simulation Assets -Simulation assets in EmbodiChain are configured using Python dataclasses. This approach provides a structured and type-safe way to define properties for physics, materials, objects and sensors in the simulation environment. +```{currentmodule} embodichain.lab.sim +``` + +Simulation assets in EmbodiChain are configured using Python dataclasses. This approach provides a structured and type-safe way to define properties for physics, materials and objects in the simulation environment. ## Visual Materials ### Configuration -The `VisualMaterialCfg` class defines the visual appearance of objects using Physically Based Rendering (PBR) properties. +The {class}`~material.VisualMaterialCfg` class defines the visual appearance of objects using Physically Based Rendering (PBR) properties. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | @@ -25,11 +28,11 @@ The `VisualMaterialCfg` class defines the visual appearance of objects using Phy ### Visual Material and Visual Material Instance -A visual material is defined using the `VisualMaterialCfg` class. It is actually a material template that can be used to create multiple instances with different parameters. +A visual material is defined using the {class}`~material.VisualMaterialCfg` class. It is actually a material template that can be used to create multiple instances with different parameters. -A visual material instance is created from a visual material using the method `create_instance()`. User can set different properties for each instance. For details API usage, please refer to the [VisualMaterialInst](https://dexforce.github.io/EmbodiChain/api_reference/embodichain/embodichain.lab.sim.html#embodichain.lab.sim.material.VisualMaterialInst) documentation. +A visual material instance is created from a visual material using the method {meth}`~material.VisualMaterial.create_instance()`. User can set different properties for each instance. For details API usage, please refer to the [VisualMaterialInst](https://dexforce.github.io/EmbodiChain/api_reference/embodichain/embodichain.lab.sim.html#embodichain.lab.sim.material.VisualMaterialInst) documentation. -For batch simualtion scenarios, when user set a material to a object (eg, a rigid object), the material instance will be created for each simulation instance automatically. +For batch simualtion scenarios, when user set a material to a object (eg, a rigid object with `num_envs` instances), the material instance will be created for each simulation instance automatically. ### Code @@ -56,7 +59,7 @@ mat_inst[0].set_base_color([1.0, 0.0, 0.0, 1.0]) ## Objects -All objects inherit from `ObjectBaseCfg`, which provides common properties. +All objects inherit from {class}`~cfg.ObjectBaseCfg`, which provides common properties. **Base Properties** @@ -69,7 +72,7 @@ All objects inherit from `ObjectBaseCfg`, which provides common properties. ## Rigid Object -Configured via `RigidObjectCfg`. +Configured via {class}`~cfg.RigidObjectCfg`. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | @@ -81,7 +84,7 @@ Configured via `RigidObjectCfg`. ### Rigid Body Attributes -The `RigidBodyAttributesCfg` class defines physical properties for rigid bodies. +The {class}`~cfg.RigidBodyAttributesCfg` class defines physical properties for rigid bodies. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | @@ -99,59 +102,12 @@ The `RigidBodyAttributesCfg` class defines physical properties for rigid bodies. | `dynamic_friction` | `float` | `0.5` | Dynamic friction coefficient. | | `static_friction` | `float` | `0.5` | Static friction coefficient. | -For Rigid Object tutorial, please refer to the [Create Scene](https://dexforce.github.io/EmbodiChain/tutorial/create_scene.html). +For Rigid Object tutorial, please refer to the [Create Scene](https://dexforce.github.io/EmbodiChain/tutorial/create_scene.html) tutorial. ## Rigid Object Groups -`RigidObjectGroupCfg` allows initializing multiple rigid objects, potentially from a folder. - - -## Soft Object - -Configured via `SoftObjectCfg`. - -| Parameter | Type | Default | Description | -| :--- | :--- | :--- | :--- | -| `voxel_attr` | `SoftbodyVoxelAttributesCfg` | `...` | Voxelization attributes. | -| `physical_attr` | `SoftbodyPhysicalAttributesCfg` | `...` | Physical attributes. | -| `shape` | `MeshCfg` | `MeshCfg()` | Mesh configuration. | - -### Soft Body Attributes - -Soft bodies require both voxelization and physical attributes. - -**Voxel Attributes (`SoftbodyVoxelAttributesCfg`)** - -| Parameter | Type | Default | Description | -| :--- | :--- | :--- | :--- | -| `triangle_remesh_resolution` | `int` | `8` | Resolution to remesh the softbody mesh before building physx collision mesh. | -| `triangle_simplify_target` | `int` | `0` | Simplify mesh faces to target value. | -| `simulation_mesh_resolution` | `int` | `8` | Resolution to build simulation voxelize textra mesh. | -| `simulation_mesh_output_obj` | `bool` | `False` | Whether to output the simulation mesh as an obj file for debugging. | +{class}`~cfg.RigidObjectGroupCfg` allows initializing multiple rigid objects, potentially from a folder. -**Physical Attributes (`SoftbodyPhysicalAttributesCfg`)** - -| Parameter | Type | Default | Description | -| :--- | :--- | :--- | :--- | -| `youngs` | `float` | `1e6` | Young's modulus (higher = stiffer). | -| `poissons` | `float` | `0.45` | Poisson's ratio (higher = closer to incompressible). | -| `dynamic_friction` | `float` | `0.0` | Dynamic friction coefficient. | -| `elasticity_damping` | `float` | `0.0` | Elasticity damping factor. | -| `material_model` | `SoftBodyMaterialModel` | `CO_ROTATIONAL` | Material constitutive model. | -| `enable_kinematic` | `bool` | `False` | If True, (partially) kinematic behavior is enabled. | -| `enable_ccd` | `bool` | `False` | Enable continuous collision detection. | -| `enable_self_collision` | `bool` | `False` | Enable self-collision handling. | -| `mass` | `float` | `-1.0` | Total mass. If negative, density is used. | -| `density` | `float` | `1000.0` | Material density in kg/m^3. | - -For Soft Object tutorial, please refer to the [Soft Body Simulation](https://dexforce.github.io/EmbodiChain/tutorial/create_softbody.html). - - -### Articulations & Robots - -Configured via `ArticulationCfg` and `RobotCfg` (which inherits from `ArticulationCfg`). - -These configurations are typically loaded from URDF or MJCF files. ### Lights @@ -164,15 +120,11 @@ Configured via `LightCfg`. | `intensity` | `float` | `50.0` | Intensity in watts/m^2. | | `radius` | `float` | `1e2` | Falloff radius. | -### Markers - -Configured via `MarkerCfg` for debugging and visualization. - -| Parameter | Type | Default | Description | -| :--- | :--- | :--- | :--- | -| `marker_type` | `Literal` | `"axis"` | "axis", "line", or "point". | -| `axis_size` | `float` | `0.002` | Thickness of axis lines. | -| `axis_len` | `float` | `0.005` | Length of axis arms. | -| `line_color` | `list` | `[1, 1, 0, 1.0]` | RGBA color for lines. | +```{toctree} +:maxdepth: 1 +sim_soft_object.md +sim_articulation.md +sim_robot.md +``` \ No newline at end of file diff --git a/docs/source/overview/sim/sim_manager.md b/docs/source/overview/sim/sim_manager.md index 823786b8..d2f19be2 100644 --- a/docs/source/overview/sim/sim_manager.md +++ b/docs/source/overview/sim/sim_manager.md @@ -1,14 +1,17 @@ # Simulation Manager -The `SimulationManager` is the central class in EmbodiChain's simulation framework for managing the simulation lifecycle. It handles: -- **Asset Management**: Loading and managing robots, rigid objects, soft objects, articulations, sensors, and lights. +```{currentmodule} embodichain.lab.sim +``` + +The {class}`SimulationManager` is the central class in EmbodiChain's simulation framework for managing the simulation lifecycle. It handles: +- **Asset Management**: Loading and managing robots, rigid objects, soft objects, articulations, and lights. - **Simulation Loop**: Controlling the physics stepping and rendering updates. - **Rendering**: Managing the simulation window, camera rendering, material settings and ray-tracing configuration. - **Interaction**: Providing gizmo controls for interactive manipulation of objects. ## Configuration -The simulation is configured using the `SimulationManagerCfg` class. +The simulation is configured using the {class}`SimulationManagerCfg` class. ```python from embodichain.lab.sim import SimulationManagerCfg @@ -45,7 +48,7 @@ sim_config = SimulationManagerCfg( ### Physics Configuration -The `PhysicsCfg` class controls the global physics simulation parameters. +The {class}`~cfg.PhysicsCfg` class controls the global physics simulation parameters. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | @@ -77,7 +80,6 @@ The manager provides methods to add, retrieve and remove various simulation asse - Soft Objects - Articulations - Robots -- Sensors - Lights - Materials @@ -87,7 +89,7 @@ For more details on simulation assets, please refer to their respective document ### Manual Update mode -In this mode, the physics simulation should be explicitly stepped by calling `update()` method, which provides precise control over the simulation timing. +In this mode, the physics simulation should be explicitly stepped by calling {meth}`~SimulationManager.update()` method, which provides precise control over the simulation timing. The use case for manual update mode includes: - Data generation with openai gym environments, in which the observation and action must be synchronized with the physics simulation. @@ -110,11 +112,13 @@ In this mode, the physics simulation stepping is automatically handling by the p > When in automatic update mode, user are recommanded to use CPU `sim_device` for simulation. -### Methods +### Mainly used methods + +- **`SimulationManager.update(physics_dt=None, step=1)`**: Steps the physics simulation with optional custom time step and number of steps. If `physics_dt` is None, uses the configured physics time step. +- **`SimulationManager.enable_physics(enable: bool)`**: Enable or disable physics simulation. +- **`SimulationManager.set_manual_update(enable: bool)`**: Set manual update mode for physics. -- **`update(physics_dt=None, step=1)`**: Steps the physics simulation with optional custom time step and number of steps. If `physics_dt` is None, uses the configured physics time step. -- **`enable_physics(enable: bool)`**: Enable or disable physics simulation. -- **`set_manual_update(enable: bool)`**: Set manual update mode for physics. +For more methods and details, refer to the [SimulationManager](https://dexforce.github.io/EmbodiChain/api_reference/embodichain/embodichain.lab.sim.html#embodichain.lab.sim.SimulationManager) documentation. ### Related Tutorials diff --git a/docs/source/overview/sim/sim_robot.md b/docs/source/overview/sim/sim_robot.md index 61134850..e0ab5992 100644 --- a/docs/source/overview/sim/sim_robot.md +++ b/docs/source/overview/sim/sim_robot.md @@ -1,11 +1,13 @@ # Robot -The `Robot` class extends `Articulation` to support advanced robotics features such as kinematic solvers (IK/FK), motion planners, and part-based control (e.g., controlling "arm" and "gripper" separately). +```{currentmodule} embodichain.lab.sim +``` -## Configuration +The {class}`~objects.Robot` class extends {class}`~objects.Articulation` to support advanced robotics features such as kinematic solvers (IK/FK), motion planners, and part-based control (e.g., controlling "arm" and "gripper" separately). -Robots are configured using `RobotCfg`. +## Configuration +Robots are configured using {class}`~cfg.RobotCfg`. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | | `control_parts` | `Dict[str, List[str]]` | `None` | Defines groups of joints (e.g., `{"arm": ["joint1", ...], "hand": ["finger1", ...]}`). | @@ -94,7 +96,7 @@ Get standard proprioceptive observation data for learning agents. obs = robot.get_proprioception() ``` ### Advanced API -The Robot class overrides standard Articulation methods to support the name argument for part-specific operations. +The Robot class overrides standard Articulation methods to support the name argument for part-specific operations. | Method | Description | | :--- | :--- | | `set_qpos(..., name="part")` | Set joint positions for a specific part. | @@ -103,5 +105,4 @@ The Robot class overrides standard Articulation methods to support the name argu | `get_qpos(name="part")` | Get joint positions of a specific part. | | `get_qvel(name="part")` | Get joint velocities of a specific part. | - - +For more API details, refer to the {class}`~objects.Robot` documentation. diff --git a/docs/source/overview/sim/sim_sensor.md b/docs/source/overview/sim/sim_sensor.md index ed240f5b..22f435c5 100644 --- a/docs/source/overview/sim/sim_sensor.md +++ b/docs/source/overview/sim/sim_sensor.md @@ -1,12 +1,15 @@ # Sensors +```{currentmodule} embodichain.lab.sim.sensors +``` + The Simulation framework provides sensor interfaces for agents to perceive the environment. Currently, the primary supported sensor type is the **Camera**. ## Camera ### Configuration -The `CameraCfg` class defines the configuration for camera sensors. It inherits from `SensorCfg` and controls resolution, clipping planes, intrinsics, and active data modalities. +The {class}`CameraCfg` class defines the configuration for camera sensors. It inherits from {class}`~SensorCfg` and controls resolution, clipping planes, intrinsics, and active data modalities. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | @@ -72,4 +75,50 @@ Retrieve sensor data using camera.get_data(). The data is returned as a dictiona | `normal` | `torch.float32` | `(B, H, W, 3)` | Surface normal vectors. | | `position` | `torch.float32` | `(B, H, W, 3)` | 3D Position map (OpenGL coords). | -*Note: `B` represents the number of environments (batch size).* \ No newline at end of file +*Note: `B` represents the number of environments (batch size).* + +## Stereo Camera + +### Configuration + +The {class}`StereoCameraCfg` class defines the configuration for stereo camera sensors. It inherits from {class}`CameraCfg` and includes additional settings for the right camera and stereo-specific features like disparity computation. + +In addition to the standard {class}`CameraCfg` parameters, it supports the following: + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `intrinsics_right` | `tuple` | `(600, 600, 320.0, 240.0)` | The intrinsics for the right camera `(fx, fy, cx, cy)`. | +| `left_to_right_pos` | `tuple` | `(0.05, 0.0, 0.0)` | Position offset `[x, y, z]` from the left camera to the right camera. | +| `left_to_right_rot` | `tuple` | `(0.0, 0.0, 0.0)` | Rotation offset `[x, y, z]` (Euler angles in degrees) from the left camera to the right camera. | +| `enable_disparity` | `bool` | `False` | Enable disparity map computation. *Note: Requires `enable_depth` to be `True`.* | + +### Usage + +You can create a stereo camera sensor using `sim.add_sensor()` with a `StereoCameraCfg` object. + +#### Code Example + +```python +from embodichain.lab.sim.sensors import StereoCamera, StereoCameraCfg + +# 1. Define Configuration +stereo_cfg = StereoCameraCfg( + width=640, + height=480, + # Intrinsics for Left (inherited) and Right cameras + intrinsics=(600, 600, 320.0, 240.0), + intrinsics_right=(600, 600, 320.0, 240.0), + # Baseline configuration (e.g., 5cm baseline) + left_to_right_pos=(0.05, 0.0, 0.0), + extrinsics=StereoCameraCfg.ExtrinsicsCfg( + parent="head_link", + pos=[0.1, 0.0, 0.0], + ), + # Data modalities + enable_color=True, + enable_depth=True, + enable_disparity=True, +) + +# 2. Add Sensor to Simulation +stereo_camera: StereoCamera = sim.add_sensor(sensor_cfg=stereo_cfg) \ No newline at end of file diff --git a/docs/source/overview/sim/sim_soft_object.md b/docs/source/overview/sim/sim_soft_object.md index 4e4130e8..27f9e8ba 100644 --- a/docs/source/overview/sim/sim_soft_object.md +++ b/docs/source/overview/sim/sim_soft_object.md @@ -1,16 +1,50 @@ # Soft Object -The `SoftObject` class represents deformable entities (e.g., cloth, sponges, soft robotics) in EmbodiChain. Unlike rigid bodies, soft objects are defined by vertices and meshes rather than a single rigid pose. +```{currentmodule} embodichain.lab.sim +``` + +The {class}`~objects.SoftObject` class represents deformable entities (e.g., cloth, sponges, soft robotics) in EmbodiChain. Unlike rigid bodies, soft objects are defined by vertices and meshes rather than a single rigid pose. ## Configuration -Soft objects are configured using the `SoftObjectCfg` dataclass. +Configured via {class}`~cfg.SoftObjectCfg`. | Parameter | Type | Default | Description | | :--- | :--- | :--- | :--- | -| `fpath` | `str` | `None` | Path to the soft body asset file (e.g., `.msh`, `.vtk`). | -| `init_pos` | `tuple` | `(0,0,0)` | Initial position `(x, y, z)`. | -| `init_rot` | `tuple` | `(0,0,0)` | Initial rotation `(r, p, y)` in degrees. | +| `voxel_attr` | `SoftbodyVoxelAttributesCfg` | `...` | Voxelization attributes. | +| `physical_attr` | `SoftbodyPhysicalAttributesCfg` | `...` | Physical attributes. | +| `shape` | `MeshCfg` | `MeshCfg()` | Mesh configuration. | + +### Soft Body Attributes + +Soft bodies require both voxelization and physical attributes. + +**Voxel Attributes ({class}`~cfg.SoftbodyVoxelAttributesCfg`)** + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `triangle_remesh_resolution` | `int` | `8` | Resolution to remesh the softbody mesh before building physx collision mesh. | +| `triangle_simplify_target` | `int` | `0` | Simplify mesh faces to target value. | +| `simulation_mesh_resolution` | `int` | `8` | Resolution to build simulation voxelize textra mesh. | +| `simulation_mesh_output_obj` | `bool` | `False` | Whether to output the simulation mesh as an obj file for debugging. | + +**Physical Attributes ({class}`~cfg.SoftbodyPhysicalAttributesCfg`)** + +| Parameter | Type | Default | Description | +| :--- | :--- | :--- | :--- | +| `youngs` | `float` | `1e6` | Young's modulus (higher = stiffer). | +| `poissons` | `float` | `0.45` | Poisson's ratio (higher = closer to incompressible). | +| `dynamic_friction` | `float` | `0.0` | Dynamic friction coefficient. | +| `elasticity_damping` | `float` | `0.0` | Elasticity damping factor. | +| `material_model` | `SoftBodyMaterialModel` | `CO_ROTATIONAL` | Material constitutive model. | +| `enable_kinematic` | `bool` | `False` | If True, (partially) kinematic behavior is enabled. | +| `enable_ccd` | `bool` | `False` | Enable continuous collision detection. | +| `enable_self_collision` | `bool` | `False` | Enable self-collision handling. | +| `mass` | `float` | `-1.0` | Total mass. If negative, density is used. | +| `density` | `float` | `1000.0` | Material density in kg/m^3. | + +For Soft Object tutorial, please refer to the [Soft Body Simulation](https://dexforce.github.io/EmbodiChain/tutorial/create_softbody.html). + ### Setup & Initialization