iTranslated by AI

The content below is an AI-generated translation. This is an experimental feature, and may contain errors. View original article
⚛️

Genesis: A Physics Simulator with (Planned) Native Support for Generative AI

に公開

Introduction

The physical AI engine "Genesis," which was recently announced, has been generating a lot of buzz. In this article, I will explain how to set it up and its basic usage.

https://x.com/zhou_xian_/status/1869511650782658846

What is Genesis

Genesis is a physics platform designed for general-purpose robotic and physical AI applications, developed by researchers from NVIDIA, Carnegie Mellon University, and others.
https://genesis-embodied-ai.github.io/

It features characteristics not found in other physics engines, such as native support for AI functions like data generation through natural language.

  • 🐍 100% Python
  • 👶 Easy installation & simple API design
  • 🚀 Speedup through parallelized simulation
  • 💥 Unified framework for diverse physical phenomena
  • 📸 Photorealistic ray-tracing rendering
  • 📐 Differentiable simulator
  • ☝🏻 Physically accurate and differentiable tactile sensor
  • 🌌 Generative data creation via natural language

Environment Setup

Here is the environment I used for testing:

  • Apple M1 (macOS Sequoia 15.2)
  • Python 3.11
  • Miniconda installed
    • Note: There was an issue where the viewer wouldn't start due to an OpenGL error when using Python via uv Issue#11

Installation

Record of my failed attempt with uv

I followed the Getting Started guide in the official documentation.

uv add genesis-world

Install PyTorch if it's missing.

uv add torch

To verify the installation, I'll try running the sample code from 👋🏻 Hello, Genesis.

hello.py
import genesis as gs
gs.init(backend=gs.cpu)

scene = gs.Scene(show_viewer=True)
plane = scene.add_entity(gs.morphs.Plane())
franka = scene.add_entity(
    gs.morphs.MJCF(file='xml/franka_emika_panda/panda.xml'),
)

scene.build()

for i in range(1000):
    scene.step()
uv run python hello.py

Execution was completed, but the viewer did not appear, accompanied by the following warning:

[Genesis] [17:00:00] [WARNING] Non-linux system detected. In order to use the interactive viewer, you need to manually run simulation in a separate thread and then start viewer. See `examples/render_on_macos.py`.

It seems that to use the interactive viewer on non-Linux machines, you need to manually run the simulation in a separate thread and then start the viewer.

As stated in the warning, I tried running examples/render_on_macos.py instead.

uv run python render_on_macos.py --vis

I hit the bug mentioned in #11, and since it looked like it would take time to resolve, I decided to rebuild the environment using Miniconda.

While the steps in the official Getting Started guide are simple, in my environment, it sometimes failed to work depending on the installation order, so caution is needed. Issue#61 has also been reported.

If PyTorch is not installed, install it first.

conda install pytorch::pytorch torchvision torchaudio -c pytorch

Install genesis-world.

pip install genesis-world

To verify the installation, let's try running the sample code from 👋🏻 Hello, Genesis.

hello.py
import genesis as gs
gs.init(backend=gs.cpu)

scene = gs.Scene(show_viewer=True)
plane = scene.add_entity(gs.morphs.Plane())
franka = scene.add_entity(
    gs.morphs.MJCF(file='xml/franka_emika_panda/panda.xml'),
)

scene.build()

for i in range(1000):
    scene.step()
python hello.py

The execution was completed, but the viewer was not displayed, accompanied by the following warning:

[Genesis] [17:00:00] [WARNING] Non-linux system detected. In order to use the interactive viewer, you need to manually run simulation in a separate thread and then start viewer. See `examples/render_on_macos.py`.

It seems that to use the interactive viewer on non-Linux machines, you need to manually run the simulation in a separate thread and then start the viewer.

As suggested in the warning, let's try running examples/render_on_macos.py instead.

python render_on_macos.py --vis

The viewer launched successfully, and I was able to confirm that the arm fell freely to the floor.

Basic Usage

Initialization

In the first step, you need to import and initialize Genesis.

import genesis as gs
gs.init(backend=gs.cpu, precision="32")
  • Backend Device: It supports cross-platform usage. Here, gs.cpu is specified, but you can switch to other backends like gs.cuda.
  • Precision: The default is f32 precision, but you can switch to f64 precision by specifying "64" if higher precision is required.

Creating a Scene

All objects, robots, cameras, etc., are placed within a scene.

scene = gs.Scene(
    sim_options=gs.options.SimOptions(
        dt=0.01,
        gravity=(0, 0, -10.0),
    ),
    show_viewer=True,
    viewer_options=gs.options.ViewerOptions(
        camera_pos=(3.5, 0.0, 2.5),
        camera_lookat=(0.0, 0.0, 0.5),
        camera_fov=40,
    ),
)

In this example, we set the simulation dt to 0.01s per step, set the gravity, and configure the initial camera pose for the viewer.

Adding Objects

In Genesis, all objects and robots are represented as Entities. Since it is designed with an object-oriented approach, you can interact with these Entities directly through methods instead of using handles or assigned global IDs.

plane = scene.add_entity(gs.morphs.Plane())
franka = scene.add_entity(
    gs.morphs.MJCF(file='xml/franka_emika_panda/panda.xml'),
)

The first argument of add_entity is a Morph type, which includes primitive types such as:

  • gs.morphs.Box: Box
  • gs.morphs.Sphere: Sphere
  • gs.morphs.Cylinder: Cylinder
  • gs.morphs.Plane: Plane

You can also load 3D models created with other tools. The currently supported formats are:

  • gs.morphs.MJCF: MuJoCo XML file
  • gs.morphs.URDF: URDF (Unified Robotics Description Format) file
  • gs.morphs.Mesh: Mesh assets (*.obj, *.ply, *.stl, *.glb, *.gltf)

Running the Simulation

Build the assets added so far and run the simulation.

scene.build()
for i in range(1000):
    scene.step()

Note that you must call scene.build() to build the scene first. This is because Genesis uses a JIT compiler that compiles GPU kernels on the fly for each run, requiring an explicit step to start the process.

Other Features

Parallel Simulation

The biggest advantage of accelerating simulations using GPUs is that scene-level parallelism allows robots to be trained in thousands of environments simultaneously.

When building a scene, you can set the number of environments required for the simulator simply by passing the n_envs parameter.

import torch

B = 20
scene.build(n_envs=B, env_spacing=(1.0, 1.0))

# Also specify the batch size when inputting via the controller
franka.control_dofs_position(torch.zeros(B, 9, device=gs.device))

Note that you also need to specify the batch size for the controller input.

However, it is a great feature that even if you forget to specify it, Genesis automatically adds the dimension for you and proceeds with the execution along with a warning.

[Genesis] [17:51:31] [WARNING] Input tensor is converted to torch.Size([20, 9]) for an additional batch dimension

Inverse Kinematics and Motion Planning

Inverse kinematics calculates the joint angles and movements required to move the robot's end effector to a target position, while motion planning plans the path for the robot to reach that target, taking obstacles and the environment into account.

Note that you need to install the OMPL (Open Motion Planning Library) module beforehand to perform motion planning.

# Get the end-effector link
end_effector = franka.get_link('hand')

# Calculate the position and posture just before grasping using inverse kinematics
qpos = franka.inverse_kinematics(
    link = end_effector,
    pos  = np.array([0.65, 0.0, 0.25]),
    quat = np.array([0, 1, 0, 0]),
)
# Position to open the gripper
qpos[-2:] = 0.04
path = franka.plan_path(
    qpos_goal     = qpos,
    num_waypoints = 200, # 2s duration
)
# Execute the planned path
for waypoint in path:
    franka.control_dofs_position(waypoint)
    scene.step()

# Adjustments to reach the final position, as there may be a gap 
# between the target and current position in controller-based control.
for i in range(100):
    scene.step()

Simulation of Non-Rigid Bodies

In addition to the rigid body simulations we have done so far, Genesis also supports physics solvers for fluid dynamics and more.


Image: Genesis HP

Future Development Roadmap

The official documentation also includes a roadmap.

Among them, the "Comprehensive Generative Framework" is very exciting, as it is a feature that converts prompts described by users in natural language into data of various modalities, such as motion.

Ongoing and Upcoming Features

  • Differentiable, physics-based tactile sensor module
  • Differentiable rigid body simulation
  • Tiled rendering
  • Fast JIT kernel compilation
  • Comprehensive generative framework
    • Character motion
    • Camera motion
    • Interactive scenes
    • Facial animation
    • Locomotion policies
    • Manipulation policies
  • Unbounded MPM (Material Point Method) simulation for large-scale environments

Requested but Not Currently Under Development

  • Viewer and headless rendering on Windows
  • Interactive GUI system
  • Support for more MPM-based material models
  • Support for more sensor types

Summary

Amid the growing excitement around software development using generative AI, its application to robotics with AI as the "intelligence" is rapidly expanding.

Conventional physics simulators often had issues with design and document usability, giving me the impression of a high learning curve for newcomers, including myself. However, as stated in its long-term mission, Genesis lowers those hurdles and provides a platform that makes physics simulation accessible to non-experts and individuals.

I intend to continue exploring the potential of this project and delve deeper into its practical applications while continuing to use it myself.

See you again soon!

Side Note

  • Is the camera position control used to record the videos in this article way too difficult?
  • I thought about participating in an Advent Calendar, but there weren't any suitable category slots left at the last minute, so I'm posting this independently.
GitHubで編集を提案

Discussion