🌟 OmniPerception
Omnidirectional Collision Avoidance for Legged Locomotion in Dynamic Environments
CoRL 2025
Accelerated sensor simulation with GPU-optimized lidar sensors for large-scale robotics environments
📖 Table of Contents
- 🌟 Project Overview
- ⚡ Performance
- 🔧 Supported Hardware
- 🎨 Visualization
- 🚀 Getting Started
- 📚 Integration Guide
- 🤝 Contributing
📋 Project Status
| Component | Status | Description |
|---|---|---|
| 🎯 LidarSensor | 🚧 Released[Half] | High-performance GPU-accelerated lidar simulation |
| 🤖 Training Code | 🚧 In Progress | Reinforcement learning integration |
| 🚀 Deploy Code | 📅 Planned | Production deployment utilities |
⚡ Performance Highlights
- 🔥 Ultra-Fast Rendering: 250ms per step for 4,096 environments with 20,000 rays each (RTX 4090)
- 🎯 Multi-Sensor Support: 11+ lidar sensor types including Livox and Velodyne series
- 🌐 Multi-Platform: Supports IsaacGym, Genesis, Mujoco, and Isaac Sim
- ⚡ GPU Acceleration: CUDA-optimized ray tracing with Warp backend
✨ Key Features
| 🎯 Precision | ⚡ Performance | 🔧 Flexibility | 🌐 Integration |
|---|---|---|---|
| Realistic sensor modeling | GPU-accelerated computation | 11+ sensor types | multi-simulation platforms |
| Self-occlusion support | 250ms/4K environments | Custom configurations | Easy API integration |
| Noise simulation | CUDA optimization | Pattern-based scanning | Multi-robot support |
🔧 Supported Hardware
Lidar Sensors
|
Livox Series
|
Traditional Spinning
|
Simulation Platforms
- IsaacGym - NVIDIA's physics simulation platform
- Genesis - High-performance physics engine
- IsaacLab - Next-generation robotics simulation platform
- Mujoco - Advanced physics simulation
- Isaac Sim - Omniverse-based robotics simulation
🎨 Visualization Examples
Real-time Lidar Simulation
Livox Mid-360 on Unitree G1 Robot

Environment Scanning with Obstacle Detection

Features: Self-occlusion modeling, real-time point cloud generation, multi-environment support
💡 Development Status
📢 Important Note: This project is under active development. While the LidarSensor module is fully functional and optimized, we're continuously improving documentation and code structure. For any issues or questions, please open an issue.
🚀 Getting Started
📦 Installation
Prerequisites
- Python 3.8+
- CUDA 11.0+ (for GPU acceleration)
- One of: IsaacGym, Genesis, Mujoco, or Isaac Sim
Quick Install
# Install core dependencies
pip install warp-lang[extras] taichi
# Install LidarSensor
cd LidarSensor
pip install -e .Optional Dependencies
# For ROS integration (optional)
source /opt/ros/humble/setup.bash
# For advanced visualization (optional)
pip install matplotlib open3d🎯 Quick Usage
1. Basic Example - IsaacGym
# Generate self-occlusion mesh (first time only)
cd LidarSensor/resources/robots/g1_29/
python process_body_mesh.py
# Run example with Unitree G1
cd LidarSensor/example/isaacgym
python unitree_g1.py2. ROS Integration Example
# Start ROS visualization
source /opt/ros/humble/setup.bash
/usr/bin/python3 LidarSensor/LidarSensor/sensor_pattern/sensor_lidar/lidar_vis_ros2.py3. Custom Configuration
from LidarSensor.sensor_config.lidar_sensor_config import LidarConfig, LidarType
# Create custom sensor configuration
config = LidarConfig(
sensor_type=LidarType.MID360,
max_range=30.0,
enable_sensor_noise=False
)🚀 Platform-Specific Integration Guides
Choose your simulation platform for detailed installation and usage instructions:
📦 IsaacLab - Recommended for large-scale RL training
Native LiDAR integration with 7+ Livox sensor types and optimized performance
Key Features:
- ✅ Native
LidarSensorclass integration - ✅ Realistic Livox patterns (
.npyfiles) - ✅ Optimized for 1000+ environments
- ✅ Easy dataclass configuration
🎮 IsaacGym - For NVIDIA GPU-accelerated physics
Direct GPU ray-casting with Warp integration
Key Features:
- ✅ GPU-accelerated ray tracing
- ✅ Multiple sensor configurations
- ✅ Real-time visualization
- ✅ Flexible terrain integration
🌟 Genesis - For high-performance physics simulation
Modern physics engine with optimized LiDAR support
Key Features:
- ✅ High-performance physics
- ✅ Multiple robot platforms
- ✅ Realistic sensor modeling
- ✅ Cross-platform support
🤝 Contributing & Support
- 🐛 Issues: Report bugs or request features via GitHub Issues
- 📖 Documentation: Platform-specific guides in
/example/directories - 🔬 Research: Cite our work if you use OmniPerception in research
- 💬 Discussions: Join our community for tips and collaboration
Cite
@article{wang2025omni,
title={Omni-Perception: Omnidirectional Collision Avoidance for Legged Locomotion in Dynamic Environments},
author={Wang, Zifan and Ma, Teli and Jia, Yufei and Yang, Xun and Zhou, Jiaming and Ouyang, Wenlong and Zhang, Qiang and Liang, Junwei},
journal={arXiv preprint arXiv:2505.19214},
year={2025}
}