Campus Autonomy - Autonomous Indoor-Outdoor Delivery Vehicle
Development of an autonomous delivery vehicle capable of navigating both indoor and outdoor environments within a campus setting
Project Overview
The “Campus Autonomy” project focuses on developing an autonomous delivery vehicle capable of navigating both indoor and outdoor environments within a campus setting. By assembling an Agile X HUNTER SE Ackermann model drive vehicle equipped with advanced sensors like LiDAR and panoramic camera, the project aims to address the complex challenges of autonomous localization, path planning, and navigation.


Key Technologies
Hardware Platform
- Vehicle Platform: Agile X HUNTER SE Ackermann drive vehicle
- LiDAR: Hesai PandarQT64 for high-precision environment perception
- Vision System: Insta360 Air panoramic camera for 360° visual input
- Navigation Sensors: Odometer and IMU for precise movement tracking
Software Stack
- Framework: ROS2 with Navigation2 package
- SLAM: Cartographer for simultaneous localization and mapping
- Path Planning: Global and local planners for optimal route generation
- Obstacle Avoidance: Real-time dynamic obstacle detection and avoidance
System Architecture
The system integrates sophisticated hardware and software components:

Key Features
1. Dual Environment Navigation
- Indoor Navigation: Precise localization in structured environments
- Outdoor Navigation: GPS-aided navigation with obstacle avoidance
- Seamless Transition: Automatic switching between indoor and outdoor modes
2. Real-time Obstacle Avoidance
- Dynamic obstacle detection using LiDAR point clouds
- Adaptive path re-planning for moving obstacles
- Safety-first approach with emergency stop capabilities
3. Modular Design
- Scalable architecture for future sensor additions
- Plugin-based navigation components
- Easy configuration and parameter tuning
Technology Stack
- Programming Languages: C++17, Python 3.8+
- Robotics Framework: ROS2 Humble
- Navigation: Navigation2 (Nav2) stack
- SLAM Algorithm: Google Cartographer
- Point Cloud Processing: PCL (Point Cloud Library)
- Computer Vision: OpenCV, ROS2 Image Pipeline
- Hardware Interface: ROS2 device drivers
- Simulation: Gazebo Classic
Experimental Results
Navigation Performance
- Localization Accuracy: ±0.2m in indoor environments
- Path Planning Efficiency: 95% success rate in reaching destinations
- Obstacle Avoidance: 100% collision-free navigation in test scenarios
System Metrics
- Real-time Performance: 20Hz sensor processing
- Battery Life: 4+ hours continuous operation
- Payload Capacity: Up to 10kg delivery capacity
Implementation Highlights


Future Work
Future development will focus on:
- osmAG Map Integration: Introducing osmAG map format into the Navigation2 stack
- Custom Global Planner: Replacing default global planner with osmAG Planner plugin
- Advanced Localization: Replacing AMCL with osmAG Localizer for improved accuracy
- Multi-robot Coordination: Enabling fleet management capabilities
- Weather Adaptation: Robust operation in various weather conditions
Project Team
- Lead Developer: Jiajie Zhang (zhangjj2023@shanghaitech.edu.cn)
- Co-developer: Yongqi Zhang (zhangyq12023@shanghaitech.edu.cn)
- Advisor: Professor Sören Schwertfeger
Related Resources
- Project Report: Campus Autonomy Final Report
- Demo Video: System Demonstration
- Code Repository: GitHub Repository
This project demonstrates the practical application of autonomous navigation technologies in real-world campus environments, contributing to the advancement of service robotics and autonomous delivery systems.