Autonomy Foundations equips learners with a practical understanding of how artificial intelligence shapes robotic perception and autonomy. Using the NVIDIA Jetson Nano and JetBot platform, this course offers a hands-on approach to core, real-world robotic capabilities, including Networking, Collision Avoidance, Path Following, AprilTag Navigation, and SLAM. Autonomy Foundations provides a comprehensive foundation for operating and maintaining intelligent robotic systems.
Additional Resources
Autonomous Systems are comprised of hardware and software enabling machines to operate independently. In this unit, participants will configure their JetBot, including software, network requirements, assembly, and initial operation.
Autonomous systems like the JetBot can be configured to navigate using pre-programmed routines, operator teleoperation, or a blend of both. This unit guides participants through motion control, precise navigation techniques, and teleoperation.
Sensors enable robots to perceive their environment and make autonomous decisions. In this unit, participants perform sensor integration, utilizing GPIO (General Purpose Input Output), digital inputs and outputs, an IMU (Inertial Measurement Units), motor encoders, IR (infrared) cameras, and LiDAR.
Introduction: Sensabot
GPIO on the Jetson Nano
About Breadboards
Integrating an LED Output
Robot Configuration: Bumper Sensor Assembly
Integrating a Bumper Sensor Input
Mini-Challenge: Morse Code
Robot Configuration: Data Collection
Navigational Elements vs. Payloads in Robotics
IR Camera Overview and Testing
Expansion Board Sensors Overview and Testing
LIDAR Overview and Testing
Sensor Investigation Part 1: Illuminated Environment
Mini-Challenge: Illuminated Environment Map
Sensor Investigation Part 2: Dark Environment
Mini-Challenge: Dark Environment Map
Unit Challenge: Sensabot Inspection
Unit Quiz: Basic IO and Sensors
To prepare to navigate unknown environments, autonomous systems are often trained with data from known environments. This unit emphasizes the importance of data collection and labeling for applications like Collision Avoidance and Path Following. Participants will perform supervised learning techniques, utilizing Classification for detecting obstacles and Regression for path prediction.
Introduction: Cargo Unmanned Ground Vehicle
Robot and Environment Configuration
Introduction to Collision Avoidance
Collision Avoidance: Data Collection
Collision Avoidance: Model Training
Collision Avoidance: Model Optimization
Mini-Challenge: Figurine Safety
Mini-Challenge: Collision Avoidance in Low Light
Introduction to Path Following
Robot and Environment Configuration
Path Following: Data Collection
Path Following: Model Training
Path Following: Model Optimization
Unit Challenge: UGV Challenge
Unit Quiz: Collision Avoidance and Path Following
AprilTags are a special type of marker that allows a robot to know its precise position (localization) and orientation (pose estimation) for accurate navigation. In this unit, participants calibrate the camera to improve AprilTag detection accuracy and leverage ROS (Robot Operating System) to perform waypoint navigation with the AprilTag markers.
Simultaneous Localization and Mapping (SLAM) is a technique used in robotics to build a map of an environment while simultaneously keeping track of the robot's location within it. In this unit, participants configure ROS to communicate over a network, allowing the JetBot to transmit LiDAR data used to generate a high-fidelity map in an Ubuntu Virtual Machine.