Autonomy Foundations with NVIDIA Jetson Nano

Autonomy Foundations equips learners with a practical understanding of how artificial intelligence shapes robotic perception and autonomy. Using the NVIDIA Jetson Nano and JetBot platform, this course offers a hands-on approach to core, real-world robotic capabilities, including Networking, Collision Avoidance, Path Following, AprilTag Navigation, and SLAM. Autonomy Foundations provides a comprehensive foundation for operating and maintaining intelligent robotic systems.

Additional Resources

Getting Started with JetBot (RxS)

Autonomous Systems are comprised of hardware and software enabling machines to operate independently. In this unit, participants will configure their JetBot, including software, network requirements, assembly, and initial operation. 

Basic Navigation with JetBot (RxS)

Autonomous systems like the JetBot can be configured to navigate using pre-programmed routines, operator teleoperation, or a blend of both. This unit guides participants through motion control,  precise navigation techniques, and teleoperation.

Collision Avoidance and Path Following with JetBot (RxS)

To prepare to navigate unknown environments, autonomous systems are often trained with data from known environments. This unit emphasizes the importance of data collection and labeling for applications like Collision Avoidance and Path Following. Participants will perform supervised learning techniques, utilizing Classification for detecting obstacles and Regression for path prediction. 

AprilTag Navigation with JetBot (RxS)

AprilTags are a special type of marker that allows a robot to know its precise position (localization) and orientation (pose estimation) for accurate navigation. In this unit, participants calibrate the camera to improve AprilTag detection accuracy and leverage ROS (Robot Operating System) to perform waypoint navigation with the AprilTag markers. 

SLAM with JetBot (RxS)

Simultaneous Localization and Mapping (SLAM) is a technique used in robotics to build a map of an environment while simultaneously keeping track of the robot's location within it. In this unit, participants configure ROS to communicate over a network, allowing the JetBot to transmit LiDAR data used to generate a high-fidelity map in an Ubuntu Virtual Machine.
Contact Us