Participants autonomously navigate a static series of obstacles using simple command sequences. They plan flight paths, input sequential commands with a block-based IDE, and execute basic autonomous patterns (launch, wait, recover). This exercise emphasizes the importance of proportionality calculations and addresses the challenges of open-loop control.
AprilTags are a special type of marker that allows a robot to know its precise position (localization) and orientation (pose estimation) for accurate navigation. In this unit, participants calibrate the camera to improve AprilTag detection accuracy and leverage ROS (Robot Operating System) to perform waypoint navigation with the AprilTag markers.
Simultaneous Localization and Mapping (SLAM) is a technique used in robotics to build a map of an environment while simultaneously keeping track of the robot's location within it. In this unit, participants configure ROS to communicate over a network, allowing the JetBot to transmit LiDAR data used to generate a high-fidelity map in an Ubuntu Virtual Machine.
Participants challenge each other to piloting or maintainer tasks of their own making. Themes for each day include flying challenges, teamwork challenges, head-to-head challenges, and troubleshooting and repair challenges.
Skills Developed: Reinforcement of prior unit skills, including piloting, maintenance, teamwork, and troubleshooting.
Reinforcement Learning is a type of machine learning where a robot learns to make decisions through trial-and-error. In Autonomous Racing, the robot learns through trial-and-error as it drives on the track. In this unit, participants collect data from the track, train and visualize the base model, and then provide feedback to the robot as it learns to race.
Participants work together to lift a payload using only their own drones. This exercise emphasizes mission planning, flight plan development, team coordination, and adapting to unexpected conditions. Students will calculate components of lifting force and understand how to sum and cancel force vectors from multiple sources.
Skills Developed: Mission planning, flight plan development, team planning, coordinated multi-robot operation, considering risk, adapting to unexpected conditions, calculating relevant components of oblique forces.
In this unit, you will learn how to use the sensors on the SPIKE Prime virtual robot. You will program your robot to perform simple sensing behaviors and respond to objects in its environment.
Python-based Coding Practice with Virtual SPIKE Prime provides students with a fully programmable robot that allows students to develop their programming and sensor integration skills using Python. The simulated environment is designed to resemble a robotics competition table, encouraging robotics teams to help prepare for competitions.
Python-based Coding with Virtual SPIKE Prime
lego spike primevirtualelementary schoolmiddle school
Block-based Coding Practice with Virtual SPIKE Prime provides students with a fully programmable robot that allows students to develop their programming and sensor integration skills using Blocks. The simulated environment is designed to resemble a robotics competition table, encouraging robotics teams to help prepare for competitions.
lego spike primevirtualelementary schoolmiddle school
Program the Strawberry Sorter to make decisions so that it can... sort strawberries! You will use if-then-else blocks to program simple decisions and looped decisions. The robot design for this unit is the Strawberry Sorter.
In this unit, you will learn about the basics of electrical soldering and get comfortable with the tools. You will learn about soldering tools (soldering iron, solder, flux, etc.), soldering safety procedures, what makes a good solder joint (tinning, heating, applying solder), and checking continuity with a Multimeter.
In this project, students are asked to take inventory of the parts that are available in a kit. Students must keep track of progress using a checklist, and perform a series of tests with the system to ensure proper functionality.
In this unit, students will assemble their robot step-by-step by building and connecting three core subsystems: Mobility, Power, and Control. Once assembled, they’ll run test programs to verify that each part is working correctly—laying the foundation for future programming and navigation challenges.
This unit introduces students to sensors: A Touch Sensor (in the form of whiskers), and a Light Sensor (Phototransistor). Students will learn how to integrate the sensors with ShieldBot, and then program the robot to sense its environment. Students will learn programming concepts along the way like variables, if/else structures, incrementing, and comparators.
In this unit, participants explore how robots can detect and make decisions based on colored objects using the HuskyLens AI camera. Participants train the camera to recognize specific-colored objects and then modify code to respond with different, intelligent actions.
In this unit, participants learn to configure the HuskyLens AI camera to detect and track AprilTag markers. They explore how machine vision enables robots to identify specific tags and trigger programmed behaviors, supporting tasks like localization, positioning, and robot navigation.