I'm a senior Mechanical Engineering student at Tufts University interested in robotics, controls, and design.
Location:
Medford, MA New Windsor, NY
Contact Information:
leiahannes@gmail.com leia.hannes@tufts.edu
Project Highlights
Pancake Making Robot
CAD | Linkages | Python
Camera Line Follower
Python | Image Processing
Product Teardown
Solidworks | Analysis | Teamwork
Project Gallery
Robotic Car
CAD | Electronics | Python
IR Line Follower
Python | Sensors
Mechanical Gripper Robot
CAD | Electronics | Python
Ball Sorter
Python | Sensors | Design
Object Recognition Navigation
ROS | Image Recognition | Machine Learning
PCB Design
Electronics | CAD
Laser-cut Dragon
Electronics | Onshape | Lasercutting
CAD Highlights
CAD
Product Teardown: SKIL Cordless Screwdriver
Tufts University, Spring 2024: Engineering Design
Project Highlights:
Used Solidworks to create a 3D model of a SKIL cordless screwdriver
Conducted a task analysis and made a decomposition table for the full screwdriver
Worked in a team to create a full CAD model and assembly
Goal & Overview:
This was a team project in which we recreated an electric screwdriver in CAD using Solidworks. We conducted a task analysis on how the screwdriver was used, and then deconstructed it (pictured below). We then made CAD files for each part, and put them together in an assembly (exploded view above). I worked with the team on the task analysis and decomposition table, and individually created the CAD for the casing and the assembly.
Presentation:
Laser-cut Dragon
Tufts University, Fall 2022: Introduction to Engineering
Project Highlights:
Used Onshape to design parts to be laser cut to make a dragon with moving legs
Coded a makerpi with micropython to move the legs depending on whether a light sensor on top of the dragon was exposed
For the midterm for my introductory engineering class, we were instructed to make an animal that could move. I decided to make a dragon that could move its legs as though it was walking. I used Onshape to design all of the parts to be laser-cut, and screws as the axles for the gears. The makerpi is connected to a dc motor in the center of the dragon, which spins the central gears. The code for this project makes the motor start and stop depending on whether the rider is sitting on top of it.
Printed Circuit Boards
Tufts University, Fall 2024: Electronics and Controls I
Project Highlights:
Designed and ordered a printed circuit board (PCB) to take in 12V and offer a 3.3V, 5V, and 12V output stabilized with capacitors (top left)
Designed and ordered an H-bridge motor driver PCB with thickened copper traces to better handle temperature variation (top right)
For this project, I used KiCad to design PCBs which I then ordered and soldered components onto. The schematics for the voltage regulators (left) and the H-Bridge (right) are pictured below. To the right is the breadboard prototype for the H-Bridge, which has input pins that are connected to a KB2040 to control the direction of the motor.
Soldered PBCs:
Simple Electronic Game
Tufts University, Fall 2024: Electronics and Controls I
Project Highlights:
- Designed and ordered a printed circuit board (PCB) to take in 12V and offer a 3.3V, 5V, and 12V output stabilized with capacitors (top left) - Designed and ordered an H-bridge motor driver PCB with thickened copper traces to better handle temperature variation (top right)
Robotic Car
Tufts University, Fall 2024: Electronics and Controls I
Project Highlights:
In the first stage of a two stage project, worked with a team of 3 to design a robotic car controlled by GET requests sent to a raspberry pi that could drive up a ramp.
In the second stage of the project, redesigned the car to drive up the ramp autonomously and work with another team's car to push a large cylinder up the ramp.
For this project, we were required to create a robot that could drive up a ramp. We then modified it to communicate with a seperate robot on another ramp so the two could work together to push a large cardboard pole up the two ramps. The original robot (pictured above) has a relatively simple design. The two front wheels (3D printed with TPU) and connected to two DC motors, which are controlled by a raspberry pi sending signals through a motor driver. The back wheels (PLA) are set on a dowel and can spin freely. The robot was initially controlled theough a simple website with some Javascript code I wrote to send GET requests to the Pi, which was running Flask. The robot was controlled through the WASD keys on a keyboard for movement forwards, left, backwards, and right. The video below shows our first trip up the ramp. We ended up driving in reverse up the ramp since it was easier for the car to move that way.
The second stage of the project required some major adjustments to our original design. We were required to communicate with a second robot on a seperate ramp while pushing a tube. First, we needed to be able to go up the ramp autonomously. We achieved this by moving our green wheels much farther apart so they could act as a barrier on the side of the ramp and adding a small caster wheel under the front section (front view pictured left).
The second challenge we faced was creating a sensor so we would know if the two robots were aligned. We made a plow-like attachment for the front of the robot that consisted of two boards wrapped with aluminum foil. The front board was connected to 3.3V, and the back board had 6 distinct strips wired to the rapsberry pi. The video to the right shows our first prototype connected to a series of LEDs.
Finally, we updated our code to change our own speed, listen to requests from our partner robot to change speeds, and to send requests to our partner based on the feedback we recieved from our sensors. The final video below shows our robot (near ramp) working with another robot.
Other CAD Highlights
Tufts University & Personal Projects, 2020 to Present
Project Highlights:
This contains a selection of parts and assemblies I've deisgned using various different CAD software. All of the pictures below are of things that I have designed fully from scratch, mostly to be 3D printed or laser cut.
Pirate Ship, designed in Onshape for lasercutting. Utilized living hinges for the curve of the ship. (Tufts, Fall 2022)
Hair dryer, designed in SOLIDWORKS (Tufts, Spring 2024)
Pulley, designed in Onshape (Tufts, Fall 2022)
Mandalorian helmets, designed in TinkedCAD (Personal Project, 2021)
Captain America's Shield, designed in TinkerCAD (Personal Project, 2022)
Toy Car, desgined for laser cutter (Tufts, Fall 2022)
Tardis, designed in Onshape (Personal Project, 2023)
Merry-go-round, designed in Onshape (Personal Project, 2024)
Mechanical Gripper Robot
Tufts University, Spring 2025: Intro Robotics
Project Highlights:
Designed and built a robot with a mechanical gripper system to pick up and move a ball.
Used a stepper motor to actuate a gear and linkage system to firmly grip a ball.
The goal for this project was to build a gripper mechanism using gears, linkages, and a stepper motor to grab a ball and transport it 2 feet without dropping it. As an additional challenge, we also tried to automate the movement after picking up the ball. The entire assembly was created in Onshape (pictured left) and then lasercut out of wood. The two videos below show the ball being lifted and moved across the floor.
Overall, the project went fairly smoothly. We were able to start early and get the gripper mechanism working quickly, so we had time to make a wheeled platform driven by a DC motor to automate transporting the ball.
There were some challenges, however, with both the gripper design and the car. For the gripper, the original design for the base was too large, so the arms of the gripper couldn't fully close. We were able to design and fabricate a new base quickly since we were using a laser cutter, so this was not a major issue.
The other difficulty we faced was the weight distribution in the car. The back right wheel is driven by a DC motor and all of the other wheels can freely rotate. We were hoping that the weight of the stepper motor in front would balance out the DC motor in the back, but our car still ended up back heavy and drives along a curved path instead of straight forward. If we had more time to work on this project, I would want to redesign the chassis to more evenly distribute the weight so the car could drive straight.
Ball Color Sorter
Tufts University, Spring 2025: Intro Robotics
Project Highlights:
Built a ball sorter that can identify the color of a ball and move it to a corresponding box.
Used a RaspberryPi and Python code with a color sensor to identify ball colors, and actuate a stepper motor and a servo motor to move them to the correct basket.
The goal for this project was to create a color sensing robot that sorted through a random handful of red, blue, green, and yellow balls. We found a video on YouTube that served as our main inspriation. Most of he final components were laser-cut, with the exception of the base and back support structure. I designed the stand for the color sensor, the funnel for the balls, and the extension for the motor, which were all specifically designed for the laser cutter to save time on fabrication.
I also wrote the code for this project (me35_ballsorter.py) to detect the colors, move the balls, and move the stepper motor to properly sort the balls by color. Each color is initialized by storing its rgb results from the sensor. When a new ball is scanned, its results are compared against the stored ones for the known colors, and it is identified as the color of its closest match.
Overall, this project went well. The most difficult part ended up being getting everything aligned correctly and moving smoothly. We had difficulties getting the stepper motor to consistently place the balls on top of the sensor, which would in turn cause the sensor to read the wrong color. If we had more time I would want to work on making everything more secure so our motor movements could be more precise.
I did learn a lot from this project about working with sensors and calibrating them to be accurate in different environments. Our sensor was mostly accurate, but had some difficulties in bright lighting. To improve this, we could make a box around the sensing area to better control the enviroment, and edit the code to average more sensor readings.
IR Line Follower
Tufts University, Spring 2025: Intro Robotics
Project Highlights:
Built a robot that can follow a dark line autonomously using an array of IR sensors.
Design Process:
The goal for this project was to create a line-following robot using IR sensors. The chassis for the robot was designed in Onshape, and the code was written in python on a raspberry pi. We used an array of three IR sensors to determine whether we were on the line or if we needed to turn. If only the middle sensor detected the line then we assumed that we were on the line. If the other two sensors picked up a line then we made decisions about how to turn based on the data we were collecting.
Final Product:
Overall, this project went well. Our robot was able to consistently follow the line (left). If we had more time to work on it, I would want to move the sensors closer to the axis of rotation of the car so that they stayed centered on the line while we turned, and to increase the number of sensors so we could implement a more complex control system.
Camera Line Follower
Tufts University, Spring 2025: Intro Robotics
Project Highlights:
Built a robot that can follow a colored line autonomously using a RaspberryPi camera.
Used image processing techniques to filter out all but one colored line to follow.
Design Process:
The goal for this project was to use a RaspberryPi camera to identify and follow a colored line from the mat pictured left. The design for the robot was done in Onshape (bottom left). The pieces pictured were lasercut with tight enough tolerances that the joints stayed together without other adhesives. The bottom and top pieces were connected by 4 bolts with 3 locknuts each keeping everything in place. The large gap in the center is the window that the camera sees through.
Coding:
All of the code for this project was done in python on a RaspberryPi. The basic logic is as follows: the camera takes a picture and then a mask is applied to filter out colors not within a specified HSV range. The centroid of the average color is calculated and then the x-position is used to determine how to steer. We also implemented some basic PID control to better control our position over the line. The video to the right shows the what the camera is seeing as the car drives.
Final Product & Future Steps:
Overall this project went well. We were able to follow most of the lines consistently and quickly. We did have trouble getting the yellow and orange lines to work due to the sharp corners, so in a future iteration it might be good to try to get the camera more centered on the axis of rotation so it can detect the line while navigating sharp curves. It could also be interesting to try looking ahead more to identify that a turn is coming so we can slow down going into it.
Object Recognition Navigation
Tufts University, Spring 2025: Intro Robotics
Project Highlights:
Used ROS actions with the Create3 robot to navigate obstacles.
Trained a teachable machine to recognize pictures of different pokeballs in order to make decisions about how to turn.
Design & Coding Process:
The goal of this project was to train a teachable machine on pictures of different pokeballs which the robot then used to decide which direction to turn.
The Teachable Machine used was trained on around 200 pictures of different angles of each pokeball sitting on the floor taken with a PiCamera attached to a RaspberryPi. The machine was able to accurately identify all 7 different pokeballs, so we moved into working on the logic for the rest of the code. We used a subscriber to access the IR sensors on the roomba and took the values from the front sensor. When the robot detects something in front of it, it stops and takes a picture with the PiCamera. The picture is fed into the teachable machine and we get an output of which pokeball the machine thinks the picture is of. It was generally very accurate, reporting around 95% to 100% confidence that it had identified the pokeball correctly. After the pokeball was identified, the robot used a Rotate Angle action to turn 90 degrees to the left or right depending on which pokeball it saw.
Final Video & Future Steps:
The video to the right shows the roomba driving up to a pokeball, identifying which one it is, and then turning 90 degrees to go to the next one. Once it gets to the end and recognizes Nidoran, it spins around as a celebration to show that it has reached the end.
Overall, this project went fairly smoothly. If we had more time to work on it, it could be interesting to try to train the machine to work with different backgrounds since currently it is only trained with the balls on the floor.
Pancake Making Robot
Tufts University, Spring 2025: Intro Robotics
Project Highlights:
Worked with a class of 30 people to design and build a robot to autonomously cook and present a pancake.
Built and wrote code for a linkage mechanism to use a fork to remove the pancake from the griddle.
Goal & Overview:
The goal for this project was to create a fully autonomous pancake-making robot that could take orders, cook two pancakes, put toppings on them, and deliver them to a customer. As a class, we divided into small groups to focus on the different sections. I worked with the cooking group and specifically focused on the mechanism to remove the pancake from the griddle (pictured left).
Design Process:
After some testing we found that the easiest way to remove a pancake from the griddles we had was by tilting the griddles upwards and then using a fork to flip the pancake out. While the rest of my team worked on the mechanisms to open the griddle and raise and lower the table I worked on the fork mechanism.
My idea for the mechanism was inspired by a Klann linkage (diagram pictured bottom right). The linkage is designed for walking robots, but I thought the shape of the movement at the bottom (green lines) could work well to remove the pancake. The initial idea was to run the mechanism backwards and use the upwards and forwards motion to flip the pancake out of the griddle.
I designed the mechanism in Onshape (top right) and then laser-cut the pieces using screws with locknuts as the axels for each joint. There are also a few gears on the back to slow down the motor and increase the torque applied by the fork.
The three videos above from left to right show the initial linkage mechanism, an updated mechanism with limit switches added, and the first test with the full linkage system. I had to redo the lengths of the links in the initial system a couple times to get the fork to move the way I wanted it to. After that was done, I added the limit switches so that the fork could be consistently stopped in the correct location. The third video shows the system working relatively well, although it does bend the pancake quite a bit.
The next step was to attach the fork mechanism to the rest of the cooking mechansim and get everything working together, which ended up being one of the most challenging parts of the process. We had trouble getting things to line up correctly so the original movement was not consisntenly removing the pancake from the griddle. After more testing we realised that we could switch the direction of the fork's motion and use the straight-across motion to remove the pancake instead of trying to flip it. This did end up requiring us to grease the griddle with a lot of cooking spray to get it to work, however.
Coding:
I did also work on the code for the fork and full table mechanisms, which both operated fairly similarly. Since everything had to be autonomous, we used an API though Airtable to communicate the status of each station between groups. I worked out the logic for the two cooking stations which ended up being as follows: the batter is dispensed when the roomba arrives at the first station. Both griddles close, cook, then open. The first table raises, the fork extends and retracts to remove the pancake, then the table lowers and the roomba moves on. Once the roomba reaches the second station, the table raises, the fork moves, and then the table is lowered. I also developed the full protocol for communication between all of the teams.
Final Video & Overall Challenges:
The video to the right shows a full runthrough with the entire class's parts of the robot all working together. Overall, we were able to make a couple of pancakes and with a little bit more time I do think we would have been able to get everything working consistently and fairly smoothly.
The biggest challenge in this project was getting everything working together at the end. Even within the smaller group I worked with it was difficult to get everything lined up correctly. I've learned from this project that when I work on another big complex project like this I would want to try to set specific design goals much earlier in the process to try to make integration easier. If we had decided on exactly what height the table would raise to and exactly where the fork needed to land early in the process it would have been easier to design parts that could fit together without having all of the parts completed.