R2-D2


MechE Team Lead: Ian McKenzie, Rebecca Cooper
ECE Team Lead: Rohan Agarwal, Shanee Lu
CS Team Lead: Rong Tan

The original R2D2 Project focused upon creating a semi-autonomous lab assistant that could navigate and map out its surrounding environment. Since last year, the team has expanded upon R2’s ability to interact with its surroundings enabling the droid to complete tasks such as getting food from a fridge, opening a door, recognizing and greeting individual people, and even firing a nerf dart at a target. To generate excitement and interest in robotics and engineering, the team aims to advertise the R2 project and generate interest in its design process through a Kickstarter campaign. This hopefully will work toward the long-term goal of having our R2 robot appear in a Star Wars movie.

Head

The team is working on integrating the nerf blaster, a periscope, a lens aperture winking mechanism, and the head structure. We are also designing the new head and additional internal and external features on the R2 robot.

Precision Arm

We are designing a high precision robotic arm for use on R2. This arm has 5 degrees of freedom and will allow R2 to perform various tasks including waving hello, opening a door, and picking up a pen.

Nerf Gun

Our goal is to create an automatic nerf gun placed inside R2D2’s body with two degrees of freedom. It is controlled by an arduino and integrated with facial recognition.

Locomotion

We are designing the new head and additional internal and external features on the R2 robot. This includes a head nodding mechanism and features like the nerf blaster attachment in R2's drawer.

Path Planning

The team is working on enabling R2 to traverse across different terrains and avoid obstacles, through the development of algorithms and the use of sensors.

Facial Recognition

The goal is to recognize the faces of Cornell Cup members and then check in for attendance. We are currently working on having faster transmission between the programs, creating a better facial recognition algorithm, and testing for Google Docs auto-fill API for check-in.

Strong Arm

We are designing a durable robotic arm intended for performing operations requiring more strength for R2. This arm will be used for more higher-force operations, such as holding open doors and picking up heavy objects.

Object Detection

The goal of the objection detection project is to be able to classify and locate an object from an image capture to guide R2 around the lab. Our eventual goal is to create a pipeline that can train a model to recognize objects in the lab given a set of training data.

Speech & Sentiment Analysis

The purpose of this project is to make R2 able to react to sentiment in the user’s speech. R2 should react either negatively, by playing a sad noise on its speaker, or positively, by playing a happy sound on its speaker, based on the sentiment of the user’s speech.

Early concept art of R2

R2 at National Maker Faire

Proposal for R2

Minibot


MechE Team Lead: Natalia Zeller MacLean
ECE Team Lead: Rishi Singhal
CS Team Lead: Sophie Zheng

The MiniBot project aims to create a cost effective and intuitive learning platform for undergraduate and high school students to learn about the basics of robotics. The MiniBot is modular and easy to assemble so students can create anything from line followers and sumo bots to race cars with the system. The base will be compatible with both Vex and Lego pieces and will include custom electronics and modular assembly pieces. Additionally, there will be a simple user interface with a coding platform where student can quickly upload commands and code to the robot. Students will be able to do everything for the robotic design including electronics, assembly, and the coding commands.

AppDev

A mobile android app that controls the Buddybot. Functionalities include video streaming from the Minibot camera, overhead coordinate detection, and simple object detection using April tags.

Buddybot

BuddyBot is a mini-project, which increases Minibot’s usability and acts as a miniature companion robot to our R2 robot. BuddyBot is not meant to be approached as a project that will be licensed for educational purposes, but rather as a project that tests the limits of the Minibot technology by incorporating more advanced, novel features and technologies that novices may not be able to use freely. We hope to inspire young inventors to be able to participate in robotics through Buddybot.

Vision

Overhead vision uses 6 cameras that calibrate and detect the location of the Minibot in the map and sends this information to the mobile app.

Laser Tag

Minibot features a laser tag game, with an automated laser turret mounted onto the bot.

LCD Touchscreen

Our LCD screen on Minibot currently features a smiley face. We aim to add additional features, such as a weather display, a simple game, music player, and an alarm clock.

General Improvements

The subteam is working on housing for motors and wires, a soccer ball kicking mechanism, a M&M candy dispenser, and a claw to grab objects.

Prototype of Minibot

Minibot disassembled

Schematic drawings of Minibot

Past Projects

Over the years Cornell Cup Robotics has successfully created numerous projects. These projects range from a humanoid robot that is able to play RockBand with 98% accuracy, to an autonomous omni-directional rover named DuneBot, and even functional droids inspired by C3PO.

img02

Pogo

Click to Learn More!

View more
img02

Interactive Wall

Click to Learn More!

View more
img02

MagGame

Click to Learn More!

View more
img02

Modbot V2

Click to Learn More!

View more
img02

Pod-Racing

Click to Learn More!

View more
img02

C3PO

Click to Learn More!

View more
img02

Rock Band

Click to Learn More!

View more
img02

Dunebot

Click to Learn More!

View more
img02

Modbot V1

Click to Learn More!

View more
img02

I2C2

Click to Learn More!

View more