CS 426 Senior Project - Spring 2024, UNR, CSE Department

RoomSense: Autonomous Robotic Assistant

Team Number: 37

Instructors: Sara Davis, Devrin Lee, David Feil-Seifer

External Advisors: Christos Papachristos

Short Project Description: Our project is a robotic assistant designed to autonomously navigate and identify objects within any room. The robot is capable of scanning its environment, mapping the layout, and detecting items. Once this process is completed, it will inform the user of the identified objects. It is easy to use by anyone, including those who might have trouble moving around or seeing well. Lastly it includes a simple interface so that people with all levels of digital literacy are able to use it.

Meet the Team

Gabriella Charalampidi

Gabriella Charalampidi

Emily McCue

Emily McCue

James Kolby

James Kolby

Object Recognition

Autonomous Navigation

More details: Our project sets out to create an intuitive robotic assistant capable of autonomous navigation, object identification, and the acquisition of accessible information across diverse environments. Built using ROS Melodic and the Hiwonder Jetauto Pro robot, our primary goal revolves around the creation of a user-friendly system tailored to the specific needs of individuals grappling with mobility or visual challenges. The core objectives include seamless execution of scanning, mapping, and precise object identification, all converging to deliver a product that helps aid an individual's independence and overall inclusivity. Our user base spans individuals with varying levels of digital literacy, ensuring that the technology we develop transcends barriers and is accessible to a broad and diverse audience. The programming landscape involves the languages Python and C++, orchestrated by the ROS Melodic operating system. To enhance our object identification capabilities, we incorporate the COCO dataset with YOLO. The hardware backbone of our project lies in the Hiwonder Jetauto Pro, a sophisticated robot equipped with LiDAR systems and 3D cameras, providing a strong unit of tools for environment perception and object detection. Practically, our robotic assistant integrates a network of sensors for environment perception, precise actuators for controlled movement, and robust processing units to handle intricate computational tasks. However, our project extends beyond regular functionality; it encompasses dependability properties such as reliability, security, and safety. Given the potentially sensitive and secure environments that the robotic assistant may navigate and map, we prioritize a testing regimen, security measures, and safety protocols. These measures collectively contribute to the creation of a reliable and trustworthy system, instilling confidence in users and stakeholders alike. In essence, our project is not just about crafting a new piece of technology; it is about delivering a valuable solution that caters to user needs while aligning with broader societal goals of inclusivity and accessibility. Through the fusion of cutting-edge technology and a user-centric approach, we strive to carve a path towards a future where autonomous robotic assistants seamlessly enhance lives and break down barriers for individuals facing unique challenges.

Project Related Resources