Skip to Content

Leon Kipkoech

Leon Kipkoech

Leon, headshot

MIT Department: Media Arts and Sciences
Faculty Mentor: Prof. Danielle Wookd
Research Supervisors: Alissa Chavalithumrong, Scott Dorrington
Undergraduate Institution: Florida National University
Hometown: Nairobi, Kenya
Website: LinkedIn, Intern’s Website

Biography

Leon Kipkoech is an Information Technology senior at Florida National University specializing in Software Engineering. He is passionate about creating tangible and interactive technologies that solve real-world problems and push the boundaries of human interactions with computer systems. Leon has developed ubiquitous and accessible systems for Microsoft Unlocked, Google Design, and various other companies, impacting millions of users.
His research includes working with Dr. Danielle Wood at MIT Media Lab on machine-learning models for gesture recognition to improve human-robot interaction systems on the International Space Station and collaborating with Dr. Joseph Laviola at UCF to enhance VR situational awareness using a multidirectional sensor network.
Leon’s innovative tangible interaction projects have earned him top honors at prestigious institutions, including an invitation from the French government as part of the first-ever African delegation at an international XR conference. A cross-country athlete, Leon is also committed to fostering innovation by leading various engineering and technology organizations in South Florida and Africa. Leon aims to advance human-computer interactions and set development standards for tangible systems such as wearables and computer interaction in space.

Abstract

Space For All: Advancing Inclusivity Through Gesture-Based
Communication on the International Space Station

Leon Kipkoech1, Alissa Chavalithumrong2, Scott Dorrington3, and Danielle Wood2,3
1Department of Computer Science, Florida National University
2Department of Aeronautics and Astronautics, Massachusetts Institute of Technology
3Program in Media Arts and Sciences, Massachusetts Institute of Technology


Non-verbal communication on the International Space Station (ISS) needs to be improved. In this study, we focus on improving human-robot interaction (HRI) while enhancing inclusivity for individuals with hearing and speaking disabilities. We propose a comprehensive gesture-based communication system tailored for the ISS environment and assess the potential impacts on astronauts’ training and on
achieving the United Nations Sustainable Development Goals (SDGs), particularly SDG 10 on reducing inequalities. Our study identified common hand gestures suitable for intra-vehicular activities (IVA), how training would be incorporated for preflight astronauts, and how the model would be implemented on ISS robots, particularly the AstroBee. Due to space policy constraints, training limitations were noted in our study. Our methodology included a literature review, expert consultation, gesture development and testing, and machine learning-based training program design. We created a dataset of at least 100 samples across 2 gesture classes and evaluated various machine-learning models for gesture recognition. Our results show a validated set of space-specific hand gestures, a suggested training program for
astronauts and robots, and an assessment of the potential impact of achieving the SDGs. This research aims to enhance operational efficiency, safety, and inclusivity on the ISS, potentially setting new
standards for communication in space exploration

« Back to profiles