Diego Castro Estrada

MIT Department: Electrical Engineering and Computer Science
Faculty Mentor: Prof. Asu Ozdaglar, Prof. Ashia Wilson
Undergraduate Institution: Florida International University
Website: LinkedIn
Research Poster
Lightning Talk


Hello! My name is Diego Castro Estrada. I’m a rising junior and an international student majoring in Computer Science at Florida International University. I’m originally from Costa Rica and my native language is Spanish. I usually like to read in my free time, but I really like to play video games as well. I’m also a soccer aficionado. I have some previous experience in natural language processing (I’ve worked at Cognac Lab in FIU for over a year now), but my research this Summer concerns the theoretical side of machine learning. Specifically, I’ll be working with Dr. Asu Ozdaglar, Dr. Ashia Wilson, and Alireza Fallah on applying transfer learning to problems with constraints. My goal is to pursue a PhD and, eventually, to go into academia with the hope of expanding our knowledge of machine learning (and artificial intelligence in general). I also hope to enable future students, especially those from underrepresented backgrounds, to be passionate about research.

2021 Abstract

Transfer Learning with Constraints

Diego Castro Estrada1, Alireza Fallah2, Asuman Ozdaglar2, and Ashia Wilson2
1School of Computing and Information Sciences, Florida International University
2Department of Electrical Engineering and Computer Science,
Massachusetts Institute of Technology

Machine learning has proven to be a useful tool for solving problems in fields ranging from computer vision to medicine. However, the large amount of data required for training a model can make solving some problems difficult or outright impossible. Transfer learning is a framework developed specifically to tackle this weakness. Generally, transfer learning consists of leveraging the knowledge held by source models previously trained to perform a task similar to the desired one. This knowledge is then used as a shortcut to create a target model that performs satisfactorily on the desired task. However, simple transfer learning is still not enough to solve constraint-based problems (like those requiring that a model be differentially private or fair). We propose a modification to the usual transfer learning framework that will enable its use in such constrained settings. We train the source model and the target model simultaneously, minimizing a joint cost function with a distance term and a number of indicator functions for constraint sets, which ensure that the target model complies with our specifications. We find that preliminary results show promise and that they indicate that our framework might help expand the use cases for machine learning in the future.