{"id":4557,"date":"2025-10-31T12:20:14","date_gmt":"2025-10-31T16:20:14","guid":{"rendered":"https:\/\/oge.mit.edu\/msrp\/?post_type=profiles&#038;p=4557"},"modified":"2025-12-09T12:06:04","modified_gmt":"2025-12-09T17:06:04","slug":"tanisha-shende","status":"publish","type":"profiles","link":"https:\/\/oge.mit.edu\/msrp\/profiles\/tanisha-shende\/","title":{"rendered":"Tanisha Shende"},"content":{"rendered":"<div class=\"wp-block-image\">\n<figure class=\"alignleft size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"2560\" src=\"https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-scaled.jpg\" alt=\"\" class=\"wp-image-4558\" style=\"width:200px;height:auto\" srcset=\"https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-scaled.jpg 2560w, https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-300x300.jpg 300w, https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-1024x1024.jpg 1024w, https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-150x150.jpg 150w, https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-768x768.jpg 768w, https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-1536x1536.jpg 1536w, https:\/\/oge.mit.edu\/msrp\/wp-content\/uploads\/sites\/2\/2025\/11\/ShendeTanisha-edited-2048x2048.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><\/figure>\n<\/div>\n\n\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<p><strong>MIT Department:<\/strong> Media, Arts and Sciences<br><strong>Faculty Mentor<\/strong>: Prof. Canan Dagdeviren<br><strong>Research Supervisor:<\/strong> Jason Hou, Shrihari Viswanath<br><strong>Undergraduate Institution:<\/strong> Oberlin College<br><strong>Website<\/strong>:<\/p>\n<\/div><\/div>\n\n\n\n<div style=\"height:0px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Biography<\/strong><\/h4>\n\n\n\n<p>Tanisha Shende is a rising senior at Oberlin College, double majoring in ComputerScience and Mathematics with a minor in Sociology and an integrative concentration inData Science. Tanisha is a human-computer interaction researcher in Dr. Shiri Azenkot and Dr. Andrea Stevenson-Won\u2019s labs at Cornell University, studying ways to make technology, especially extended reality, more accessible to disabled people. Last summer, she conducted research at Gallaudet University under Dr. Abraham Glasser to explore the effectiveness of augmented reality in the technical education of d\/Deaf and hard-of-hearing students. Tanisha is a student worker in the Office of Undergraduate Research and the chair of the MathematicsMajors Committee to support students\u2019 STEM success. She is also studying technology and foreign affairs through political research sanctioned by the U.S. State Department and as aStudent Delegate to the Athens Democracy Forum. Tanisha is broadly interested in applying extended reality and artificial intelligence to healthcare, education, and accessibility while ensuring responsible technology development. She plans to pursue a PhD in Computer Science alongside a JD or a technology policy degree.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Abstract<\/strong><\/h4>\n\n\n\n<p class=\"has-text-align-center\"><strong>A Mixed Reality System for Real-Time 3D Ultrasound: Improving Usability and Accuracy<\/strong><\/p>\n\n\n\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<p class=\"has-text-align-center\"><strong>Tanisha Shende<sup>1<\/sup>, Jason Hou<sup>2<\/sup>, Shrihari Viswanath<sup>2<\/sup>, and Canan Dagdeviren<sup>2<\/sup><\/strong><\/p>\n\n\n\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-group is-vertical is-content-justification-center is-layout-flex wp-container-core-group-is-layout-4b2eccd6 wp-block-group-is-layout-flex\">\n<p class=\"has-text-align-center\"><sup>1<\/sup>Department of Computer Science, Oberlin College<\/p>\n\n\n\n<p><sup>2<\/sup>Media Lab, Massachusetts Institute of Technology<\/p>\n\n\n\n<p class=\"has-text-align-center\"><\/p>\n<\/div>\n<\/div><\/div>\n<\/div><\/div>\n<\/div><\/div>\n<\/div><\/div>\n<\/div><\/div>\n\n\n\n<p>Ultrasound imaging is widely used for diagnosis and guidance due to its safety, cost-effectiveness, and real-time capabilities. However, traditional systems rely on 2D slices, demanding extensive training and spatial reasoning, which can lead to interpretation errors and operator variability. While 3D and 4D imaging have been introduced, interfaces for intuitive, real-time interpretation of volumetric ultrasound data remain lacking. We present a novel mixed reality interface that displays real-time 3D ultrasound data, aiming to enhance interpretability and usability. To evaluate its effectiveness, we are conducting a user study with both novices and experts in ultrasound imaging, comparing our system to conventional 2D, screen-based 3D, and mixed reality 2D interfaces. Participants perform object identification and structure reconstruction tasks; performance is measured by accuracy, speed, perceived workload (NASA Task Load Index), and qualitative feedback from semi-structured interviews. Results will inform improvements in user experience and system accuracy, with the goal of advancing ultrasound imaging interfaces toward greater interpretability and clinical utility.<\/p>\n","protected":false},"featured_media":4558,"template":"","profile_category":[23],"class_list":["post-4557","profiles","type-profiles","status-publish","has-post-thumbnail","hentry","profile_category-2025-interns"],"acf":[],"_links":{"self":[{"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/profiles\/4557","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/profiles"}],"about":[{"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/types\/profiles"}],"version-history":[{"count":3,"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/profiles\/4557\/revisions"}],"predecessor-version":[{"id":4850,"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/profiles\/4557\/revisions\/4850"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/media\/4558"}],"wp:attachment":[{"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/media?parent=4557"}],"wp:term":[{"taxonomy":"profile_category","embeddable":true,"href":"https:\/\/oge.mit.edu\/msrp\/wp-json\/wp\/v2\/profile_category?post=4557"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}