Host:
Eyal Heldenberg
Duration:
24:41
Release Date:
September 4, 2025
9
Dr. Gezzer Ortega, Assistant Professor of Surgery at Mass General Brigham, explores how AI-powered interpretation tools can expand language access in healthcare while preserving the human connection that builds trust. With over 26 million Americans speaking English “less than very well,” language barriers affect clinical accuracy, patient safety and provider wellbeing.
Nearly one in twelve Americans struggles with English proficiency comparable in scale to the nation’s diabetes population. This group faces higher risks of miscommunication, medical errors and reduced trust in care providers.
Dr. Ortega explains that language access isn’t just about translation it’s about understanding meaning, culture and emotion. He highlights how one word can hold opposite meanings across Spanish-speaking regions, underscoring the need for interpreters who grasp both language and context.
As the son of Dominican immigrants, Dr. Ortega grew up serving as an informal interpreter for his parents. That early experience now fuels his mission to make healthcare communication more equitable.
In his current work, he leads digital health initiatives for marginalized populations at Mass General Brigham, combining empathy and technology to address communication inequities.
Dr. Ortega envisions a future where artificial intelligence enhances, rather than replaces, human interpreters. AI can help:
But he emphasizes that “small talk matters.” Technology must preserve the personal exchanges such as greetings, reassurance, humor that strengthen trust between patient and provider.
Dr. Ortega’s message is clear:
“Technology should enhance, not replace, the human connections that define compassionate care.”
By pairing AI interpretation tools with professional interpreters, healthcare systems can achieve equity at scale without losing the empathy that makes healing possible.
this insightful conversation, Dr. Gezzer Ortega, Assistant Professor of Surgery at Mass General Brigham, draws from both his research expertise and personal experience as the son of Dominican immigrants to explore the complex landscape of language access in healthcare. With 26 million Americans speaking English "less than very well" a population nearly as large as those living with diabetes Dr. Ortega illuminates how language barriers create cascading effects throughout the healthcare system.
Unlike many academic discussions that focus solely on policy, Dr. Ortega brings a practitioner's perspective to understanding how communication breakdowns impact both patient outcomes and provider wellbeing. From his childhood role as an informal interpreter for his parents to his current work leading digital health initiatives for marginalized populations, he offers a nuanced view of how technology can enhance rather than replace human connection in medical interpretation.
The conversation reveals the hidden complexities within language communities themselves, where a single word can have opposite meanings across Spanish-speaking regions, and how these subtleties affect both trust-building and clinical accuracy. Dr. Ortega's optimistic yet realistic vision for AI-powered interpretation tools provides a roadmap for maintaining the "small talk" that builds patient relationships while scaling language access across healthcare systems.
Key Moments
On the Scale of the Problem (02:20): "There are 31 million people in the United States with diabetes. And we think that's a big problem. So 26 million is not far from that. So it's a big problem."
On Technology Evolution (04:50): "There was a technology that I looked at last week, it's already outdated. You know, it's moving so fast. I can't keep up with it yet, but I'm trying to."
On Patient Autonomy (12:40): "They were discussing an increased sense of autonomy when they use an AI tool... because it's in their control. It's in their pocket, it's in their hands, and they can guide the conversation."
On Dialect Complexity (15:45): "A word I would say, ahorita, which to me meant later... And I realized that in Mexico and in many Central American countries... that word means in this moment, right? It means right now."
On Building Trust (19:45): "I truly believe that patient care happens in the small talk... when we work with an interpreter... we become very transactional... You don't tend to ask about the dog or the cat or who lives at home."
On Future Regulation (22:30): "We need the technology sector and the healthcare sector and the clinicians to work together on these policies... from the very beginning for everyone to be in the same room."
Key Takeaways:
Eyal Heldenberg (00:10) Hi, everyone. Welcome to Care Culture Talks. My name is Eyal Heldenberg. I'm the CEO of No Barrier, an AI medical interpreter for health care professionals. I'm very delighted to have Dr. Ortega with us today. Gezzer, how are you?
Dr. Gezzer Ortega (00:25) I am pleased, Gezzer always. Thank you for having me. I'm excited to be here.
Eyal Heldenberg (00:30) Perfect. So we're to talk about many things, of course, language access, cultural competency. But before that, Gezzer, would you like to kind of give us your professional background and journey?
Dr. Gezzer Ortega (00:41) Yeah, I'm an assistant professor of surgery at Brigham and Women's Hospital, now Mass General Brigham, in the Department of Surgery, specifically within the Center for Surgery and Public Health. I lead a lot of the work focused on improving care for our patients who have been historically marginalized and or, I should say, kind of thinking about how we can use some of the digital health interventions that have come out to improve access and outcomes of surgical care.
Eyal Heldenberg (01:10) Amazing. I think, you know, we interviewed many kind of providers. I think you have a broader sense of the problem domain throughout your research and, root cause maybe. So just on a general level, when we try to think about this, language barriers, like what do we, what is the frontline? What are the providers and patients feel like in those kind of situations?
Dr. Gezzer Ortega (01:38) Yeah, no, I think it's a couple of things. One, the scope of the problem, right? There are 68 million people within the US that speak a language other than English, and of those, 26 million are considered speaking English less than very well. So that's usually the population that we're thinking about when we are trying to improve care or language access within our country.
And I bring that up because 26 million people sometimes doesn't sound like a lot, but there are 31 million people in the United States with diabetes. And we think that's a big problem. So 26 million is not far from that. So it's a big problem. It's a big issue that we have to confront. And then for clinicians in the healthcare space, it's extremely challenging when you can't effectively communicate with your patient. Because that is key.
It is vital to providing optimal care, right? Is being able to communicate a diagnosis, communicate a treatment plan and what to do moving forward. And so there are a lot of things that contribute to that when you can't do that, right? From everything from physician burnout, right? To just this feeling of not, you know, almost like letting your patient down, right? Because you don't have the language or the resources necessary to communicate with them effectively. And so I think we feel like an obligation to provide whatever resource we can to communicate with our patients. And I think it's part of the reason why there's so many workarounds that exists in this space.
Eyal Heldenberg (03:16) Yeah, there are many workarounds. This get by the situation and maybe it's more common than what we would like to think. So when we think about the different solutions out there, right? There is interpretation services, the get by solutions. Like how do you categorize them? What's best practice? What's not good, but we are doing it anyway. How do you see those?
Dr. Gezzer Ortega (03:42) Yeah, I think best practice in an ideal setting would be a human interpreter. I think that having a trained, certified medical interpreter in that language of the patient, but also someone that understands the medical language of that specific field that you're in is the ideal perfect scenario.
And it's important because there are so many languages where verbal cues, where nonverbal cues is part of the communication. And so that is captured when you have a person in there with you. But also they can capture the emotion, right? The facial expressions and some of these aspects of it. But that's not always available, right? We know that scaling that is extremely hard. And so there are other resources, right? Whether it be a phone, right? Using a telephone to reach a remote interpreter and have them communicate. We've seen video and iPads increasing in use as well, which are a little bit better because you have a video and you can interact with a person in that setting. Now we have the advent of AI that is going to really transform how we look at interpreters. But I think that that's something that I was telling someone earlier. There was a technology that I looked at last week, it's already outdated.
You know, it's moving so fast. I can't keep up with it yet, but I'm trying to. But I think it's going to be a game changer for the interpreter field as well.
Eyal Heldenberg (05:07) I totally agree on that. I wonder like, if you try to look, you know, two years from now, maybe three years from now, what would be possible, I would say perceptions on technology, maybe both on the provider side and on the patient side, what would be around that?
Dr. Gezzer Ortega (05:27) Yeah, it's tough because I think right now the perception of technology in general is that it is a replacer, right? Like sometimes when we see technology, we think, oh, it's going to take something that we know and replace it, right? And it's hard not to think of it that way, right? Because you think of like CDs now being replaced by MP3 and like you think of all these other technologies that have occurred. But I'm hoping that we see technology as an augmenter or as an enhancer, right? And not so much a replacer. There are going to be different roles and different tasks that are going to now be done by technology, similar to what we see in manufacturing, right? There are a lot of robots that do jobs that humans used to do. But I think that the big kind of within the next two or three years is seeing how the technology will evolve and how will it be implemented into the current workflows? Because to go back to your previous question of, you know, what's the ideal interpreter in the ideal setting? It depends, right? Because if I'm in a situation where, you know, we have a meeting scheduled on Thursday at three o'clock and I know that's happening and I know that you speak French, then I'm going to get that French interpreter, have them ready for our conversation.
But if it's something that's acute, unexpected, and I don't have an interpreter readily available, then maybe you use someone else that works in the hospital that speaks that language, or you go get the iPad, or in very extreme situations, you just use a family member, right? Because you need to communicate and you just need something, some resources. So the acuity sometimes, the urgency of the situation really makes it a little bit harder to do what's optimal.
Eyal Heldenberg (07:07) I totally agree. I think, there would be, maybe in five or 10 years, there will be some situations where you need humans. Imagine like a senior citizen that maybe doesn't hear very well. Imagine end of life decisions. You wouldn't want AI to interpret it because it's super sensitive. So maybe like many other verticals, AI would come and probably get some big chunk of things but we need humans for many use cases right?
Dr. Gezzer Ortega (07:36) Absolutely, right? And you mentioned those like end of life, goals of care planning and some of these situations where, or even when you have to deliver bad news, right? It's helpful to have, or a family meeting, right? It's helpful to have that person there with you serving as the interpreter and being able to read the room, right? What are the emotions? What's happening, right? What's going on? And so that's really helpful. But I agree that there are going to be many uses where technology is going to fill a lot of these gaps. And my hope is that in doing so, a patient that comes into the hospital or clinic or different healthcare settings won't have any gaps in their care that are not in a language they understand.
Eyal Heldenberg (08:18) Right, in every touch point there would be some solution that is relevant for that touch point.
Let's talk about, I get a lot about the low literature patients. I wonder if we could harness technology in those encounters, for example, that maybe the patient doesn't read very well or some level of understanding, if and how we can maybe think about it.
Dr. Gezzer Ortega (08:43) Yeah, I think that the first thing I'll say with low health literacy is that that's a separate unique concept compared to individuals with a non-English preferred language, right? Because I always say like, I can have, a patient that comes in from another country who's a PhD, super high health literacy, but they just don't speak English, right? They were born in another country, doctor in another country, right? But then they come here because they have... and while they were visiting the United States, they have appendicitis or they got injured and now need care, right? And so having a non-English preferred language doesn't necessarily always mean that you have a low health literacy, right? That's like the first thing I always try to debunk, right? Is this like connection. But there are people who have, you know, a non-English preferred language and also have a low health literacy, right? And so that makes the challenge even greater, right? Because you now have to address not only a language that you need to navigate, but you also need to navigate how you're going to communicate with this patient, right? And do it in a way where they can understand what you're saying. And I think technology affords some of that, right? We've seen some of the things that can quickly adapt to the different levels of literacy that a patient may have in the moment, which is helpful. I think we as clinicians also need to get better at describing or explaining the things that we do in a way that others can understand. And so sometimes we talk about a teach back method and, or I may say to the patient, tell me in your own words, what do you understand about what we're doing today and what the plan is? And one of the things I always tell the students and the people that I work with is that the confusion, that face of confusion is universal. No matter what language you speak, you look at someone's face and you can tell like they either get it or they don't, right? And if they don't, then you need to take a step back and say, okay, well, how do I explain this in a way where this patient will understand what the plan is? And it takes some work, some practice, but that's something that I'm hoping technology can help us do a little bit more efficiently, at least a little bit faster in those settings and think about ways where we could be more creative in describing what we want to communicate to our patients.
Eyal Heldenberg (11:01) Yeah, I totally agree. I think that technology could come and help to simplify concepts. Maybe, some doctor's you know, level of language is kind of medical language. And there are some use cases where technology would come, hey, we need to simplify that or here's an example to simplify. Maybe even just the speech part, for example, patient cannot read just to have the speech part and explainable part of it could come in hand. We talk a lot about provider autonomy and patient autonomy in a sense of freedom to operate. And I would say that language barrier, when it exists in the room, both autonomies, both parties kind of not at the optimal level. I wonder like what's your position on that technology around that and could can we or should we restore the same autonomy level as English speaking provider English speaking patient.
Dr. Gezzer Ortega (11:59) Yeah, I think that that's a great point. I was recently engaging with a couple of patients and they were discussing an increased sense of autonomy when they use an AI tool, like a Google Translate, or something like that, because it's in their control. It's in their pocket, it's in their hands, and they can guide the conversation versus what traditionally happens in our healthcare system where the clinician has to bring the interpreter to you or they have to bring an iPad to you, right, or a phone so that you can communicate with them. So there's this increased sense of control, but also, we've worked with technology that is on mobile phones. And so our clinicians have loved it because using the mobile phone, they told us if they have a mobile phone, it goes wherever they go. That means they have an interpreter wherever they go and no matter what setting they are. And sometimes you may be in one place of a hospital where you don't have, you can't find the iPad or you can't find the phone, but you have your mobile phone. And so your smartphone allows you to then quickly use an app or a technology that allows you to communicate with that patient at that time. And so there's this greater sense of empowerment that exists on both ends, right? On the clinician side, the provider side, as well as the patient side, because they have access to this technology and they can initiate it whenever they need it.
Eyal Heldenberg (13:24) Yeah, I didn't mention that you are Spanish speaking, right? You're coming from the community. Maybe we'll talk about this specific community. I wonder if you could describe a bit about special things, maybe providers should know when with the Latino community, with Spanish speaking communities, specific sensitivities, cultural and it's family politics or something like that.
Dr. Gezzer Ortega (13:52) Yeah, no, I mean, it's part of the inspiration for the work, right? My parents immigrated to the United States from the Dominican Republic and, they navigated our healthcare system, our country as primary Spanish speakers. And so early on, I was that conduit, I was that ad hoc interpreter for them in various scenarios and situations as I was learning English as a child and then navigating this multilingual, bilingual world and helping them get whatever new resources they needed. And so it's the reason that I do a lot of the research and work that I do in making sure that I can provide that access. And I've learned so many things in doing this work, right? One of the things that I've, I am from New York City, but I trained in Washington, DC. And I remember being in DC where although there's a large Latino community in both cities, in DC, the Latino community is predominantly South American. In New York, you have a larger Puerto Rican and Dominican community. And we have a phrase where, you know, a word I would say, ahorita, which to me meant later. And so sometimes I would tell a patient, like, I will come back later. I would say, I'll come back ahorita. And then when I moved to DC and I said the same word, the patients were confused and they were like, what do you mean you come back ahorita? And I'll come back later and they would say, and I would say, and they would say, no, ahorita means now. And I'm like, no, it doesn't say it means later. And I realized that in Mexico and in many Central American countries and South American countries, that word means in this moment, right? It means right now. And so I had to change the phrase that I use when I leave, you know, from an encounter where I'm speaking with someone in Spanish because of that. And so there are so many little nuances even within the Spanish language, right? Of like the same exact word that has a totally different meaning. And so we have to be mindful of that. And so, my hope is that, technology can pick up on those nuances, right? Because there could be a big difference between, I need something now versus I need something later, right? And it's the same word, but depending on where I'm from, I'm interpreting that in a different way. And so, little things that I've learned along the way and just how even within the same language, you have so many different meanings for one word, but also how people use a word varies. And so I think that it's a very humbling, but also kind of an experience that makes me realize that we need to be thoughtful about how we address some of the nuances and dialects within the different countries that may speak the same languages in technology.
Eyal Heldenberg (16:38) Yeah. We heard different stories of you have patients speaking one dialect of Spanish and bringing medical translator that speaks just another dialect. And it's an overlap of 50% of the terminology and kind of slows things down because it's not exactly your choice of words, choice a wording, the tone, the pitch, even the rhythm of the speech.
Dr. Gezzer Ortega (17:19) Absolutely. I mean, one of my favorite examples, and this is not the best word, but it depends on where you're from. And so it's the word bicho. And so in Argentina, that word is like a toddler. But in Mexico, it's an insect, like a bug. But in Puerto Rico and in Dominican Republic, it's slang for penis. And so the same word has a very different meaning depending on where you are, right? And so I think that, and there are many words like that. And so I think you're absolutely right with, you have to be mindful of where the person is coming from so that when you're communicating, you understand what their meaning of that word is so that you can make sure you're on the same page.
Eyal Heldenberg (18:06) Yeah, yeah, absolutely. I wonder what's your take on, trust as a patient when you approach and have those touch points with, you know, the healthcare system, you know, trust in general and trust for limited English proficiency patients. So what's your take, what you learned throughout the years?
Dr. Gezzer Ortega (18:30) Yeah, I mean, my take on trust is that we as clinicians have dropped the ball sometimes on making a good first impression. I think to inherit trust and to work with trust, you need to make a really good first impression. I think when you don't, then you feel like you need to work to regain a person's trust or build it back because we've already lost it. And so I think that part of it is figuring out how to make that really strong first impression. And I think for patients with limited English proficiency, I think that making sure that you have the resources they need to effectively communicate makes a very strong first impression, right? Having an interpreter ready or having a resource or technology ready for them to use, but then also providing them with that resource throughout their entire care, lets them know that you want to hear what they have to say. And then the second thing I'll say about that is I truly believe that patient care happens in the small talk. And so when we have an opportunity to communicate effectively in English, I can ask you about your dog and your cat and who you live with and all these things that may not be directly related to why you're here in the hospital or in the clinic, but are important to you and are important for me to get a full picture of who you are, right? And so, for instance, if I know what your living situation is, then I may say, okay, well, maybe we need to get you rehab or a nurse at home or something because you live alone, right? Or, I may learn that you have a lot of support at home. And so, I will feel more comfortable sending you home and letting you do these things at home because you have a lot of support. But all of those things happen in small conversations and the small talk. And sometimes when we work with an interpreter or we're using another medium to communicate, we tend to forget that. We become very transactional, right? Like we just say, well, tell me what I need to know I get what I need to get and then that's it, right? You don't tend to ask about the dog or the cat or who lives at home. And so I think that what we have to be mindful of is as we, to build trust with this community, right? We need to continue to do the same things that we do, even when we're working through an interpreter or a technology, right? Not to lose that human touch in these conversations, right? Because that's where we'll be able to learn what the full scope of the patient and what we can do to really take care of them optimally.
Eyal Heldenberg (21:07) Yeah, this is a great insight. I wonder, let's talk a bit about medical translation and regulation, policy, what's now? How do you think this will evolve in the next couple of years?
Dr. Gezzer Ortega (21:23) A lot. It's going to continue to evolve. I think that currently CMS requires that anything that is AI translated be audited by a human. And so right now we're using AI technology in various spaces, especially like translation of documents is a common use of AI, but that still needs to be vetted and audited by someone, linguist that can you know, verify that the translation was correct. I think that that will get better with time. I think that we will find various use cases where we may not need to have everything vetted by an interpreter or a linguist. I do think that there are certain aspects that are gonna be a little bit more challenging than others, like the informed concept process, right? Is one in where there's a legal component to it and there's a contract that's being signed by two parties. And what does it mean to have that translated by or interpreted by AI, right? Or even interpreted, translated and then by AI and then vetted by a human, you know, right? Like we still have to work through these things. But a lot of technology, we tend to be very reactive when it comes to policies, right? We see what happens and then we say, okay, well now we need a policy to address this, versus proactive, right? And thinking about, but I think, what's important is that we need the technology sector and the healthcare sector and the clinicians to work together on these policies, right? We need from the very beginning for everyone to be in the same room thinking about these things, because if not, we could miss an opportunity have a policy that's not going to, that may have a un-consequential adverse effects or.
Eyal Heldenberg (22:48) Yeah.
Dr. Gezzer Ortega (23:09) we could miss an opportunity have a policy that's not going to, that may have a un-consequential adverse effects or.
Eyal Heldenberg (23:20) Yeah, yeah, makes sense. So maybe just one last question for today. Are you optimistic towards the future and your vision to it?
Dr. Gezzer Ortega (23:32) I'm super optimistic, but I'm also an optimist by nature. But I think that the future is going to be one in where technology is going to allow us to provide better care to our patients if we use it right, right? If we use it effectively, if we're smart about how we incorporate it, I think it has an opportunity to allow us to provide better care, but also from an equity standpoint, equity lens, I think it may allow us to really fill a lot of gaps that inequities have caused within our healthcare system. And if we can do it in a way where we're thoughtful and mindful about the populations that we're caring for, I think we can have a great impact on the overall health of our patients.
Eyal Heldenberg (24:25) Amazing, this is a great closure. Thank you very much, Dr. Ortega for being in Care Culture Talks. I appreciate it.
Dr. Gezzer Ortega (24:33) Thank you for having me. I appreciate it.