Proud to be part of the NACHC Accelerator Cohort 2026 π
No Barrier in NACHC Accelerator
Read Announcement βHost:
Rivka Allouche
Duration:
21:01
Release Date:
May 13, 2026

16
.jpg)
In this episode of the Care Culture Talks podcast, Rivka turns the tables on the show's usual host, Dr. Aurelio Muzaurieta, a resident physician in Stanford Emergency Medicine, to explore what language access really looks like on the front lines of emergency care. Dr. Muzaurieta speaks five languages, including Spanish, Brazilian Portuguese, French, Mandarin Chinese and English and uses them regularly with patients. Yet he estimates that roughly 40 percent of the people he treats have limited English proficiency (LEP), often in languages he doesn't speak, including Vietnamese, Russian and Tagalog.
The conversation walks through the three main interpretation modalities used in large academic health systems today: in-person interpreters, iPad-based video interpreting and phone-based call center services. Dr. Muzaurieta is candid about where each one works and where each one breaks down, from the hardware limitations of audio-only phone interpreting, to the lack of continuity when a new interpreter joins partway through an encounter, to the moments when a minute of patient speech comes back translated as three words and the clinician is left wondering what was lost.
A significant portion of the episode is devoted to the hidden equity costs of language barriers in healthcare. Encounters with LEP patients can take two to three times longer than encounters in English, raising real questions about time to attention, time to pain medication and time to clinical intervention. These are disparities Dr. Muzaurieta is now exploring through his own research. He also speaks to the discomfort of seeing family members, sometimes children or grandchildren, pressed into service as ad hoc medical interpreters and the undue burden that places on them.
The discussion then turns to artificial intelligence. Dr. Muzaurieta does not yet use AI for medical interpretation in his current training environment but he is closely watching the space and collaborating with the team at No Barrier on what the future of AI-powered language access could look like. He shares a measured, practitioner's view of AI in healthcare more broadly, including his positive experience with clinical reasoning tools and his more mixed experience with AI scribes, which in his case actually slowed his documentation down rather than speeding it up. The takeaway is a nuanced one. AI has clear potential to ease the friction of multilingual care, but only when it is designed for the specific complexity of clinical environments, including rooms where multiple people are speaking multiple languages at once.
Throughout the conversation, one theme keeps surfacing. Patients with limited English proficiency notice when their providers make an effort to meet them in their own language, and that effort shapes trust, comprehension and ultimately the quality of care they receive.
This episode is essential listening for emergency physicians, healthcare leaders, medical interpreters, health equity researchers and innovators building the next generation of language access and AI tools for healthcare.
β
00:00 Introduction to Multilingual Patient Care
02:00 Language Proficiency in Healthcare
04:57 Tools for Overcoming Language Barriers
08:52 Experiences with Call Center Interpreters
13:04 The Role of AI in Medical Interpretation
18:00 Patient Perspectives on Language Access
23:50 Future of Language Interpretation in Healthcare
β
Rivka Allouche (00:08)
I'm really happy to host Dr. Aurelio Muzaurieta today. He's a resident physician in Stanford Emergency Medicine, whose work sits at the intersection of emergency care and multilingual patient communication. Aurelio, you are usually the one behind the mic, hosting healthcare leaders on the Care Culture Talks podcast. But today, we are turning the tables and stepping into your clinical world. So this leads to my first question. Tell us how many languages you speak.
Aurelio Muzaurieta (00:38)
Rivka it's so nice to see you and thank you for having me on the podcast today. I speak Spanish, Portuguese, Brazilian Portuguese, French and Mandarin Chinese. These languages I've studied since I was in secondary school for the most part and had exposure in a multilingual and international community here in the United States that inspired me to learn more languages and to be curious about different cultures and different worlds, particularly with how diverse the U.S. is linguistically and culturally. So I often use these languages in the medical setting here in California. There are patients who speak way more diverse languages than I'm able to use. And so we are oftentimes using interpreting services as well in conjunction with any doctors or other advanced providers who speak languages. I'm very excited to be able to speak and treat patients in their own language. I would say we are usually doing Spanish, Mandarin, but other languages that I wish that I spoke in this area might be Vietnamese or Russian. Tagalog is very common, and many, many more.
Rivka Allouche (01:53)
So when you say very common, can you be like a little bit more specific in your day to day? Can you say like how many patients you meet who do not speak English?
Aurelio Muzaurieta (02:02)
I would say it's probably about 40 % of the patients that I would say not necessarily don't speak English, but I would probably rephrase that and think about it as limited English proficiency in the context of a healthcare setting. So many folks have a certain proficiency in operating in English, but when it comes to their health,
I don't know about you, but if I'm in a foreign country and I don't speak the language and something is going on with my health, I really would want to be able to use my own language to speak that to the other provider on the other side to make sure that they're able to understand what I'm saying truly because different nuances in a language around...
pain or discomfort or dizziness, headache, some of these concepts that we feel can be quite nuanced actually when it comes to translating them. And I find that that's one of the challenges of learning languages and practicing in a linguistically diverse health setting.
Rivka Allouche (03:06)
And so where you're practicing, what are the tools that are used to overcome language access or language barriers?
Aurelio Muzaurieta (03:16)
Absolutely. So it depends a bit on the environment that you're practicing in. Larger academic health systems, quaternary systems that have all of the highest technology and a lot of resources tend to use three different modalities, I would say, for the most part.
Most ideal is an in-person interpreter for a language. We do have these available 24-7 in Spanish, as well as Mandarin Chinese with some limited availability. That's very nice because they're able to see the patient in person and be side-by-side.
This is obviously a very difficult thing to make sure that you have 24-7. We also have iPads and phone interpreters that we use where we call in a live interpreter.
This is a great resource for mid to low acuity patients, not somebody who's coming in really in extremis, pretty sick. It's also a little limited in patients with age-related hearing loss who aren't able to engage well with the technology. So all that to say, there is a lot that we do now to try to fill the gap.
But it certainly isn't perfect and we're looking for different ways to expand language access for patients with limited English proficiency and really making them feel comfortable in the health setting, making them feel that all of their concerns, questions, and really what they're experiencing is being interpreted in the correct way, in an accurate way, and that they're getting the care that they need. So that's really the main goal here.
Rivka Allouche (04:58)
And so when you say that, you have in-house Spanish interpreters, so are they able to speak like Dominican Spanish and Cuban Spanish or like how does it work to understand the nuances and also the dialects?
Aurelio Muzaurieta (05:11)
That's a great question. It's most of the Spanish there, there is sort of a neutral Spanish that can be used. And there are certain phrases that I think people try to get around. And, know, as I am in my training phase here at Stanford, so, you know, most of what I'm kind of experiencing is within kind of that context. So, you know, my opinions are really kind of more of my own observation and not really of the grander university. I will say that when I see patients from different Spanish speaking backgrounds, most of the time that we're able to understand each other and there are going to be a few phrases or a few different accent differentiators that make things more difficult to understand. And when that's the case, I would say it slows down the conversation because we have to be able to use different words to go around. An interpreter is just one person who has their one experience and that is.
Rivka Allouche (06:13)
So like the provider is trying to catch if something slows down and maybe try another method.
Aurelio Muzaurieta (06:19)
Yeah, for the most part, let's say a word is used that the interpreter might not be familiar with in a different kind of dialect of Spanish, let's say, then they might ask to clarify that.
And to clarify a word that they're not used. And this happens in a lot of different languages. I've noticed when we're using interpreting services in Vietnamese, for example, there may be a word that's used by a particular generation that's different to describe a headache or dizziness that you want to kind of, and what I've noticed in the context of ensuring the correct interpretation is that we pause and say, hello,
I'm just going to clarify with a few sentences what this patient means by that phrase. I would say yes, yeah, that's a more academic way to be able to think about it. Yes, it's a bit of a teachback method, a bit of a clarification moment that is slower. And really, we want to ensure accuracy and precision, but there's also that element of
Rivka Allouche (07:05)
The teachback method ?
Aurelio Muzaurieta (07:25)
whether you're in an emergency department or you're in a fast-paced clinic. You you want to be able to be efficient and see everybody, you want to see as many patients as you possibly can while attending to all of the different needs of your each individual patient. And it's like, can be very challenging when you have a patient with language access needs. This can lengthen the encounter in the healthcare setting.
Two, three fold to get the information that you would get. And so it really becomes a question of equity and access that I think, working alongside some of the innovators at No Barrier and just thinking about the future of, language interpretation, I find that there's so many opportunities to fill this gap. And that's that's one of the reasons why I'm working with this group and trying to help kind and move things forward.
Rivka Allouche (08:19)
Thank you. And do you have yourself experience with calling call centers?
Did you happen to have this situation with your patient or maybe a nurse or someone working with you and share feedback about how it works, what could be improved there and what works actually well?
Aurelio Muzaurieta (08:34)
Despite having studied and being credentialed in several languages, obviously it's, they're never enough. It's always more. There are points too where I use interpreters to augment the languages that I do speak if I feel like there's stuff that I'm not understanding or that I need some clarification in. There's no shame in this. In fact, I think it's better for the patient in many ways. My experience with using call center interpreter services is a mixed experience. I will say it's better than not having anything.
It's depending on, I would say yes, one aspect is a little bit, is the technology itself, where it's all focused on audio for the most part. And so there, some patients really would appreciate maybe like a visual and audio aspect with
Rivka Allouche (09:09)
like slowing the process or...
Aurelio Muzaurieta (09:29)
being able to read what is being said, I think can help some patients. We don't really have that aspect when you're call ing on a phone. And then there's just the speaker and how loud it can be, like just some of the hardware stuff. And then the actual connecting experience is a little bit choppy because oftentimes you're getting
an interpreter for one aspect of the encounter, but they're not the same throughout the entire encounter with that patient. So if you're going into a room initially meeting the patient, you have an interpreter on the iPad or the phone, you're able to use that service and you can.
You have that interpreter and they might be a very nice person. And then you've, finish your questioning, you have to go, you're called for a code or a trauma and you go back and then you have to go deal with the emergency. And then you go and see this patient again and you're getting a whole different interpreter who has no idea kind of what's the context of the relationship. And you do have to like start over a little bit there and that can happen three, four or five times in an encounter. So there's a little bit of lack of continuity with the interpreter.
And then the quality of such can vary, it feels quite a lot. Sometimes...
there will be a conversation that kind of goes on back and forth and you're like, I only said three words and this was a minute of conversation. And then you'll get your something back that's only three or four words and you wonder what was said or if you missed anything. So obviously you never really know, but you try your best. And at the end of the day, my experience is mostly positive, but I feel like there could be so many more aspects of the service that would be provided. And anything that I think could be automated or augmented with artificial intelligence, I think really could improve the patient and doctor experience.
Rivka Allouche (11:23)
So the next question would be as a transition, do you yourself use AI for medical interpretations in your day to day?
Aurelio Muzaurieta (11:31)
In my current practice, no. And part of that has to do with the fact that I'm in the training phase of my work and I operate within a larger system that I need to respect. So the ways that I use interpreting services are more traditional ways that either the iPads or the phones or the bilingual credentialing system that I am a part of within the larger institution. I've tried it sort of outside of real clinical context and found it to be pretty exciting and pretty new. And I think that once I start practicing on my own, I would say that I'm excited to practice more, learn more about it and help advance some of the technologies from the medical provider perspective.
At the end of the day, my whole goal is let's make sure that we're providing the best care to our patients. And one aspect of that is being able to communicate with them accurately, precisely, smoothly, and compassionately.
Those barriers are automatically higher whenever you have a patient without English as their primary language or who would prefer to not speak English. One of the other things that I think this really helps with thinking about often the dynamics of a patient who comes in with English needs is the many patients that come in who elect not to use an interpreter and will use a family member such as a child or a grandchild to interpret for them because they feel more comfortable with that person interpreting for them. Unfortunately, the child or grandchild for the most part have no experience in medical interpreting and they know their family member very well. But in my opinion, it puts an unfair, undue burden on that family member to try to interpret all of the different medical complexities and nuances that a patient might be presenting with. I find that, and most of this fear or discomfort of a family member using an interpreter is kind of based in the idea that they probably have had some negative experience in the past with them and feel like they're not personal enough or it's not the best quality or they speak like the interpreter will come on and they speak a dialect or a certain regionality that's different from what they're used to. And so it distracts the patient. There's so many different ways that it becomes difficult. And that's sort of why I find this space very exciting because linguistically, if you're somebody who's lived in France and which, and Israel and places that speak so many languages in Europe and the Middle East. And so you're always used to having all of these different linguistic like.
Rivka Allouche (14:27)
and also language barriers as I have been an expat so I know
Aurelio Muzaurieta (14:30)
Yes.
And you feel that frustration sometimes, but also there can be, that's sort of what I'm thinking here.
Rivka Allouche (14:42)
Like you really want to understand, but you don't want to make a mistake. So I exactly know what this is.
Aurelio Muzaurieta (14:46)
Yeah, and you might say less. So my, I mean, my whole, my whole goal is to keep expanding and keep learning and using the technology in the way that we can safely. And then recognizing where technology's barrier is and AI's barrier is and the limitations there to be respectful of that as well.
Rivka Allouche (15:08)
So I assume that if AI is not in your day-to-day right now, maybe you use AI for other needs, for instance, AI scribes or other AI tools in healthcare.
Aurelio Muzaurieta (15:22)
I do. I've been learning and trying some of the different AI technologies. Some of the things that I've been using are some clinical reasoning tools. I think the one that has been most useful for my personal practice, and I don't really speak on behalf of my other colleagues within Stanford. It's mostly my own experiences at open evidence. I find very good for clinical reasoning. It typically pulls from resources that are well credited and renowned in the medical community and synthesizes complex clinical decision-making. I think it's becoming very ubiquitous in practice. There are AI scribes that I've actually had pretty mixed experiences with as a provider. I find that the experience actually makes the documentation a little more difficult ironically. It's still not quite at the level where you're sort of teasing out nuances in a medical encounter when a patient is explaining a certain story or part of their history that may not be fully related to the encounter. I found myself actually having to edit the created sort of interpretation of the encounter by AI and that takes me longer than just writing it myself. So I find that is not quite there. And this could be just the version that I'm using and maybe that there's a better one that I'm finding, but I actually had ceased using AI scribes because it makes me slower than seeing the patient myself. So that was a little bit of a bummer to realize, but also just like part of the, I think excitement around new technologies and kind of the feedback loop that we need to be having with trying to bring AI efficiently and effectively into the medical space and realizing that there are some things that are working and some things that aren't. And maybe that's just my personal experience with it. I do know other providers who use it and feel comfortable with it.
Rivka Allouche (17:25)
It's great to hear your perspective. You know, we talk about a lot of, the AI that is brought by the medical sites and you talked about the family members that sometimes are used as interpreters. But in the patient perspective, have you ever seen any patients taking just their Google Translates or their phones or any LLM just to help the conversation and just to have, in their hands their interpreters?
Aurelio Muzaurieta (17:52)
I have seen this. Most of the time it hasn't been the LLMs that are live back and forth. But I have seen them with particular phrases if a patient feels comfortable and they say, I want to speak English, but I don't know how to say pancreatitis, or I don't know how to say appendicitis, or pneumonia. I'll look how to say this and they'll write it and then show me. So there are ways that we kind of go about that. The things that have been most interesting to me are the patients who come in with their family members and then we try to use an interpreter and then multiple people are in the room trying to speak languages. I know that both AI and in-person or person interpreters have difficulty sort of hearing out who is talking which and what. Because the art in medicine of history taking typically doesn't just come from the patient. You'll get collateral history from a spouse or a child that really helps to put into context what's going on. And that can be a complex sort of dynamic with languages. In a way, I'm optimistic that AI technology is gonna be able to help sift through that a little bit more. And the more, obviously that these, systems learn is when we're giving more data and that they learn from this and we can kind of continue to just make it an easier experience for both patients, the health systems, the doctors, our nursing staff, everyone can kind of just more easily work together and that's what I'm mostly excited about.
Rivka Allouche (19:31)
So do you feel on a more personal level that people who are not speaking English as their primary language feel that providers are really making an effort just to fill that gap and to overcome those language barriers? Do they feel that? Do they feel this effort?
Aurelio Muzaurieta (19:51)
I wish I could put myself in that position as a patient within our healthcare system in the United States because I don't know if I have ever asked a patient if they feel like they've been ignored or they have had to wait longer because it's more cumbersome. There are a few research studies that I've been getting myself involved in that are trying to look at some of these disparities and experience between patients who speak English primarily and patients with limited English proficiency in terms of time to attention, time to pain medications, time to particular interventions and whether or not that there's a disparity there.
Based on the language barrier that's erected between the provider and the patient. I would venture to say there may be, but I don't have enough data to be able to tell you whether or not that's the case. We try our best, I'll say that.
Rivka Allouche (20:51)
Okay. That's true.
β