Jan. 13, 2023

What is the future of AI in surgery? Jawad Ali - Vality Partners

What is the future of AI in surgery? Jawad Ali - Vality Partners
The player is loading ...
What is the future of AI in surgery? Jawad Ali - Vality Partners

I always have a fun time speaking with Jawad Ali and it’s always a pleasure. Jawad is a practicing general surgeon, founder of Vality (a physician led consulting firm specializing in digital surgery, patient engagement and surgical devices) and founder of Austin MedTech Connect (a private community connecting founders, clinicians, investors and healthcare stakeholders). We talk about success, surgical innovation, physician burnout, medtech in Austin and goal setting. I hope you all enjoy this conversation as much as I did.

1
00:00:00,000 --> 00:00:07,000
Hi, everyone. I'm really excited to bring you this conversation with Jawad Ali. I always have a lot of fun talking to Jawad and it's always a pleasure.

2
00:00:07,000 --> 00:00:16,000
He is a practicing general surgeon, founder of Vality, which is a physician-led consulting firm specializing in digital surgery, patient engagement and surgical devices.

3
00:00:16,000 --> 00:00:24,000
He is also the founder of Austin MedTech Connect, a private community connecting founders, clinicians, investors, and healthcare stakeholders.

4
00:00:24,000 --> 00:00:35,000
We talk about success, surgical innovation, physician burnout, the medtech scene in Austin, and goal setting. I hope you guys enjoy this conversation as much as I did.

5
00:00:35,000 --> 00:00:50,000
Thanks so much for being here today, Jawad. It's always a pleasure speaking with you. I think to get started, let's start with your childhood. How has your childhood shaped you into who you are today?

6
00:00:50,000 --> 00:01:07,000
Excited to be here, Rashad. That's a good opening question. A little bit of a long answer. I was born in Pakistan. We left Pakistan when I was about six, moved to Saudi Arabia, lived in England for a year, moved to the US when I was in eighth grade, kind of moved around in the States for a little bit.

7
00:01:07,000 --> 00:01:21,000
I would say one defining aspect of my childhood was always moving and never being in one place for more than maybe three or four years and exposed me to different cultures, different environments.

8
00:01:21,000 --> 00:01:31,000
One, it kind of gave me a sense of being a global community. I feel like I look at the world as, you know, it's all one place versus like, this is my place, this is your place.

9
00:01:31,000 --> 00:01:58,000
Also, it gave me kind of this lack of permanence in a way, like things are always changing, you got to keep moving with it. And so I would say those are the two big things. Also gave me an appreciation for connection with people because I always had to kind of go to different places, connect with new people, and just give me an interest and a passion for meeting people and talking to them. And I'm always interested in people's stories. I would say those are kind of some big ways that it affected me.

10
00:01:58,000 --> 00:02:11,000
Thanks for sharing that, Joa. We are creatures of habit, and I have a similar path. I moved about 19 times by the time I graduated high school across the group in India as well.

11
00:02:11,000 --> 00:02:16,000
What I found is since then, I always want to continue moving every two years.

12
00:02:16,000 --> 00:02:17,000
Yeah.

13
00:02:17,000 --> 00:02:27,000
Is that something you found as well? And if so, how do you find peace and stability in staying in one place and not wanting to move every so often?

14
00:02:27,000 --> 00:02:44,000
Yeah, starting off on the deep end. So I would say, you know, like my wife, for example, she grew up in San Antonio, born and raised there. And for me, we always moved houses and so for like a home was like a very like stable thing, like this is our home.

15
00:02:44,000 --> 00:02:57,000
This is, you know, like this sacred place. And for me, it's like, this is where I go to sleep at night, you know, and so we moved from Austin to California and back.

16
00:02:57,000 --> 00:03:11,000
But now that we're back, I feel like I've really enjoyed settling down, establishing roots, establishing patterns, and maybe realizing like some of that aspect of always being moving does have a negative part to it.

17
00:03:11,000 --> 00:03:26,000
And trying to counteract that intentionally, where like I do try to like create, you know, long term connections with my community and with my local environment, you know, I mean, I always feel like we could always move at any time.

18
00:03:26,000 --> 00:03:38,000
But at the same time, I'm trying to like actively counter that by just thinking like, you know, having like a 10 year plan that doesn't involve moving, for example, you know, I think we're very similar in that regard.

19
00:03:38,000 --> 00:03:52,000
Let's go back to your undergrad and talk me through the decision to go to med school and just give me a brief synopsis on your experience in undergrad as well.

20
00:03:52,000 --> 00:04:02,000
Sure. Yes, I want to text A&M for those of you that are in Texas, they know like, it's kind of like a cult, it's kind of, you know, people make Aggie jokes all the time, but it's also a great school.

21
00:04:02,000 --> 00:04:11,000
My experience was not the traditional A&M experience, so like, I got a good scholarship to go there, that was a big part of the deciding factor.

22
00:04:11,000 --> 00:04:19,000
And then I lived, you know, off campus during, you know, what's called fish camp, which is like the freshman camp that everyone goes and kind of drinks the cool it a little bit.

23
00:04:19,000 --> 00:04:30,000
I was like, we were like visiting family in Pakistan the whole time. And then even when I came to move on campus I lived mainly in the engineering area with a lot of international students.

24
00:04:30,000 --> 00:04:40,000
And so it was a different experience I think a lot of, you know, people have in undergrad but it was also really cool because it was such a diverse community.

25
00:04:40,000 --> 00:04:57,000
As you know, like the international engineering community is, you know involves people from all over the world you know like not just Asia but also Africa, South America, and things like that so it kind of gave me a really good crowd to hang out with even

26
00:04:57,000 --> 00:05:09,000
the other guys you know was undergraduates was graduate students and things like that was also very involved in like the Islamic community there. And so it was a different aspect to, I think, my undergraduate years.

27
00:05:09,000 --> 00:05:19,000
But it was you know it was a cool time you know living away from home, kind of getting to be more independent kind of exploring both like socially and also academically.

28
00:05:19,000 --> 00:05:23,000
Being a background in engineering.

29
00:05:23,000 --> 00:05:33,000
I was thinking the same way where I didn't do not have a background in engineering, but I like breaking things down to first principles, thinking in a structured approach.

30
00:05:33,000 --> 00:05:35,000
That is both good and bad.

31
00:05:35,000 --> 00:05:43,000
It's good for decision making in business. It can be bad for decision making in social settings relationships.

32
00:05:43,000 --> 00:05:53,000
How has that thinking framework, and I'm making an assumption here do you have that thinking framework, and how has that helped you, and how does that hurt you.

33
00:05:53,000 --> 00:05:54,000
Yeah.

34
00:05:54,000 --> 00:06:00,000
So, let me just take a second to process that. I mean, I think,

35
00:06:00,000 --> 00:06:15,000
when I was younger, I would overthink a lot of things especially socially. And that would lead to either paralysis or almost like anxiety, or, you know, not taking chances in if you want to call it chances.

36
00:06:15,000 --> 00:06:29,000
And then as I got older, I began very conscientiously trying to overcome that. And part of that was just like educating myself on some of those things and a lot of people don't have to educate themselves on like social things but like you know,

37
00:06:29,000 --> 00:06:48,000
for me it was actually very useful to like learn about, not just you know what works and what doesn't but like how to put yourself in a mental framework that allows you to interact in social environments, easily and effectively and naturally that in a way that was true to

38
00:06:48,000 --> 00:07:01,000
me without trying to like copy some somebody else or some something else. And so like, you know, there's this model there's this model of you know, there's the world, and there's your perception, and between that there's a prism which is your mind.

39
00:07:01,000 --> 00:07:17,000
And so, your view of the world, inevitably goes through some kind of distortion. And I think one model was just breaking that down and like using kind of like you know like almost like CBT cognitive behavioral therapy in ways that allowed me to

40
00:07:17,000 --> 00:07:35,000
realize that I don't need to overthink things and things are generally going to be fine. And then also implementing this this thought model of like, it's more how you how you say it other than what you say that that is effective in social situations.

41
00:07:35,000 --> 00:07:52,000
And so it doesn't really matter what you say if you just like natural and say something in a good way then you know it's almost always fine. And so I don't know if that answers the question but I feel like I kind of use models to help overcome some of the overthinking that I was doing.

42
00:07:52,000 --> 00:07:54,000
I agree.

43
00:07:54,000 --> 00:08:00,000
We think of the world and reality in an objective manner.

44
00:08:00,000 --> 00:08:05,000
But I think our perception of reality is reality.

45
00:08:05,000 --> 00:08:07,000
Right, yeah.

46
00:08:07,000 --> 00:08:12,000
After getting getting into the matrix a little bit. Yeah.

47
00:08:12,000 --> 00:08:17,000
Tell me about the decision to go into general surgery.

48
00:08:17,000 --> 00:08:25,000
Why general surgery, and was that path clear on day one of medical school or not.

49
00:08:25,000 --> 00:08:36,000
Yeah, so I mean I think you answered a little bit about why med school I mean I think, you know, you can probably relate to this, I feel like people that come from the Indian subcontinent is either you go to medical school and that's the best thing or you do engineering

50
00:08:36,000 --> 00:08:50,000
and you know that's okay or like you know, you're kind of a failure, I feel like that's that's the mindset which is not a good mindset, but you know, that was a lot of it was like, I didn't really open myself up to like all the possibilities it was like either I do medicine or

51
00:08:50,000 --> 00:09:03,000
I do engineering. And so I did bio engineering and I was like I'm going to let me pick, you know, sometime, and I really like the environment actually have a practicing clinician, chatting to the ER said orthopedic surgeon I liked, you know, you're interacting with people

52
00:09:03,000 --> 00:09:14,000
a lot. You're making a lot of decisions, and you're doing like procedural stuff right so that's why I went into medicine. And then as far as general surgery I actually explored a lot of different specialties.

53
00:09:14,000 --> 00:09:28,000
I like cardiology like orthopedics, I like ENT. And then what really pushed me into general surgery was I liked trauma, because it was exciting and you also dealt with the whole body.

54
00:09:28,000 --> 00:09:39,000
So everything from head to toe. And then I liked a critical care, because there was so much physiology in there, and you know physiology in a lot of ways and as I was telling you earlier.

55
00:09:39,000 --> 00:09:47,000
I was almost a physics major, but physiology is a lot of physics, you know, you kind of dealing with equations dealing with physical principles.

56
00:09:47,000 --> 00:09:53,000
And so I thought that was really cool and you get to do surgery so that's why I went into general surgery.

57
00:09:53,000 --> 00:09:59,000
Thanks for sharing that you are then I can completely relate the options for me we're doctor a doctor.

58
00:09:59,000 --> 00:10:07,000
On the table. Okay, there you go. That sort of goal setting, while very motivating.

59
00:10:07,000 --> 00:10:13,000
And kind of keeps it going to an extent does not.

60
00:10:13,000 --> 00:10:18,000
It is not in line with happiness.

61
00:10:18,000 --> 00:10:28,000
How do you define happiness now, is it defined by continuous growth as a defined by meeting different goals.

62
00:10:28,000 --> 00:10:39,000
Or do you have a different definition is something I find myself struggling with is I always want more. My life was somewhat decided for me to an extent.

63
00:10:39,000 --> 00:10:44,000
And I had these goals I to achieve med school residency.

64
00:10:44,000 --> 00:10:46,000
And when I was there.

65
00:10:46,000 --> 00:10:49,000
I found myself deeply unhappy.

66
00:10:49,000 --> 00:11:00,000
I didn't have any further goals and I didn't really know who I was, and what I enjoyed because I had never been given the freedom or the opportunity to explore that side of myself.

67
00:11:00,000 --> 00:11:05,000
Is that something that resonates with you, and how do you define happiness.

68
00:11:05,000 --> 00:11:06,000
Yeah.

69
00:11:06,000 --> 00:11:20,000
So, I would say it definitely resonates with me I mean I think there was definitely parts of my life where I struggled with some of those things. I think, you know,

70
00:11:20,000 --> 00:11:30,000
I mean obviously if, if you are looking for external things like certain like monetary goals to be happy then you know it's never going to be enough.

71
00:11:30,000 --> 00:11:38,000
I think to me, if I am living aligned with my values.

72
00:11:38,000 --> 00:11:54,000
Then that provides an internal satisfaction that supersedes any kind of superficial goals and obviously like, you know you have Maslow's hierarchy of needs like if I'm not safe, I don't have a shelter and don't have food like doesn't matter if I'm you know living

73
00:11:54,000 --> 00:12:09,000
in a place where I'm like I'm not going to be that happy but I think once you can establish, you know, some basic like security around yourself and some basic relationships, then I think after that, to me, what provides happiness is being aligned with my values

74
00:12:09,000 --> 00:12:13,000
and living according to them you know if that makes sense.

75
00:12:13,000 --> 00:12:14,000
That makes perfect sense.

76
00:12:14,000 --> 00:12:28,000
So back to general surgery. When I was in medical school, the DaVinci robot, I believe was just coming out and there was a lot of buzz around it. Since then, I haven't been that involved in the surgical fields.

77
00:12:28,000 --> 00:12:34,000
What's the biggest advancement you feel since then and what are you most looking forward to in the future of surgery.

78
00:12:34,000 --> 00:12:48,000
Yeah, so I mean when I was in residency, you know DaVinci was coming out, and it was really polarizing a lot of surgeons were all in, you know this is kind of the next big thing and if you're not doing robotics you're kind of a caveman.

79
00:12:48,000 --> 00:13:00,000
The other surgeons were like this is just total hype. The data isn't behind it there's no real difference in outcome and adds so much cost. Sometimes outcomes are worse it's really just a consumer driven technology.

80
00:13:00,000 --> 00:13:17,000
And now I think we're finally coming into the era where robotics is showing its value beyond the mechanical benefits right so like, you know, for people who aren't that familiar with robotics, it allows a surgeon to control forearms, it allows you to have a steady

81
00:13:17,000 --> 00:13:23,000
camera allows you to have wristed instruments allows you to decrease tremor scaling motion.

82
00:13:23,000 --> 00:13:40,000
I think those benefits are all kind of like for robotic surgeons, they're like okay and but we're looking to the future and the future is integrating the surgical episode into the whole episode of care.

83
00:13:40,000 --> 00:14:02,000
So, you're using, you know, obviously, this is we're in the era of like big data right so you're using population level data to create a plan for each individual patient that involves, you know, a specific pre op regimen that involves you know

84
00:14:02,000 --> 00:14:18,000
integrating the data around the operation such as you know, like, operative length, not only that but specific motions, the every single instrument, the surgeon uses every single motion surgeon makes is quantified, and then you have additional functionality

85
00:14:18,000 --> 00:14:31,000
like if it's augmented reality or, you know, some kind of partial automation, or some kind of you know, video based guidance or assistance to help the surgeon, everything is recorded. The data is fed back.

86
00:14:31,000 --> 00:14:43,000
And then the patient outcomes are integrated so you can see, you know when the patient got this pre op regimen and had this specific operation in a very nuanced way, the outcomes were this and then you know you put it all together you feed it back.

87
00:14:43,000 --> 00:15:01,000
And so, you can really improve surgical care in a way that goes beyond the specific, like, mechanical modality, and also allows you to like really share best practices, you know democratize for lack of a better word, a really high standard of surgical care.

88
00:15:01,000 --> 00:15:04,000
That is a fascinating answer.

89
00:15:04,000 --> 00:15:08,000
I would like to go deeper in a few points.

90
00:15:08,000 --> 00:15:15,000
First of all, are you able to provide an exam, this is not it sounds like a fellowship question.

91
00:15:15,000 --> 00:15:26,000
But it's fine. It's fine. Are you able to provide an example of how population health and help individuals with pre optimization.

92
00:15:26,000 --> 00:15:33,000
And my second question is around efficiency within the or, but let's get to the first question first.

93
00:15:33,000 --> 00:15:42,000
Yeah, so I mean, you know this is not classically population of the for example, like body mass index right so like my fellowship was in bariatrics and so you know we use BMI a lot.

94
00:15:42,000 --> 00:15:58,000
And like anyone who uses BMI knows how, like, insensitive of a tool that is you know, like one person's healthy BMI and the person in healthy unhealthy BMI and you know so many factors aren't taken into account but like for hernia surgery right,

95
00:15:58,000 --> 00:16:13,000
you know, you can use BMI as a broad tool like for BMI is below 35 in general, you're, it's like satisfactory for surgery if it's over 35 you know it's not, but like that's not good enough at all, you know for one person BMI of 40 could be fine because

96
00:16:13,000 --> 00:16:27,000
just a very muscular person, one of those person BMI of 30 could not be fine because they're generally like a frail thin person but they have a lot of abdominal obesity, for example, you know, and that's not even talking about other risk factors right like maybe someone

97
00:16:27,000 --> 00:16:40,000
who has diabetes or Crohn's or something like that they have a different BMI criteria, and then that's still not getting into the specific patient right that maybe for one person it's 33.4 for another person it's you know 37.8.

98
00:16:40,000 --> 00:16:52,000
And so we're not like that's what I'm talking about in terms of like specific metrics that can lead to individualize P op targets for for for patients, you know stuff like that.

99
00:16:52,000 --> 00:16:57,000
So let's, let's go back to surgery, and let's go back to the or.

100
00:16:57,000 --> 00:17:10,000
So there are certain things that we do as humans that AI or machine might classify as inefficient, and I'll label two things and I may be off base here because I'm not a surgeon.

101
00:17:10,000 --> 00:17:12,000
One is field of vision.

102
00:17:12,000 --> 00:17:22,000
The first is that the field of vision and the periphery is just as sharp as central bars isn't and sometimes the camera to get a better field of vision.

103
00:17:22,000 --> 00:17:32,000
And the second is resting during surgery surgery is very tiresome. A lot of focus lot of concentration. And I wonder if there are certain movements.

104
00:17:32,000 --> 00:17:36,000
You may as a rest movement.

105
00:17:36,000 --> 00:17:39,000
And if those would be classified as an efficient.

106
00:17:39,000 --> 00:17:50,000
How do you balance efficiency, and just being human in this environment, and where do you say okay that may be more efficient, but that is not human.

107
00:17:50,000 --> 00:17:56,000
And as long as humans are operating. It is not reasonable to expect that.

108
00:17:56,000 --> 00:18:01,000
That's an interesting question. I mean I think some of those things you know like

109
00:18:01,000 --> 00:18:17,000
are better understood. If you take surgery as in a paradigm, apart from how humans classically do it people talk about like you know the chess analogy and like an AI can learn how to play chess in a totally different way than a human would play chess and

110
00:18:17,000 --> 00:18:20,000
so come up with moves that a human would never think of.

111
00:18:20,000 --> 00:18:32,000
I think in surgery is like that you know because like the way that surgeons operate is based on the way that humans think, you know, and so I think the operation may be totally different.

112
00:18:32,000 --> 00:18:43,000
If an AI was controlling it you know it may not, it may not even have the same steps you know. And so like I think right now the way surgery works is that it's step by step and each step is very narrow.

113
00:18:43,000 --> 00:18:52,000
And all surgery like let's say open umbilical hernia repair you know probably has like 50 steps, you know if you think about it.

114
00:18:52,000 --> 00:19:04,000
And so, and like something like super big like a heart transplant probably has like you know like 1000 steps you know. And so each one of those steps like if you think about it like step by step.

115
00:19:04,000 --> 00:19:14,000
It's not like an AI can really change it but if you think of the whole thing, and you think of the outcome as being the final result and then you feed in the information it may have a whole different way of doing it, you know.

116
00:19:14,000 --> 00:19:17,000
So, to get to your question.

117
00:19:17,000 --> 00:19:30,000
Right now, like the field of vision thing isn't that relevant because the surgeon is focused on one small thing. And so the periphery isn't really, you know there's nothing happening in the periphery because operation is designed to be focused on one thing at a time.

118
00:19:30,000 --> 00:19:44,000
And then, you know, you're saying question about the rest movements I mean, there's not really rest movements I would say I mean, you know, some sometimes you might have like your assistant doing something that's static like retraction or something like that, you know,

119
00:19:44,000 --> 00:19:46,000
and then.

120
00:19:46,000 --> 00:19:55,000
But, you know, for the surgeon, the surgeon is not doing that like for golf or surgery for example, the assistant is retracting the gallbladder the whole time pretty much you know.

121
00:19:55,000 --> 00:20:03,000
And so, you know, you don't have to do that but that's a that's a good role for the robot for example you know because you can just place an arm there retracts the gallbladder listed up and just stays there.

122
00:20:03,000 --> 00:20:15,000
You know you don't have to think about it. And so, you know, but like, but like there could be like, but if you had a human assistant for example sometimes they're attracting the gallbladder in a dynamic way that improves your the planes for example and so

123
00:20:15,000 --> 00:20:27,000
I think AI could learn how to be a dynamic protractor. That's probably one of the lower hanging fruit for it. But like I don't think there's necessarily like rest movements that the surgeon does at least not that we acknowledge, you know, to ourselves that this

124
00:20:27,000 --> 00:20:29,000
is like a rest movement.

125
00:20:29,000 --> 00:20:30,000
Okay.

126
00:20:30,000 --> 00:20:39,000
Yeah, AI and explainability is a fascinating subject and liability plays a big role in that.

127
00:20:39,000 --> 00:20:53,000
How far are we from AI completely autonomously performing a surgery. When will that happen you think and which surgery would it be.

128
00:20:53,000 --> 00:21:11,000
Yeah, I mean I think, in general, we are super far away I think more far away than people think, you know, because we're not even to where AI can like annotate the surgery properly, you know, for example, you know what I mean, in terms of like recognizing the anatomy

129
00:21:11,000 --> 00:21:26,000
and seeing what the steps are and all that stuff I mean we were kind of there but like not quite I mean and even that work took a lot of human effort to like manually annotate surgery so that you know to train the algorithm to recognize like you know, this is the liver

130
00:21:26,000 --> 00:21:40,000
this is the gallbladder this is assisted duck you know and then thousands of surgeries later now the gallbladder, the algorithm kind of recognize that. And so I think to have it to where, you know, if you take the standard of like self driving cars I mean, the standard has got to be higher for

131
00:21:40,000 --> 00:21:50,000
the surgery right because like instead of, you know, scraping your car on the side, it could be a bad outcome for a patient. And so I think we're actually farther away than we think.

132
00:21:50,000 --> 00:22:08,000
I'll throw out like 30 years as a number. I think, I think you know surgeries that are like easier to do will be for example, surgeries that involve solid structures, so you know there's no real motion, and then surgery that involved basically like

133
00:22:08,000 --> 00:22:24,000
some kind of removal to like prostatectomy, for example, you know, solid structure fixed environment, who just kind of taking it out if you will, you know, and so I think that's that's a that's a good one.

134
00:22:24,000 --> 00:22:35,000
I mean maybe like I hesitate to say this was not my specialty but like, you know, things like spine that you know you're intervening on a bony structure that's close to the skin.

135
00:22:35,000 --> 00:22:38,000
So, you know, things like that.

136
00:22:38,000 --> 00:22:44,000
Let's go back to your experience in residency as a, as a chief resident.

137
00:22:44,000 --> 00:22:50,000
Tell me about what did you learn about managing people.

138
00:22:50,000 --> 00:23:00,000
Do people change a lot. And how do you know if someone is amenable to change, or if they are more static in their ways.

139
00:23:00,000 --> 00:23:16,000
So, you know, background on me so I really struggled my intern year as a surgery resident, but then I really developed a mission statement based approach to my training, where every day I went into work and be like what is going to make me a better surgeon versus

140
00:23:16,000 --> 00:23:25,000
in the beginning, I was like what is going to make people happy. And I went in, when I went in with the mentality of what's going to make me a better surgeon and deliver better care that started doing better.

141
00:23:25,000 --> 00:23:34,000
And then as a result people were happy as well. And so, I was selected to be chief resident out of the other residents in my program who are also great.

142
00:23:34,000 --> 00:23:49,000
And I took it on pretty seriously, like I started doing things like I did the orientation, you know presentation and for the interns I revitalized our surgery skills lab and our curriculum.

143
00:23:49,000 --> 00:24:01,000
And I did like by annual, you know meetings with each resident to just see what their goals were for the year and see if they're meeting them.

144
00:24:01,000 --> 00:24:16,000
I kind of changed the way we rounded and so I tried to approach it from like, I read this like some HBR books and things like that you know try to approach it from like, you know, I'm leading a team perspective versus like, I'm trying to get through residency.

145
00:24:16,000 --> 00:24:21,000
And so I really enjoyed that aspect of something I hadn't really done at that level before.

146
00:24:21,000 --> 00:24:28,000
So, you know, as far as that experience I thought it was really good as far as do people change.

147
00:24:28,000 --> 00:24:36,000
You know, I think a friend of mine once told me, people, he's a psychiatrist he was saying you know people never change.

148
00:24:36,000 --> 00:24:53,000
Like basically like, once you know somebody like you know them, you know, and I think to some extent that's true. But I think people can not change who they are, but change what they do.

149
00:24:53,000 --> 00:25:08,000
And so, I think that's maybe the same thing, you know, I think, I think everybody has a potential for immense change, whether that is changing who they are or not, you know, is up for debate but I mean I think myself as example I feel like I've changed dramatically,

150
00:25:08,000 --> 00:25:20,000
you know, over the last 15 years. So I mean I think people can definitely change who they are and whether that's them changing everything about them and probably not. But I think I think people have a large potential for change.

151
00:25:20,000 --> 00:25:34,000
Yeah, I agree your thoughts and ideas about who you are is who you are to yourself, but your actions define who you are to others.

152
00:25:34,000 --> 00:25:42,000
And there is some tension there at times. I was in remediation in residency, and it's primarily because I wanted to be a hospitalist.

153
00:25:42,000 --> 00:25:57,000
My residency wanted to be an outpatient family doctor. I said no, I'm going to be a hospitalist and I said, Oh, you're on remediation. I said yes sir yes ma'am and I passed remediation and that was kind of the end of that.

154
00:25:57,000 --> 00:26:01,000
Talk me about the decision to start Vality Partners.

155
00:26:01,000 --> 00:26:08,000
And tell me about your favorite client that you've helped.

156
00:26:08,000 --> 00:26:20,000
Yeah, so I always wanted to have medical innovation and technology as part of my career, and actually work with a friend of mine now Dr Dan Peterson when I was a resident, he was starting a company called LFR Biosciences.

157
00:26:20,000 --> 00:26:31,000
And you know after we did surgeries together I would assist him and he would kind of do, you know product development and clinical trials and I got to be a part of that journey.

158
00:26:31,000 --> 00:26:44,000
And so that exploded into the world of startup medtech. And as I went through fellowship, I began meeting more people at the conferences I would go to the booths and talk to them and, you know, developed a few relationships as far as working with companies.

159
00:26:44,000 --> 00:27:00,000
And then after that it's kind of just a stepping stone approach of building out the infrastructure and engagement, like specifics, and it gave me a lot of not just like condition to my career, but a lot of autonomy and my practice is very

160
00:27:00,000 --> 00:27:16,000
satisfying. Also a lot of relationships and I just feel like the growth potential is really exciting you know as far as like having that additional aspect and I feel like I can relate to what you're doing in the investing world, as far as like you know you're building

161
00:27:16,000 --> 00:27:20,000
and growing something that's yours, you know.

162
00:27:20,000 --> 00:27:23,000
And so I think that's really good.

163
00:27:23,000 --> 00:27:28,000
Yeah, it's a different feeling having something that's yours and growing it.

164
00:27:28,000 --> 00:27:34,000
Talk to me about digital surgery, and we've spoken a bit about it.

165
00:27:34,000 --> 00:27:43,000
What tools. What new tools have you used in the past two years, how have surgical tools changed in the past five years.

166
00:27:43,000 --> 00:27:51,000
And within surgery and within the or what is some tool you wish you had that doesn't exist.

167
00:27:51,000 --> 00:28:01,000
Yeah, and you know I go my last question you said your favorite customer. And I just totally didn't answer it. I feel like your multi part questions are like throwing me off.

168
00:28:01,000 --> 00:28:11,000
But that's that's definitely on me. So, you know, I work with a company called care sense, and I've worked with them for gosh probably four or five years now.

169
00:28:11,000 --> 00:28:32,000
And they do digital patient engagement. And so we have like a weekly call with the CEO and another clinician after Totsmith, and we talked about just things that are coming up for the company issues with, you know, like expansion and new ideas as far as like, you know, trends

170
00:28:32,000 --> 00:28:41,000
and engaging with certain clients, and it's just a great way to like, you know, this Monday morning at seven. And so it's a great way to start the week, just thinking about it.

171
00:28:41,000 --> 00:28:47,000
And I've really enjoyed that long term relationship and kind of helping Charlie and his company.

172
00:28:47,000 --> 00:29:02,000
You know grow and engage with customers and kind of see what we can do to improve health care essentially. And so this been a really nice experience because it's combined with like almost like a long term like friendship, if you will.

173
00:29:02,000 --> 00:29:15,000
So yeah, just to answer that second part of the question. And then as far as digital surgery. So you know, working on a paper with stages that based on digital, it's kind of a abstract term, and it can mean different things to different people, but like we break it

174
00:29:15,000 --> 00:29:34,000
down into robotics, data capture, data analytics, connectivity, and advanced visualization, and some things within that for example are like you know, AI, tele mentoring, telesurgery, augmented reality, and then including like tele operation and autonomous surgery.

175
00:29:34,000 --> 00:29:38,000
And so, you know, that's kind of a little breakdown of what digital surgery is.

176
00:29:38,000 --> 00:29:56,000
The most exciting thing for me is really, like we talked about a little bit before, integrating the entire episode of care with the surgical operation. So people talk about the OR black box we're like you know right now.

177
00:29:56,000 --> 00:30:11,000
And so, you know, the first part of the surgery is the surgeons op report. Right, which is number one can be subjective. Number two very limited like you can't even see what happened. And so, opening that up in a way that is comfortable for the surgeon and the

178
00:30:11,000 --> 00:30:27,000
surgeon and allows for taking that immense amount of data and integrating it into the episode of care, you know, I think that's the most exciting thing and then using that to improve outcomes you know because that's what's going to improve outcomes more than like a specific

179
00:30:27,000 --> 00:30:35,000
tool. You know what I mean. Yeah, it's really just doing things better and understanding what we're doing, you know, and kind of iterating on that.

180
00:30:35,000 --> 00:30:47,000
And do you think the actual surgical process. Has that been fully optimized within the or itself, or is there more room for improvement there as well.

181
00:30:47,000 --> 00:31:04,000
Yeah, I mean you know there's so much room for improvement that you can tell by looking at the variation among different places, you know, and so, like turnover time can be 15 minutes one place, an hour another place you know cost cost can vary so much.

182
00:31:04,000 --> 00:31:20,000
You know, even between surgeon to surgeon like the same operation can take half an hour can take two hours you know. And so, there's so much opportunity there both from the facility side, the team side, the actual operation, you know, and then that doesn't talk about the

183
00:31:20,000 --> 00:31:32,000
management as far as, you know, like you know, enhance recovery after surgery, give us a good idea of like how to optimize pain control for example with some pre up meds basic post op regimen things like that.

184
00:31:32,000 --> 00:31:46,000
Let's talk about the refugee coalition, you're involved in, just tell me a bit more about it and then let's dig deeper into how we can best help developing countries.

185
00:31:46,000 --> 00:31:52,000
Sure. Yes, I was looking for an organization to be involved in, or the long term.

186
00:31:52,000 --> 00:32:02,000
And, you know, being an immigrant I can relate somewhat to the refugee population even though obviously it's very different.

187
00:32:02,000 --> 00:32:06,000
A lot of the refugees from here are there here from like Afghanistan for example very close to Pakistan.

188
00:32:06,000 --> 00:32:19,000
And so I wanted to work with the refugee organization, and what spoke to me about MRC was that they kind of have the startup mentality, where like they take on actually some like venture debt, they kind of have these business models that they employ refugees

189
00:32:19,000 --> 00:32:32,000
and it's not like, you know, they get donations and hand out stuff it's basically they create businesses to employ refugees over the long term. And then I also like that it's not just giving people, you know, it's kind of the classic teacher person to fish

190
00:32:32,000 --> 00:32:46,000
approach. And so they actually utilize it pre existing skills, whether it's at the farm or at the textile community, textile studio. And so those are the things I really liked about it then I met Meg, who was awesome I met David so I really like the team.

191
00:32:46,000 --> 00:32:55,000
I really liked the mission, the specifics around the operation and then the people. And yeah, I decided to have the opportunity to work with them.

192
00:32:55,000 --> 00:33:14,000
There's a, there's two schools of thought and how to help developing nations. There's one that says we should go there, build the infrastructure, stay there, and leave, maybe in 2030 years when it's once the country has developed to an extent.

193
00:33:14,000 --> 00:33:22,000
The other thought is we can teach them how to improve their infrastructure and then give them the funds to do so.

194
00:33:22,000 --> 00:33:36,000
Which school of thought. Do you fall under, and which industries do you think we should focus on the most as a healthcare is education is a good plumbing toilets electricity.

195
00:33:36,000 --> 00:33:51,000
Yeah, I mean, and thanks again for mentioning I keep, I keep not answering the last parts of your question. So, you know, I mean, as you know like you have institutions like the World Bank that go in and they quote unquote help, but it's so many strings

196
00:33:51,000 --> 00:34:05,000
attached you know and like sometimes it leads to situations that are terrible in the long term. And so, like, from my opinion which admittedly is very like uneducated like this is not my space to talk about.

197
00:34:05,000 --> 00:34:20,000
But as a basic building block for successful country. You have to have number one, like safety right so like a lot of countries just people just live in insecurity, think to you have to have the your basic needs are met right like you have a roof like you have a

198
00:34:20,000 --> 00:34:30,000
toilet, you have food to eat. And then number three, you have to have a basic level of education so that in the population can like take advantage of opportunities and grow.

199
00:34:30,000 --> 00:34:43,000
And so I think, to me, those are the building blocks and then from there, you know you can talk about like growing industries and things like that you know like like the model of like the micro lending model you know I think is really interesting, where you

200
00:34:43,000 --> 00:34:54,000
like I think first, you kind of contribute to developing those things if you can. And then you kind of incentivize people in that community to build businesses with some logistical support from your end.

201
00:34:54,000 --> 00:35:05,000
And to me that's a good start you know I think that's more than we do and most of the time, you know, and I think you kind of like the problem happens when the results go into place where you don't want them to go.

202
00:35:05,000 --> 00:35:23,000
And then that's just becomes a weird situation, sometimes, but, you know, I mean I think if we don't look at it from the perspective of, you know, we know what's right for you would just look at from the perspective of like here are some basic humanitarian

203
00:35:23,000 --> 00:35:33,000
goals that we have. And then from there it's kind of creating an environment that's going to make you successful with what you want to do. I mean maybe that's a good foundation.

204
00:35:33,000 --> 00:35:38,000
But, but like I said you know I don't really like, I don't really know.

205
00:35:38,000 --> 00:35:41,000
I'm not in a position to really speak in an educated way on that.

206
00:35:41,000 --> 00:35:44,000
I like that answer.

207
00:35:44,000 --> 00:35:55,000
Is failure a prerequisite for success. And does the magnitude of your previous failures, determine the magnitude of your future successes.

208
00:35:55,000 --> 00:36:02,000
I don't think so. I mean, I don't see why. If you win every single time.

209
00:36:02,000 --> 00:36:13,000
That's a problem. I mean I think you can you can do that. You know what I mean. I mean you always talk about like, I mean, it's a balance I mean, but I don't necessarily glorify failure I don't think you have to have to have that

210
00:36:13,000 --> 00:36:25,000
you have to have failed in order to succeed. You know, and I think conversely I don't think you have to have failed big in order to succeed. I mean I think it helps to learn, but you can learn from a win.

211
00:36:25,000 --> 00:36:30,000
You can also learn from a fail. So, I agree.

212
00:36:30,000 --> 00:36:39,000
I think we learn the same amount from our successes as our failures, and there aren't inherently more lessons and a failure.

213
00:36:39,000 --> 00:36:49,000
Let's talk about Austin medtech connect and talk me about the journey to where you are now congratulations on putting together so many great events.

214
00:36:49,000 --> 00:36:53,000
And what's the future of Austin medtech.

215
00:36:53,000 --> 00:37:08,000
Thanks so much. Yeah, no it's been a super exciting journey I mean, the way it came about is so during coded. I was networking with a bunch of people on zoom, and I kept seeing that this person's in Austin that person's in Austin and there's so many

216
00:37:08,000 --> 00:37:18,000
kinds of people who are working in this field that live down the street, and they don't know each other or if they do there's no one place for them to gather and communicate.

217
00:37:18,000 --> 00:37:28,000
And I interacted with a lot of the other orgs in town and I realized this is a missing gap that we had as far as an organization purely focused on breaking down the silos and offering that connectivity.

218
00:37:28,000 --> 00:37:41,000
And so yeah we formed as a 501 ct nonprofit. I got Jeff Levine and Christian Norton to join me on the board and put on our first events and it's been really fun. The response has been you know better than I could have hoped for.

219
00:37:41,000 --> 00:37:55,000
And, you know, it's another opportunity for me to learn, you know, I never put on events before I never formed a nonprofit before, you know, I never raised funding for anything before, you know, and so all of those were learning experiences.

220
00:37:55,000 --> 00:38:05,000
And, you know, I think the fact that we're successful speaks to the need more than anything, because people were looking for something like this to go to.

221
00:38:05,000 --> 00:38:17,000
And so even if you know our events, like, weren't perfect. It was still received well because, you know, there was really not a better alternative, if you will.

222
00:38:17,000 --> 00:38:33,000
And so yeah so it's been fun kind of learning and putting on bigger events and, you know, I keep having conversations with people both locally and nationally and internationally actually, you know, I've conversations with people in Ireland and France and in London today.

223
00:38:33,000 --> 00:38:53,000
And so the other other goal you mentioned in the future is allow us to be connected community internally, but then also interface with other regions like San Antonio, Houston, Dallas, but then also allow us to interface with other regions in the country and internationally

224
00:38:53,000 --> 00:39:10,000
and we're seeing more and more of that, because I think the Austin strength is combining advanced technology and software with an entrepreneurial startup spirit. And then when you add healthcare to that, you know, it's kind of this triple threat of the future

225
00:39:10,000 --> 00:39:26,000
of technology, because that's what it's going to take is people that have an understanding of technology, willing to take chances in educated and proficient way and have access to, you know, real healthcare systems and people because we have such a huge healthcare

226
00:39:26,000 --> 00:39:43,000
market like in San Antonio and Houston, for example, that I think it's, it's, to me that the perfect mix of creating like a really important place for medical technology in the world, you know, and I've talked to people who are international companies who are

227
00:39:43,000 --> 00:39:50,000
establishing the first global headquarters or first US headquarters in Austin now. So you know it feels validating in a way.

228
00:39:50,000 --> 00:39:57,000
So again, congratulations. It's, it's been amazing seeing the growth you've had.

229
00:39:57,000 --> 00:40:06,000
Let's go a little bit deeper into how it grew so fast. How much of that growth was inbound, how much of it was outbound.

230
00:40:06,000 --> 00:40:17,000
And what was your outbound strategy that works so well if it was outbound led primarily about meaning like just sending out messages and reaching out to people.

231
00:40:17,000 --> 00:40:34,000
Yeah, so I think the initial success was just based on relationships, you know, like people that I knew personally people that you know Christian and Jeff knew, because they're like, they've been in this town for over, you know, 10 years, and Jeff's a METEC

232
00:40:34,000 --> 00:40:47,000
CEO, and Christian Norton is a CEO of like a consulting, digital technology consulting company. And so they both had deep connections here and so I think, and I had a lot of questions of clinical community.

233
00:40:47,000 --> 00:40:56,000
And so I think the initial success was based on that, just our relationships. And then from there, it was

234
00:40:56,000 --> 00:41:07,000
based on the first event successfully and then I think a lot of it was just due to like LinkedIn, for example, you know, like you put on the first event you kind of make some posts about it, there's some hype, people want to come to the next one and it's a little

235
00:41:07,000 --> 00:41:19,000
snowball effect. And we're really seeing that because initially we had you know, 100 people, then 150 then 250 on our list and these are all kind of like people that we know to some extent.

236
00:41:19,000 --> 00:41:35,000
And then the other thing that we did was we tried to have what I probably have made up a term called relevance density, where, you know, I think people sometimes don't go to event because they don't know if anyone's going to be there that they'll find

237
00:41:35,000 --> 00:41:49,000
like social or professional value in. And so if it's invite only, you kind of have that right density of people that almost everybody finds it worth it to come to, you know, and so that was part of the success as well.

238
00:41:49,000 --> 00:41:56,000
Yeah, I making connections is arguably the most impact you can have in this world.

239
00:41:56,000 --> 00:41:59,000
And I just want to commend you and congratulate you on that.

240
00:41:59,000 --> 00:42:10,000
Yeah, thanks so much. I mean, you know, we connected through LinkedIn also so I feel like you know that's been part of it is, I think with people like yourselves and Zane and Kasim and you know so many other people I think.

241
00:42:10,000 --> 00:42:16,000
Yeah, I mean everyone talks about LinkedIn, it's such a great platform and I think for me it's been really kind of transformative.

242
00:42:16,000 --> 00:42:18,000
Yeah.

243
00:42:18,000 --> 00:42:21,000
Do you think healthcare should be profitable.

244
00:42:21,000 --> 00:42:39,000
So, which parts of healthcare should be profitable. Would you divide it into primary care versus surgical care, or would you divide it into innovation and healthcare should be profitable, but care delivery should not.

245
00:42:39,000 --> 00:42:48,000
I see no problem with health care being profitable. You know, I think it's just making the profits align with what's best for people.

246
00:42:48,000 --> 00:42:59,000
You know if that makes sense. I mean, I think just because you get the profits out of it doesn't mean it's going to be good, necessarily, or aligned with the right things.

247
00:42:59,000 --> 00:43:05,000
I think to me an ideal case scenario you align the profits with the right outcomes.

248
00:43:05,000 --> 00:43:13,000
And then you know you always have to iterate on it's never going to be right the first time. Yeah, you all have to keep up with what's what's changing.

249
00:43:13,000 --> 00:43:16,000
But I don't see a problem with healthcare being profitable.

250
00:43:16,000 --> 00:43:18,000
Okay.

251
00:43:18,000 --> 00:43:25,000
What's the future for you personally, Joanne, what are you looking forward to in your life or the next 10 years.

252
00:43:25,000 --> 00:43:28,000
And what's the end goal for you.

253
00:43:28,000 --> 00:43:36,000
How do you see your retirement, if you want to use that phrase, looking like for you.

254
00:43:36,000 --> 00:43:43,000
Yeah, I mean, so, you know, I love doing surgery and care patients.

255
00:43:43,000 --> 00:43:51,000
There was a time in my life where I was like, is this the right event, go down forever or not, but I've decided that that it is.

256
00:43:51,000 --> 00:44:07,000
I think, you know, as Vality Partners and Austin Mitech continues to grow. I'm excited about, you know, learning more about being a leader, about building business operations, and kind of connecting in a more structured way whether it's putting on,

257
00:44:07,000 --> 00:44:12,000
we talked about putting on a conference today you know, building out stuff like that.

258
00:44:12,000 --> 00:44:25,000
We have some conversations, you and I about like different activities and things you know I think that's super exciting. So I think just being more engaged and continuing to learn about that kind of thing you know I don't know if I'd ever get an MBA or something like that

259
00:44:25,000 --> 00:44:31,000
but probably some kind of you know, formal training just to kind of help with that process.

260
00:44:31,000 --> 00:44:50,000
And then in the long run, you know, I would love to, like, so we're growing Vality Partners and I would love to kind of have like, this is our firm that we built with our team, we go out we'll do this work and just really be known for excellence and be known for being a good group of people

261
00:44:50,000 --> 00:44:52,000
to work with.

262
00:44:52,000 --> 00:44:56,000
So that's kind of what I'm looking forward to in the future. I mean,

263
00:44:56,000 --> 00:45:10,000
I used to be excited about retirement but I feel like the more I get into this kind of work, like, I don't even think about retiring you know because I'm like, I, you know, it's not about saving up this X amount of dollars and then not having to work again you know like that

264
00:45:10,000 --> 00:45:12,000
doesn't excite me anymore.

265
00:45:12,000 --> 00:45:25,000
That makes me really happy to hear that you have truly found your AK guy, which for those who don't know is the intersection of what you love to do, what you're good at, and what people will pay you for.

266
00:45:25,000 --> 00:45:33,000
Yeah, I mean I don't know if I've truly found anything but I feel like there's things I'm excited about and you know so maybe that maybe that's as good as it gets.

267
00:45:33,000 --> 00:45:35,000
Yeah, I think so.

268
00:45:35,000 --> 00:45:47,000
I am still looking to leave clinical medicine. Yeah. And it's a path that I don't have a clear path on how that looks like.

269
00:45:47,000 --> 00:45:54,000
But I'm trying different things and enjoying all the things I'm trying right now so I'm excited about that.

270
00:45:54,000 --> 00:45:56,000
Yeah.

271
00:45:56,000 --> 00:46:05,000
No, I mean unfortunately we're in a time where you know so many clinicians are having are having a hard time with with the clinical work.

272
00:46:05,000 --> 00:46:19,000
And it's a huge problem. Actually you know I'm working with Olivia and David Morris over at Doctors Living. If you're not familiar with that, I want to check them out and they're doing a lot in terms of like what's needed in this area to make things better for

273
00:46:19,000 --> 00:46:22,000
clinicians you know, because like

274
00:46:22,000 --> 00:46:38,000
there's too many clinicians who want to leave to make that a sustainable thing as far as having enough of a clinical workforce, even you know. And so not that everyone has to be a clinician but like, we got to make things better for clinicians so that enough people want to do it.

275
00:46:38,000 --> 00:46:44,000
Let's get into it. I'm pessimistic here, very. Okay, I don't think it will change.

276
00:46:44,000 --> 00:46:46,000
Zero hope.

277
00:46:46,000 --> 00:46:51,000
And I'm generally an optimistic, it's hard for me to say that. Yeah.

278
00:46:51,000 --> 00:47:01,000
The reason is, the reason we're leaving is because of a lack of value and a lack of respect, and it goes both financially and just respect from society.

279
00:47:01,000 --> 00:47:14,000
Yeah, is a bigger trend of devaluing expertise with the rise of the internet, Google, and different now chat GPT that trend I don't see improving.

280
00:47:14,000 --> 00:47:16,000
Yeah.

281
00:47:16,000 --> 00:47:24,000
The business of medicine is about improving productivity and efficiency.

282
00:47:24,000 --> 00:47:29,000
And that's just business, I think.

283
00:47:29,000 --> 00:47:36,000
It's about providing the most access to care in the most cost effective manner.

284
00:47:36,000 --> 00:47:50,000
And that's a noble thing, but that does not involve physicians, I don't see physicians having room in that space. We will, until we're not needed. So I see primary care eroding fairly fast.

285
00:47:50,000 --> 00:47:55,000
I will not be surprised that in 20 years there are no family doctors.

286
00:47:55,000 --> 00:48:06,000
Again, I have a bias I'm pessimistic on this, but that's kind of my take on it at this point in my life, but I'd love for you to challenge that and change my mind if possible.

287
00:48:06,000 --> 00:48:24,000
Yeah, I mean, like I said, I don't have any answers I mean I do think, you know, physicians have to be open to objectively analyzing their value and their contribution, like you said, I think too often we make the statement like I train

288
00:48:24,000 --> 00:48:32,000
for you know, x amount of years, therefore I should be paid y amount of dollars, which like, I'm like, come on, doesn't, when has it ever worked like that.

289
00:48:32,000 --> 00:48:48,000
You know, but I think, you know, we do bring a ton of value in terms of, I think just who we are the people that get into medical school is not your average person, both in terms of capability, in terms of drive and in terms of like passion for taking

290
00:48:48,000 --> 00:48:51,000
care of people, you know.

291
00:48:51,000 --> 00:49:03,000
I think that's true now as it has ever been, you know, and so I think you take this group of people who are like uniquely positioned in terms of those combination of things, and then you put them through, like, I think you have to think about what pathway you put them

292
00:49:03,000 --> 00:49:18,000
through. I think like this eight years of school, plus like three to like seven years of further training is crazy. You know it's crazy. It's crazy. I mean I think people combine that in other places into five years total.

293
00:49:18,000 --> 00:49:31,000
Right. So like in the US where we take at least 11 years to do, they do it in five, right. And that's a total game changer. You know, I think if you if you gave people a streamlined pathway to be a physician.

294
00:49:31,000 --> 00:49:45,000
And that came with you know way less debt way less time, you know, I mean, I think we should consider that, you know, especially because like, like back in the day like way back in the day family doctor was like, not just seeing patients like delivering babies and taking out appendixes

295
00:49:45,000 --> 00:49:55,000
and things like that you know that's a different world. Like, that's not the case anymore so I think we have to be open to considering like the training pathway right.

296
00:49:55,000 --> 00:50:10,000
And I think after that we have to be like thinking about what is the real role of the physician, like, especially when you integrate having NPS and PAs as part of the infrastructure having, you know, AI enabled decision support as far as the infrastructure,

297
00:50:10,000 --> 00:50:28,000
you know, like, to me I think the real role is having that human connection and enabling behavioral change, you know, to me, because I think the AI tools, they can tell you what antibiotics to take, but they can't like convince you to like you know take care of

298
00:50:28,000 --> 00:50:42,000
yourselves and have this person who's like I think, and for most physicians that's where they find the most satisfaction is in the human connection with their patients you know I so think if you can take all these other things and allow the physician to be this human

299
00:50:42,000 --> 00:51:01,000
connection that's also, by the way, very educated and capable and, you know, respected to me that's where I see it going, and you take away the, you know, like 10 minute visits to get X amount of RV use thing, and then you take away like you know $500,000 of debt to me that becomes

300
00:51:01,000 --> 00:51:07,000
more sustainable from a primary care perspective. I completely agree.

301
00:51:07,000 --> 00:51:20,000
Removing that debt, making medical education cheaper shortening residency are all things that will help burn out.

302
00:51:20,000 --> 00:51:23,000
I think this is a good place to end.

303
00:51:23,000 --> 00:51:40,000
It's. And the reason is, this is a much longer discussion because each one of those points involve institutions losing revenue, which will be challenging to mandate without legal changes.

304
00:51:40,000 --> 00:51:46,000
Thanks so much for coming on today, I had an absolute pleasure talking to you.

305
00:51:46,000 --> 00:51:56,000
I hope we continue to grow our relationship and work together more. And I think soon we'll meet in person as well.

306
00:51:56,000 --> 00:52:17,000
Thank you so much for having me. I love your very insightful approach to these conversations. Thank you.