How to raise money as a first time founder: Joshua Liu - SeamlessMD


Josh is the CEO and cofounder of SeamlessMD, a patient engagement platform which improves patient satisfaction while reducing hospital length of stay, readmissions, and costs.. They have raised over $7.5 million and are used by various health systems including Stanford Health. He is intelligent, humble and posses incredible grit. We talk about his childhood, his journey from an interest in coding to medical school and then being a founder. We discuss ChatGPT, AI in healthcare, the future of private healthcare in Canada, raising money for his startup, 90% of the VC he met said no initially and his end goal in life.
He was previously the Chair of the Canadian Medical Association’s Joule Innovation Council and on the Advisory Group to the Office of the Chief Health Innovation Strategist for the Ontario Ministry of Health. Joshua has also served on the Shad Valley Board of Directors and as a Startup Advisor to Northeastern University.
Joshua has received numerous honours, including being named Digital Health Executive of the Year by Digital Health Canada, Forbes 30 Under 30 in Science and Healthcare, Canadian Top 20 Under 20, TD Canada Trust Scholar, Creative Destruction Lab Alumn and NEXT Canada Alumn.
00:00:00,000 --> 00:00:04,000
Hey everyone, I'm really excited to bring you this conversation with Josh.
2
00:00:04,000 --> 00:00:06,000
I had so much fun talking to Josh.
3
00:00:06,000 --> 00:00:10,000
Josh is the founder and CEO of SeamlessMD.
4
00:00:10,000 --> 00:00:17,000
SeamlessMD is a patient engagement platform which is used by numerous hospitals, including Stanford Health.
5
00:00:17,000 --> 00:00:22,000
It reduces the patient length of stay, reduces readmission rate and costs.
6
00:00:22,000 --> 00:00:27,000
They have raised over $7.5 million in funding so far.
7
00:00:27,000 --> 00:00:34,000
Josh has won numerous awards, including the Digital Health Executive of the Year by Digital Health Canada,
8
00:00:34,000 --> 00:00:41,000
Forbes 30 under 30 in science and healthcare, the Canadian Top 20 under 20, TD Canada Trust Scholar,
9
00:00:41,000 --> 00:00:46,000
the Creative Destruction Lab alum, and he's also an alum from the Next Canada.
10
00:00:46,000 --> 00:00:51,000
He was the chair of the Canadian Medical Association's Innovation Council,
11
00:00:51,000 --> 00:00:59,000
and he was also on the advisory group to the Office of the Chief Health Innovation Strategist for the Ontario Ministry of Health.
12
00:00:59,000 --> 00:01:07,000
He has also served on the board of directors at Chad Valley and a startup advisor to Northeastern University.
13
00:01:07,000 --> 00:01:14,000
Josh is intelligent, humble, and possesses incredible grit and perseverance.
14
00:01:14,000 --> 00:01:21,000
We talk about his childhood, his journey from an interest in coding to medical school and then in being a founder.
15
00:01:21,000 --> 00:01:26,000
We discussed Chad's GPT, AI and healthcare, the future of private healthcare in Canada,
16
00:01:26,000 --> 00:01:31,000
raising money for his startup, and how 90% of the VCs he met initially said no.
17
00:01:31,000 --> 00:01:35,000
We also get into his end goal in life and with a startup.
18
00:01:35,000 --> 00:01:39,000
I hope you guys enjoy the conversation as much as I did.
19
00:01:39,000 --> 00:01:49,000
Before we get started, I do have one simple ask. If you enjoy this podcast and this channel, please subscribe and it will help support the channel even more.
20
00:01:49,000 --> 00:01:53,000
Thanks so much for listening and watching everyone. Hey Josh, thanks so much for being here today.
21
00:01:53,000 --> 00:02:00,000
I've been really looking forward to this. I think let's go back in time and start at your childhood.
22
00:02:00,000 --> 00:02:05,000
Our childhood defines who we are to a fairly big extent.
23
00:02:05,000 --> 00:02:14,000
Talk me through your childhood and what are some learnings you took from your childhood to now and what are some things you had to unlearn from it?
24
00:02:14,000 --> 00:02:21,000
Yeah, I would say for me a couple of things. So one is I've always been very much a builder.
25
00:02:21,000 --> 00:02:26,000
Like I won't say I've always been an entrepreneur per se, but I've always been very much a builder.
26
00:02:26,000 --> 00:02:35,000
So for me, when I was 11, the internet was still kind of new, but I had a friend in school who got really into building websites.
27
00:02:35,000 --> 00:02:42,000
So then I started learning how to use, and I'm going to date myself, Microsoft front page to build a website.
28
00:02:42,000 --> 00:02:47,000
And then from there, I started learning how to do basic coding with HTML.
29
00:02:47,000 --> 00:02:56,000
And so by the time I was in high school, and my teachers had no idea what it took to build a website, I was building websites for school projects.
30
00:02:56,000 --> 00:03:07,000
Back then that was amazing. So I did that. And then when I got to high school, up until then, I had been very much academic oriented.
31
00:03:07,000 --> 00:03:11,000
So I was very much focused on grades in school and all that.
32
00:03:11,000 --> 00:03:20,000
But it was in high school when I really started to, I think, break out of my shelf from a social skills, interpersonal skills point of view.
33
00:03:20,000 --> 00:03:27,000
And so I think going again back to being very much a builder, I started really getting not only involved in my community, but in building stuff.
34
00:03:27,000 --> 00:03:32,000
So I started school clubs and nonprofit groups and things like that.
35
00:03:32,000 --> 00:03:41,000
And that really got me exposed to leadership and teamwork and communication and public speaking.
36
00:03:41,000 --> 00:03:53,000
And so I think I went into school being very academically inclined and then graduated still, I mean, caring about academics, but becoming much more well rounded person.
37
00:03:53,000 --> 00:03:56,000
And then, and then going from there.
38
00:03:56,000 --> 00:04:03,000
Interesting. I'm trying to put you in a box and maybe I shouldn't because I'm finding it incredibly difficult.
39
00:04:03,000 --> 00:04:18,000
As you said, you're a builder. You look at things as a builder from first principles of breaking things down, but you're also getting to public speaking, which, and trying to fit socially.
40
00:04:18,000 --> 00:04:21,000
Those are two different people in most people's eyes.
41
00:04:21,000 --> 00:04:38,000
How did you go about marrying the two, and which one do you enjoy the most? Do you enjoy the socializing public speaking part of your life, or do you enjoy the solo work building part of your life?
42
00:04:38,000 --> 00:04:52,000
It's funny, I enjoyed both for different reasons. So I think, you know, I very much like the, let's call it the zero to one phase of building things and I think, you know, building websites was going from zero to one as a solo, you know, individual creator, let's say.
43
00:04:52,000 --> 00:05:08,000
And I would say where things like public speaking and teamwork and leadership come in is, how do you go from zero to one as an organization, right. So when I started getting into like, hey, let's build a school club to solve this problem or this nonprofit group.
44
00:05:08,000 --> 00:05:22,000
Well, it's not about just me anymore, right, you have to interact with teammates, you have to sell other people on a vision for why you're starting this club, you have to start, you know, convincing other people in the ecosystem to support your vision.
45
00:05:22,000 --> 00:05:39,000
And so then to go from zero to one and build a team or organization, then you have to develop all those other skills. And certainly today like building a company, I would say I'm spending a lot more time building in a social interpersonal point of view and not so much from an individual
46
00:05:39,000 --> 00:05:51,000
point of view, but actually even then we'll get to this later but when we started my company CMSMD with a very small team of you know, three, you know founders initially, a lot of going from zero to one was individual work and then now that you know we're at
47
00:05:51,000 --> 00:05:58,000
almost 40 people a lot of my work now is interpersonal communication teamwork etc.
48
00:05:58,000 --> 00:06:13,000
Yeah, I completely agree I was surprised how much of building a startup and my startup was fairly small 10 people is managing people and how much time that took.
49
00:06:13,000 --> 00:06:21,000
Let's talk about hiring. And let's look at it from a frame of structured decision making versus intuition.
50
00:06:21,000 --> 00:06:27,000
Tell me your, your weightage for each structure and intuition when hiring.
51
00:06:27,000 --> 00:06:35,000
And how do you go about selecting a co founders, and then be your first tires.
52
00:06:35,000 --> 00:06:47,000
Yeah, so I mean, selecting co founders. I actually can't give you great advice on that because I went through a unique process where I didn't completely select my co founder so just for context.
53
00:06:47,000 --> 00:06:53,000
When we started seamless and the we did it through an incubator called the next 36.
54
00:06:53,000 --> 00:07:01,000
And the way that worked was individuals across Canada, who were interested in getting involved in entrepreneurship could apply.
55
00:07:01,000 --> 00:07:13,000
And then they would pick 36 people across the country to this program. And then at the time at least, they would decide based on your prior interest and skill sets.
56
00:07:13,000 --> 00:07:22,000
What group of three should be in to start this journey together. Now you can imagine that because chemistry is so important. Many of these groups of three did not work out.
57
00:07:22,000 --> 00:07:34,000
And I would say that my two co founders, we were one of the few that actually survived as a team and went on to build a venture that's still around today you know about 10 years later.
58
00:07:34,000 --> 00:07:41,000
So unfortunately I am not great at telling you the right ways like co founders, I will say one thing I learned though.
59
00:07:41,000 --> 00:07:52,000
Because actually, the year before we started seamless MD and before I did this business incubator. I tried starting and I failed at building a completely different healthcare technology venture.
60
00:07:52,000 --> 00:07:56,000
And in that one, I had started it with two friends from med school.
61
00:07:56,000 --> 00:08:06,000
And I think one of the reasons why we weren't successful is because we were all clinical people so our skill sets overlapped. None of us were technology people or could build a product.
62
00:08:06,000 --> 00:08:15,000
And I think one of the reasons why I did the business incubator was I wanted to find people who had skill sets that I didn't have around engineering, technical and otherwise.
63
00:08:15,000 --> 00:08:23,000
And so for me the big learning was okay you got to have complementary skill sets you can't have too much overlapping effort or else you just can't move quickly enough.
64
00:08:23,000 --> 00:08:31,000
I completely agree. And that's advice I got an ignored from some mentors in my startup as well.
65
00:08:31,000 --> 00:08:36,000
Let's go back to high school. So you're a coder.
66
00:08:36,000 --> 00:08:39,000
Your speech your public speaker as well.
67
00:08:39,000 --> 00:08:46,000
It seems like the two paths I would see based on that would be software engineer or politician.
68
00:08:46,000 --> 00:08:53,000
But you end up into medical school talk to me about the journey to med school was that always in the cards.
69
00:08:53,000 --> 00:08:56,000
Or how did you reach that decision.
70
00:08:56,000 --> 00:09:09,000
Yeah, I would say it was the cards for a while I mean I grew up, you know, really interested in, in medicine, I think, to be fair, like I grew up in a, you know, I think a traditional Chinese household where like medicine was one of the career opportunities
71
00:09:09,000 --> 00:09:18,000
that you know was like recognized and admired and all that I was certainly influenced by some of those, those ideas.
72
00:09:18,000 --> 00:09:29,000
But I'll tell you even when I was doing undergrad and I was preparing for medical school do my MCATs and my pre recs and all that kind of stuff. I still wasn't really sure if I wanted to do medicine.
73
00:09:29,000 --> 00:09:39,000
But I wasn't sure what else I wanted to do so I applied and say hey I don't have a better plan so I did and I was lucky to get into medical school.
74
00:09:39,000 --> 00:09:50,000
But you know even in in medical school. I wasn't really sure what I wanted to do you know some folks know they want to go into family medicine or surgery or cardiology.
75
00:09:50,000 --> 00:09:58,000
I thought I wanted to do neurology and then I shadowed some neurologists and realize I wasn't as excited about as I thought I would be.
76
00:09:58,000 --> 00:10:09,000
And where I got lucky was I had some great mentors who got me more interested in quality safety and patient experience, they took me under wing.
77
00:10:09,000 --> 00:10:25,000
And so, at UHN, I ended up doing some research in med school on preventing readmissions after hospital discharge. And I became from very passionate about that topic and quality and safety and ultimately trying to solve the readmissions problem
78
00:10:25,000 --> 00:10:41,000
that led to me, you know, starting SeamlessMD which we can get to later. And so it's funny because I never planned to go into healthcare tech entrepreneurship, but if I hadn't done medical school there's no I would have kind of gone on this path in the first place.
79
00:10:41,000 --> 00:10:49,000
What advice do you have to medical students who want to follow in your footsteps?
80
00:10:49,000 --> 00:10:59,000
Yeah, I was fine there are a lot more medical students nowadays who are very interested in this atypical career path. The advice I always give folks in med school is like number one.
81
00:10:59,000 --> 00:11:07,000
It's totally normal if you don't know what you want. You don't know what specialty you want in fact you might change your mind after you start it so there's no there's no sure thing.
82
00:11:07,000 --> 00:11:24,000
I always tell people, the important thing is to follow your curiosity and see where that leads you. And that means, if you get curious about something, read more about it, shadow, do research on it, go to a conference, work on a project there, and either you'll find out that you actually
83
00:11:24,000 --> 00:11:42,000
are curious about it and you'll keep doing more in it, or you'll realize you don't like it and you can cross off your list, but just keep following your curiosity and that's what I did right like I was curious about readmissions and then I was curious about startups and then, you know, like by chance, and by the way, didn't always work out like I had a failed startup.
84
00:11:42,000 --> 00:11:50,000
Yeah, and then another one that that's still around today and so if you follow curiosity I feel like you get to a good place.
85
00:11:50,000 --> 00:11:54,000
And did you do a residency Josh.
86
00:11:54,000 --> 00:12:01,000
I did not so I actually had gotten into family medicine residency in Toronto.
87
00:12:01,000 --> 00:12:04,000
And what happened was that normally starts, you know, July 1 right.
88
00:12:04,000 --> 00:12:18,000
The incubator that I had started doing in my last year of med school for seamless MD that incubator ran until the end of August. So originally, my residency program said okay well like just start September 1 I said great.
89
00:12:18,000 --> 00:12:31,000
And then when we got to the end of August. They said well Josh like, what do you want to do do you want to do like 5050 time between residency and seamless do you want to do 9010 time they were incredibly supportive.
90
00:12:31,000 --> 00:12:40,000
But I decided was that I didn't think I could do two things really well, and then I didn't want to do two things, you know, with it with that you know half effort.
91
00:12:40,000 --> 00:12:53,000
And so I said hey you know what I want to see where the seamless MD thing goes. How about we talk again in the year. And then, you know, maybe, maybe we'd be dead by that maybe didn't work out or maybe I'll do something else.
92
00:12:53,000 --> 00:13:04,000
And then basically, every year for several years I would come back and they would say well Josh are you coming back to residency now and say, well no because I'm still, I still want to see where this is going it's still growing.
93
00:13:04,000 --> 00:13:13,000
And then at that point, you know, they were so generous like after five years, like, they were like hey Josh, we can't keep this annual thing going anymore you're going to come back or not.
94
00:13:13,000 --> 00:13:23,000
And I was like, I'm so grateful for all the opportunities you gave me but I'm just so all in on this there's no way I could, you know, do this part time or anything.
95
00:13:23,000 --> 00:13:25,000
So, there you go.
96
00:13:25,000 --> 00:13:30,000
That is a fascinating answer and a very brave decision from my perspective.
97
00:13:30,000 --> 00:13:37,000
What do you have to yourself as a med student if you were to go back in time.
98
00:13:37,000 --> 00:13:53,000
Yeah, I don't know there's too much I would change that I think one of the things I've come to believe is that my situation now is a result of all these compounding experiences right and so even if I think I can go back in time and optimize something that actually
99
00:13:53,000 --> 00:14:11,000
has to come in a way that may not have been necessarily a good thing for me. I do think that being said though I do wish I'd spent more time probably learning other skill sets that would be relevant from doing now so for example, you know, I, I know enough
100
00:14:11,000 --> 00:14:25,000
technology stuff I can speak to seamless MD, but I kind of wish I was more technical in some ways I wish I had a better technical mind. And so I think going back, given that like I always believe that as you get older your your schedule gets to do things gets smaller
101
00:14:25,000 --> 00:14:27,000
in terms of like your daily time window.
102
00:14:27,000 --> 00:14:41,000
I feel like I had more time back then to explore stuff so I wish I had spent more time, like learning more basic technical skills around like coding and development or just technical architectures because now I'm just like either too lazy or just too busy to have more
103
00:14:41,000 --> 00:14:46,000
technical foundation, but I wish I spent more time learning about technology back then.
104
00:14:46,000 --> 00:14:53,000
Do you think with the rise of Chad GPT writing code to an extent, poor code from what I know
105
00:14:53,000 --> 00:15:01,000
is, are the new coders people who are using no code tools or AI to write their code?
106
00:15:01,000 --> 00:15:06,000
Or what is the market like for software engineers of the future you think.
107
00:15:06,000 --> 00:15:14,000
Yeah, I mean it was out of the green and so because I'm making I'm not technical, so I don't really live in that world from a software development point of view.
108
00:15:14,000 --> 00:15:29,000
I don't. I think it's likely I think people have used this phrase already words like, you know, chat GPT or a in general is not going to I think replace a lot of what humans do, but I think those who use it will have an advantage so you might as well
109
00:15:29,000 --> 00:15:40,000
learn how it could could benefit you so I think to your point there may be a lot of repetitive tasks that similar AI tools could use to streamline development.
110
00:15:40,000 --> 00:15:54,000
And I think probably one of the lower hanging fruits is if you're learning the fact that you can use something like chat GPT to understand you know how should I code something, or how should I troubleshoot this issue and the fact that it can scan a lot of, you know,
111
00:15:54,000 --> 00:16:02,000
past data and language and actually help generate well here's, here's a framework or here's code that others have used to solve this problem.
112
00:16:02,000 --> 00:16:11,000
And the fact that you can like learn faster and improve your skills faster I think that's probably the lower hanging fruit that people are already taking advantage of.
113
00:16:11,000 --> 00:16:27,000
And then some ways your point, some of these tools and like no code tools, make producing software more accessible to the average person so you know in the future, will people, how much code people have to know or what formal coding take could be very very different.
114
00:16:27,000 --> 00:16:42,000
And I think people who have a sound technical understanding and mine will always have an advantage so I think the best people in the development space will always have incredibly strong fundamentals that they can design the technical world with as much detail as possible
115
00:16:42,000 --> 00:16:47,000
and those who don't have that technical foundation will probably have some limitations at the end of the day.
116
00:16:47,000 --> 00:16:53,000
So I think it's still important but I think you're right I think it will make development more accessible to more people.
117
00:16:53,000 --> 00:16:57,000
Whether you know, code or not.
118
00:16:57,000 --> 00:17:10,000
Yeah, I agree with that. I'll ask another question about chat GPT. For those who don't know, it recently passed the USMLEs or our board exam in the states, and also a test by a Wharton MBA professor.
119
00:17:10,000 --> 00:17:12,000
Were you surprised by that.
120
00:17:12,000 --> 00:17:28,000
Yeah, the second part of the question is, there is a decline in the value of expertise in general. And then by default a decline in the value of physician expertise. You think chat GPT will exacerbate that decline.
121
00:17:28,000 --> 00:17:43,000
So I'll take a step back first and then I'll get into the specific GPT example the USMLEs so I mean if you think about decades ago right we had schools and the big focus on school was, we had to teach students knowledge, right, because knowledge was what was
122
00:17:43,000 --> 00:17:45,000
powerful at the time.
123
00:17:45,000 --> 00:17:54,000
And now, knowledge that your fingertips so now it's not as important that students go to schools universities to learn facts because you can search for answers so clinically.
124
00:17:54,000 --> 00:18:06,000
I don't have to know everything and up to date. I can look up to date to find clinical information whereas maybe 200 years ago I had to memorize all this information because it was not my fingertips so we've democratized access to knowledge.
125
00:18:06,000 --> 00:18:20,000
And so being able to make good judgment and clinical assessments and decision making is far more valuable than just having knowledge. And so, you know, and then when I think about something like chat GPT, and it's able to synthesize this information,
126
00:18:20,000 --> 00:18:31,000
and, you know, do more than just without knowledge, maybe in some ways to the size, you know good insight with that knowledge, but even like the USMLE or some of these clinical exams.
127
00:18:31,000 --> 00:18:40,000
They're very often very straight, I would guess I haven't read the assembly but a lot of very straightforward scenarios, or very clear like solutions to questions.
128
00:18:40,000 --> 00:18:55,000
Whereas, in the real clinical world, when you're making judgments where there's often imperfect information, you have to incorporate way more context than what's in a, you know, five sentence, then yet.
129
00:18:55,000 --> 00:19:13,000
You know, I think you're too like chat GPT are going to be helpful for us to maybe pull insight from the evidence or from data more quickly, but I think very often you still need that great clinical judgment that so far only human can do.
130
00:19:13,000 --> 00:19:31,000
I do think there's quite some point in the future where AI will be good enough that you can reach that like I don't know if you've ever seen Star Trek, but in Star Trek they have this like AI hologram doctor who's human like, and, and it looks and feels
131
00:19:31,000 --> 00:19:46,000
like a human physician, and they basically act as the doctor for people for humans. Um, can we get to that point I think that's totally possible but I don't I think that's quite far away but in the meantime I don't think something like chat GPT could replace that
132
00:19:46,000 --> 00:19:59,000
clinical judgment, that's that can understand the complexity needed for real clinical care, but when you're talking about like these vignettes in a written exam yeah I'm not that shocked that it could score pretty well.
133
00:19:59,000 --> 00:20:14,000
So, from what I'm getting at is there's a knowledge is not as valuable anymore because of the democratizing of access, but the application of knowledge and decision making is as valuable as ever.
134
00:20:14,000 --> 00:20:27,000
That being said, our testing in medical school, law school, different schools is still based on knowledge. It's based on recollection memorizing facts.
135
00:20:27,000 --> 00:20:37,000
So we should move away from that testing, and if so should be moved more towards an OSCE based scenario of testing.
136
00:20:37,000 --> 00:20:52,000
So, I do think it's hard for me to say like how much knowledge is sufficient necessary but you do need some foundational clinical medical knowledge, so that you even can be resourceful and looking up more information as needed.
137
00:20:52,000 --> 00:21:06,000
So we probably debate how much of the detail currently be taught in medical school is necessary but I do think some foundational, a significant amount of initial knowledge is needed so that you can actually interpret data well and and and use it properly
138
00:21:06,000 --> 00:21:10,000
and you're not starting from scratch with every clinical encounter.
139
00:21:10,000 --> 00:21:14,000
But I do think that in the future.
140
00:21:14,000 --> 00:21:30,000
Whoever the clinical encounter changes like let's say in the future, the clinical counters a situation where maybe you do have an AI assistant who is, you're able to have your fingertips to access information or access predictive analytics about a patient or anything
141
00:21:30,000 --> 00:21:31,000
like that.
142
00:21:31,000 --> 00:21:47,000
So, whatever the encounter looks like that should probably be at some point what the OSCE is like the OSCE shouldn't match the real clinical encounter and if in the future we're relying on AI assistants to treat patients, then the OSCE at some point should evolve to match that.
143
00:21:47,000 --> 00:22:01,000
So as long as that I think is similar enough I think you know we'll be doing medical education in the right way. It's kind of like I think about like let's say, it's funny I mean think about like let's call it like cars and autonomous driving and all that right so you can argue at some point,
144
00:22:01,000 --> 00:22:13,000
maybe drivers licenses are obsolete because no one's driving anymore and at that point, the only people with drivers licenses are professional race car drivers because that's still a sport or something like that.
145
00:22:13,000 --> 00:22:30,000
So let's say in the meantime, we have a lot of AI assistant driving, then probably when you're doing like a driver's license right now like they should actually expect and appreciate that you may be using an assistant driving and that might be the right thing to do
146
00:22:30,000 --> 00:22:36,000
like can use the AI properly if not you should get your license maybe at some point. I don't know.
147
00:22:36,000 --> 00:22:40,000
But I think I think the real world should mimic the educational world, or vice versa.
148
00:22:40,000 --> 00:22:46,000
Yeah, this is an interesting debate where as AI does more and more of our work.
149
00:22:46,000 --> 00:22:53,000
At what some point we solely exist to offload liability from the eye and the patient.
150
00:22:53,000 --> 00:22:56,000
When do you see that transitioning.
151
00:22:56,000 --> 00:23:06,000
And say we have the Star Trek version of this AI physician, and no human is involved no human physician is involved.
152
00:23:06,000 --> 00:23:12,000
What is the liability rest on the AI on the patient, or is there a shared liability model.
153
00:23:12,000 --> 00:23:22,000
It's a great question so I think the analogy I tend to think about is like let's let's take medical imaging because that's probably the lowest one of the lowest hanging fruits when it comes to AI and medicine.
154
00:23:22,000 --> 00:23:37,000
So, you know, there's a question of well like, on the one hand, if everything was purely a human assessment by radiologists and let's say there was a radiologist error who is making countless errors in their reports.
155
00:23:37,000 --> 00:23:43,000
Well then someone could say well, this one human radiologist had a massively high error rate.
156
00:23:43,000 --> 00:23:52,000
If we have radiologists from the system. Then we've gotten rid of the error or if we retrain them or they fix their problems, but no one else is affected like all the radiologists are fine.
157
00:23:52,000 --> 00:24:04,000
Whereas if you had an algorithm, an AI model for radiology that had errors, then every image is read incorrectly, and the massive scale that impacted is bad until you fix that problem.
158
00:24:04,000 --> 00:24:10,000
I think that's where people get worried about AI thinking that, yes, benefits get scaled but so do errors.
159
00:24:10,000 --> 00:24:21,000
So the way I look at it is, is like what is the total error rate or the total benefit and so for example, you know, even if an AI model and imaging does have errors.
160
00:24:21,000 --> 00:24:34,000
But if it in aggregate has fewer errors, first of all, the error would never be zero. Right, so I think people keep thinking that as we perfect well humans are imperfect the human radiology error rate is not 0%.
161
00:24:34,000 --> 00:24:43,000
I hope it's small but it's not 0%. So in the same way to think that AI would be 0% is flawed thinking the first place it's even humans can get to 0%.
162
00:24:43,000 --> 00:24:56,000
But if the total error rate from an AI is lower than the human error rate and aggregate as close as 0% as possible. I think that's actually a good thing and we should strive to get closer to 0% and if it gets us closer that's better.
163
00:24:56,000 --> 00:25:08,000
So if it's a reliability question then the question I have is like well, at the end of the day, the software is an assist. But in the day like the, the healthcare system should own the outcome.
164
00:25:08,000 --> 00:25:20,000
Right. So for example, if we're the system has to the healthcare system has to decide is the AI good enough or so much better than us that we shouldn't have humans looking at it at all.
165
00:25:20,000 --> 00:25:38,000
So I think that's your point should AI be an assist that just lets us do more at scale. And at the end of the day it's like, and I don't know the legal stuff so it's probably something in law that decides when when liability passes off from a provider to technology
166
00:25:38,000 --> 00:25:50,000
there's probably some precedent set in law, and probably the end of the day both parties get gets get sued or in trouble to be honest that's probably what really happens and then they end up negotiating, who takes all the more blame than the other.
167
00:25:50,000 --> 00:25:58,000
But I think ultimately like the provider has to decide the level of risk they're comfortable with and how much confidence they have in, in the AI.
168
00:25:58,000 --> 00:26:01,000
But yeah, there's probably legal precedent so that's a good question.
169
00:26:01,000 --> 00:26:18,000
Yeah, generally from what I've seen it falls on the standard of care, determined by what your peers would do, which is good and bad because it sets a baseline for not practicing aggressive or risky medicine.
170
00:26:18,000 --> 00:26:27,000
But it also curves innovation, because any deviation from the standard of care by default is opening yourself up to malpractice.
171
00:26:27,000 --> 00:26:35,000
Okay, so that basically of none of your peers are innovative or using AI and medical imaging than the standard of care was not using.
172
00:26:35,000 --> 00:26:43,000
Which is, it sounds silly when you say it but I don't think that law will change anytime soon.
173
00:26:43,000 --> 00:26:46,000
For better or worse, mostly for worse I'd say.
174
00:26:46,000 --> 00:27:00,000
So my whole thing is that like, people should focus more I mean, I think maybe you're getting this anyways but like our healthcare system needs to focus more on like what are the goals, what are the outcomes, and having clear accountability tied with outcomes.
175
00:27:00,000 --> 00:27:13,000
And then based on that the right innovation the right tools, AI or not, what we pulled into the system to achieve those goals right so if there were certain targets that in the healthcare system where it was like, hey, like, I'm making this up but like let's say there's
176
00:27:13,000 --> 00:27:26,000
a huge backlog in medical imaging, we need to need to get more imaging done faster. And the healthcare system said okay well the only way we could do that would be to invest more heavily in AI to accelerate reading images.
177
00:27:26,000 --> 00:27:37,000
Well, now you have an aligned incentive to actually innovate. Otherwise, if, if the incentive was not to be more efficient, then, like, why are we expecting people to change.
178
00:27:37,000 --> 00:27:42,000
You know everyone's, you know, everyone's pretty rational about their motivations.
179
00:27:42,000 --> 00:27:47,000
Yeah, I don't know if I want to get started on our healthcare system.
180
00:27:47,000 --> 00:27:56,000
I'll ask you one question. If you could change one thing about, and by our I mean Ontario's healthcare system. What would it be.
181
00:27:56,000 --> 00:28:13,000
Oh, it would be exactly that it would be about. We talked for decades about shifting to a more performance based healthcare system where reimbursement and funding is tied to outcomes but it's been mostly talk very little, very little
182
00:28:13,000 --> 00:28:28,000
progress. You know as much as I think in in Ontario in Canada. We like to think sometimes that we are, you know, it's kind of some ways better than our American counterparts in the healthcare system because we're you know completely for the most
183
00:28:28,000 --> 00:28:32,000
part, completely most are publicly funded and they're more of a mixed system.
184
00:28:32,000 --> 00:28:46,000
We both have our own challenges and one of the things the US healthcare system has done I think better than us even a lot more progressive on different reimbursement models and value based care arrangements and at least trying to see what works and what
185
00:28:46,000 --> 00:28:55,000
doesn't. They've succeeded and failed in different ways and those models, but we've made very very limited progress. And so we've been mostly talk not very much action.
186
00:28:55,000 --> 00:29:04,000
And my concern is that, you know, you know our population is living longer, our elderly population is growing.
187
00:29:04,000 --> 00:29:22,000
There's also in general, you know, probably going to be population decline happening around the world in, and, and so our healthcare workforce is going to get smaller and coven that's caused all this burnout and poor, even worse, working
188
00:29:22,000 --> 00:29:34,000
in the workplace, and so what you're having is you have a growing elderly population with complex you know medical diseases, and then you have a shrinking healthcare workforce exacerbated by by coven now.
189
00:29:34,000 --> 00:29:43,000
And this shrinking healthcare workforce is not going to be able to care for this massively elderly growing population in Ontario can around the world.
190
00:29:43,000 --> 00:29:57,000
And if you think we're in a crisis today. It's going to get far far worse in the coming decades. And unless we act now to not only do payment reform but re engineer how we do healthcare delivery, get more people back into healthcare.
191
00:29:57,000 --> 00:30:02,000
It's, I'm deeply concerned about the future of healthcare delivery in this country.
192
00:30:02,000 --> 00:30:13,000
And I think there's not enough action being taken to try and change how we deliver care and pay for it. And if that doesn't happen like I just fear for like 1020 years from now what's going to happen.
193
00:30:13,000 --> 00:30:20,000
Why do you think that is, why is not enough action being taken what is the is.
194
00:30:20,000 --> 00:30:23,000
What is the missing piece of the puzzle here.
195
00:30:23,000 --> 00:30:33,000
So I think that the reality for us especially since healthcare is primarily publicly funded is it does start from the top of policy and government that sets.
196
00:30:33,000 --> 00:30:39,000
You know, the performance expectations, the metrics that matter how the reimbursement is done.
197
00:30:39,000 --> 00:30:54,000
And it takes a lot of courage and conviction to do things very differently. I mean, I think to be honest like if you did a lot of payment reform, you're probably going to upset a lot of physicians, in particular, who you know let's be honest when
198
00:30:54,000 --> 00:31:04,000
you're paid on a fee for service basis, where performance doesn't really matter besides volume. So kind of a pretty good situation to be in. Right. I mean in any other industry.
199
00:31:04,000 --> 00:31:19,000
Let's say like let's say business or anything else. If you don't perform if you don't if your organization can't hit targets. You could go out of business right and healthcare we we tolerate so much and in fact, we are frankly I think are, I think
200
00:31:19,000 --> 00:31:30,000
your highest performing providers and organizations that are delivering the best care at the lowest cost. You should probably be paying them better and the ones who aren't you should be paying them less and create incentive structure where it's better
201
00:31:30,000 --> 00:31:34,000
and the performance is rewarded.
202
00:31:34,000 --> 00:31:39,000
But we don't have that here it's there's, there's a lot less of meritocracy and healthcare delivery.
203
00:31:39,000 --> 00:31:49,000
And, and I frankly like for most people that's probably a pretty good thing maybe like who wants the added pressure of having to achieve better health outcomes.
204
00:31:49,000 --> 00:32:02,000
I think, as this idea gets confused sometimes about democracy and meritocracy, and there are different things the best ideas are not often the most popular ones are the most democratic ones.
205
00:32:02,000 --> 00:32:11,000
The one thing I will add to the value based care and I'll, I'll go based on Adam Grant's framework.
206
00:32:11,000 --> 00:32:29,000
I'll add him to my defensive corner is the way we measure value should be the process, not solely the outcomes, because poor processes can create good outcomes, and if we solely focus on the outcomes be incentivize perverse processes
207
00:32:29,000 --> 00:32:38,000
and weight loss is a good example where if you incentivize just restrict BMI, people will starve themselves, you know incentivize exercise healthy eating.
208
00:32:38,000 --> 00:32:42,000
So I'll just I'll just leave that as a statement.
209
00:32:42,000 --> 00:32:48,000
This is a good point to get into is medicine a calling for a job.
210
00:32:48,000 --> 00:32:51,000
And if it's a calling, how do you know.
211
00:32:51,000 --> 00:33:03,000
This is probably one where I probably disagree with some folks and you have to be fair I'm actually not practicing so take what I say with the grain of salt this point maybe, maybe if I was practicing I have a very different opinion, I think, I think when most
212
00:33:03,000 --> 00:33:08,000
people go into a health care profession medicine or otherwise.
213
00:33:08,000 --> 00:33:11,000
It very much starts off for a lot of people as a calling.
214
00:33:11,000 --> 00:33:19,000
Right, you go in because you're trying to make a difference you're trying to take, make an impact on patient care or the health care system or something else.
215
00:33:19,000 --> 00:33:28,000
And I think what unfortunately happens is the reality of healthcare medicine is that, man, is it. Is it frustrating is it stressful.
216
00:33:28,000 --> 00:33:33,000
So, working conditions have gotten worse, especially the last few years.
217
00:33:33,000 --> 00:33:46,000
And so I think it's very easy for folks and in healthcare medicine to, and understandably so get jaded get unhappy, satisfied, sometimes more recently want to leave.
218
00:33:46,000 --> 00:33:58,000
And so I think there's a lot of work that has to be done and important work has been done to turn it back into a calling and make it stay a calling. Once you're in it.
219
00:33:58,000 --> 00:34:10,000
And because I think it's, it's such an important part of society that we can afford to, not only can we not afford to lose such great people from the healthcare profession because we need it.
220
00:34:10,000 --> 00:34:23,000
We have to find a way to recruit more people like I actually think the government for example and healthcare organizations need to do more to actually like recruit, like we should be recruiting our best and brightest to go into healthcare more
221
00:34:23,000 --> 00:34:32,000
than we do now, like we need it. But to your point, we have to fix the culture, we have to make it a calling again and keep it a calling.
222
00:34:32,000 --> 00:34:41,000
And part of that has to do with like improving working conditions. Some of them I have to do with like improving the way we, we do like reimbursement and do it well.
223
00:34:41,000 --> 00:34:46,000
Part of them, you know, might be maybe there's more training that we could do that we're not doing.
224
00:34:46,000 --> 00:35:00,000
I mean, I don't have all the solutions but I think the problem I believe exists the fact that there's a culture issue that has to be improved and fixed so that way it is a calling again and it stays a calling.
225
00:35:00,000 --> 00:35:02,000
Okay.
226
00:35:02,000 --> 00:35:09,000
I'm not sure if I 100% agree with that but I get your point and I think we need more people in medicine that's as clear as ever.
227
00:35:09,000 --> 00:35:12,000
Let's talk about seamless MD.
228
00:35:12,000 --> 00:35:22,000
Talk to me, and feel free to go in as much detail as you'd like to about how you went from idea to landing your first client.
229
00:35:22,000 --> 00:35:39,000
Hopefully so, just to give it more background context to it so the initial idea of seamless MD was, what if we could build a technology platform to monitor a patient after they left hospital, you know, monitor their symptoms how they're doing,
230
00:35:39,000 --> 00:35:44,000
and then allow care team to catch those issues earlier and prevent something like a read mission.
231
00:35:44,000 --> 00:35:59,000
Now when I first started working on this my team. My initial interest was in preventing read missions for patients with complex chronic diseases so think heart failure COPD because that's actually what my research was in so that's the world that I knew.
232
00:35:59,000 --> 00:36:06,000
So then when I started talking to people in internal medicine like my mentors, my mentors colleagues and all that.
233
00:36:06,000 --> 00:36:15,000
None of them were interested in the idea. They were like I know this isn't going to work or like, you know, it's not that interesting to me personally, taking a lot of rejections.
234
00:36:15,000 --> 00:36:27,000
And then one day, one of those individuals said to me, you know, Josh, you should go talk to surgeons, they are more tech savvy they they're more innovative they like this kind of stuff you just make it more traction there said, okay I'm kind of disappointed
235
00:36:27,000 --> 00:36:33,000
here that but I got nothing to lose at this point everyone just rejected us. And so I started talking to surgeons.
236
00:36:33,000 --> 00:36:43,000
And they started saying, Oh yeah Josh come work with us like we have read mission problems too and yeah we are more innovative. And I said okay that's kind of interesting.
237
00:36:43,000 --> 00:36:53,000
And actually, you know when we first started finding surgeons. One of the things I did was I would go on PubMed, and I would type in surgery read missions.
238
00:36:53,000 --> 00:36:57,000
And I would find papers about it and I would look up who wrote those papers.
239
00:36:57,000 --> 00:37:16,000
And I remember one of the first three people that I that I called emailed one first three his name was Dr. David burger. At the time he was the vice chair of surgery at Bayer College of Medicine in Houston, Texas, and he had written a paper on preventing
240
00:37:16,000 --> 00:37:29,000
re admissions. And in that paper he had he, they had they came up with a framework for like these are the symptoms that we'd want to monitor after a colorectal surgery and potentially prevent read missions.
241
00:37:29,000 --> 00:37:41,000
And I emailed him said hey you know like I'm working on this idea for an app, you know, and your paper got me thinking well what if we just use an app to, you know, check in on those symptoms with the patient, you know, could that be a cool idea.
242
00:37:41,000 --> 00:37:55,000
And he responded to me. And then we just started talking and eventually bear college of medicine ended up becoming our very first pilot site and they actually did a study with us with the very first horrible version of seamless MD so you can imagine the results of that
243
00:37:55,000 --> 00:38:04,000
study were not at all. Fantastic. But it was a great first step, and a great first partner to get something going.
244
00:38:04,000 --> 00:38:16,000
Did you move from there to your second, third, fourth client was it more word of mouth, more cold emailing. And what advice do you have to founders trying to sell into health systems.
245
00:38:16,000 --> 00:38:36,000
Great question. The first the first handful were all cold outreach and networking so as a mixture of cold emailing hospitals, going through our networks like me ask my mentors hey do you know, like, who do you know, let's say, what surgeons, do you do you know
246
00:38:36,000 --> 00:38:48,000
what you think this might be up their alley, then everyone that you meet, asking them again, asking those people hey who else do you think might be interested in the idea about we still use those approaches today to try and like build out awareness and networks
247
00:38:48,000 --> 00:39:00,000
but especially back then when we have like zero proof points zero clout, or anything for us, no one's looking for us. So we're always having to reach out and look for people ourselves.
248
00:39:00,000 --> 00:39:11,000
And how did you monetize your service. One thing I look for in my angel investing is a clinical ROI and a financial ROI, and a lot of founders.
249
00:39:11,000 --> 00:39:17,000
I find they have a very clear clinical ROI, but their financial ROI needs some work.
250
00:39:17,000 --> 00:39:29,000
So you've found conservative success, and I was just wondering, you know, how are your conversations with your first customers or buyers and what financial ROI did you demonstrate.
251
00:39:29,000 --> 00:39:43,000
I'll answer your question with the story that I'll get more specific so it's a great point and I think one of the things that I've learned is that ROI is viewed differently by different stakeholders in the system, and oftentimes I think people in the same system
252
00:39:43,000 --> 00:39:57,000
have miscommunication internally but what actually matters I'll give you an example so you know we had a partner in Ontario years ago where they said hey Josh, come bring seamless here and do like a pilot, you know, study with us, and we guarantee you that if
253
00:39:57,000 --> 00:40:09,000
you can help us like physicians, reduce readmission significantly for sure the hospital CEO would buy into this and we can turn this into like a real commercial contract and of course like you believe these things right.
254
00:40:09,000 --> 00:40:19,000
So we go we implement it. We reduce readmission by over 50%. We presented contract before you implemented or was it more of a verbal contract.
255
00:40:19,000 --> 00:40:32,000
We still have because we're we you know we're a software platform we're managing PHI, etc. We have to be secure and we have to be contract with the hospital, but it was more that I think from their point of view, like, hey, once we prove it, like we can go to the
256
00:40:32,000 --> 00:40:46,000
hospital leadership to sustain and grow this even further beyond the initial scope. And so, they knew they were using it they were using it, but it was confined to like a very like defined group or department in the hospital.
257
00:40:46,000 --> 00:40:55,000
So then we measure the results we cut remissions by over 50%. And then together we co present the results to the CEO of the hospital.
258
00:40:55,000 --> 00:41:00,000
And the CEO says like wow like these are amazing results.
259
00:41:00,000 --> 00:41:06,000
But, but given how you know global funding at the time worked in Ontario with hospitals.
260
00:41:06,000 --> 00:41:19,000
I said actually Josh you know, if you reduce these readmissions, and you save me costs and the readmissions, you know the government will actually think I don't need as much money in following years budgets.
261
00:41:19,000 --> 00:41:33,000
So actually, it doesn't doesn't really save me money. It just means the government thinks I need less and so there's not a real financial incentive for me to scale this out, even though he's like yes this clearly would reduce readmissions further.
262
00:41:33,000 --> 00:41:42,000
And then the above moment being like wait so hold on like this whole time we assumed, you know, based on what we were told by the physicians that we just improved this outcome metric.
263
00:41:42,000 --> 00:41:57,000
The CEO will see the value. What we really should have been doing was talking to everyone, not just the physicians but also the CEO and the other executives to understand well like what ROI do you need to see, because because frankly, the executive
264
00:41:57,000 --> 00:42:08,000
is the one paying for this. And I think, and then what you guys over time is that there can be a disconnect in what they actually want to see results on and what what a clinical person.
265
00:42:08,000 --> 00:42:15,000
And so now we have to satisfy multiple parties like we have to improve. I'll put this way I'll give you an example.
266
00:42:15,000 --> 00:42:29,000
Quality and clinical people love seeing how seeing this can reduce hospital like the stay. And then some of the folks in like the finance side view that as an opportunity to increase throughput and bed capacity for the hospital.
267
00:42:29,000 --> 00:42:35,000
You know, related, related topics, but kind of different still. Yeah.
268
00:42:35,000 --> 00:42:38,000
Who do you find.
269
00:42:38,000 --> 00:42:41,000
First of all, that must have been incredibly frustrating.
270
00:42:41,000 --> 00:42:44,000
Yeah.
271
00:42:44,000 --> 00:43:00,000
I think the ROI from your point person's perspective is so vastly different from the CEOs perspective so I'm, I'm sorry you had that experience because I can't imagine being, especially early on and you're you're writing on this contract you and extend emotionally.
272
00:43:00,000 --> 00:43:02,000
I'm sorry you went through that.
273
00:43:02,000 --> 00:43:17,000
Okay, well, the biggest question I have is what happens is you have all these frontline clinicians and staff who put their own bloods volunteers into this. And despite it being successful from a clinical outcomes point of view, the fact that wasn't valued by the system.
274
00:43:17,000 --> 00:43:32,000
And then that means they're they're like demotivated to maybe improve quality in the future, or perhaps now they don't believe in digital health, because they had this one experience where they put all this effort into it it worked, did what was intended and still didn't matter.
275
00:43:32,000 --> 00:43:34,000
That's where I get more concerned for the system.
276
00:43:34,000 --> 00:43:36,000
Yeah.
277
00:43:36,000 --> 00:43:45,000
And this is different than the states where readmissions are not paid for within 30 days is that an accurate statement I made there.
278
00:43:45,000 --> 00:43:52,000
It's, it's, it's been more complicated than that so it's that for certain.
279
00:43:52,000 --> 00:44:09,000
Depending on the pair so for Medicare or CMS the pair, which is the government in the US, and that covers folks who are over 65 years old, they have certain conditions like heart failure COPD, etc where if you if the hospital exceeds a certain expected readmission
280
00:44:09,000 --> 00:44:24,000
rate, then they get poundies for it, but technically not not every readmission in the US is being penalized so it does matter, but I'll tell you, even the US part of the challenge is that there are some executives and hospitals who are kind of gaming the system
281
00:44:24,000 --> 00:44:34,000
right so I've had conversations with hospitals where they might say you know Josh, you know, I want you to help us bring our readmission rate from like 15% to let's say 10%.
282
00:44:34,000 --> 00:44:42,000
But we don't want you going lower than that, because at some point if it's lower than that, then the then our financial model shows that we actually started losing money.
283
00:44:42,000 --> 00:44:53,000
So, there are some strange complexities in even in the US where they have these incentives where like because there are some reverse, like reverse incentives.
284
00:44:53,000 --> 00:44:57,000
They actually want you to only improve it to a certain extent sometimes.
285
00:44:57,000 --> 00:45:07,000
What are your thoughts on the pay wider model, and recent recent paper, where the insurance company is the provider.
286
00:45:07,000 --> 00:45:17,000
So the incentives are clear, where they want to reduce costs to increase their profits. Now, I know their profits are capped currently in the states.
287
00:45:17,000 --> 00:45:21,000
Is that the same in Canada, and then what are your thoughts on the pay wider model.
288
00:45:21,000 --> 00:45:30,000
Yeah, so I don't know too much about the details of that and how it relates to how things are in Canada because you have to think in some ways.
289
00:45:30,000 --> 00:45:44,000
The public health care system here is, I guess not quite prepared so it's on the public provider model and I'm wrong because I'm not as technically sound on this but I assume in the provider model usually the payer also as a care delivery right so if I'm a
290
00:45:44,000 --> 00:45:54,000
provider, I'm the payer I'm the insurance provider, but I also own the hospitals, the physician practices and the whole care delivery so it's fully integrated.
291
00:45:54,000 --> 00:46:10,000
Whereas, let's say in Ontario, the government publicly funds healthcare, but technically a lot of the care providers here like the hospitals are not technically you don't typically work for the government as a family practice your your own entity so I guess you don't really have
292
00:46:10,000 --> 00:46:13,000
that that here.
293
00:46:13,000 --> 00:46:20,000
It sounds very good in theory, right, I'm guessing there are some clear limitations that I haven't really dug into or just yet.
294
00:46:20,000 --> 00:46:30,000
But I think it comes down to like, if you have the right leadership in the organization I think if you have great leadership that's doing things right.
295
00:46:30,000 --> 00:46:34,000
It can work very well. I suspect if you have bad leadership.
296
00:46:34,000 --> 00:46:46,000
It may not be so great it may not lead to great outcomes for patients. Are they and here's the other thing too like are they going to serve every patient in the region are they going to hand select patients, right part of the benefit of the Ontario
297
00:46:46,000 --> 00:46:56,000
healthcare system is that like everyone gets served. Right, no one's cherry picking their patients you know no one's excluding patients, you know from insurance coverage or anything.
298
00:46:56,000 --> 00:47:06,000
So in the ideal world, great leadership, fully integrated care delivery and funding model, and you serve every patient in the region, so that you have to deliver good care to everyone.
299
00:47:06,000 --> 00:47:08,000
Sounds good in theory.
300
00:47:08,000 --> 00:47:14,000
Yeah, I think it could work in certain verticals and pockets.
301
00:47:14,000 --> 00:47:29,000
I wonder if primary care should be more direct primary care with direct private pay and oncology surgery, more expensive procedures should be covered through a government insurance plan.
302
00:47:29,000 --> 00:47:40,000
It seems like the opposite will happen to be frank with you where more expensive services will become privatized and primary care will remain public.
303
00:47:40,000 --> 00:47:51,000
I know whatever went bankrupt from $50, but they did from $100,000 surgery was kind of my rebuttal there. What are your thoughts on private care in Canada.
304
00:47:51,000 --> 00:47:53,000
Is it needed.
305
00:47:53,000 --> 00:48:09,000
Should it exist will help alleviate our backlog and improve access and by private care I mean true private care where patients are paying for care, not the independent surgical centers, which is a whole other conversation.
306
00:48:09,000 --> 00:48:21,000
Yeah, so I mean, it's great questions a lot of debate online right now and I think, you know, the trying to alleviate the surgery backlog with more private surgical facilities is a really hot topic lately.
307
00:48:21,000 --> 00:48:36,000
My perspective on this tends to be that I think, I think it's been an oversimplification in the dialogue on social media to say that either private is always bad or to say, public, public is always good.
308
00:48:36,000 --> 00:48:49,000
I think the right answer ends up being somewhere between. For me, the important thing is you know do we have clarity on what the goals are for the healthcare system, right, which for me, right, if I were, you know, in charge to be like how do we get better access better
309
00:48:49,000 --> 00:48:54,000
quality lower cost right big picture details need to be worked out.
310
00:48:54,000 --> 00:49:11,000
And for me like if certain providers or provider groups can deliver better care to lower cost.
311
00:49:11,000 --> 00:49:26,000
There's a way to do that better with some aspect of like privately, you know paid for health care, then I think the province should be open minded to what what what would that look like and what's the impact, but I do think it's an oversimplification to say,
312
00:49:26,000 --> 00:49:41,000
well in every possible situation the world private healthcare is no place, or, or to argue like the opposite that we need to know this should always be the case for everything everything should be private like I think both are both extremes are probably the wrong answer.
313
00:49:41,000 --> 00:49:45,000
And so I think people just need to be open minded and actually have a dialogue.
314
00:49:45,000 --> 00:49:48,000
Because I think we need to focus on what the goal is.
315
00:49:48,000 --> 00:50:01,000
And everything else in between is just different tools or mechanisms to hit the goal, but being closed minded to tools I think is just is is missing an opportunity to improve the healthcare system.
316
00:50:01,000 --> 00:50:11,000
I agree, it's too bad that mentioning private healthcare is political kryptonite in Canada.
317
00:50:11,000 --> 00:50:20,000
But it seems like it could help definitely in some ways. And there is nuance and complexity here which are hard to find in today's world often.
318
00:50:20,000 --> 00:50:35,000
Yeah, and I think that if you are a very key stakeholder in how health care is delivered and organized. You have a duty to explore all the reasonable options to improve healthcare here, whether it's public or private or something else like you may not end up deciding
319
00:50:35,000 --> 00:50:41,000
to go down that route, but to ignore the options.
320
00:50:41,000 --> 00:50:48,000
That does not seem like you're, you're being responsible to like patients and the healthcare system.
321
00:50:48,000 --> 00:50:51,000
One thing I hear from founders.
322
00:50:51,000 --> 00:50:55,000
Often is it's really hard to raise money from Canadian VCs.
323
00:50:55,000 --> 00:50:58,000
That's much easier from American VCs.
324
00:50:58,000 --> 00:51:06,000
Talk to me a bit about your experience raising money for your startup and did you find that to be true.
325
00:51:06,000 --> 00:51:17,000
So when it comes to raising money, like, I think what I've learned is that raising money is like selling a product except the product is your company shares so it's very much a sales process.
326
00:51:17,000 --> 00:51:30,000
And in the same way that like I tell like founders. Hey, if you can get one customer you can get like five you can get five you can get 10 you get 10 you get 100. The same with like raising capital if you can get like one investor to believe in you and assuming
327
00:51:30,000 --> 00:51:42,000
they're not crazy. You can get five, you can get five you can get 10 and so forth. You know when we raised our seed round back in I think it was 2015 we raised 1.1 million.
328
00:51:42,000 --> 00:51:47,000
It was tough I actually had this whole spreadsheet of every investor I talked to.
329
00:51:47,000 --> 00:52:02,000
And most most that I invite a lot of ideas was just cold outreach. I mean that's a whole other topic we can get into if you want, but like I did not have a bunch of investors that I got intros to, or a bunch of in mount interest most of my investors I got through just cold
330
00:52:02,000 --> 00:52:08,000
outbound contacting, you know going down LinkedIn going down angel list Google.
331
00:52:08,000 --> 00:52:12,000
What was the message you're sending them.
332
00:52:12,000 --> 00:52:28,000
I was saying that I had to look it up but it was probably something like, like, something like hey like I'm a physician turned digital entrepreneur, a few lines on like traction raising capital like I thought this might be up your alley because you did X like you
333
00:52:28,000 --> 00:52:32,000
invested this company or, etc.
334
00:52:32,000 --> 00:52:43,000
And I shall tell you there were two profiles work out really well for us one was physicians who had angel invest in other companies.
335
00:52:43,000 --> 00:52:59,000
The second profile that was good was actually people who were part of hospital foundation boards, because clearly like they you know we're very successful financially and donated a lot to hospitals and.
336
00:52:59,000 --> 00:53:04,000
Therefore like that's why they're heavily involved in the foundation, but they clearly care about health care.
337
00:53:04,000 --> 00:53:15,000
And so, we had a number of like physician investors or, and some hospital foundation board member investors who were both excited about the company from a mission point of view.
338
00:53:15,000 --> 00:53:23,000
And so I think part of it is figuring out what what is the right ideal investor profile for a company and it varies this is just what works for us.
339
00:53:23,000 --> 00:53:39,000
But I can tell you like more than 90% of the investors who actually met with me so a lot of them even meet with me of like the, you know, who knows how many over 100 probably that I met more than 90% rejected us after meeting me.
340
00:53:39,000 --> 00:53:46,000
And so, people don't realize like, at least for companies like ours like how many investors I had to speak to, to get to us.
341
00:53:46,000 --> 00:53:49,000
And why did they say no.
342
00:53:49,000 --> 00:53:54,000
Oh my gosh, back then I mean price so many good reasons to say no back then.
343
00:53:54,000 --> 00:54:05,000
You know we had limited traction right, I mean, I mean you think digital health is like, do you have this so I think relatively early now, right, it's gotten a lot better in last year's but start rolling now.
344
00:54:05,000 --> 00:54:19,000
Like 2015 I can tell you like it was tiny, like there weren't weren't a lot of, like, besides the epics as far as in medics the world there were no other major healthcare, like it success stories not many at the time.
345
00:54:19,000 --> 00:54:32,000
We had for a limited traction. Right, we didn't have, you know, all that many probably case days or anything so even back then most people were investing in the vision, the team and the mission.
346
00:54:32,000 --> 00:54:35,000
People don't want to invest in that.
347
00:54:35,000 --> 00:54:52,000
They want traction they want data, you know, what made you successful, because the idea to me isn't particularly novel which means it's the team, the execution, the grid the perseverance that made it a success.
348
00:54:52,000 --> 00:55:01,000
Do you think that's an accurate representation, and why do you think others failed at doing this or why did your biggest competitors fail.
349
00:55:01,000 --> 00:55:11,000
I think there's a couple things and I think part of it is how we've chosen to differentiate so when people ask me well like what makes seamless and be different.
350
00:55:11,000 --> 00:55:19,000
You know, investors want to hear about companies that have some unique note, maybe a network effect or something else like that.
351
00:55:19,000 --> 00:55:31,000
And so the differentiation for the companies is the same as what I as well as investors and customers which is, we choose to differentiate in ways that minimize friction to adoption, because as you know very well in healthcare.
352
00:55:31,000 --> 00:55:41,000
It's easy to build the technology, it's very hard to get adoption new technology. And so for example for us it was investing early on and things like integrations with the major EHRs.
353
00:55:41,000 --> 00:55:54,000
It was like helping our customer partners, continue to build out the largest data set of evidence in the industry to give confidence to clinicians that this can work in the healthcare setting.
354
00:55:54,000 --> 00:56:01,000
It's finding better and better best practices to train the staff on and how do you get patients and families adopting this novel technology.
355
00:56:01,000 --> 00:56:11,000
And none of that is like the sexy network effects remotes that an investor cares about. It was all about how do we get better adoption and success with our clinical partners.
356
00:56:11,000 --> 00:56:18,000
That's what our customers value, they don't care about network effects, they want stuff that works that patients use that drive outcomes.
357
00:56:18,000 --> 00:56:35,000
And so I think that we really focus on how do we deliver results and minimize friction to adoption. That's what stood out where a lot of our competitors are focused on like, oh I'm going to integrate with like our platform with like Uber, or we're going to do some fancy AI that doesn't really improve
358
00:56:35,000 --> 00:56:44,000
something for like the patient outcome but we want to say that we have AI like, who cares about, like I don't care about that stuff I want to actually drive value and make adoption easier.
359
00:56:44,000 --> 00:56:59,000
And that makes a big difference. And so our focus is very clear. And I think that shows in our team construction, so about a quarter of our team actually comes from a clinical background some sort from pharmacy or they're from a dietitian or health
360
00:56:59,000 --> 00:57:01,000
communication specialist.
361
00:57:01,000 --> 00:57:09,000
You look at all of our typical competitors historically, it was mostly like engineers, business people like very little clinical people involved.
362
00:57:09,000 --> 00:57:15,000
That shows and how they interact with customers and providers. There's a disconnect there.
363
00:57:15,000 --> 00:57:20,000
That is an invaluable tip, and I completely agree with that.
364
00:57:20,000 --> 00:57:30,000
Josh, would you rather have a life full of many successes or chase outrageous goals with massive failures.
365
00:57:30,000 --> 00:57:46,000
You know I think for me, those two are not really mutually exclusive. So for example, I think a lot of the goals we have for building CMSMD are in some ways outrageous or maybe back then, like even some things that we've done now back then felt outrageous.
366
00:57:46,000 --> 00:57:57,000
But along the way we've had a lot of both many successes and many failures throughout our journey. And so I think for me now it's about like let's go for really ambitious goals.
367
00:57:57,000 --> 00:58:04,000
But we have to celebrate like the many successes along the way.
368
00:58:04,000 --> 00:58:06,000
Yeah, that's what's worked out well for us.
369
00:58:06,000 --> 00:58:16,000
What's the end goal for you, Josh, are you, are you living life in the now, being mindful of everything.
370
00:58:16,000 --> 00:58:19,000
Are you still chasing your next goal.
371
00:58:19,000 --> 00:58:26,000
And what is what is 90 year old Josh want to have accomplished in this life.
372
00:58:26,000 --> 00:58:35,000
This topic reminds me of a great talk that I saw from a gentleman named Jordan banks Jordan banks used to be the the managing director of Facebook Canada.
373
00:58:35,000 --> 00:58:47,000
And he had this great slide in one of his talks about career journeys. And he talked about how, you know, when he looked back like 10 years ago at the time.
374
00:58:47,000 --> 00:58:58,000
And you had asked him 10 years before like hey like, what do you think you'd be doing 10 years from now. It was I think it was something like oh I'm going to be like a lawyer or something and a deferment all that.
375
00:58:58,000 --> 00:59:08,000
What happened in reality he zigzagged from like I think was law then he went to like work for like it was like eBay or something about like Facebook Canada like it no plans to go into like technology right.
376
00:59:08,000 --> 00:59:21,000
But he kept himself open minded and curious and, you know, accidentally found his way into a career that he absolutely loved. And so his whole point which I agree with is that humans are very bad at predicting the future for ourselves.
377
00:59:21,000 --> 00:59:36,000
And so I think for me, I stopped trying to think about like five or 10 year goals and horizons, and I tried to focus more on like, what am I curious about what excites me and living in that moment, knowing that as long as I'm open minded to great opportunities,
378
00:59:36,000 --> 00:59:47,000
I'm going to end up in a great place, even though I have no idea today what that's going to look like. So for example when we, when I, when I got into medical school like, do we plan to start CMS MD.
379
00:59:47,000 --> 00:59:54,000
No. Right. I thought maybe I do something entrepreneurial 1020 years on the line but this was never planned.
380
00:59:54,000 --> 01:00:01,000
And so when I think about like success in the future like I don't know what that holds. I do know what matters to me is impact.
381
01:00:01,000 --> 01:00:18,000
And so I think about seamless and be for example, you know, our biggest goals around like how do we get this into more clinical areas to more health systems into the hands of more patients and how do we support, not only, you know, acute care,
382
01:00:18,000 --> 01:00:28,000
but how do we support at some point primary care and community care the whole continuum and so for us, our goals are really centered around how do we accelerate impact.
383
01:00:28,000 --> 01:00:37,000
And so I think those things I think a lot of great outcomes will come along the way whether it's financial or clinical or otherwise.
384
01:00:37,000 --> 01:00:45,000
But yeah, I don't I don't plan in tenure horizons I can't see that long to the future, you just, I'm not that good. So, I think that's a good answer.
385
01:00:45,000 --> 01:00:55,000
And too many of us get caught up in creating goals, and then trying to chase them for the wrong reasons.
386
01:00:55,000 --> 01:01:09,000
And what we do is even as a company, I think sometimes, you know, we, we set goals, and then we realized it wasn't really a good goal to set. It's okay to reset, you know, or a perspective on it and change the goal.
387
01:01:09,000 --> 01:01:19,000
I mean you don't want to extend bag and like, you know, change the goal because you didn't perform well but it's more like the big picture goals. Sometimes they're wrong, and we should change them.
388
01:01:19,000 --> 01:01:27,000
Do you think healthcare is a good fit for the traditional venture model where they're looking for an exit, maximum 10 years.
389
01:01:27,000 --> 01:01:35,000
Or do you think venture should extend their timeline or fund horizon to 20 years to fit healthcare.
390
01:01:35,000 --> 01:01:49,000
Yeah, so I think the way I look at it now is, we'll back up so, first of all, like most companies should not be venture back in the healthcare or not like, I think, I think we've seen the last like several years now that there are probably too many
391
01:01:49,000 --> 01:02:02,000
companies that got venture funding that frankly, probably should never be in venture back in the first place and I think now that we're seeing the great reset and venture capital which which is, which is probably healthy and it was important that we we end up doing that.
392
01:02:02,000 --> 01:02:20,000
And I think probably that is emphasized even more in healthcare when given how healthcare works, there's probably even fewer companies that should be ventured back to the healthcare compared to to like the general like technology market let's say.
393
01:02:20,000 --> 01:02:36,000
I mean if you look at the biggest financial successes as companies, right, like the biggest companies by far for the most part or not like, let's call it healthcare technology companies right like your biggest health technology company, you know, maybe or let's say in
394
01:02:36,000 --> 01:02:40,000
software you could say is let's say it's epic or something. Right.
395
01:02:40,000 --> 01:02:53,000
And if you compare that to the size of like a Microsoft or Google or an apple like a Simon close right it's never going to be as big as those right so you think about it. If you were going to try and go into venture healthcare in general is probably not the industry you want
396
01:02:53,000 --> 01:02:54,000
to be in.
397
01:02:54,000 --> 01:03:10,000
But there's clearly some investors who just like, are wanting to have an impact in healthcare because they love the industry they love the impact in the same way like I tell founders like, hey, I get my goal was to build to get the best financial outcome for myself.
398
01:03:10,000 --> 01:03:19,000
No one should ever go in healthcare, like you, you're more likely to build a much bigger financial success for company outside healthcare than in healthcare.
399
01:03:19,000 --> 01:03:24,000
So, whether you're investing in healthcare, or trying to build a company in healthcare.
400
01:03:24,000 --> 01:03:31,000
If your primary goal is to get the best financial return. You just shouldn't be in healthcare it makes no sense as an investor or an entrepreneur.
401
01:03:31,000 --> 01:03:40,000
Part of the reason you're doing is because you care about the industry or passionate about it and whatnot. And so then you just have to incorporate that thinking into your decision making.
402
01:03:40,000 --> 01:03:50,000
That's not to say there aren't venture backable things in healthcare is just there's probably a lot fewer of them or even like the potential exercises will never be as great as, you know, consumer markets or something else.
403
01:03:50,000 --> 01:03:52,000
Yeah, I completely agree.
404
01:03:52,000 --> 01:03:57,000
The reason I stick to healthcare for investing is, I have industry expertise.
405
01:03:57,000 --> 01:04:05,000
Be I can actively help my founders with contract sales and give them whatever experience I have.
406
01:04:05,000 --> 01:04:13,000
And see I want to make an impact. I'll plug a book here psychology of money by Morgan House. So I think it's a book everyone should read.
407
01:04:13,000 --> 01:04:25,000
And it talks about how your financial decisions are not completely based on maximizing returns and being rational and emotion has a big part of them for better or worse.
408
01:04:25,000 --> 01:04:28,000
It's not worth fighting that part of it.
409
01:04:28,000 --> 01:04:34,000
It's better to embrace it and then realize that we make emotional decisions and accept that.
410
01:04:34,000 --> 01:04:43,000
This has been a great pleasure Josh I thoroughly enjoyed this conversation. And I hope we can have a party someday.
411
01:04:43,000 --> 01:04:45,000
You got it.
412
01:04:45,000 --> 01:04:49,000
Rashad Thank you so much. I had a great time on your show. Thank you for the opportunity.
413
01:04:49,000 --> 01:04:52,000
Where can people find you, Josh.
414
01:04:52,000 --> 01:04:57,000
Yeah, you can find me on Twitter at Joshua PL you.
415
01:04:57,000 --> 01:05:07,000
Awesome. Thank you.