Dec. 1, 2022
How far are we from general AI in healthcare? Amit Garg - Tau Ventures

The player is loading ...

Amit is a founding partner at Tau Venture, an $85 million fund which invests in early stage healthcare and enterprise software startups with an AI focus.
00:00 Introduction
02:50 Future of the fund
04:15 How are you thinking about allocation?
08:15 Time to exit
09:10 Secondary market
12:15 Healthcare and profits
15:25 Value based care
18:25 One piece of advice for your past self
23:00 What tailwinds are you investing on?
28:45 AI and biotech
31:15 Will AI replace us?
40:00 What would you do if money was not an issue
44:00 What makes you resilient?
1
00:00:00,000 --> 00:00:02,960
Amit, thanks so much for being here.
2
00:00:02,960 --> 00:00:08,240
You have been very generous with your time with me and I'm honored for that.
3
00:00:08,240 --> 00:00:11,560
Thanks for taking the time to come to this podcast today.
4
00:00:11,560 --> 00:00:15,640
I think to start, if you can give us a brief introduction and then we can get into it.
5
00:00:15,640 --> 00:00:18,760
Well, Rishad, thank you for the kind words.
6
00:00:18,760 --> 00:00:20,040
Have the same words back to you.
7
00:00:20,040 --> 00:00:25,520
It's been a pleasure getting to know you and collaborating and at some point we will do
8
00:00:25,520 --> 00:00:26,520
more deals together.
9
00:00:26,520 --> 00:00:30,920
But for today, I'm just excited to be here and have a conversation.
10
00:00:30,920 --> 00:00:35,480
My background for those listening in, for those watching, I am a venture capitalist
11
00:00:35,480 --> 00:00:36,920
in Silicon Valley.
12
00:00:36,920 --> 00:00:39,600
I run a fund called Tau Ventures.
13
00:00:39,600 --> 00:00:45,320
As of recording this, we have 85 million total that we're investing primarily at the seed
14
00:00:45,320 --> 00:00:48,680
stage writing typically $500,000 checks.
15
00:00:48,680 --> 00:00:50,800
I focus on the healthcare side.
16
00:00:50,800 --> 00:00:56,000
My co-founder and partner Sanjay focuses on enterprise and everything we invest in, we
17
00:00:56,000 --> 00:00:59,320
look for AI, artificial intelligence.
18
00:00:59,320 --> 00:01:03,480
Companies we have invested in include Iterative Health, which is computer vision for colon
19
00:01:03,480 --> 00:01:04,480
cancer.
20
00:01:04,480 --> 00:01:06,480
Now it does a lot more than that.
21
00:01:06,480 --> 00:01:10,400
We have companies that do machine learning for drug discovery, companies that help with
22
00:01:10,400 --> 00:01:11,400
pre-auth.
23
00:01:11,400 --> 00:01:18,960
In general, what we're looking for is how can AI really empower both folks in healthcare
24
00:01:18,960 --> 00:01:22,960
and technology and the intersection to make a really big difference?
25
00:01:22,960 --> 00:01:25,040
We have so many problems in healthcare.
26
00:01:25,040 --> 00:01:29,640
I know Drishad, you're based in Canada, but I'm going to pick on the US here.
27
00:01:29,640 --> 00:01:33,880
In the US, we spent just over 18% of our GDP in healthcare.
28
00:01:33,880 --> 00:01:38,280
We have worse outcomes than countries that are comparable to the US.
29
00:01:38,280 --> 00:01:45,160
We honestly have created a very tragic situation in the US where we have both the best treatments
30
00:01:45,160 --> 00:01:50,160
and the best doctors in the world, but the costs are just out of hand.
31
00:01:50,160 --> 00:01:52,400
The weights are out of hand.
32
00:01:52,400 --> 00:01:53,400
There's a lot to be done.
33
00:01:53,400 --> 00:01:57,320
I think technology is in many ways a very powerful tool.
34
00:01:57,320 --> 00:01:59,760
It's not the only tool to make a difference.
35
00:01:59,760 --> 00:02:03,800
Before all of this, I co-founded a startup called Health IQ.
36
00:02:03,800 --> 00:02:08,320
Publicly, you'll see that the company raised $140 million.
37
00:02:08,320 --> 00:02:12,200
Before that, I worked at a couple of other VC funds, started my career at Google.
38
00:02:12,200 --> 00:02:16,960
This was pre-IPO days at Google, and I guess I'm dating myself.
39
00:02:16,960 --> 00:02:21,360
I was a product manager there, learned a whole bunch and very grateful for my experience
40
00:02:21,360 --> 00:02:22,520
there.
41
00:02:22,520 --> 00:02:25,400
My training is in computer science and biology.
42
00:02:25,400 --> 00:02:28,080
My master's is at the intersection of those two.
43
00:02:28,080 --> 00:02:29,920
Then I also went to business school.
44
00:02:29,920 --> 00:02:36,200
I'm trying to bring all of this, my life experiences, having worked in a big company, having started
45
00:02:36,200 --> 00:02:40,040
companies, having worked at VC funds, and trying to bring it all together here at Tau
46
00:02:40,040 --> 00:02:46,520
Ventures so that we can truly, truly help our entrepreneurs succeed, make an impact,
47
00:02:46,520 --> 00:02:47,520
and make money.
48
00:02:47,520 --> 00:02:50,680
I believe in the intersection of all of those.
49
00:02:50,680 --> 00:02:51,680
Perfect.
50
00:02:51,680 --> 00:02:52,680
Thanks for that introduction.
51
00:02:52,680 --> 00:02:59,160
I told Sametra when I was talking to him that you're going to have more than a billion under
52
00:02:59,160 --> 00:03:00,760
management in five years.
53
00:03:00,760 --> 00:03:02,640
Wow, you are very kind.
54
00:03:02,640 --> 00:03:05,400
I don't think we will, to be honest.
55
00:03:05,400 --> 00:03:12,840
Not trying to be falsely modest here, but raising funds takes time and you can't run
56
00:03:12,840 --> 00:03:14,280
before you walk.
57
00:03:14,280 --> 00:03:18,320
I don't think it's even the right thing for us to get to a billion within five years.
58
00:03:18,320 --> 00:03:23,160
I think more likely we'll raise another fund and then another fund every two or three years.
59
00:03:23,160 --> 00:03:24,560
That's the norm.
60
00:03:24,560 --> 00:03:27,400
You roughly maybe double in size.
61
00:03:27,400 --> 00:03:32,400
There's exceptions and there's reasons to do something different.
62
00:03:32,400 --> 00:03:39,160
If you follow the trajectory here, that is the biggest expectation, we will grow in size
63
00:03:39,160 --> 00:03:40,160
for sure.
64
00:03:40,160 --> 00:03:43,680
I always want to be an early stage fund.
65
00:03:43,680 --> 00:03:48,560
At least that's what we're thinking right now that our differentiator is in how we help
66
00:03:48,560 --> 00:03:51,440
entrepreneurs build companies at the late stage.
67
00:03:51,440 --> 00:03:57,000
It's a lot more about financial modeling and there's a lot of value in that too for sure,
68
00:03:57,000 --> 00:03:59,680
but it's not what we are focused on.
69
00:03:59,680 --> 00:04:05,040
It's also much easier to do a 10x on a $500 million fund than on a $5 billion fund.
70
00:04:05,040 --> 00:04:11,400
There's practical reasons to also keep your size within a certain range.
71
00:04:11,400 --> 00:04:17,280
Let's talk about when you're thinking of allocating this $85 million, how do you manage risk and
72
00:04:17,280 --> 00:04:18,280
reward?
73
00:04:18,280 --> 00:04:23,320
Are you looking for say a 10x return on every single deal you put into?
74
00:04:23,320 --> 00:04:28,680
Are you okay with maybe a 1000x return, but a 5% chance of that on some deals?
75
00:04:28,680 --> 00:04:34,840
How are you thinking about that risk reward to eventually return back, as you said, 10x
76
00:04:34,840 --> 00:04:36,880
on the whole $85 million?
77
00:04:36,880 --> 00:04:40,200
10x by the way is very ambitious.
78
00:04:40,200 --> 00:04:47,080
It says 10x, but if you look at the data, overwhelmingly good funds do 3x and exceptionally
79
00:04:47,080 --> 00:04:49,320
good fund does 5x or higher.
80
00:04:49,320 --> 00:04:56,680
So there is a long tail of distributions for sure, but what we were hoping is 3x at least
81
00:04:56,680 --> 00:04:58,280
and 5x ambitiously.
82
00:04:58,280 --> 00:05:02,360
Anything over that I'll be extremely, extremely happy about.
83
00:05:02,360 --> 00:05:08,280
I can share today our fund is at 2.5x in just over two years.
84
00:05:08,280 --> 00:05:09,780
That's the first fund.
85
00:05:09,780 --> 00:05:11,600
So we seem to be on track.
86
00:05:11,600 --> 00:05:14,480
Now the law of small numbers helps me.
87
00:05:14,480 --> 00:05:20,560
As I was talking earlier, if you have a smaller fund, it's easier to actually get outsized
88
00:05:20,560 --> 00:05:21,560
returns.
89
00:05:21,560 --> 00:05:24,320
You have more flexibility when you buy and sell.
90
00:05:24,320 --> 00:05:30,560
You're also hungry and you also need to get a few exits in order for the needle to really
91
00:05:30,560 --> 00:05:31,560
move.
92
00:05:31,560 --> 00:05:38,500
So for all those three reasons, having a manageable fund size is a good thing.
93
00:05:38,500 --> 00:05:41,680
So when we look at deals, that's partly what we look at also.
94
00:05:41,680 --> 00:05:45,860
We're investing primarily at the seed stage and specifically late seed.
95
00:05:45,860 --> 00:05:49,880
So we're looking for companies that are a little bit more than two people in a garage.
96
00:05:49,880 --> 00:05:52,600
They typically have a pipeline of customers.
97
00:05:52,600 --> 00:05:54,320
That's the key thing to look for.
98
00:05:54,320 --> 00:05:55,800
If you have pilots, wonderful.
99
00:05:55,800 --> 00:05:58,080
If you have pay pilots, even better.
100
00:05:58,080 --> 00:06:02,360
So if you're making money, revenue is amazing, but that's not our expectation.
101
00:06:02,360 --> 00:06:07,340
Our expectation is that you have a roster of potential clients and that you're able
102
00:06:07,340 --> 00:06:12,040
to close on them and get to recurring contracts and eventually to a Series A within nine to
103
00:06:12,040 --> 00:06:13,040
18 months.
104
00:06:13,040 --> 00:06:16,920
A Series A is when you have product market fit here in the US at least.
105
00:06:16,920 --> 00:06:19,400
A million ARR is kind of what people look for.
106
00:06:19,400 --> 00:06:24,080
So we'll look for, can this company get there?
107
00:06:24,080 --> 00:06:27,120
And can we really help them get there?
108
00:06:27,120 --> 00:06:30,760
And can this company have an explosive growth?
109
00:06:30,760 --> 00:06:35,620
So what you want is not just a company that has good revenues and good profitability,
110
00:06:35,620 --> 00:06:38,120
but in what time horizon it does that.
111
00:06:38,120 --> 00:06:42,960
If a company is raising a seed and they have been around for 10 years, it's not a good
112
00:06:42,960 --> 00:06:43,960
fit for me.
113
00:06:43,960 --> 00:06:46,720
They may be a very good company, but it's not what I'm looking for.
114
00:06:46,720 --> 00:06:52,880
I'm looking for perhaps a Silicon Valley mold of companies where you're raising every couple
115
00:06:52,880 --> 00:06:54,160
of years.
116
00:06:54,160 --> 00:07:01,400
You are ideally doubling every year in terms of your revenues or very soon in your ARR,
117
00:07:01,400 --> 00:07:03,160
annual recurring revenue.
118
00:07:03,160 --> 00:07:06,760
But we don't expect every company to hit 10X, obviously.
119
00:07:06,760 --> 00:07:12,320
If every company hit 10X, then there would be no need for venture capitalists, I should
120
00:07:12,320 --> 00:07:13,320
say.
121
00:07:13,320 --> 00:07:15,560
But we look for the odds that a company could get there.
122
00:07:15,560 --> 00:07:22,240
So in our portfolio construction, we do hope and expect, and so far are seeing this, that
123
00:07:22,240 --> 00:07:25,680
about 10% of the companies will actually do 10X or better.
124
00:07:25,680 --> 00:07:33,080
And then maybe 50% or 60% will do somewhere between three or four X and five and six X.
125
00:07:33,080 --> 00:07:39,080
Some companies may do one or two X, and you may have some companies that lose money.
126
00:07:39,080 --> 00:07:40,080
It's possible.
127
00:07:40,080 --> 00:07:43,680
And in our portfolio construction, we said maybe 10% of the companies will actually end
128
00:07:43,680 --> 00:07:44,680
up making less.
129
00:07:44,680 --> 00:07:50,680
I'm happy to say that so far we have way over indexed on the successes.
130
00:07:50,680 --> 00:07:55,520
And once again, that's in some ways a function of having a small fund and having the flexibility,
131
00:07:55,520 --> 00:08:00,700
how we play Parada, when we play Parada, how we help other portfolio companies in terms
132
00:08:00,700 --> 00:08:05,480
of getting customer traction and in terms of getting investor traction and then helping
133
00:08:05,480 --> 00:08:06,480
them with exits.
134
00:08:06,480 --> 00:08:07,720
I'm very proud to say this.
135
00:08:07,720 --> 00:08:11,780
We have had four exits so far, and the first fund is just about two years old, just over
136
00:08:11,780 --> 00:08:13,240
two years old.
137
00:08:13,240 --> 00:08:14,240
Congratulations.
138
00:08:14,240 --> 00:08:15,240
That is impressive.
139
00:08:15,240 --> 00:08:20,440
I was reading some statistics on AngelList, and the average time to exit after VC money
140
00:08:20,440 --> 00:08:22,440
was 5.6 years.
141
00:08:22,440 --> 00:08:26,200
So you guys are doing better than half that.
142
00:08:26,200 --> 00:08:27,200
So it depends.
143
00:08:27,200 --> 00:08:31,920
First of all, the 5.6, I wasn't familiar with that figure, I've heard higher figures than
144
00:08:31,920 --> 00:08:32,920
that.
145
00:08:32,920 --> 00:08:35,060
Different industries have different time horizons.
146
00:08:35,060 --> 00:08:37,560
When you invest also has a different time horizon.
147
00:08:37,560 --> 00:08:42,120
Like if you're investing at a series C, well, you're probably looking more like a three
148
00:08:42,120 --> 00:08:47,200
or four X and probably within three or four years rather than a 10X in 10 years when you
149
00:08:47,200 --> 00:08:48,640
invest at the seed stage, right?
150
00:08:48,640 --> 00:08:51,240
It's risk reward based on time.
151
00:08:51,240 --> 00:08:57,160
But a couple of things that have been beneficial to us is we have had M&As and whatever comes
152
00:08:57,160 --> 00:09:01,360
out of companies that got acquired actually the acquirement IPO, and we have stock in
153
00:09:01,360 --> 00:09:03,000
the acquire also.
154
00:09:03,000 --> 00:09:08,840
But the other instrument that we have besides IPO and M&A is to do secondaries.
155
00:09:08,840 --> 00:09:14,880
So I'm open to selling my shares to somebody else, and I'm open to buying shares from somebody
156
00:09:14,880 --> 00:09:15,880
else.
157
00:09:15,880 --> 00:09:19,640
I've actually bought a lot more so far than I've sold.
158
00:09:19,640 --> 00:09:25,880
I've actually bought four or five times more so far from angel investors, from other VC
159
00:09:25,880 --> 00:09:29,360
investors, and I'm open to selling to somebody else.
160
00:09:29,360 --> 00:09:34,160
I have only done it once so far, but in the near future, I'll do more of it.
161
00:09:34,160 --> 00:09:36,600
And that's what's called a secondary.
162
00:09:36,600 --> 00:09:41,840
So the advantage of a secondary is that you can recognize exits a little bit earlier and
163
00:09:41,840 --> 00:09:46,080
return money to your LPs, your own investors a little bit earlier.
164
00:09:46,080 --> 00:09:50,080
Now how and when you do that, there's art and science to it.
165
00:09:50,080 --> 00:09:55,440
And how much you sell is also obviously there's a lot of art and science to it.
166
00:09:55,440 --> 00:10:00,160
So inherently, I would think if your company is doing well, you would want to buy as much
167
00:10:00,160 --> 00:10:01,160
as you can.
168
00:10:01,160 --> 00:10:02,160
When?
169
00:10:02,160 --> 00:10:03,160
Not necessarily.
170
00:10:03,160 --> 00:10:04,280
Not necessarily.
171
00:10:04,280 --> 00:10:07,200
We are big believers in supporting our companies.
172
00:10:07,200 --> 00:10:11,800
But mind you, we are at the moment 85 AUM.
173
00:10:11,800 --> 00:10:16,120
So if my company is already worth a billion, even if I put in 2 million, 3 million, which
174
00:10:16,120 --> 00:10:19,960
for me is a big check right now, it doesn't move the needle that much.
175
00:10:19,960 --> 00:10:23,120
It's not the kind of capital that that company is looking for.
176
00:10:23,120 --> 00:10:27,000
And it may also not be the amount of risk reward that I want.
177
00:10:27,000 --> 00:10:34,040
So when I decide to play my prorata, and a prorata is just a fancy word that means investing
178
00:10:34,040 --> 00:10:38,980
enough to maintain your ownership in the next round, we like doing that.
179
00:10:38,980 --> 00:10:43,040
But sometimes we don't, even if the company is doing well, because there's enough interest
180
00:10:43,040 --> 00:10:47,160
around the table and we want to make sure really good investors come in.
181
00:10:47,160 --> 00:10:53,000
Or because the company is already valued so highly that the opportunity cost for me is
182
00:10:53,000 --> 00:10:54,000
too high.
183
00:10:54,000 --> 00:10:56,960
I may say, look, I could put more money here, but I could also put it in a company that's
184
00:10:56,960 --> 00:11:02,220
worth one-tenth the valuation where I may have a higher risk reward.
185
00:11:02,220 --> 00:11:07,200
So there's many motivations and many things you have to consider, not just when you make
186
00:11:07,200 --> 00:11:09,680
the investment, but when you do follow-ups.
187
00:11:09,680 --> 00:11:10,680
That makes sense.
188
00:11:10,680 --> 00:11:16,840
Amit, you've had a window into healthcare in various different countries.
189
00:11:16,840 --> 00:11:22,760
You grew up in Brazil, you've invested in India, and obviously North America as well.
190
00:11:22,760 --> 00:11:29,080
The question I want to ask is, there's a theory that I've been playing around with that primary
191
00:11:29,080 --> 00:11:35,280
care and first access to healthcare is where profits should lie, and acute care, chronic
192
00:11:35,280 --> 00:11:41,240
care to an extent, and cancer care likely should not be where profits should lie to
193
00:11:41,240 --> 00:11:43,760
build the best model of healthcare.
194
00:11:43,760 --> 00:11:50,100
I think we still haven't figured out healthcare in terms of pricing and reimbursement anywhere.
195
00:11:50,100 --> 00:11:58,160
How do you look at if you were to design your health system, would it be profitable or not,
196
00:11:58,160 --> 00:12:01,320
and where would most of the profits be derived from if so?
197
00:12:01,320 --> 00:12:05,960
Yeah, no, that's a very simple and very hard question.
198
00:12:05,960 --> 00:12:09,960
Healthcare has obviously a moral component to it.
199
00:12:09,960 --> 00:12:16,160
I think many people, perhaps you included, Rishad, you're a doctor, would agree that
200
00:12:16,160 --> 00:12:21,040
you have to provide some kind of baseline of care for everyone.
201
00:12:21,040 --> 00:12:23,000
I certainly believe in that.
202
00:12:23,000 --> 00:12:29,360
But at the same time, I do see the benefits of having a profit motive because that ensures
203
00:12:29,360 --> 00:12:35,980
accountability, that ensures innovation, that ensures alignment of interests in many ways.
204
00:12:35,980 --> 00:12:38,960
Can also do misalignment of interests.
205
00:12:38,960 --> 00:12:39,960
I am a capitalist.
206
00:12:39,960 --> 00:12:44,800
I mean, venture capitalist has two words, and I did go to business school.
207
00:12:44,800 --> 00:12:51,240
I do believe actually that if you do capitalism in the right way, it is the single most powerful
208
00:12:51,240 --> 00:12:56,480
way of impacting our societies, creating jobs.
209
00:12:56,480 --> 00:13:00,120
I do think there's a way to do that in medicine.
210
00:13:00,120 --> 00:13:01,840
It's not easy.
211
00:13:01,840 --> 00:13:05,840
If I were to design a healthcare system from scratch, first of all, I would surround myself
212
00:13:05,840 --> 00:13:08,400
with a lot of good people because I don't know everything.
213
00:13:08,400 --> 00:13:09,400
I don't know enough.
214
00:13:09,400 --> 00:13:13,980
Healthcare is too big for any one person to try to design by themselves.
215
00:13:13,980 --> 00:13:17,800
What I would try to do is align the incentives as much as possible.
216
00:13:17,800 --> 00:13:22,520
There are companies out there, med device companies, pharma companies, where you need
217
00:13:22,520 --> 00:13:24,080
to provide a profit motive.
218
00:13:24,080 --> 00:13:25,080
Absolutely.
219
00:13:25,080 --> 00:13:26,520
Otherwise, there will not be innovation.
220
00:13:26,520 --> 00:13:29,280
There will not be discoveries.
221
00:13:29,280 --> 00:13:37,100
There is a part of healthcare where you're providing care to perhaps disadvantaged populations,
222
00:13:37,100 --> 00:13:43,400
people who are below the poverty line, where it doesn't make sense to necessarily charge
223
00:13:43,400 --> 00:13:45,680
them money.
224
00:13:45,680 --> 00:13:47,680
Perhaps what you do is you create tiers.
225
00:13:47,680 --> 00:13:52,000
Different countries have tried doing that, including here in the US, or Brazil, or India,
226
00:13:52,000 --> 00:13:56,240
or Canada, or UK.
227
00:13:56,240 --> 00:14:01,260
I think that the answer is you have to have a public system that is good, that attracts
228
00:14:01,260 --> 00:14:08,960
the best talent, that does pay well, that does serve everyone, and that perhaps is managed
229
00:14:08,960 --> 00:14:11,040
at a national level.
230
00:14:11,040 --> 00:14:17,120
I know there's pluses and minuses to this, but I do think there's no way around it.
231
00:14:17,120 --> 00:14:22,120
At the same time, you have to have a good private system because there's other things
232
00:14:22,120 --> 00:14:25,960
in healthcare that absolutely need a good private system.
233
00:14:25,960 --> 00:14:28,320
I think you have to attack from both fronts.
234
00:14:28,320 --> 00:14:34,520
If I look at the countries or the states that have done this the best, it's usually a combination.
235
00:14:34,520 --> 00:14:37,760
This requires far more than just policymakers.
236
00:14:37,760 --> 00:14:42,720
It actually requires what I call all the P's, the letter P, in healthcare.
237
00:14:42,720 --> 00:14:48,440
It requires providers, it requires payers, it requires patients, it requires policymakers
238
00:14:48,440 --> 00:14:51,340
or politicians, it requires pharma.
239
00:14:51,340 --> 00:14:55,880
There's other P's out there, but those are the five big ones.
240
00:14:55,880 --> 00:15:01,680
I think that you have to create a regulatory framework that allows people to actually make
241
00:15:01,680 --> 00:15:03,480
the right choices.
242
00:15:03,480 --> 00:15:08,920
If you create frameworks where the interests are misaligned, no matter how good people
243
00:15:08,920 --> 00:15:11,800
are, they will take decisions that are suboptimal.
244
00:15:11,800 --> 00:15:17,000
This is something I've been thinking about in value-based care, is the incentives are
245
00:15:17,000 --> 00:15:24,360
often aligned to outcomes, which actually creates perverse processes to obtain those
246
00:15:24,360 --> 00:15:25,360
outcomes.
247
00:15:25,360 --> 00:15:29,560
A way better example, if you incentivize a strict BMI, people will starve themselves.
248
00:15:29,560 --> 00:15:33,920
You should incentivize the process or the structures in place instead.
249
00:15:33,920 --> 00:15:39,840
What are your thoughts on value-based care and how to best incentivize the right processes
250
00:15:39,840 --> 00:15:42,560
so the outcomes are achieved that we desire?
251
00:15:42,560 --> 00:15:44,520
You said it better than me, Rishad.
252
00:15:44,520 --> 00:15:49,360
If you pick just one variable and you optimize around that variable, it doesn't necessarily
253
00:15:49,360 --> 00:15:52,160
capture everything.
254
00:15:52,160 --> 00:15:58,840
You could optimize on price, you could optimize on outcomes, you could optimize on the amount
255
00:15:58,840 --> 00:16:02,080
of time for you to get seen quickly.
256
00:16:02,080 --> 00:16:06,840
You could optimize on any one variable and you will not actually optimize for everyone
257
00:16:06,840 --> 00:16:08,280
in every single situation.
258
00:16:08,280 --> 00:16:11,000
That's the problem in healthcare, honestly.
259
00:16:11,000 --> 00:16:13,960
The answer is you can't optimize just on one variable.
260
00:16:13,960 --> 00:16:19,520
I think for better or for worse, outcomes is the least worst metric that I can think
261
00:16:19,520 --> 00:16:21,840
of, but it's not perfect.
262
00:16:21,840 --> 00:16:26,840
You have to temper it exactly as you said by looking at process.
263
00:16:26,840 --> 00:16:29,160
Let me create another situation here.
264
00:16:29,160 --> 00:16:33,520
Doctors that deal, and not just doctors, healthcare providers really, doctors, nurses, physician
265
00:16:33,520 --> 00:16:37,880
assistants, everyone who's involved in the care of a patient that deal with complicated
266
00:16:37,880 --> 00:16:44,560
cases are going to have worse outcomes, very presumably, than doctors who are focused on
267
00:16:44,560 --> 00:16:47,840
easier cases as a percentage.
268
00:16:47,840 --> 00:16:53,160
Doctors who are perceived, and this one I will pick on doctors, who are perceived as nice,
269
00:16:53,160 --> 00:16:59,640
bedside manners, get rated higher even if they're not necessarily the best doctors.
270
00:16:59,640 --> 00:17:04,680
The best doctors may actually be a little bit rough on the edges and they may have patients
271
00:17:04,680 --> 00:17:07,600
do things that the patients don't like.
272
00:17:07,600 --> 00:17:11,720
It creates for a little bit of friction.
273
00:17:11,720 --> 00:17:14,560
If you optimize just on patient satisfaction, it's not correct.
274
00:17:14,560 --> 00:17:16,520
If you just optimize on outcome, it's not correct.
275
00:17:16,520 --> 00:17:20,280
If you optimize on keeping costs low, it's not correct.
276
00:17:20,280 --> 00:17:26,320
Unfortunately, you have to pick one of them and focus on it, but not lose sight of everything
277
00:17:26,320 --> 00:17:27,320
else.
278
00:17:27,320 --> 00:17:31,360
I think that's why you need to look at patients holistically and also at healthcare systems
279
00:17:31,360 --> 00:17:32,520
holistically.
280
00:17:32,520 --> 00:17:39,360
I'm fully in favor of healthcare administrators being much more proficient about the challenge
281
00:17:39,360 --> 00:17:45,120
they're dealing with, for providers to be much more proficient about the administrative
282
00:17:45,120 --> 00:17:51,960
challenges, for patients to be very engaged in their own care, for pharma companies to
283
00:17:51,960 --> 00:17:56,320
change their business models, improve their business models, which are honestly a stretch
284
00:17:56,320 --> 00:18:00,840
very crazy right now, like $10 billion or $1 billion to develop a drug.
285
00:18:00,840 --> 00:18:01,840
That's crazy.
286
00:18:01,840 --> 00:18:05,160
What we need is to really listen into each other.
287
00:18:05,160 --> 00:18:09,060
I know this sounds a little bit cliche, but it is really true to listen into each other
288
00:18:09,060 --> 00:18:11,480
to see things from each other's perspective.
289
00:18:11,480 --> 00:18:15,920
I think if there was one variable, one other P I would focus on, you said process and I
290
00:18:15,920 --> 00:18:17,240
would say perspective.
291
00:18:17,240 --> 00:18:23,180
If you could go back in time to yourself 10 years ago, 20 years ago, what is one piece
292
00:18:23,180 --> 00:18:25,600
of advice you would give yourself?
293
00:18:25,600 --> 00:18:30,160
Oh boy, only one?
294
00:18:30,160 --> 00:18:32,000
As many as you want, Amit.
295
00:18:32,000 --> 00:18:37,800
I think this gets a little philosophical, but what you think now and what ends up happening
296
00:18:37,800 --> 00:18:42,080
10 years later are invariably very different things.
297
00:18:42,080 --> 00:18:46,320
Practically every plan I have made in my life didn't work out or didn't work out the way
298
00:18:46,320 --> 00:18:50,880
I thought it would, but having the plan was very important.
299
00:18:50,880 --> 00:18:56,340
If you'd asked me 10 years ago, I could have never predicted that I would be sitting today
300
00:18:56,340 --> 00:18:59,840
running Tao Ventures, doing early stage investments.
301
00:18:59,840 --> 00:19:06,920
However, if you had asked me what I would like to do in terms of principles, those principles
302
00:19:06,920 --> 00:19:09,480
to stay very same.
303
00:19:09,480 --> 00:19:14,200
I hear this from a lot of people that my principles have stayed constant.
304
00:19:14,200 --> 00:19:20,080
Obviously you evolve over time, but the principle is what you're grounded on.
305
00:19:20,080 --> 00:19:25,240
I've always wanted to make an impact, do good, do well.
306
00:19:25,240 --> 00:19:29,880
I've always believed more specifically that the intersection of life sciences and technology
307
00:19:29,880 --> 00:19:33,640
is something that I can make a disproportionate impact.
308
00:19:33,640 --> 00:19:40,680
In my engineering mindset and my life sciences mindset, I could bring both those two and
309
00:19:40,680 --> 00:19:42,680
provide the best treatments for everyone.
310
00:19:42,680 --> 00:19:44,000
Make a lot of money in the process.
311
00:19:44,000 --> 00:19:46,760
I don't think those are mutually exclusive, by the way.
312
00:19:46,760 --> 00:19:51,560
I don't think that you have to choose between making money and doing good.
313
00:19:51,560 --> 00:19:55,200
There are ways in which you can actually do both.
314
00:19:55,200 --> 00:19:57,600
That principle has stayed constant for me.
315
00:19:57,600 --> 00:20:02,540
If I were to remind myself 10 years ago, it would be everything you're thinking right
316
00:20:02,540 --> 00:20:06,480
now specifically will probably not work like you're thinking.
317
00:20:06,480 --> 00:20:11,440
But the underlying motivations, make sure you keep holding onto those.
318
00:20:11,440 --> 00:20:13,640
I agree with that.
319
00:20:13,640 --> 00:20:18,940
I would add money is status and status brings the power to affect change.
320
00:20:18,940 --> 00:20:24,440
Take me back to when you were a child, when you were five years old, 10 years old.
321
00:20:24,440 --> 00:20:27,720
What did you want to become at that point?
322
00:20:27,720 --> 00:20:34,000
To make an assumption there was, some people would argue is the mindset of a child.
323
00:20:34,000 --> 00:20:40,720
But jokes aside, as far as I can remember, I wanted to do something in healthcare.
324
00:20:40,720 --> 00:20:42,480
This is for many reasons.
325
00:20:42,480 --> 00:20:46,280
I did grow up in a smaller city in Brazil.
326
00:20:46,280 --> 00:20:48,440
I did my high school in a larger city.
327
00:20:48,440 --> 00:20:52,540
I've spent some time in India, which is where my roots are.
328
00:20:52,540 --> 00:20:58,280
When you grow up in emerging countries, you do see lots of social problems.
329
00:20:58,280 --> 00:21:01,600
You see both the best and the worst of humanity in some ways.
330
00:21:01,600 --> 00:21:05,080
I wanted to try fixing some problems.
331
00:21:05,080 --> 00:21:09,200
Growing up, I do remember very specifically people not having access to education and
332
00:21:09,200 --> 00:21:10,800
healthcare.
333
00:21:10,800 --> 00:21:14,080
My parents came from education and they're professors.
334
00:21:14,080 --> 00:21:17,960
I always thought maybe something along those lines or something along the lines of healthcare.
335
00:21:17,960 --> 00:21:23,560
I would gravitate more towards healthcare because I felt that if you don't have good health,
336
00:21:23,560 --> 00:21:25,800
then you can't do anything in life really.
337
00:21:25,800 --> 00:21:32,200
Healthcare is really a foundation of who you are.
338
00:21:32,200 --> 00:21:36,520
If you're not healthy yourself, you can't afford even to go to school in some ways.
339
00:21:36,520 --> 00:21:41,200
It's truly, truly foundational for you as an individual, for a family, and for society
340
00:21:41,200 --> 00:21:42,200
in general.
341
00:21:42,200 --> 00:21:46,320
I'm pretty sure my parents would have said that I wanted to be a doctor.
342
00:21:46,320 --> 00:21:54,320
I'm sure I entertained being an astronaut or being a jet pilot fighter, but those were
343
00:21:54,320 --> 00:21:55,320
whims.
344
00:21:55,320 --> 00:21:56,320
They were not real plans.
345
00:21:56,320 --> 00:22:00,160
I think I truly wanted to be a doctor.
346
00:22:00,160 --> 00:22:02,480
As I grew older, I actually applied to be a doctor.
347
00:22:02,480 --> 00:22:06,520
I got into med school, but I decided not to.
348
00:22:06,520 --> 00:22:15,320
It wasn't a change in goals in some ways because my ultimate goal wasn't to be a doctor.
349
00:22:15,320 --> 00:22:18,720
My ultimate goal was to make an impact.
350
00:22:18,720 --> 00:22:20,640
My ultimate goal was to fix problems.
351
00:22:20,640 --> 00:22:25,600
My ultimate goal was to create a better life for myself and for other people.
352
00:22:25,600 --> 00:22:28,280
I don't want to come across here as not being self-interested.
353
00:22:28,280 --> 00:22:29,980
I'm very self-interested.
354
00:22:29,980 --> 00:22:31,320
I do want to make money.
355
00:22:31,320 --> 00:22:32,600
I do want to live comfortably.
356
00:22:32,600 --> 00:22:37,880
I do want to have the words you mentioned, power and status, but I don't want those necessarily
357
00:22:37,880 --> 00:22:41,040
at the exclusion of being able to make an impact.
358
00:22:41,040 --> 00:22:45,280
In fact, if anything, I want to use both in service of each other.
359
00:22:45,280 --> 00:22:49,200
In my mind, at least those things are aligned.
360
00:22:49,200 --> 00:22:54,720
My ultimate motivation behind it, that has stayed constant as far as I can remember.
361
00:22:54,720 --> 00:22:55,720
Okay, perfect.
362
00:22:55,720 --> 00:22:58,240
Thanks for sharing that, Amit.
363
00:22:58,240 --> 00:23:04,880
The next question I want to ask is, building on Bill Gross's idea lab study of why now
364
00:23:04,880 --> 00:23:10,920
is the best predictor of success in a startup, one could argue tailwinds drive a considerable
365
00:23:10,920 --> 00:23:16,080
amount of success, COVID, regulatory changes, wars to an extent.
366
00:23:16,080 --> 00:23:22,680
A tailwind I'm banking on is hybrid at-home care, really growing over the next five years,
367
00:23:22,680 --> 00:23:25,600
including hospital care at home.
368
00:23:25,600 --> 00:23:28,840
What are some tailwinds you're banking on right now?
369
00:23:28,840 --> 00:23:31,760
I'm going to ask you to predict the future.
370
00:23:31,760 --> 00:23:37,440
What are some tailwinds you think will emerge in the next five to 10 years?
371
00:23:37,440 --> 00:23:38,440
Sure.
372
00:23:38,440 --> 00:23:41,520
I'm going to invest in a company called Irritative Health, which I know you're familiar with
373
00:23:41,520 --> 00:23:42,520
also, Rishad.
374
00:23:42,520 --> 00:23:48,320
It was very coincidentally the very first investment we did through the fund, officially.
375
00:23:48,320 --> 00:23:51,800
They've done phenomenally well.
376
00:23:51,800 --> 00:23:57,960
They've raised more than $200 million, and this is all public in just over two years.
377
00:23:57,960 --> 00:24:03,360
What the company had started off with was training their algorithm to detect cancer
378
00:24:03,360 --> 00:24:04,600
using computer vision.
379
00:24:04,600 --> 00:24:10,920
More specifically, taking video feeds of a colonoscopy and in real time, being able to
380
00:24:10,920 --> 00:24:16,760
detect if there was a polyp that was cancerous, and helping physicians, specialists, especially
381
00:24:16,760 --> 00:24:22,680
gastroenterologists, be able to detect colon cancer earlier, quicker, and better.
382
00:24:22,680 --> 00:24:28,320
If you don't detect it early enough, cancer is, unfortunately, all of us have been touched
383
00:24:28,320 --> 00:24:33,360
by it in some way or shape or form, and I think all of us will resonate with this, that
384
00:24:33,360 --> 00:24:36,400
it can mean the difference between making it or not making it.
385
00:24:36,400 --> 00:24:38,080
Detect early, you save a patient's life.
386
00:24:38,080 --> 00:24:42,040
You detect it too late, maybe you can't do anything about it.
387
00:24:42,040 --> 00:24:47,760
Where I'm going with this is we betted on a company that was able to take massive amounts
388
00:24:47,760 --> 00:24:52,520
of data and make sense of it very quickly, and that would not have been possible even
389
00:24:52,520 --> 00:24:53,520
a few years back.
390
00:24:53,520 --> 00:24:57,360
I dare say even three or four years back, because now we have a lot more computational
391
00:24:57,360 --> 00:24:58,360
power.
392
00:24:58,360 --> 00:24:59,360
We have a lot more data.
393
00:24:59,360 --> 00:25:03,440
We have a lot more understanding of how to analyze this data.
394
00:25:03,440 --> 00:25:05,280
We have algorithms that have been built by others.
395
00:25:05,280 --> 00:25:10,000
We have tech stacks that you can leverage from, and I'm giving the example of iterative
396
00:25:10,000 --> 00:25:13,240
health, but that's symptomatic of most of our portfolio.
397
00:25:13,240 --> 00:25:15,720
We have found really good folks.
398
00:25:15,720 --> 00:25:17,600
Sometimes they're doctors, sometimes they're technologists.
399
00:25:17,600 --> 00:25:23,120
More often than not, there are partnerships between these people who are able to take
400
00:25:23,120 --> 00:25:26,200
cutting edge tools and apply it to solve these problems.
401
00:25:26,200 --> 00:25:28,240
I'll give you another example.
402
00:25:28,240 --> 00:25:37,840
We have a company called Signos, which repurposed a hardware device, a continuous glucose monitor,
403
00:25:37,840 --> 00:25:47,240
that is typically used for you to track and hopefully avoid a glucose shortage.
404
00:25:47,240 --> 00:25:54,080
It's affiliated with an insulin pump and diabetics use it, but they have repurposed it for obesity.
405
00:25:54,080 --> 00:25:56,880
You can see what's going on inside your body in real time.
406
00:25:56,880 --> 00:26:00,680
You eat, drink, or exercise and help you take better decisions.
407
00:26:00,680 --> 00:26:05,080
If you can measure it, you can monitor it, and if you can monitor it, you can modify
408
00:26:05,080 --> 00:26:08,880
it, your behaviors.
409
00:26:08,880 --> 00:26:15,640
Once again, if you think about it, we bet it on a company that has done really well
410
00:26:15,640 --> 00:26:20,080
so far and we continue very much believing in it, that can take massive amounts of data
411
00:26:20,080 --> 00:26:22,800
and try to make sense of it.
412
00:26:22,800 --> 00:26:27,360
No single human being could have made sense of all of this until recently.
413
00:26:27,360 --> 00:26:28,920
You do need machines.
414
00:26:28,920 --> 00:26:34,080
I am a firm believer that machines are not here to replace us.
415
00:26:34,080 --> 00:26:43,760
They're here to help us, to augment what we can do as physicians, as nurses, as investors,
416
00:26:43,760 --> 00:26:45,120
as a society.
417
00:26:45,120 --> 00:26:46,320
It will create jobs.
418
00:26:46,320 --> 00:26:47,680
It will destroy jobs.
419
00:26:47,680 --> 00:26:54,800
Overall, what technology has done throughout human history is make us more efficient, is
420
00:26:54,800 --> 00:26:58,840
able to do things that we couldn't do as well at all.
421
00:26:58,840 --> 00:27:06,560
I do believe that AI done in a smart way can make our societies incredibly better.
422
00:27:06,560 --> 00:27:12,560
It can lead us, the whole gospel of prosperity, it can lead us to a brighter future.
423
00:27:12,560 --> 00:27:13,940
The key is to do it well.
424
00:27:13,940 --> 00:27:21,280
If you do it in the wrong ways, we can create a lot of imbalances in society.
425
00:27:21,280 --> 00:27:23,480
That is going on to your second question.
426
00:27:23,480 --> 00:27:27,320
What I do expect in the next 10 years, I expect more of this.
427
00:27:27,320 --> 00:27:29,080
I expect a lot more of this.
428
00:27:29,080 --> 00:27:37,960
I expect AI to really make a transformational change in how we develop drugs, how we discover
429
00:27:37,960 --> 00:27:39,480
drugs.
430
00:27:39,480 --> 00:27:45,000
The amount of molecules that we know today is a fraction of all the possible molecules
431
00:27:45,000 --> 00:27:49,000
that can exist in the universe.
432
00:27:49,000 --> 00:27:52,120
I've heard the figure 0.01%, by the way.
433
00:27:52,120 --> 00:27:59,360
That's the amount of molecules that we know today of all the possibilities out there.
434
00:27:59,360 --> 00:28:04,240
I'm hinting here at a revolution in biotech, which I would love to do more of.
435
00:28:04,240 --> 00:28:05,880
I'm not there yet.
436
00:28:05,880 --> 00:28:12,000
We invest in digital health, but in the near future, I do expect biotech to be as fast
437
00:28:12,000 --> 00:28:19,000
in terms of development with CRISPR and CAR-T and with AI and with all of these new technologies
438
00:28:19,000 --> 00:28:25,680
that are emerging to become more similar to how tech is done today.
439
00:28:25,680 --> 00:28:27,400
Digital health is already there, by the way.
440
00:28:27,400 --> 00:28:32,760
Digital health, two people in a garage can now build digital health companies the same
441
00:28:32,760 --> 00:28:35,680
way that Google and Apple were created.
442
00:28:35,680 --> 00:28:40,160
So digital health already operates very much like tech, and I expect another industry to
443
00:28:40,160 --> 00:28:43,960
start operating in the next 10 years.
444
00:28:43,960 --> 00:28:49,920
With biotech in particular, the time to exit on average from what I remember is over eight
445
00:28:49,920 --> 00:28:51,320
to 10 years.
446
00:28:51,320 --> 00:28:56,640
A lot of that is because of the trials that need to happen in the stepwise fashion waiting
447
00:28:56,640 --> 00:28:58,000
for results to come.
448
00:28:58,000 --> 00:29:01,800
Do you think that process will be expedited through AI as well?
449
00:29:01,800 --> 00:29:02,800
Absolutely.
450
00:29:02,800 --> 00:29:03,800
Absolutely.
451
00:29:03,800 --> 00:29:11,920
I have a PhD in biochemistry, so I'm remiss in saying that, but I am operating at the
452
00:29:11,920 --> 00:29:17,240
intersection of AI and life sciences, and that's what I've built my career for the last
453
00:29:17,240 --> 00:29:18,760
20 plus years.
454
00:29:18,760 --> 00:29:26,840
I firmly believe that what in silico allows us to speed things up, not by one or two X,
455
00:29:26,840 --> 00:29:28,880
by 10, 100 X.
456
00:29:28,880 --> 00:29:35,680
Think about being able to simulate molecules, to be able to simulate protein-protein bonds,
457
00:29:35,680 --> 00:29:43,600
to be able to simulate how drug-to-drug interactions would be, to be able to simulate how a particular
458
00:29:43,600 --> 00:29:46,080
drug would behave in your body.
459
00:29:46,080 --> 00:29:49,000
Ultimately, you will still need human clinical trials.
460
00:29:49,000 --> 00:29:50,000
Absolutely.
461
00:29:50,000 --> 00:29:51,000
Animal trials.
462
00:29:51,000 --> 00:29:52,000
Absolutely.
463
00:29:52,000 --> 00:30:01,040
But you'll be able to hone in instead of looking at a search space that is the entire universe.
464
00:30:01,040 --> 00:30:03,480
You'll be able to hone in, and these are the things we need to test.
465
00:30:03,480 --> 00:30:05,040
This is how we need to test.
466
00:30:05,040 --> 00:30:07,840
These are the things you have discovered that are not issues, and these are potential things
467
00:30:07,840 --> 00:30:08,840
we need to go deeper in.
468
00:30:08,840 --> 00:30:09,840
Okay.
469
00:30:09,840 --> 00:30:10,840
Yeah.
470
00:30:10,840 --> 00:30:14,720
I think what AI will allow us to do, and is allowing us to do, is to be able to focus
471
00:30:14,720 --> 00:30:16,040
a lot better.
472
00:30:16,040 --> 00:30:18,040
We do have some portfolio companies doing this already.
473
00:30:18,040 --> 00:30:24,240
We have one called ArpeggioBio that's focused on mRNA and helping develop more around mRNA
474
00:30:24,240 --> 00:30:25,240
as a platform.
475
00:30:25,240 --> 00:30:26,880
RNA in general, I should say.
476
00:30:26,880 --> 00:30:34,720
We have another one called Teco.Bio that is focused on how do drugs interact with your
477
00:30:34,720 --> 00:30:37,680
blood and with cancer in general.
478
00:30:37,680 --> 00:30:39,640
How do you make sense of all of that?
479
00:30:39,640 --> 00:30:46,520
Because the blood is actually the way your entire body really transports substances to
480
00:30:46,520 --> 00:30:47,520
each other.
481
00:30:47,520 --> 00:30:51,160
Ultimately, it's the blood for the vast majority of things.
482
00:30:51,160 --> 00:30:55,320
How you deliver drugs to the right place and how they interact with each other is absolutely
483
00:30:55,320 --> 00:30:57,560
crucial for how we make sense of cancer.
484
00:30:57,560 --> 00:30:58,560
Yeah.
485
00:30:58,560 --> 00:31:03,960
I think targeted chemo, especially immunotherapy, has just made such big advancements over the
486
00:31:03,960 --> 00:31:04,960
past two decades.
487
00:31:04,960 --> 00:31:10,280
I don't want to get too deep into this because a good chunk of our listeners are not physicians.
488
00:31:10,280 --> 00:31:18,280
I think there's a comfort with AI replacing analytical tasks, but there is a discomfort
489
00:31:18,280 --> 00:31:26,640
with AI replacing creative tasks, especially with Dali and what it's doing in the art world.
490
00:31:26,640 --> 00:31:33,760
We seem to protect the artistic side of humanity from AI, and we seem to want to say what makes
491
00:31:33,760 --> 00:31:37,520
us human is our creative endeavors.
492
00:31:37,520 --> 00:31:44,480
What are your thoughts if AI can be more creative than us, if AI can be more human than humans?
493
00:31:44,480 --> 00:31:48,520
Should AI replace creative tasks?
494
00:31:48,520 --> 00:31:54,820
Art music are probably easier to answer than being a counselor, being a friend, or even
495
00:31:54,820 --> 00:31:56,440
being a parent.
496
00:31:56,440 --> 00:32:00,520
If AI can do it better than us, should we let it?
497
00:32:00,520 --> 00:32:03,260
Once again, very simple and very hard question.
498
00:32:03,260 --> 00:32:06,680
You talk to different folks, they'll have different opinions, and these are folks who
499
00:32:06,680 --> 00:32:07,680
are experts.
500
00:32:07,680 --> 00:32:11,360
You talk to people who have spent their entire lives with AI, and they'll give you different
501
00:32:11,360 --> 00:32:13,240
answers around this.
502
00:32:13,240 --> 00:32:18,440
There's very famous Canadians, or people based in Canada, I should say, Jeffrey Hinton and
503
00:32:18,440 --> 00:32:23,480
Yasha Bengio, and some of the foremost names in AI are actually based in Canada or working
504
00:32:23,480 --> 00:32:25,200
out of Canada.
505
00:32:25,200 --> 00:32:31,680
But I fall under the school of thought that general AI is really, really far.
506
00:32:31,680 --> 00:32:34,280
It's not going to happen in my lifetime.
507
00:32:34,280 --> 00:32:38,400
An intelligence that is more human than human is very far.
508
00:32:38,400 --> 00:32:43,800
An intelligence that is better than humans in specific tasks, that's already here, and
509
00:32:43,800 --> 00:32:45,260
that's going to expand.
510
00:32:45,260 --> 00:32:48,340
But think about how AI in general learns today.
511
00:32:48,340 --> 00:32:53,920
By and large, it's because we throw a lot of data, and we do reinforcement learning,
512
00:32:53,920 --> 00:32:54,920
supervised learning.
513
00:32:54,920 --> 00:33:00,200
There is such a thing called unsupervised learning, but by and large, the inherent intelligence
514
00:33:00,200 --> 00:33:06,360
of the fastest, strongest, biggest computer in the world, which you could argue is the
515
00:33:06,360 --> 00:33:13,320
internet itself, is not bigger than an invertebrate, than a worm.
516
00:33:13,320 --> 00:33:18,520
There are tasks that human babies learn by themselves or by watching that computers can't
517
00:33:18,520 --> 00:33:19,520
do.
518
00:33:19,520 --> 00:33:24,760
Fundamentally, how human brains are structured and learn is very different from how computers
519
00:33:24,760 --> 00:33:25,760
are structured.
520
00:33:25,760 --> 00:33:28,940
It doesn't mean that we need to build computers like humans, by the way.
521
00:33:28,940 --> 00:33:32,560
It leads to a whole philosophical question of what is intelligence, how do you define
522
00:33:32,560 --> 00:33:35,680
it, and can intelligence be replicated in different ways?
523
00:33:35,680 --> 00:33:41,200
I fall under the school of thought that there's not a single path towards intelligence.
524
00:33:41,200 --> 00:33:47,080
Going to your question really more specifically, look, we've been going through revolutions
525
00:33:47,080 --> 00:33:49,160
for thousands of years at this point.
526
00:33:49,160 --> 00:33:54,480
We started off thinking that the Earth was the center of the universe, and then we realized,
527
00:33:54,480 --> 00:33:56,760
no, the Earth orbits a star.
528
00:33:56,760 --> 00:33:59,000
We thought that that star was the center of the universe.
529
00:33:59,000 --> 00:34:02,520
Then we said, no, no, that star is just one in a galaxy.
530
00:34:02,520 --> 00:34:04,960
Then we said that galaxy is the only thing in the universe.
531
00:34:04,960 --> 00:34:09,640
Now we know that galaxy is one of thousands, millions, trillions in the universe.
532
00:34:09,640 --> 00:34:14,560
We have gone from an anthropocentric view of the world to a heliocentric view of the
533
00:34:14,560 --> 00:34:22,040
world to a perhaps a pantheistic centric view of the world.
534
00:34:22,040 --> 00:34:23,680
Why not?
535
00:34:23,680 --> 00:34:30,760
Why can't we also accept that things that we create, AI is a creation of human beings,
536
00:34:30,760 --> 00:34:32,860
can actually create things that we can't?
537
00:34:32,860 --> 00:34:40,440
We already do automation in factories, and those factories will do productions at a higher
538
00:34:40,440 --> 00:34:45,520
quality and at higher speed than we do.
539
00:34:45,520 --> 00:34:47,000
Think about cars.
540
00:34:47,000 --> 00:34:50,600
We invented cars, and then we used machines to perfect those cars.
541
00:34:50,600 --> 00:34:54,560
We built machines that built machines that built the machines that built the cars.
542
00:34:54,560 --> 00:34:59,240
We have created layers upon layers upon layers upon layers.
543
00:34:59,240 --> 00:35:07,800
I'm not personally afraid of AI diminishing my humanity if AI can do something better.
544
00:35:07,800 --> 00:35:08,920
I don't celebrate that.
545
00:35:08,920 --> 00:35:13,420
We created AI, and AI can do things that I couldn't even dream of.
546
00:35:13,420 --> 00:35:16,160
What I think needs to happen is for us not to lose purpose.
547
00:35:16,160 --> 00:35:17,720
I think that's a different question.
548
00:35:17,720 --> 00:35:22,880
If humans lose their purpose as individuals and as a society and as a civilization, then
549
00:35:22,880 --> 00:35:24,400
yes, then they're screwed.
550
00:35:24,400 --> 00:35:30,280
There is a danger then in recognizing that things that we create are better than us in
551
00:35:30,280 --> 00:35:35,480
so many different ways that we start losing purpose, but it doesn't need to be that way.
552
00:35:35,480 --> 00:35:39,360
I'm totally okay with AI doing things better than me.
553
00:35:39,360 --> 00:35:44,800
Sure, I just don't need to lose sight of purpose in that conversation.
554
00:35:44,800 --> 00:35:46,160
I'm a big fan of you, man.
555
00:35:46,160 --> 00:35:47,920
I wish more people thought like that.
556
00:35:47,920 --> 00:35:50,280
There's a lot of fear in this space.
557
00:35:50,280 --> 00:35:54,580
You said something very interesting that I want to pick at more, that we don't need to
558
00:35:54,580 --> 00:35:57,440
build computers like humans.
559
00:35:57,440 --> 00:36:03,200
As per my understanding of the current regulatory space, there's still a need for us to understand
560
00:36:03,200 --> 00:36:05,480
what the AI is doing.
561
00:36:05,480 --> 00:36:10,280
And I'm not a computer engineer, but there needs to be some transparency over the neural
562
00:36:10,280 --> 00:36:12,640
networks or whatever the learning process is.
563
00:36:12,640 --> 00:36:18,920
At what point do we let go of that need for control and say, if there is correlation between
564
00:36:18,920 --> 00:36:25,960
outcomes or the outcomes the AI is producing are great and better than ours, then we don't
565
00:36:25,960 --> 00:36:30,680
need to understand the process because we might not be able to, and maybe we're limiting
566
00:36:30,680 --> 00:36:36,120
the scope of AI by saying, okay, we need to understand what's going on here on the backend.
567
00:36:36,120 --> 00:36:41,720
Yeah, so you're talking about explainability and that's tough.
568
00:36:41,720 --> 00:36:46,120
Every single one of your questions has been tough because they're not clear cut answers.
569
00:36:46,120 --> 00:36:47,480
Once again, things evolve.
570
00:36:47,480 --> 00:36:53,080
You have to also be humble enough, and I'm speaking at myself, that with new data, with
571
00:36:53,080 --> 00:36:57,800
new knowledge, with new expertise, you factor that into your thinking and your thinking
572
00:36:57,800 --> 00:36:58,800
may change.
573
00:36:58,800 --> 00:37:01,680
Once again, thinking will change, will evolve.
574
00:37:01,680 --> 00:37:06,800
I think the underlying motivations and principles, those can stay constant.
575
00:37:06,800 --> 00:37:15,200
So I think explainability is crucial for us to be able to justify, rationalize, make sense
576
00:37:15,200 --> 00:37:16,440
of things.
577
00:37:16,440 --> 00:37:23,520
We're not there in terms of being comfortable with the unexplainable when we have created
578
00:37:23,520 --> 00:37:26,000
the unexplainable.
579
00:37:26,000 --> 00:37:29,720
There's plenty of things that happen in the world that are unexplainable to us.
580
00:37:29,720 --> 00:37:36,480
Life is in many ways, arguably, random.
581
00:37:36,480 --> 00:37:44,440
There's so many variables to track for that we don't have the capacity to decide that
582
00:37:44,440 --> 00:37:45,920
this caused this, this caused this.
583
00:37:45,920 --> 00:37:48,240
There's so many different factors at play.
584
00:37:48,240 --> 00:37:49,880
We're comfortable with randomness.
585
00:37:49,880 --> 00:37:55,000
We're comfortable with non-explainability to a certain degree when we are not behind
586
00:37:55,000 --> 00:37:56,760
it.
587
00:37:56,760 --> 00:38:03,680
When we think about, oh, I was born in a rich family or I was born in a poor family.
588
00:38:03,680 --> 00:38:05,960
Well, I can't explain that.
589
00:38:05,960 --> 00:38:07,760
It happened.
590
00:38:07,760 --> 00:38:09,360
We're reasonably comfortable with that.
591
00:38:09,360 --> 00:38:10,760
We may argue about it.
592
00:38:10,760 --> 00:38:11,760
We may debate about it.
593
00:38:11,760 --> 00:38:17,000
We may at the end of the day say, I don't understand how it happened, but we know that
594
00:38:17,000 --> 00:38:18,000
it happens.
595
00:38:18,000 --> 00:38:23,080
And we have over the course of hundreds of generations gotten comfortable with that.
596
00:38:23,080 --> 00:38:27,360
But we're not comfortable when we create that unexplainability.
597
00:38:27,360 --> 00:38:32,720
We're not comfortable with the fact that I created a box and I can't explain what the
598
00:38:32,720 --> 00:38:33,900
box does.
599
00:38:33,900 --> 00:38:38,280
So that is ultimately on us eventually getting comfortable.
600
00:38:38,280 --> 00:38:43,920
If the AI is doing a good job and we have tested it hundreds and millions of times and
601
00:38:43,920 --> 00:38:49,140
it is leading to better patient outcomes, it's making our societies better.
602
00:38:49,140 --> 00:38:50,500
Maybe we should consider it.
603
00:38:50,500 --> 00:38:56,960
Maybe we should consider it that I can't explain how, but I know that this is better for us.
604
00:38:56,960 --> 00:39:00,760
We're not there and I'm not advocating we get there anytime soon.
605
00:39:00,760 --> 00:39:05,800
Right now I think it is very important for us to go from one step to another step to
606
00:39:05,800 --> 00:39:06,800
another step.
607
00:39:06,800 --> 00:39:09,760
I think it is, I'm not comfortable with that.
608
00:39:09,760 --> 00:39:14,240
If I were being treated for something and an AI did something and we couldn't explain
609
00:39:14,240 --> 00:39:15,960
how it did it, I would worry.
610
00:39:15,960 --> 00:39:17,960
I would be grateful, but I would worry.
611
00:39:17,960 --> 00:39:21,180
What if we do follow-ups here?
612
00:39:21,180 --> 00:39:26,160
If something happens down the road, if we can explain what the AI did, then how do I
613
00:39:26,160 --> 00:39:28,520
ensure my future treatment goes well?
614
00:39:28,520 --> 00:39:33,520
I think we do need the regulation and we need to create frameworks.
615
00:39:33,520 --> 00:39:35,040
This will be an evolving conversation.
616
00:39:35,040 --> 00:39:40,280
It will take perhaps far more than your or my lifetime.
617
00:39:40,280 --> 00:39:44,680
I'll ask what I think hopefully is an easier question.
618
00:39:44,680 --> 00:39:51,680
If your LPs came to you tomorrow and said, we don't want any more returns, keep all the
619
00:39:51,680 --> 00:39:54,960
money we've committed, what would you do the day after?
620
00:39:54,960 --> 00:39:58,880
What if that amount was a billion?
621
00:39:58,880 --> 00:40:01,000
Wow.
622
00:40:01,000 --> 00:40:05,400
That is not a situation that will happen.
623
00:40:05,400 --> 00:40:10,640
That I have a billion tomorrow and my LPs say keep it.
624
00:40:10,640 --> 00:40:12,240
I know where you're going with this.
625
00:40:12,240 --> 00:40:15,440
You're asking the hypothetical question is, if money wasn't an object, what would you
626
00:40:15,440 --> 00:40:16,440
do?
627
00:40:16,440 --> 00:40:17,440
That's the question you're asking.
628
00:40:17,440 --> 00:40:23,120
If money wasn't an object and you had a lot of money at your disposal, it's not just that
629
00:40:23,120 --> 00:40:26,840
you don't need any more money for whatever you want to do, but you have quite a bit of
630
00:40:26,840 --> 00:40:27,840
capital that-
631
00:40:27,840 --> 00:40:33,400
The second part of your question is an assumption that the world is the same as it is right
632
00:40:33,400 --> 00:40:37,140
now, meaning tomorrow hasn't changed anything.
633
00:40:37,140 --> 00:40:44,080
If tomorrow's world is significantly different than today's world, then I would perhaps take
634
00:40:44,080 --> 00:40:45,840
different actions.
635
00:40:45,840 --> 00:40:50,440
If there's an asteroid hurtling toward the earth tomorrow and I have a billion dollars
636
00:40:50,440 --> 00:40:56,280
and I can do something about it, well, yes, then that's what I should do.
637
00:40:56,280 --> 00:40:58,640
Because the principle is to value human life.
638
00:40:58,640 --> 00:41:00,160
The principle is to value intelligence.
639
00:41:00,160 --> 00:41:05,440
The principle is to make sure that all of us on this little spaceship traveling through
640
00:41:05,440 --> 00:41:10,400
the universe, that we have the ability to reach the stars one day.
641
00:41:10,400 --> 00:41:15,260
Getting a little poetic here, but the principle here is to preserve intelligence and human
642
00:41:15,260 --> 00:41:18,680
life in general, and not just human life, life in general.
643
00:41:18,680 --> 00:41:25,160
Being more specific here, I would continue running what I do today as a VC fund and I
644
00:41:25,160 --> 00:41:26,580
would do a lot more of it.
645
00:41:26,580 --> 00:41:31,760
If I have a billion dollars, well, then I can do almost 10 times more than I'm doing
646
00:41:31,760 --> 00:41:32,920
right now with 85.
647
00:41:32,920 --> 00:41:37,080
I would try to invest beyond what I'm investing in right now.
648
00:41:37,080 --> 00:41:40,600
I'm focused on AI in healthcare, my partner AI in enterprise.
649
00:41:40,600 --> 00:41:44,840
We might start doing AI in biotech, AI in climate tech.
650
00:41:44,840 --> 00:41:46,080
We're focused on the US right now.
651
00:41:46,080 --> 00:41:47,560
We're open to Canada.
652
00:41:47,560 --> 00:41:49,080
We might start doing other geographies.
653
00:41:49,080 --> 00:41:51,920
I would love to invest more in emerging countries.
654
00:41:51,920 --> 00:41:56,440
India and Brazil are national fits for me, given my heritage, given my connections, given
655
00:41:56,440 --> 00:41:58,000
my knowledge of those two markets.
656
00:41:58,000 --> 00:42:02,920
But I would love to be able to do more things in Singapore, in the UK, perhaps even in other
657
00:42:02,920 --> 00:42:05,840
countries where we are comfortable operating.
658
00:42:05,840 --> 00:42:08,200
I would love to build a bigger team.
659
00:42:08,200 --> 00:42:11,280
That's a necessity actually, not a desire.
660
00:42:11,280 --> 00:42:13,320
It would be absolutely a requirement.
661
00:42:13,320 --> 00:42:15,640
I would like to perhaps expand.
662
00:42:15,640 --> 00:42:19,640
We want to be early stage, but with a billion dollars, it becomes tricky to be an early
663
00:42:19,640 --> 00:42:20,760
stage fund.
664
00:42:20,760 --> 00:42:25,280
Maybe we would become a seed series A and series B and start leading those deals.
665
00:42:25,280 --> 00:42:29,240
With a billion, you probably will have to go beyond series B. That is not what I was
666
00:42:29,240 --> 00:42:31,960
thinking, but we would have to seriously consider it.
667
00:42:31,960 --> 00:42:35,560
I would like to build the type of fund that that's what I'm focused on right now that
668
00:42:35,560 --> 00:42:37,720
does really good and really well.
669
00:42:37,720 --> 00:42:44,640
So 10X returns, but also in some ways, the type of companies we're investing in create
670
00:42:44,640 --> 00:42:45,640
value.
671
00:42:45,640 --> 00:42:50,960
I believe very much that if you create value, you get valuation.
672
00:42:50,960 --> 00:42:56,120
There's unfortunate for capitalism, those two things are not one and the same, but we
673
00:42:56,120 --> 00:43:00,840
can make sure that we operate so that they both help each other.
674
00:43:00,840 --> 00:43:04,320
There are ways of making money in this world that do not create value.
675
00:43:04,320 --> 00:43:09,600
It sounds like you have found your eikigai, which is the Japanese concept of purpose.
676
00:43:09,600 --> 00:43:13,720
And it's the intersection of where you're good at, where you love to do, and what people
677
00:43:13,720 --> 00:43:14,960
will pay you for.
678
00:43:14,960 --> 00:43:17,160
I want to be mindful of the time, Amit.
679
00:43:17,160 --> 00:43:18,760
Do you have time for one more question?
680
00:43:18,760 --> 00:43:20,480
If not, we can end it here.
681
00:43:20,480 --> 00:43:21,480
Let's do it.
682
00:43:21,480 --> 00:43:22,480
Let's do it.
683
00:43:22,480 --> 00:43:23,480
It's been an honor, Shad.
684
00:43:23,480 --> 00:43:24,480
You're very kind.
685
00:43:24,480 --> 00:43:27,120
You're very tough on your questions and very kind on your comments.
686
00:43:27,120 --> 00:43:28,360
What makes you resilient?
687
00:43:28,360 --> 00:43:31,240
This is something I've been thinking about for my kids.
688
00:43:31,240 --> 00:43:37,440
My previous answer would be you have to go through adversity to be resilient.
689
00:43:37,440 --> 00:43:42,720
Looking at the studies and the data out there, from what I've found, resilience comes from
690
00:43:42,720 --> 00:43:49,880
a good internal and external support system and doesn't necessarily require experiencing
691
00:43:49,880 --> 00:43:55,240
obstacles and overcoming them in a healthy fashion without maladaptive behavior.
692
00:43:55,240 --> 00:43:57,960
What makes you resilient?
693
00:43:57,960 --> 00:44:00,240
I might be changing your answer by saying that.
694
00:44:00,240 --> 00:44:01,240
No, no, no.
695
00:44:01,240 --> 00:44:02,940
I think you're right.
696
00:44:02,940 --> 00:44:04,720
No man is an island.
697
00:44:04,720 --> 00:44:10,200
I think part of it is yes, you're internal, somewhat shaped by your experiences, not always.
698
00:44:10,200 --> 00:44:14,880
I do think you can learn from other people's experiences, by the way, just because I'll
699
00:44:14,880 --> 00:44:21,400
take a very stupid example, but you wouldn't go and jump into a well.
700
00:44:21,400 --> 00:44:23,440
That's by the way a proverb in Hindi.
701
00:44:23,440 --> 00:44:26,640
The reason you don't do that is because you know that jumping into a well for the last
702
00:44:26,640 --> 00:44:29,000
guy who did it didn't turn out as well.
703
00:44:29,000 --> 00:44:31,640
You learn from somebody else's experience.
704
00:44:31,640 --> 00:44:36,440
I think there's a component here of what you experience yourself, what you learn from others,
705
00:44:36,440 --> 00:44:39,760
what you learn from others by watching them, what you learn from others by reading in them,
706
00:44:39,760 --> 00:44:43,800
what you learn from others by just what other people share with you.
707
00:44:43,800 --> 00:44:49,240
There's a component of resilience that is absolutely the village around you.
708
00:44:49,240 --> 00:44:54,520
Now there's obviously situations, incredible stories of people who beat all kinds of odds
709
00:44:54,520 --> 00:44:56,960
with very little support systems.
710
00:44:56,960 --> 00:45:01,100
There are also stories of people with a lot of support systems who don't get the amount
711
00:45:01,100 --> 00:45:04,800
of resilience perhaps that they were hoping for.
712
00:45:04,800 --> 00:45:05,800
There's a spectrum.
713
00:45:05,800 --> 00:45:06,840
It's very contextual.
714
00:45:06,840 --> 00:45:10,560
If you have resilience in one area, you may not have as much resilience in another.
715
00:45:10,560 --> 00:45:16,000
I'm not aware of those studies you might be, but I am willing to bet you that physical
716
00:45:16,000 --> 00:45:21,720
and mental resilience, there's a correlation, but they're not necessarily completely connected.
717
00:45:21,720 --> 00:45:29,600
I may be great at handling stress at work, but be terrible at handling stress when I'm
718
00:45:29,600 --> 00:45:30,600
running.
719
00:45:30,600 --> 00:45:32,800
It's not a perfect correlation.
720
00:45:32,800 --> 00:45:36,200
For me personally, I think it's all of the above.
721
00:45:36,200 --> 00:45:38,400
I think resilience is a muscle in some ways.
722
00:45:38,400 --> 00:45:40,320
You have to keep exercising it.
723
00:45:40,320 --> 00:45:43,080
It's easy to fall too comfortable.
724
00:45:43,080 --> 00:45:45,760
I'm grateful for all the people around me.
725
00:45:45,760 --> 00:45:52,000
First and foremost, my wife, she's my rock and she didn't pay me to say all of this.
726
00:45:52,000 --> 00:45:56,200
When she hears this, I'll hopefully make some brownie points, but it's really true.
727
00:45:56,200 --> 00:46:01,820
She gives me a lot of wisdom and she gives me a lot of direction and she helps me how
728
00:46:01,820 --> 00:46:02,820
to be better.
729
00:46:02,820 --> 00:46:08,000
Obviously, my parents, they were who gave me the foundation.
730
00:46:08,000 --> 00:46:13,320
My teachers, my mentors, both in the past and in the present, my friends, both in the
731
00:46:13,320 --> 00:46:17,040
past and in the present.
732
00:46:17,040 --> 00:46:22,400
There's people that I've never met, some of them alive, some of them not alive, who are
733
00:46:22,400 --> 00:46:23,800
role models.
734
00:46:23,800 --> 00:46:27,960
Obviously, I'm building Tao Ventures with a team.
735
00:46:27,960 --> 00:46:29,640
My co-founder for sure.
736
00:46:29,640 --> 00:46:34,600
The reason we are building this fund together is because we know we are a good team.
737
00:46:34,600 --> 00:46:35,920
We are a good partnership.
738
00:46:35,920 --> 00:46:41,400
We can keep each other both accountable, but also bring the best in each other.
739
00:46:41,400 --> 00:46:43,120
Big shout out here to you, Sanjay.
740
00:46:43,120 --> 00:46:46,160
I don't think I'm perfect at this, Rishad.
741
00:46:46,160 --> 00:46:47,760
Nobody is, to be honest.
742
00:46:47,760 --> 00:46:53,760
The day I believe I am or that I've hit my limits, then that means that I'll start failing.
743
00:46:53,760 --> 00:46:56,120
It's a good reminder to myself, there's always more to learn.
744
00:46:56,120 --> 00:46:57,920
There's always more to unlearn.
745
00:46:57,920 --> 00:47:04,880
I discover every day as I go by that something I knew, there's far more to learn about it.
746
00:47:04,880 --> 00:47:05,880
Resilience included.
747
00:47:05,880 --> 00:47:07,760
It's been great talking to you, Amit.
748
00:47:07,760 --> 00:47:10,480
Thanks so much for coming on the show today.
749
00:47:10,480 --> 00:47:15,760
We didn't get to talk too much about investing or other topics I had in mind.
750
00:47:15,760 --> 00:47:18,760
Would love to do it again in the new year.
751
00:47:18,760 --> 00:47:19,760
Absolutely.
752
00:47:19,760 --> 00:47:20,760
Thank you for having me.
753
00:47:20,760 --> 00:47:22,520
Thank you to all of you watching us.
754
00:47:22,520 --> 00:47:25,080
We are TaoVentures.com.
755
00:47:25,080 --> 00:47:28,480
Feel free to check us out.
756
00:47:28,480 --> 00:47:31,480
We read everything that reaches our inbox.
757
00:47:31,480 --> 00:47:32,960
So welcome to reach out.
758
00:47:32,960 --> 00:47:36,640
I'm not able to respond to everyone, but I will certainly read it.
759
00:47:36,640 --> 00:47:40,840
Once again, we're a focused seed stage, primarily enterprise and healthcare, investing in the
760
00:47:40,840 --> 00:47:42,680
US, but very much open to Canada.
761
00:47:42,680 --> 00:47:43,680
Awesome.
762
00:47:43,680 --> 00:47:55,320
Thanks, Amit.
00:00:00,000 --> 00:00:02,960
Amit, thanks so much for being here.
2
00:00:02,960 --> 00:00:08,240
You have been very generous with your time with me and I'm honored for that.
3
00:00:08,240 --> 00:00:11,560
Thanks for taking the time to come to this podcast today.
4
00:00:11,560 --> 00:00:15,640
I think to start, if you can give us a brief introduction and then we can get into it.
5
00:00:15,640 --> 00:00:18,760
Well, Rishad, thank you for the kind words.
6
00:00:18,760 --> 00:00:20,040
Have the same words back to you.
7
00:00:20,040 --> 00:00:25,520
It's been a pleasure getting to know you and collaborating and at some point we will do
8
00:00:25,520 --> 00:00:26,520
more deals together.
9
00:00:26,520 --> 00:00:30,920
But for today, I'm just excited to be here and have a conversation.
10
00:00:30,920 --> 00:00:35,480
My background for those listening in, for those watching, I am a venture capitalist
11
00:00:35,480 --> 00:00:36,920
in Silicon Valley.
12
00:00:36,920 --> 00:00:39,600
I run a fund called Tau Ventures.
13
00:00:39,600 --> 00:00:45,320
As of recording this, we have 85 million total that we're investing primarily at the seed
14
00:00:45,320 --> 00:00:48,680
stage writing typically $500,000 checks.
15
00:00:48,680 --> 00:00:50,800
I focus on the healthcare side.
16
00:00:50,800 --> 00:00:56,000
My co-founder and partner Sanjay focuses on enterprise and everything we invest in, we
17
00:00:56,000 --> 00:00:59,320
look for AI, artificial intelligence.
18
00:00:59,320 --> 00:01:03,480
Companies we have invested in include Iterative Health, which is computer vision for colon
19
00:01:03,480 --> 00:01:04,480
cancer.
20
00:01:04,480 --> 00:01:06,480
Now it does a lot more than that.
21
00:01:06,480 --> 00:01:10,400
We have companies that do machine learning for drug discovery, companies that help with
22
00:01:10,400 --> 00:01:11,400
pre-auth.
23
00:01:11,400 --> 00:01:18,960
In general, what we're looking for is how can AI really empower both folks in healthcare
24
00:01:18,960 --> 00:01:22,960
and technology and the intersection to make a really big difference?
25
00:01:22,960 --> 00:01:25,040
We have so many problems in healthcare.
26
00:01:25,040 --> 00:01:29,640
I know Drishad, you're based in Canada, but I'm going to pick on the US here.
27
00:01:29,640 --> 00:01:33,880
In the US, we spent just over 18% of our GDP in healthcare.
28
00:01:33,880 --> 00:01:38,280
We have worse outcomes than countries that are comparable to the US.
29
00:01:38,280 --> 00:01:45,160
We honestly have created a very tragic situation in the US where we have both the best treatments
30
00:01:45,160 --> 00:01:50,160
and the best doctors in the world, but the costs are just out of hand.
31
00:01:50,160 --> 00:01:52,400
The weights are out of hand.
32
00:01:52,400 --> 00:01:53,400
There's a lot to be done.
33
00:01:53,400 --> 00:01:57,320
I think technology is in many ways a very powerful tool.
34
00:01:57,320 --> 00:01:59,760
It's not the only tool to make a difference.
35
00:01:59,760 --> 00:02:03,800
Before all of this, I co-founded a startup called Health IQ.
36
00:02:03,800 --> 00:02:08,320
Publicly, you'll see that the company raised $140 million.
37
00:02:08,320 --> 00:02:12,200
Before that, I worked at a couple of other VC funds, started my career at Google.
38
00:02:12,200 --> 00:02:16,960
This was pre-IPO days at Google, and I guess I'm dating myself.
39
00:02:16,960 --> 00:02:21,360
I was a product manager there, learned a whole bunch and very grateful for my experience
40
00:02:21,360 --> 00:02:22,520
there.
41
00:02:22,520 --> 00:02:25,400
My training is in computer science and biology.
42
00:02:25,400 --> 00:02:28,080
My master's is at the intersection of those two.
43
00:02:28,080 --> 00:02:29,920
Then I also went to business school.
44
00:02:29,920 --> 00:02:36,200
I'm trying to bring all of this, my life experiences, having worked in a big company, having started
45
00:02:36,200 --> 00:02:40,040
companies, having worked at VC funds, and trying to bring it all together here at Tau
46
00:02:40,040 --> 00:02:46,520
Ventures so that we can truly, truly help our entrepreneurs succeed, make an impact,
47
00:02:46,520 --> 00:02:47,520
and make money.
48
00:02:47,520 --> 00:02:50,680
I believe in the intersection of all of those.
49
00:02:50,680 --> 00:02:51,680
Perfect.
50
00:02:51,680 --> 00:02:52,680
Thanks for that introduction.
51
00:02:52,680 --> 00:02:59,160
I told Sametra when I was talking to him that you're going to have more than a billion under
52
00:02:59,160 --> 00:03:00,760
management in five years.
53
00:03:00,760 --> 00:03:02,640
Wow, you are very kind.
54
00:03:02,640 --> 00:03:05,400
I don't think we will, to be honest.
55
00:03:05,400 --> 00:03:12,840
Not trying to be falsely modest here, but raising funds takes time and you can't run
56
00:03:12,840 --> 00:03:14,280
before you walk.
57
00:03:14,280 --> 00:03:18,320
I don't think it's even the right thing for us to get to a billion within five years.
58
00:03:18,320 --> 00:03:23,160
I think more likely we'll raise another fund and then another fund every two or three years.
59
00:03:23,160 --> 00:03:24,560
That's the norm.
60
00:03:24,560 --> 00:03:27,400
You roughly maybe double in size.
61
00:03:27,400 --> 00:03:32,400
There's exceptions and there's reasons to do something different.
62
00:03:32,400 --> 00:03:39,160
If you follow the trajectory here, that is the biggest expectation, we will grow in size
63
00:03:39,160 --> 00:03:40,160
for sure.
64
00:03:40,160 --> 00:03:43,680
I always want to be an early stage fund.
65
00:03:43,680 --> 00:03:48,560
At least that's what we're thinking right now that our differentiator is in how we help
66
00:03:48,560 --> 00:03:51,440
entrepreneurs build companies at the late stage.
67
00:03:51,440 --> 00:03:57,000
It's a lot more about financial modeling and there's a lot of value in that too for sure,
68
00:03:57,000 --> 00:03:59,680
but it's not what we are focused on.
69
00:03:59,680 --> 00:04:05,040
It's also much easier to do a 10x on a $500 million fund than on a $5 billion fund.
70
00:04:05,040 --> 00:04:11,400
There's practical reasons to also keep your size within a certain range.
71
00:04:11,400 --> 00:04:17,280
Let's talk about when you're thinking of allocating this $85 million, how do you manage risk and
72
00:04:17,280 --> 00:04:18,280
reward?
73
00:04:18,280 --> 00:04:23,320
Are you looking for say a 10x return on every single deal you put into?
74
00:04:23,320 --> 00:04:28,680
Are you okay with maybe a 1000x return, but a 5% chance of that on some deals?
75
00:04:28,680 --> 00:04:34,840
How are you thinking about that risk reward to eventually return back, as you said, 10x
76
00:04:34,840 --> 00:04:36,880
on the whole $85 million?
77
00:04:36,880 --> 00:04:40,200
10x by the way is very ambitious.
78
00:04:40,200 --> 00:04:47,080
It says 10x, but if you look at the data, overwhelmingly good funds do 3x and exceptionally
79
00:04:47,080 --> 00:04:49,320
good fund does 5x or higher.
80
00:04:49,320 --> 00:04:56,680
So there is a long tail of distributions for sure, but what we were hoping is 3x at least
81
00:04:56,680 --> 00:04:58,280
and 5x ambitiously.
82
00:04:58,280 --> 00:05:02,360
Anything over that I'll be extremely, extremely happy about.
83
00:05:02,360 --> 00:05:08,280
I can share today our fund is at 2.5x in just over two years.
84
00:05:08,280 --> 00:05:09,780
That's the first fund.
85
00:05:09,780 --> 00:05:11,600
So we seem to be on track.
86
00:05:11,600 --> 00:05:14,480
Now the law of small numbers helps me.
87
00:05:14,480 --> 00:05:20,560
As I was talking earlier, if you have a smaller fund, it's easier to actually get outsized
88
00:05:20,560 --> 00:05:21,560
returns.
89
00:05:21,560 --> 00:05:24,320
You have more flexibility when you buy and sell.
90
00:05:24,320 --> 00:05:30,560
You're also hungry and you also need to get a few exits in order for the needle to really
91
00:05:30,560 --> 00:05:31,560
move.
92
00:05:31,560 --> 00:05:38,500
So for all those three reasons, having a manageable fund size is a good thing.
93
00:05:38,500 --> 00:05:41,680
So when we look at deals, that's partly what we look at also.
94
00:05:41,680 --> 00:05:45,860
We're investing primarily at the seed stage and specifically late seed.
95
00:05:45,860 --> 00:05:49,880
So we're looking for companies that are a little bit more than two people in a garage.
96
00:05:49,880 --> 00:05:52,600
They typically have a pipeline of customers.
97
00:05:52,600 --> 00:05:54,320
That's the key thing to look for.
98
00:05:54,320 --> 00:05:55,800
If you have pilots, wonderful.
99
00:05:55,800 --> 00:05:58,080
If you have pay pilots, even better.
100
00:05:58,080 --> 00:06:02,360
So if you're making money, revenue is amazing, but that's not our expectation.
101
00:06:02,360 --> 00:06:07,340
Our expectation is that you have a roster of potential clients and that you're able
102
00:06:07,340 --> 00:06:12,040
to close on them and get to recurring contracts and eventually to a Series A within nine to
103
00:06:12,040 --> 00:06:13,040
18 months.
104
00:06:13,040 --> 00:06:16,920
A Series A is when you have product market fit here in the US at least.
105
00:06:16,920 --> 00:06:19,400
A million ARR is kind of what people look for.
106
00:06:19,400 --> 00:06:24,080
So we'll look for, can this company get there?
107
00:06:24,080 --> 00:06:27,120
And can we really help them get there?
108
00:06:27,120 --> 00:06:30,760
And can this company have an explosive growth?
109
00:06:30,760 --> 00:06:35,620
So what you want is not just a company that has good revenues and good profitability,
110
00:06:35,620 --> 00:06:38,120
but in what time horizon it does that.
111
00:06:38,120 --> 00:06:42,960
If a company is raising a seed and they have been around for 10 years, it's not a good
112
00:06:42,960 --> 00:06:43,960
fit for me.
113
00:06:43,960 --> 00:06:46,720
They may be a very good company, but it's not what I'm looking for.
114
00:06:46,720 --> 00:06:52,880
I'm looking for perhaps a Silicon Valley mold of companies where you're raising every couple
115
00:06:52,880 --> 00:06:54,160
of years.
116
00:06:54,160 --> 00:07:01,400
You are ideally doubling every year in terms of your revenues or very soon in your ARR,
117
00:07:01,400 --> 00:07:03,160
annual recurring revenue.
118
00:07:03,160 --> 00:07:06,760
But we don't expect every company to hit 10X, obviously.
119
00:07:06,760 --> 00:07:12,320
If every company hit 10X, then there would be no need for venture capitalists, I should
120
00:07:12,320 --> 00:07:13,320
say.
121
00:07:13,320 --> 00:07:15,560
But we look for the odds that a company could get there.
122
00:07:15,560 --> 00:07:22,240
So in our portfolio construction, we do hope and expect, and so far are seeing this, that
123
00:07:22,240 --> 00:07:25,680
about 10% of the companies will actually do 10X or better.
124
00:07:25,680 --> 00:07:33,080
And then maybe 50% or 60% will do somewhere between three or four X and five and six X.
125
00:07:33,080 --> 00:07:39,080
Some companies may do one or two X, and you may have some companies that lose money.
126
00:07:39,080 --> 00:07:40,080
It's possible.
127
00:07:40,080 --> 00:07:43,680
And in our portfolio construction, we said maybe 10% of the companies will actually end
128
00:07:43,680 --> 00:07:44,680
up making less.
129
00:07:44,680 --> 00:07:50,680
I'm happy to say that so far we have way over indexed on the successes.
130
00:07:50,680 --> 00:07:55,520
And once again, that's in some ways a function of having a small fund and having the flexibility,
131
00:07:55,520 --> 00:08:00,700
how we play Parada, when we play Parada, how we help other portfolio companies in terms
132
00:08:00,700 --> 00:08:05,480
of getting customer traction and in terms of getting investor traction and then helping
133
00:08:05,480 --> 00:08:06,480
them with exits.
134
00:08:06,480 --> 00:08:07,720
I'm very proud to say this.
135
00:08:07,720 --> 00:08:11,780
We have had four exits so far, and the first fund is just about two years old, just over
136
00:08:11,780 --> 00:08:13,240
two years old.
137
00:08:13,240 --> 00:08:14,240
Congratulations.
138
00:08:14,240 --> 00:08:15,240
That is impressive.
139
00:08:15,240 --> 00:08:20,440
I was reading some statistics on AngelList, and the average time to exit after VC money
140
00:08:20,440 --> 00:08:22,440
was 5.6 years.
141
00:08:22,440 --> 00:08:26,200
So you guys are doing better than half that.
142
00:08:26,200 --> 00:08:27,200
So it depends.
143
00:08:27,200 --> 00:08:31,920
First of all, the 5.6, I wasn't familiar with that figure, I've heard higher figures than
144
00:08:31,920 --> 00:08:32,920
that.
145
00:08:32,920 --> 00:08:35,060
Different industries have different time horizons.
146
00:08:35,060 --> 00:08:37,560
When you invest also has a different time horizon.
147
00:08:37,560 --> 00:08:42,120
Like if you're investing at a series C, well, you're probably looking more like a three
148
00:08:42,120 --> 00:08:47,200
or four X and probably within three or four years rather than a 10X in 10 years when you
149
00:08:47,200 --> 00:08:48,640
invest at the seed stage, right?
150
00:08:48,640 --> 00:08:51,240
It's risk reward based on time.
151
00:08:51,240 --> 00:08:57,160
But a couple of things that have been beneficial to us is we have had M&As and whatever comes
152
00:08:57,160 --> 00:09:01,360
out of companies that got acquired actually the acquirement IPO, and we have stock in
153
00:09:01,360 --> 00:09:03,000
the acquire also.
154
00:09:03,000 --> 00:09:08,840
But the other instrument that we have besides IPO and M&A is to do secondaries.
155
00:09:08,840 --> 00:09:14,880
So I'm open to selling my shares to somebody else, and I'm open to buying shares from somebody
156
00:09:14,880 --> 00:09:15,880
else.
157
00:09:15,880 --> 00:09:19,640
I've actually bought a lot more so far than I've sold.
158
00:09:19,640 --> 00:09:25,880
I've actually bought four or five times more so far from angel investors, from other VC
159
00:09:25,880 --> 00:09:29,360
investors, and I'm open to selling to somebody else.
160
00:09:29,360 --> 00:09:34,160
I have only done it once so far, but in the near future, I'll do more of it.
161
00:09:34,160 --> 00:09:36,600
And that's what's called a secondary.
162
00:09:36,600 --> 00:09:41,840
So the advantage of a secondary is that you can recognize exits a little bit earlier and
163
00:09:41,840 --> 00:09:46,080
return money to your LPs, your own investors a little bit earlier.
164
00:09:46,080 --> 00:09:50,080
Now how and when you do that, there's art and science to it.
165
00:09:50,080 --> 00:09:55,440
And how much you sell is also obviously there's a lot of art and science to it.
166
00:09:55,440 --> 00:10:00,160
So inherently, I would think if your company is doing well, you would want to buy as much
167
00:10:00,160 --> 00:10:01,160
as you can.
168
00:10:01,160 --> 00:10:02,160
When?
169
00:10:02,160 --> 00:10:03,160
Not necessarily.
170
00:10:03,160 --> 00:10:04,280
Not necessarily.
171
00:10:04,280 --> 00:10:07,200
We are big believers in supporting our companies.
172
00:10:07,200 --> 00:10:11,800
But mind you, we are at the moment 85 AUM.
173
00:10:11,800 --> 00:10:16,120
So if my company is already worth a billion, even if I put in 2 million, 3 million, which
174
00:10:16,120 --> 00:10:19,960
for me is a big check right now, it doesn't move the needle that much.
175
00:10:19,960 --> 00:10:23,120
It's not the kind of capital that that company is looking for.
176
00:10:23,120 --> 00:10:27,000
And it may also not be the amount of risk reward that I want.
177
00:10:27,000 --> 00:10:34,040
So when I decide to play my prorata, and a prorata is just a fancy word that means investing
178
00:10:34,040 --> 00:10:38,980
enough to maintain your ownership in the next round, we like doing that.
179
00:10:38,980 --> 00:10:43,040
But sometimes we don't, even if the company is doing well, because there's enough interest
180
00:10:43,040 --> 00:10:47,160
around the table and we want to make sure really good investors come in.
181
00:10:47,160 --> 00:10:53,000
Or because the company is already valued so highly that the opportunity cost for me is
182
00:10:53,000 --> 00:10:54,000
too high.
183
00:10:54,000 --> 00:10:56,960
I may say, look, I could put more money here, but I could also put it in a company that's
184
00:10:56,960 --> 00:11:02,220
worth one-tenth the valuation where I may have a higher risk reward.
185
00:11:02,220 --> 00:11:07,200
So there's many motivations and many things you have to consider, not just when you make
186
00:11:07,200 --> 00:11:09,680
the investment, but when you do follow-ups.
187
00:11:09,680 --> 00:11:10,680
That makes sense.
188
00:11:10,680 --> 00:11:16,840
Amit, you've had a window into healthcare in various different countries.
189
00:11:16,840 --> 00:11:22,760
You grew up in Brazil, you've invested in India, and obviously North America as well.
190
00:11:22,760 --> 00:11:29,080
The question I want to ask is, there's a theory that I've been playing around with that primary
191
00:11:29,080 --> 00:11:35,280
care and first access to healthcare is where profits should lie, and acute care, chronic
192
00:11:35,280 --> 00:11:41,240
care to an extent, and cancer care likely should not be where profits should lie to
193
00:11:41,240 --> 00:11:43,760
build the best model of healthcare.
194
00:11:43,760 --> 00:11:50,100
I think we still haven't figured out healthcare in terms of pricing and reimbursement anywhere.
195
00:11:50,100 --> 00:11:58,160
How do you look at if you were to design your health system, would it be profitable or not,
196
00:11:58,160 --> 00:12:01,320
and where would most of the profits be derived from if so?
197
00:12:01,320 --> 00:12:05,960
Yeah, no, that's a very simple and very hard question.
198
00:12:05,960 --> 00:12:09,960
Healthcare has obviously a moral component to it.
199
00:12:09,960 --> 00:12:16,160
I think many people, perhaps you included, Rishad, you're a doctor, would agree that
200
00:12:16,160 --> 00:12:21,040
you have to provide some kind of baseline of care for everyone.
201
00:12:21,040 --> 00:12:23,000
I certainly believe in that.
202
00:12:23,000 --> 00:12:29,360
But at the same time, I do see the benefits of having a profit motive because that ensures
203
00:12:29,360 --> 00:12:35,980
accountability, that ensures innovation, that ensures alignment of interests in many ways.
204
00:12:35,980 --> 00:12:38,960
Can also do misalignment of interests.
205
00:12:38,960 --> 00:12:39,960
I am a capitalist.
206
00:12:39,960 --> 00:12:44,800
I mean, venture capitalist has two words, and I did go to business school.
207
00:12:44,800 --> 00:12:51,240
I do believe actually that if you do capitalism in the right way, it is the single most powerful
208
00:12:51,240 --> 00:12:56,480
way of impacting our societies, creating jobs.
209
00:12:56,480 --> 00:13:00,120
I do think there's a way to do that in medicine.
210
00:13:00,120 --> 00:13:01,840
It's not easy.
211
00:13:01,840 --> 00:13:05,840
If I were to design a healthcare system from scratch, first of all, I would surround myself
212
00:13:05,840 --> 00:13:08,400
with a lot of good people because I don't know everything.
213
00:13:08,400 --> 00:13:09,400
I don't know enough.
214
00:13:09,400 --> 00:13:13,980
Healthcare is too big for any one person to try to design by themselves.
215
00:13:13,980 --> 00:13:17,800
What I would try to do is align the incentives as much as possible.
216
00:13:17,800 --> 00:13:22,520
There are companies out there, med device companies, pharma companies, where you need
217
00:13:22,520 --> 00:13:24,080
to provide a profit motive.
218
00:13:24,080 --> 00:13:25,080
Absolutely.
219
00:13:25,080 --> 00:13:26,520
Otherwise, there will not be innovation.
220
00:13:26,520 --> 00:13:29,280
There will not be discoveries.
221
00:13:29,280 --> 00:13:37,100
There is a part of healthcare where you're providing care to perhaps disadvantaged populations,
222
00:13:37,100 --> 00:13:43,400
people who are below the poverty line, where it doesn't make sense to necessarily charge
223
00:13:43,400 --> 00:13:45,680
them money.
224
00:13:45,680 --> 00:13:47,680
Perhaps what you do is you create tiers.
225
00:13:47,680 --> 00:13:52,000
Different countries have tried doing that, including here in the US, or Brazil, or India,
226
00:13:52,000 --> 00:13:56,240
or Canada, or UK.
227
00:13:56,240 --> 00:14:01,260
I think that the answer is you have to have a public system that is good, that attracts
228
00:14:01,260 --> 00:14:08,960
the best talent, that does pay well, that does serve everyone, and that perhaps is managed
229
00:14:08,960 --> 00:14:11,040
at a national level.
230
00:14:11,040 --> 00:14:17,120
I know there's pluses and minuses to this, but I do think there's no way around it.
231
00:14:17,120 --> 00:14:22,120
At the same time, you have to have a good private system because there's other things
232
00:14:22,120 --> 00:14:25,960
in healthcare that absolutely need a good private system.
233
00:14:25,960 --> 00:14:28,320
I think you have to attack from both fronts.
234
00:14:28,320 --> 00:14:34,520
If I look at the countries or the states that have done this the best, it's usually a combination.
235
00:14:34,520 --> 00:14:37,760
This requires far more than just policymakers.
236
00:14:37,760 --> 00:14:42,720
It actually requires what I call all the P's, the letter P, in healthcare.
237
00:14:42,720 --> 00:14:48,440
It requires providers, it requires payers, it requires patients, it requires policymakers
238
00:14:48,440 --> 00:14:51,340
or politicians, it requires pharma.
239
00:14:51,340 --> 00:14:55,880
There's other P's out there, but those are the five big ones.
240
00:14:55,880 --> 00:15:01,680
I think that you have to create a regulatory framework that allows people to actually make
241
00:15:01,680 --> 00:15:03,480
the right choices.
242
00:15:03,480 --> 00:15:08,920
If you create frameworks where the interests are misaligned, no matter how good people
243
00:15:08,920 --> 00:15:11,800
are, they will take decisions that are suboptimal.
244
00:15:11,800 --> 00:15:17,000
This is something I've been thinking about in value-based care, is the incentives are
245
00:15:17,000 --> 00:15:24,360
often aligned to outcomes, which actually creates perverse processes to obtain those
246
00:15:24,360 --> 00:15:25,360
outcomes.
247
00:15:25,360 --> 00:15:29,560
A way better example, if you incentivize a strict BMI, people will starve themselves.
248
00:15:29,560 --> 00:15:33,920
You should incentivize the process or the structures in place instead.
249
00:15:33,920 --> 00:15:39,840
What are your thoughts on value-based care and how to best incentivize the right processes
250
00:15:39,840 --> 00:15:42,560
so the outcomes are achieved that we desire?
251
00:15:42,560 --> 00:15:44,520
You said it better than me, Rishad.
252
00:15:44,520 --> 00:15:49,360
If you pick just one variable and you optimize around that variable, it doesn't necessarily
253
00:15:49,360 --> 00:15:52,160
capture everything.
254
00:15:52,160 --> 00:15:58,840
You could optimize on price, you could optimize on outcomes, you could optimize on the amount
255
00:15:58,840 --> 00:16:02,080
of time for you to get seen quickly.
256
00:16:02,080 --> 00:16:06,840
You could optimize on any one variable and you will not actually optimize for everyone
257
00:16:06,840 --> 00:16:08,280
in every single situation.
258
00:16:08,280 --> 00:16:11,000
That's the problem in healthcare, honestly.
259
00:16:11,000 --> 00:16:13,960
The answer is you can't optimize just on one variable.
260
00:16:13,960 --> 00:16:19,520
I think for better or for worse, outcomes is the least worst metric that I can think
261
00:16:19,520 --> 00:16:21,840
of, but it's not perfect.
262
00:16:21,840 --> 00:16:26,840
You have to temper it exactly as you said by looking at process.
263
00:16:26,840 --> 00:16:29,160
Let me create another situation here.
264
00:16:29,160 --> 00:16:33,520
Doctors that deal, and not just doctors, healthcare providers really, doctors, nurses, physician
265
00:16:33,520 --> 00:16:37,880
assistants, everyone who's involved in the care of a patient that deal with complicated
266
00:16:37,880 --> 00:16:44,560
cases are going to have worse outcomes, very presumably, than doctors who are focused on
267
00:16:44,560 --> 00:16:47,840
easier cases as a percentage.
268
00:16:47,840 --> 00:16:53,160
Doctors who are perceived, and this one I will pick on doctors, who are perceived as nice,
269
00:16:53,160 --> 00:16:59,640
bedside manners, get rated higher even if they're not necessarily the best doctors.
270
00:16:59,640 --> 00:17:04,680
The best doctors may actually be a little bit rough on the edges and they may have patients
271
00:17:04,680 --> 00:17:07,600
do things that the patients don't like.
272
00:17:07,600 --> 00:17:11,720
It creates for a little bit of friction.
273
00:17:11,720 --> 00:17:14,560
If you optimize just on patient satisfaction, it's not correct.
274
00:17:14,560 --> 00:17:16,520
If you just optimize on outcome, it's not correct.
275
00:17:16,520 --> 00:17:20,280
If you optimize on keeping costs low, it's not correct.
276
00:17:20,280 --> 00:17:26,320
Unfortunately, you have to pick one of them and focus on it, but not lose sight of everything
277
00:17:26,320 --> 00:17:27,320
else.
278
00:17:27,320 --> 00:17:31,360
I think that's why you need to look at patients holistically and also at healthcare systems
279
00:17:31,360 --> 00:17:32,520
holistically.
280
00:17:32,520 --> 00:17:39,360
I'm fully in favor of healthcare administrators being much more proficient about the challenge
281
00:17:39,360 --> 00:17:45,120
they're dealing with, for providers to be much more proficient about the administrative
282
00:17:45,120 --> 00:17:51,960
challenges, for patients to be very engaged in their own care, for pharma companies to
283
00:17:51,960 --> 00:17:56,320
change their business models, improve their business models, which are honestly a stretch
284
00:17:56,320 --> 00:18:00,840
very crazy right now, like $10 billion or $1 billion to develop a drug.
285
00:18:00,840 --> 00:18:01,840
That's crazy.
286
00:18:01,840 --> 00:18:05,160
What we need is to really listen into each other.
287
00:18:05,160 --> 00:18:09,060
I know this sounds a little bit cliche, but it is really true to listen into each other
288
00:18:09,060 --> 00:18:11,480
to see things from each other's perspective.
289
00:18:11,480 --> 00:18:15,920
I think if there was one variable, one other P I would focus on, you said process and I
290
00:18:15,920 --> 00:18:17,240
would say perspective.
291
00:18:17,240 --> 00:18:23,180
If you could go back in time to yourself 10 years ago, 20 years ago, what is one piece
292
00:18:23,180 --> 00:18:25,600
of advice you would give yourself?
293
00:18:25,600 --> 00:18:30,160
Oh boy, only one?
294
00:18:30,160 --> 00:18:32,000
As many as you want, Amit.
295
00:18:32,000 --> 00:18:37,800
I think this gets a little philosophical, but what you think now and what ends up happening
296
00:18:37,800 --> 00:18:42,080
10 years later are invariably very different things.
297
00:18:42,080 --> 00:18:46,320
Practically every plan I have made in my life didn't work out or didn't work out the way
298
00:18:46,320 --> 00:18:50,880
I thought it would, but having the plan was very important.
299
00:18:50,880 --> 00:18:56,340
If you'd asked me 10 years ago, I could have never predicted that I would be sitting today
300
00:18:56,340 --> 00:18:59,840
running Tao Ventures, doing early stage investments.
301
00:18:59,840 --> 00:19:06,920
However, if you had asked me what I would like to do in terms of principles, those principles
302
00:19:06,920 --> 00:19:09,480
to stay very same.
303
00:19:09,480 --> 00:19:14,200
I hear this from a lot of people that my principles have stayed constant.
304
00:19:14,200 --> 00:19:20,080
Obviously you evolve over time, but the principle is what you're grounded on.
305
00:19:20,080 --> 00:19:25,240
I've always wanted to make an impact, do good, do well.
306
00:19:25,240 --> 00:19:29,880
I've always believed more specifically that the intersection of life sciences and technology
307
00:19:29,880 --> 00:19:33,640
is something that I can make a disproportionate impact.
308
00:19:33,640 --> 00:19:40,680
In my engineering mindset and my life sciences mindset, I could bring both those two and
309
00:19:40,680 --> 00:19:42,680
provide the best treatments for everyone.
310
00:19:42,680 --> 00:19:44,000
Make a lot of money in the process.
311
00:19:44,000 --> 00:19:46,760
I don't think those are mutually exclusive, by the way.
312
00:19:46,760 --> 00:19:51,560
I don't think that you have to choose between making money and doing good.
313
00:19:51,560 --> 00:19:55,200
There are ways in which you can actually do both.
314
00:19:55,200 --> 00:19:57,600
That principle has stayed constant for me.
315
00:19:57,600 --> 00:20:02,540
If I were to remind myself 10 years ago, it would be everything you're thinking right
316
00:20:02,540 --> 00:20:06,480
now specifically will probably not work like you're thinking.
317
00:20:06,480 --> 00:20:11,440
But the underlying motivations, make sure you keep holding onto those.
318
00:20:11,440 --> 00:20:13,640
I agree with that.
319
00:20:13,640 --> 00:20:18,940
I would add money is status and status brings the power to affect change.
320
00:20:18,940 --> 00:20:24,440
Take me back to when you were a child, when you were five years old, 10 years old.
321
00:20:24,440 --> 00:20:27,720
What did you want to become at that point?
322
00:20:27,720 --> 00:20:34,000
To make an assumption there was, some people would argue is the mindset of a child.
323
00:20:34,000 --> 00:20:40,720
But jokes aside, as far as I can remember, I wanted to do something in healthcare.
324
00:20:40,720 --> 00:20:42,480
This is for many reasons.
325
00:20:42,480 --> 00:20:46,280
I did grow up in a smaller city in Brazil.
326
00:20:46,280 --> 00:20:48,440
I did my high school in a larger city.
327
00:20:48,440 --> 00:20:52,540
I've spent some time in India, which is where my roots are.
328
00:20:52,540 --> 00:20:58,280
When you grow up in emerging countries, you do see lots of social problems.
329
00:20:58,280 --> 00:21:01,600
You see both the best and the worst of humanity in some ways.
330
00:21:01,600 --> 00:21:05,080
I wanted to try fixing some problems.
331
00:21:05,080 --> 00:21:09,200
Growing up, I do remember very specifically people not having access to education and
332
00:21:09,200 --> 00:21:10,800
healthcare.
333
00:21:10,800 --> 00:21:14,080
My parents came from education and they're professors.
334
00:21:14,080 --> 00:21:17,960
I always thought maybe something along those lines or something along the lines of healthcare.
335
00:21:17,960 --> 00:21:23,560
I would gravitate more towards healthcare because I felt that if you don't have good health,
336
00:21:23,560 --> 00:21:25,800
then you can't do anything in life really.
337
00:21:25,800 --> 00:21:32,200
Healthcare is really a foundation of who you are.
338
00:21:32,200 --> 00:21:36,520
If you're not healthy yourself, you can't afford even to go to school in some ways.
339
00:21:36,520 --> 00:21:41,200
It's truly, truly foundational for you as an individual, for a family, and for society
340
00:21:41,200 --> 00:21:42,200
in general.
341
00:21:42,200 --> 00:21:46,320
I'm pretty sure my parents would have said that I wanted to be a doctor.
342
00:21:46,320 --> 00:21:54,320
I'm sure I entertained being an astronaut or being a jet pilot fighter, but those were
343
00:21:54,320 --> 00:21:55,320
whims.
344
00:21:55,320 --> 00:21:56,320
They were not real plans.
345
00:21:56,320 --> 00:22:00,160
I think I truly wanted to be a doctor.
346
00:22:00,160 --> 00:22:02,480
As I grew older, I actually applied to be a doctor.
347
00:22:02,480 --> 00:22:06,520
I got into med school, but I decided not to.
348
00:22:06,520 --> 00:22:15,320
It wasn't a change in goals in some ways because my ultimate goal wasn't to be a doctor.
349
00:22:15,320 --> 00:22:18,720
My ultimate goal was to make an impact.
350
00:22:18,720 --> 00:22:20,640
My ultimate goal was to fix problems.
351
00:22:20,640 --> 00:22:25,600
My ultimate goal was to create a better life for myself and for other people.
352
00:22:25,600 --> 00:22:28,280
I don't want to come across here as not being self-interested.
353
00:22:28,280 --> 00:22:29,980
I'm very self-interested.
354
00:22:29,980 --> 00:22:31,320
I do want to make money.
355
00:22:31,320 --> 00:22:32,600
I do want to live comfortably.
356
00:22:32,600 --> 00:22:37,880
I do want to have the words you mentioned, power and status, but I don't want those necessarily
357
00:22:37,880 --> 00:22:41,040
at the exclusion of being able to make an impact.
358
00:22:41,040 --> 00:22:45,280
In fact, if anything, I want to use both in service of each other.
359
00:22:45,280 --> 00:22:49,200
In my mind, at least those things are aligned.
360
00:22:49,200 --> 00:22:54,720
My ultimate motivation behind it, that has stayed constant as far as I can remember.
361
00:22:54,720 --> 00:22:55,720
Okay, perfect.
362
00:22:55,720 --> 00:22:58,240
Thanks for sharing that, Amit.
363
00:22:58,240 --> 00:23:04,880
The next question I want to ask is, building on Bill Gross's idea lab study of why now
364
00:23:04,880 --> 00:23:10,920
is the best predictor of success in a startup, one could argue tailwinds drive a considerable
365
00:23:10,920 --> 00:23:16,080
amount of success, COVID, regulatory changes, wars to an extent.
366
00:23:16,080 --> 00:23:22,680
A tailwind I'm banking on is hybrid at-home care, really growing over the next five years,
367
00:23:22,680 --> 00:23:25,600
including hospital care at home.
368
00:23:25,600 --> 00:23:28,840
What are some tailwinds you're banking on right now?
369
00:23:28,840 --> 00:23:31,760
I'm going to ask you to predict the future.
370
00:23:31,760 --> 00:23:37,440
What are some tailwinds you think will emerge in the next five to 10 years?
371
00:23:37,440 --> 00:23:38,440
Sure.
372
00:23:38,440 --> 00:23:41,520
I'm going to invest in a company called Irritative Health, which I know you're familiar with
373
00:23:41,520 --> 00:23:42,520
also, Rishad.
374
00:23:42,520 --> 00:23:48,320
It was very coincidentally the very first investment we did through the fund, officially.
375
00:23:48,320 --> 00:23:51,800
They've done phenomenally well.
376
00:23:51,800 --> 00:23:57,960
They've raised more than $200 million, and this is all public in just over two years.
377
00:23:57,960 --> 00:24:03,360
What the company had started off with was training their algorithm to detect cancer
378
00:24:03,360 --> 00:24:04,600
using computer vision.
379
00:24:04,600 --> 00:24:10,920
More specifically, taking video feeds of a colonoscopy and in real time, being able to
380
00:24:10,920 --> 00:24:16,760
detect if there was a polyp that was cancerous, and helping physicians, specialists, especially
381
00:24:16,760 --> 00:24:22,680
gastroenterologists, be able to detect colon cancer earlier, quicker, and better.
382
00:24:22,680 --> 00:24:28,320
If you don't detect it early enough, cancer is, unfortunately, all of us have been touched
383
00:24:28,320 --> 00:24:33,360
by it in some way or shape or form, and I think all of us will resonate with this, that
384
00:24:33,360 --> 00:24:36,400
it can mean the difference between making it or not making it.
385
00:24:36,400 --> 00:24:38,080
Detect early, you save a patient's life.
386
00:24:38,080 --> 00:24:42,040
You detect it too late, maybe you can't do anything about it.
387
00:24:42,040 --> 00:24:47,760
Where I'm going with this is we betted on a company that was able to take massive amounts
388
00:24:47,760 --> 00:24:52,520
of data and make sense of it very quickly, and that would not have been possible even
389
00:24:52,520 --> 00:24:53,520
a few years back.
390
00:24:53,520 --> 00:24:57,360
I dare say even three or four years back, because now we have a lot more computational
391
00:24:57,360 --> 00:24:58,360
power.
392
00:24:58,360 --> 00:24:59,360
We have a lot more data.
393
00:24:59,360 --> 00:25:03,440
We have a lot more understanding of how to analyze this data.
394
00:25:03,440 --> 00:25:05,280
We have algorithms that have been built by others.
395
00:25:05,280 --> 00:25:10,000
We have tech stacks that you can leverage from, and I'm giving the example of iterative
396
00:25:10,000 --> 00:25:13,240
health, but that's symptomatic of most of our portfolio.
397
00:25:13,240 --> 00:25:15,720
We have found really good folks.
398
00:25:15,720 --> 00:25:17,600
Sometimes they're doctors, sometimes they're technologists.
399
00:25:17,600 --> 00:25:23,120
More often than not, there are partnerships between these people who are able to take
400
00:25:23,120 --> 00:25:26,200
cutting edge tools and apply it to solve these problems.
401
00:25:26,200 --> 00:25:28,240
I'll give you another example.
402
00:25:28,240 --> 00:25:37,840
We have a company called Signos, which repurposed a hardware device, a continuous glucose monitor,
403
00:25:37,840 --> 00:25:47,240
that is typically used for you to track and hopefully avoid a glucose shortage.
404
00:25:47,240 --> 00:25:54,080
It's affiliated with an insulin pump and diabetics use it, but they have repurposed it for obesity.
405
00:25:54,080 --> 00:25:56,880
You can see what's going on inside your body in real time.
406
00:25:56,880 --> 00:26:00,680
You eat, drink, or exercise and help you take better decisions.
407
00:26:00,680 --> 00:26:05,080
If you can measure it, you can monitor it, and if you can monitor it, you can modify
408
00:26:05,080 --> 00:26:08,880
it, your behaviors.
409
00:26:08,880 --> 00:26:15,640
Once again, if you think about it, we bet it on a company that has done really well
410
00:26:15,640 --> 00:26:20,080
so far and we continue very much believing in it, that can take massive amounts of data
411
00:26:20,080 --> 00:26:22,800
and try to make sense of it.
412
00:26:22,800 --> 00:26:27,360
No single human being could have made sense of all of this until recently.
413
00:26:27,360 --> 00:26:28,920
You do need machines.
414
00:26:28,920 --> 00:26:34,080
I am a firm believer that machines are not here to replace us.
415
00:26:34,080 --> 00:26:43,760
They're here to help us, to augment what we can do as physicians, as nurses, as investors,
416
00:26:43,760 --> 00:26:45,120
as a society.
417
00:26:45,120 --> 00:26:46,320
It will create jobs.
418
00:26:46,320 --> 00:26:47,680
It will destroy jobs.
419
00:26:47,680 --> 00:26:54,800
Overall, what technology has done throughout human history is make us more efficient, is
420
00:26:54,800 --> 00:26:58,840
able to do things that we couldn't do as well at all.
421
00:26:58,840 --> 00:27:06,560
I do believe that AI done in a smart way can make our societies incredibly better.
422
00:27:06,560 --> 00:27:12,560
It can lead us, the whole gospel of prosperity, it can lead us to a brighter future.
423
00:27:12,560 --> 00:27:13,940
The key is to do it well.
424
00:27:13,940 --> 00:27:21,280
If you do it in the wrong ways, we can create a lot of imbalances in society.
425
00:27:21,280 --> 00:27:23,480
That is going on to your second question.
426
00:27:23,480 --> 00:27:27,320
What I do expect in the next 10 years, I expect more of this.
427
00:27:27,320 --> 00:27:29,080
I expect a lot more of this.
428
00:27:29,080 --> 00:27:37,960
I expect AI to really make a transformational change in how we develop drugs, how we discover
429
00:27:37,960 --> 00:27:39,480
drugs.
430
00:27:39,480 --> 00:27:45,000
The amount of molecules that we know today is a fraction of all the possible molecules
431
00:27:45,000 --> 00:27:49,000
that can exist in the universe.
432
00:27:49,000 --> 00:27:52,120
I've heard the figure 0.01%, by the way.
433
00:27:52,120 --> 00:27:59,360
That's the amount of molecules that we know today of all the possibilities out there.
434
00:27:59,360 --> 00:28:04,240
I'm hinting here at a revolution in biotech, which I would love to do more of.
435
00:28:04,240 --> 00:28:05,880
I'm not there yet.
436
00:28:05,880 --> 00:28:12,000
We invest in digital health, but in the near future, I do expect biotech to be as fast
437
00:28:12,000 --> 00:28:19,000
in terms of development with CRISPR and CAR-T and with AI and with all of these new technologies
438
00:28:19,000 --> 00:28:25,680
that are emerging to become more similar to how tech is done today.
439
00:28:25,680 --> 00:28:27,400
Digital health is already there, by the way.
440
00:28:27,400 --> 00:28:32,760
Digital health, two people in a garage can now build digital health companies the same
441
00:28:32,760 --> 00:28:35,680
way that Google and Apple were created.
442
00:28:35,680 --> 00:28:40,160
So digital health already operates very much like tech, and I expect another industry to
443
00:28:40,160 --> 00:28:43,960
start operating in the next 10 years.
444
00:28:43,960 --> 00:28:49,920
With biotech in particular, the time to exit on average from what I remember is over eight
445
00:28:49,920 --> 00:28:51,320
to 10 years.
446
00:28:51,320 --> 00:28:56,640
A lot of that is because of the trials that need to happen in the stepwise fashion waiting
447
00:28:56,640 --> 00:28:58,000
for results to come.
448
00:28:58,000 --> 00:29:01,800
Do you think that process will be expedited through AI as well?
449
00:29:01,800 --> 00:29:02,800
Absolutely.
450
00:29:02,800 --> 00:29:03,800
Absolutely.
451
00:29:03,800 --> 00:29:11,920
I have a PhD in biochemistry, so I'm remiss in saying that, but I am operating at the
452
00:29:11,920 --> 00:29:17,240
intersection of AI and life sciences, and that's what I've built my career for the last
453
00:29:17,240 --> 00:29:18,760
20 plus years.
454
00:29:18,760 --> 00:29:26,840
I firmly believe that what in silico allows us to speed things up, not by one or two X,
455
00:29:26,840 --> 00:29:28,880
by 10, 100 X.
456
00:29:28,880 --> 00:29:35,680
Think about being able to simulate molecules, to be able to simulate protein-protein bonds,
457
00:29:35,680 --> 00:29:43,600
to be able to simulate how drug-to-drug interactions would be, to be able to simulate how a particular
458
00:29:43,600 --> 00:29:46,080
drug would behave in your body.
459
00:29:46,080 --> 00:29:49,000
Ultimately, you will still need human clinical trials.
460
00:29:49,000 --> 00:29:50,000
Absolutely.
461
00:29:50,000 --> 00:29:51,000
Animal trials.
462
00:29:51,000 --> 00:29:52,000
Absolutely.
463
00:29:52,000 --> 00:30:01,040
But you'll be able to hone in instead of looking at a search space that is the entire universe.
464
00:30:01,040 --> 00:30:03,480
You'll be able to hone in, and these are the things we need to test.
465
00:30:03,480 --> 00:30:05,040
This is how we need to test.
466
00:30:05,040 --> 00:30:07,840
These are the things you have discovered that are not issues, and these are potential things
467
00:30:07,840 --> 00:30:08,840
we need to go deeper in.
468
00:30:08,840 --> 00:30:09,840
Okay.
469
00:30:09,840 --> 00:30:10,840
Yeah.
470
00:30:10,840 --> 00:30:14,720
I think what AI will allow us to do, and is allowing us to do, is to be able to focus
471
00:30:14,720 --> 00:30:16,040
a lot better.
472
00:30:16,040 --> 00:30:18,040
We do have some portfolio companies doing this already.
473
00:30:18,040 --> 00:30:24,240
We have one called ArpeggioBio that's focused on mRNA and helping develop more around mRNA
474
00:30:24,240 --> 00:30:25,240
as a platform.
475
00:30:25,240 --> 00:30:26,880
RNA in general, I should say.
476
00:30:26,880 --> 00:30:34,720
We have another one called Teco.Bio that is focused on how do drugs interact with your
477
00:30:34,720 --> 00:30:37,680
blood and with cancer in general.
478
00:30:37,680 --> 00:30:39,640
How do you make sense of all of that?
479
00:30:39,640 --> 00:30:46,520
Because the blood is actually the way your entire body really transports substances to
480
00:30:46,520 --> 00:30:47,520
each other.
481
00:30:47,520 --> 00:30:51,160
Ultimately, it's the blood for the vast majority of things.
482
00:30:51,160 --> 00:30:55,320
How you deliver drugs to the right place and how they interact with each other is absolutely
483
00:30:55,320 --> 00:30:57,560
crucial for how we make sense of cancer.
484
00:30:57,560 --> 00:30:58,560
Yeah.
485
00:30:58,560 --> 00:31:03,960
I think targeted chemo, especially immunotherapy, has just made such big advancements over the
486
00:31:03,960 --> 00:31:04,960
past two decades.
487
00:31:04,960 --> 00:31:10,280
I don't want to get too deep into this because a good chunk of our listeners are not physicians.
488
00:31:10,280 --> 00:31:18,280
I think there's a comfort with AI replacing analytical tasks, but there is a discomfort
489
00:31:18,280 --> 00:31:26,640
with AI replacing creative tasks, especially with Dali and what it's doing in the art world.
490
00:31:26,640 --> 00:31:33,760
We seem to protect the artistic side of humanity from AI, and we seem to want to say what makes
491
00:31:33,760 --> 00:31:37,520
us human is our creative endeavors.
492
00:31:37,520 --> 00:31:44,480
What are your thoughts if AI can be more creative than us, if AI can be more human than humans?
493
00:31:44,480 --> 00:31:48,520
Should AI replace creative tasks?
494
00:31:48,520 --> 00:31:54,820
Art music are probably easier to answer than being a counselor, being a friend, or even
495
00:31:54,820 --> 00:31:56,440
being a parent.
496
00:31:56,440 --> 00:32:00,520
If AI can do it better than us, should we let it?
497
00:32:00,520 --> 00:32:03,260
Once again, very simple and very hard question.
498
00:32:03,260 --> 00:32:06,680
You talk to different folks, they'll have different opinions, and these are folks who
499
00:32:06,680 --> 00:32:07,680
are experts.
500
00:32:07,680 --> 00:32:11,360
You talk to people who have spent their entire lives with AI, and they'll give you different
501
00:32:11,360 --> 00:32:13,240
answers around this.
502
00:32:13,240 --> 00:32:18,440
There's very famous Canadians, or people based in Canada, I should say, Jeffrey Hinton and
503
00:32:18,440 --> 00:32:23,480
Yasha Bengio, and some of the foremost names in AI are actually based in Canada or working
504
00:32:23,480 --> 00:32:25,200
out of Canada.
505
00:32:25,200 --> 00:32:31,680
But I fall under the school of thought that general AI is really, really far.
506
00:32:31,680 --> 00:32:34,280
It's not going to happen in my lifetime.
507
00:32:34,280 --> 00:32:38,400
An intelligence that is more human than human is very far.
508
00:32:38,400 --> 00:32:43,800
An intelligence that is better than humans in specific tasks, that's already here, and
509
00:32:43,800 --> 00:32:45,260
that's going to expand.
510
00:32:45,260 --> 00:32:48,340
But think about how AI in general learns today.
511
00:32:48,340 --> 00:32:53,920
By and large, it's because we throw a lot of data, and we do reinforcement learning,
512
00:32:53,920 --> 00:32:54,920
supervised learning.
513
00:32:54,920 --> 00:33:00,200
There is such a thing called unsupervised learning, but by and large, the inherent intelligence
514
00:33:00,200 --> 00:33:06,360
of the fastest, strongest, biggest computer in the world, which you could argue is the
515
00:33:06,360 --> 00:33:13,320
internet itself, is not bigger than an invertebrate, than a worm.
516
00:33:13,320 --> 00:33:18,520
There are tasks that human babies learn by themselves or by watching that computers can't
517
00:33:18,520 --> 00:33:19,520
do.
518
00:33:19,520 --> 00:33:24,760
Fundamentally, how human brains are structured and learn is very different from how computers
519
00:33:24,760 --> 00:33:25,760
are structured.
520
00:33:25,760 --> 00:33:28,940
It doesn't mean that we need to build computers like humans, by the way.
521
00:33:28,940 --> 00:33:32,560
It leads to a whole philosophical question of what is intelligence, how do you define
522
00:33:32,560 --> 00:33:35,680
it, and can intelligence be replicated in different ways?
523
00:33:35,680 --> 00:33:41,200
I fall under the school of thought that there's not a single path towards intelligence.
524
00:33:41,200 --> 00:33:47,080
Going to your question really more specifically, look, we've been going through revolutions
525
00:33:47,080 --> 00:33:49,160
for thousands of years at this point.
526
00:33:49,160 --> 00:33:54,480
We started off thinking that the Earth was the center of the universe, and then we realized,
527
00:33:54,480 --> 00:33:56,760
no, the Earth orbits a star.
528
00:33:56,760 --> 00:33:59,000
We thought that that star was the center of the universe.
529
00:33:59,000 --> 00:34:02,520
Then we said, no, no, that star is just one in a galaxy.
530
00:34:02,520 --> 00:34:04,960
Then we said that galaxy is the only thing in the universe.
531
00:34:04,960 --> 00:34:09,640
Now we know that galaxy is one of thousands, millions, trillions in the universe.
532
00:34:09,640 --> 00:34:14,560
We have gone from an anthropocentric view of the world to a heliocentric view of the
533
00:34:14,560 --> 00:34:22,040
world to a perhaps a pantheistic centric view of the world.
534
00:34:22,040 --> 00:34:23,680
Why not?
535
00:34:23,680 --> 00:34:30,760
Why can't we also accept that things that we create, AI is a creation of human beings,
536
00:34:30,760 --> 00:34:32,860
can actually create things that we can't?
537
00:34:32,860 --> 00:34:40,440
We already do automation in factories, and those factories will do productions at a higher
538
00:34:40,440 --> 00:34:45,520
quality and at higher speed than we do.
539
00:34:45,520 --> 00:34:47,000
Think about cars.
540
00:34:47,000 --> 00:34:50,600
We invented cars, and then we used machines to perfect those cars.
541
00:34:50,600 --> 00:34:54,560
We built machines that built machines that built the machines that built the cars.
542
00:34:54,560 --> 00:34:59,240
We have created layers upon layers upon layers upon layers.
543
00:34:59,240 --> 00:35:07,800
I'm not personally afraid of AI diminishing my humanity if AI can do something better.
544
00:35:07,800 --> 00:35:08,920
I don't celebrate that.
545
00:35:08,920 --> 00:35:13,420
We created AI, and AI can do things that I couldn't even dream of.
546
00:35:13,420 --> 00:35:16,160
What I think needs to happen is for us not to lose purpose.
547
00:35:16,160 --> 00:35:17,720
I think that's a different question.
548
00:35:17,720 --> 00:35:22,880
If humans lose their purpose as individuals and as a society and as a civilization, then
549
00:35:22,880 --> 00:35:24,400
yes, then they're screwed.
550
00:35:24,400 --> 00:35:30,280
There is a danger then in recognizing that things that we create are better than us in
551
00:35:30,280 --> 00:35:35,480
so many different ways that we start losing purpose, but it doesn't need to be that way.
552
00:35:35,480 --> 00:35:39,360
I'm totally okay with AI doing things better than me.
553
00:35:39,360 --> 00:35:44,800
Sure, I just don't need to lose sight of purpose in that conversation.
554
00:35:44,800 --> 00:35:46,160
I'm a big fan of you, man.
555
00:35:46,160 --> 00:35:47,920
I wish more people thought like that.
556
00:35:47,920 --> 00:35:50,280
There's a lot of fear in this space.
557
00:35:50,280 --> 00:35:54,580
You said something very interesting that I want to pick at more, that we don't need to
558
00:35:54,580 --> 00:35:57,440
build computers like humans.
559
00:35:57,440 --> 00:36:03,200
As per my understanding of the current regulatory space, there's still a need for us to understand
560
00:36:03,200 --> 00:36:05,480
what the AI is doing.
561
00:36:05,480 --> 00:36:10,280
And I'm not a computer engineer, but there needs to be some transparency over the neural
562
00:36:10,280 --> 00:36:12,640
networks or whatever the learning process is.
563
00:36:12,640 --> 00:36:18,920
At what point do we let go of that need for control and say, if there is correlation between
564
00:36:18,920 --> 00:36:25,960
outcomes or the outcomes the AI is producing are great and better than ours, then we don't
565
00:36:25,960 --> 00:36:30,680
need to understand the process because we might not be able to, and maybe we're limiting
566
00:36:30,680 --> 00:36:36,120
the scope of AI by saying, okay, we need to understand what's going on here on the backend.
567
00:36:36,120 --> 00:36:41,720
Yeah, so you're talking about explainability and that's tough.
568
00:36:41,720 --> 00:36:46,120
Every single one of your questions has been tough because they're not clear cut answers.
569
00:36:46,120 --> 00:36:47,480
Once again, things evolve.
570
00:36:47,480 --> 00:36:53,080
You have to also be humble enough, and I'm speaking at myself, that with new data, with
571
00:36:53,080 --> 00:36:57,800
new knowledge, with new expertise, you factor that into your thinking and your thinking
572
00:36:57,800 --> 00:36:58,800
may change.
573
00:36:58,800 --> 00:37:01,680
Once again, thinking will change, will evolve.
574
00:37:01,680 --> 00:37:06,800
I think the underlying motivations and principles, those can stay constant.
575
00:37:06,800 --> 00:37:15,200
So I think explainability is crucial for us to be able to justify, rationalize, make sense
576
00:37:15,200 --> 00:37:16,440
of things.
577
00:37:16,440 --> 00:37:23,520
We're not there in terms of being comfortable with the unexplainable when we have created
578
00:37:23,520 --> 00:37:26,000
the unexplainable.
579
00:37:26,000 --> 00:37:29,720
There's plenty of things that happen in the world that are unexplainable to us.
580
00:37:29,720 --> 00:37:36,480
Life is in many ways, arguably, random.
581
00:37:36,480 --> 00:37:44,440
There's so many variables to track for that we don't have the capacity to decide that
582
00:37:44,440 --> 00:37:45,920
this caused this, this caused this.
583
00:37:45,920 --> 00:37:48,240
There's so many different factors at play.
584
00:37:48,240 --> 00:37:49,880
We're comfortable with randomness.
585
00:37:49,880 --> 00:37:55,000
We're comfortable with non-explainability to a certain degree when we are not behind
586
00:37:55,000 --> 00:37:56,760
it.
587
00:37:56,760 --> 00:38:03,680
When we think about, oh, I was born in a rich family or I was born in a poor family.
588
00:38:03,680 --> 00:38:05,960
Well, I can't explain that.
589
00:38:05,960 --> 00:38:07,760
It happened.
590
00:38:07,760 --> 00:38:09,360
We're reasonably comfortable with that.
591
00:38:09,360 --> 00:38:10,760
We may argue about it.
592
00:38:10,760 --> 00:38:11,760
We may debate about it.
593
00:38:11,760 --> 00:38:17,000
We may at the end of the day say, I don't understand how it happened, but we know that
594
00:38:17,000 --> 00:38:18,000
it happens.
595
00:38:18,000 --> 00:38:23,080
And we have over the course of hundreds of generations gotten comfortable with that.
596
00:38:23,080 --> 00:38:27,360
But we're not comfortable when we create that unexplainability.
597
00:38:27,360 --> 00:38:32,720
We're not comfortable with the fact that I created a box and I can't explain what the
598
00:38:32,720 --> 00:38:33,900
box does.
599
00:38:33,900 --> 00:38:38,280
So that is ultimately on us eventually getting comfortable.
600
00:38:38,280 --> 00:38:43,920
If the AI is doing a good job and we have tested it hundreds and millions of times and
601
00:38:43,920 --> 00:38:49,140
it is leading to better patient outcomes, it's making our societies better.
602
00:38:49,140 --> 00:38:50,500
Maybe we should consider it.
603
00:38:50,500 --> 00:38:56,960
Maybe we should consider it that I can't explain how, but I know that this is better for us.
604
00:38:56,960 --> 00:39:00,760
We're not there and I'm not advocating we get there anytime soon.
605
00:39:00,760 --> 00:39:05,800
Right now I think it is very important for us to go from one step to another step to
606
00:39:05,800 --> 00:39:06,800
another step.
607
00:39:06,800 --> 00:39:09,760
I think it is, I'm not comfortable with that.
608
00:39:09,760 --> 00:39:14,240
If I were being treated for something and an AI did something and we couldn't explain
609
00:39:14,240 --> 00:39:15,960
how it did it, I would worry.
610
00:39:15,960 --> 00:39:17,960
I would be grateful, but I would worry.
611
00:39:17,960 --> 00:39:21,180
What if we do follow-ups here?
612
00:39:21,180 --> 00:39:26,160
If something happens down the road, if we can explain what the AI did, then how do I
613
00:39:26,160 --> 00:39:28,520
ensure my future treatment goes well?
614
00:39:28,520 --> 00:39:33,520
I think we do need the regulation and we need to create frameworks.
615
00:39:33,520 --> 00:39:35,040
This will be an evolving conversation.
616
00:39:35,040 --> 00:39:40,280
It will take perhaps far more than your or my lifetime.
617
00:39:40,280 --> 00:39:44,680
I'll ask what I think hopefully is an easier question.
618
00:39:44,680 --> 00:39:51,680
If your LPs came to you tomorrow and said, we don't want any more returns, keep all the
619
00:39:51,680 --> 00:39:54,960
money we've committed, what would you do the day after?
620
00:39:54,960 --> 00:39:58,880
What if that amount was a billion?
621
00:39:58,880 --> 00:40:01,000
Wow.
622
00:40:01,000 --> 00:40:05,400
That is not a situation that will happen.
623
00:40:05,400 --> 00:40:10,640
That I have a billion tomorrow and my LPs say keep it.
624
00:40:10,640 --> 00:40:12,240
I know where you're going with this.
625
00:40:12,240 --> 00:40:15,440
You're asking the hypothetical question is, if money wasn't an object, what would you
626
00:40:15,440 --> 00:40:16,440
do?
627
00:40:16,440 --> 00:40:17,440
That's the question you're asking.
628
00:40:17,440 --> 00:40:23,120
If money wasn't an object and you had a lot of money at your disposal, it's not just that
629
00:40:23,120 --> 00:40:26,840
you don't need any more money for whatever you want to do, but you have quite a bit of
630
00:40:26,840 --> 00:40:27,840
capital that-
631
00:40:27,840 --> 00:40:33,400
The second part of your question is an assumption that the world is the same as it is right
632
00:40:33,400 --> 00:40:37,140
now, meaning tomorrow hasn't changed anything.
633
00:40:37,140 --> 00:40:44,080
If tomorrow's world is significantly different than today's world, then I would perhaps take
634
00:40:44,080 --> 00:40:45,840
different actions.
635
00:40:45,840 --> 00:40:50,440
If there's an asteroid hurtling toward the earth tomorrow and I have a billion dollars
636
00:40:50,440 --> 00:40:56,280
and I can do something about it, well, yes, then that's what I should do.
637
00:40:56,280 --> 00:40:58,640
Because the principle is to value human life.
638
00:40:58,640 --> 00:41:00,160
The principle is to value intelligence.
639
00:41:00,160 --> 00:41:05,440
The principle is to make sure that all of us on this little spaceship traveling through
640
00:41:05,440 --> 00:41:10,400
the universe, that we have the ability to reach the stars one day.
641
00:41:10,400 --> 00:41:15,260
Getting a little poetic here, but the principle here is to preserve intelligence and human
642
00:41:15,260 --> 00:41:18,680
life in general, and not just human life, life in general.
643
00:41:18,680 --> 00:41:25,160
Being more specific here, I would continue running what I do today as a VC fund and I
644
00:41:25,160 --> 00:41:26,580
would do a lot more of it.
645
00:41:26,580 --> 00:41:31,760
If I have a billion dollars, well, then I can do almost 10 times more than I'm doing
646
00:41:31,760 --> 00:41:32,920
right now with 85.
647
00:41:32,920 --> 00:41:37,080
I would try to invest beyond what I'm investing in right now.
648
00:41:37,080 --> 00:41:40,600
I'm focused on AI in healthcare, my partner AI in enterprise.
649
00:41:40,600 --> 00:41:44,840
We might start doing AI in biotech, AI in climate tech.
650
00:41:44,840 --> 00:41:46,080
We're focused on the US right now.
651
00:41:46,080 --> 00:41:47,560
We're open to Canada.
652
00:41:47,560 --> 00:41:49,080
We might start doing other geographies.
653
00:41:49,080 --> 00:41:51,920
I would love to invest more in emerging countries.
654
00:41:51,920 --> 00:41:56,440
India and Brazil are national fits for me, given my heritage, given my connections, given
655
00:41:56,440 --> 00:41:58,000
my knowledge of those two markets.
656
00:41:58,000 --> 00:42:02,920
But I would love to be able to do more things in Singapore, in the UK, perhaps even in other
657
00:42:02,920 --> 00:42:05,840
countries where we are comfortable operating.
658
00:42:05,840 --> 00:42:08,200
I would love to build a bigger team.
659
00:42:08,200 --> 00:42:11,280
That's a necessity actually, not a desire.
660
00:42:11,280 --> 00:42:13,320
It would be absolutely a requirement.
661
00:42:13,320 --> 00:42:15,640
I would like to perhaps expand.
662
00:42:15,640 --> 00:42:19,640
We want to be early stage, but with a billion dollars, it becomes tricky to be an early
663
00:42:19,640 --> 00:42:20,760
stage fund.
664
00:42:20,760 --> 00:42:25,280
Maybe we would become a seed series A and series B and start leading those deals.
665
00:42:25,280 --> 00:42:29,240
With a billion, you probably will have to go beyond series B. That is not what I was
666
00:42:29,240 --> 00:42:31,960
thinking, but we would have to seriously consider it.
667
00:42:31,960 --> 00:42:35,560
I would like to build the type of fund that that's what I'm focused on right now that
668
00:42:35,560 --> 00:42:37,720
does really good and really well.
669
00:42:37,720 --> 00:42:44,640
So 10X returns, but also in some ways, the type of companies we're investing in create
670
00:42:44,640 --> 00:42:45,640
value.
671
00:42:45,640 --> 00:42:50,960
I believe very much that if you create value, you get valuation.
672
00:42:50,960 --> 00:42:56,120
There's unfortunate for capitalism, those two things are not one and the same, but we
673
00:42:56,120 --> 00:43:00,840
can make sure that we operate so that they both help each other.
674
00:43:00,840 --> 00:43:04,320
There are ways of making money in this world that do not create value.
675
00:43:04,320 --> 00:43:09,600
It sounds like you have found your eikigai, which is the Japanese concept of purpose.
676
00:43:09,600 --> 00:43:13,720
And it's the intersection of where you're good at, where you love to do, and what people
677
00:43:13,720 --> 00:43:14,960
will pay you for.
678
00:43:14,960 --> 00:43:17,160
I want to be mindful of the time, Amit.
679
00:43:17,160 --> 00:43:18,760
Do you have time for one more question?
680
00:43:18,760 --> 00:43:20,480
If not, we can end it here.
681
00:43:20,480 --> 00:43:21,480
Let's do it.
682
00:43:21,480 --> 00:43:22,480
Let's do it.
683
00:43:22,480 --> 00:43:23,480
It's been an honor, Shad.
684
00:43:23,480 --> 00:43:24,480
You're very kind.
685
00:43:24,480 --> 00:43:27,120
You're very tough on your questions and very kind on your comments.
686
00:43:27,120 --> 00:43:28,360
What makes you resilient?
687
00:43:28,360 --> 00:43:31,240
This is something I've been thinking about for my kids.
688
00:43:31,240 --> 00:43:37,440
My previous answer would be you have to go through adversity to be resilient.
689
00:43:37,440 --> 00:43:42,720
Looking at the studies and the data out there, from what I've found, resilience comes from
690
00:43:42,720 --> 00:43:49,880
a good internal and external support system and doesn't necessarily require experiencing
691
00:43:49,880 --> 00:43:55,240
obstacles and overcoming them in a healthy fashion without maladaptive behavior.
692
00:43:55,240 --> 00:43:57,960
What makes you resilient?
693
00:43:57,960 --> 00:44:00,240
I might be changing your answer by saying that.
694
00:44:00,240 --> 00:44:01,240
No, no, no.
695
00:44:01,240 --> 00:44:02,940
I think you're right.
696
00:44:02,940 --> 00:44:04,720
No man is an island.
697
00:44:04,720 --> 00:44:10,200
I think part of it is yes, you're internal, somewhat shaped by your experiences, not always.
698
00:44:10,200 --> 00:44:14,880
I do think you can learn from other people's experiences, by the way, just because I'll
699
00:44:14,880 --> 00:44:21,400
take a very stupid example, but you wouldn't go and jump into a well.
700
00:44:21,400 --> 00:44:23,440
That's by the way a proverb in Hindi.
701
00:44:23,440 --> 00:44:26,640
The reason you don't do that is because you know that jumping into a well for the last
702
00:44:26,640 --> 00:44:29,000
guy who did it didn't turn out as well.
703
00:44:29,000 --> 00:44:31,640
You learn from somebody else's experience.
704
00:44:31,640 --> 00:44:36,440
I think there's a component here of what you experience yourself, what you learn from others,
705
00:44:36,440 --> 00:44:39,760
what you learn from others by watching them, what you learn from others by reading in them,
706
00:44:39,760 --> 00:44:43,800
what you learn from others by just what other people share with you.
707
00:44:43,800 --> 00:44:49,240
There's a component of resilience that is absolutely the village around you.
708
00:44:49,240 --> 00:44:54,520
Now there's obviously situations, incredible stories of people who beat all kinds of odds
709
00:44:54,520 --> 00:44:56,960
with very little support systems.
710
00:44:56,960 --> 00:45:01,100
There are also stories of people with a lot of support systems who don't get the amount
711
00:45:01,100 --> 00:45:04,800
of resilience perhaps that they were hoping for.
712
00:45:04,800 --> 00:45:05,800
There's a spectrum.
713
00:45:05,800 --> 00:45:06,840
It's very contextual.
714
00:45:06,840 --> 00:45:10,560
If you have resilience in one area, you may not have as much resilience in another.
715
00:45:10,560 --> 00:45:16,000
I'm not aware of those studies you might be, but I am willing to bet you that physical
716
00:45:16,000 --> 00:45:21,720
and mental resilience, there's a correlation, but they're not necessarily completely connected.
717
00:45:21,720 --> 00:45:29,600
I may be great at handling stress at work, but be terrible at handling stress when I'm
718
00:45:29,600 --> 00:45:30,600
running.
719
00:45:30,600 --> 00:45:32,800
It's not a perfect correlation.
720
00:45:32,800 --> 00:45:36,200
For me personally, I think it's all of the above.
721
00:45:36,200 --> 00:45:38,400
I think resilience is a muscle in some ways.
722
00:45:38,400 --> 00:45:40,320
You have to keep exercising it.
723
00:45:40,320 --> 00:45:43,080
It's easy to fall too comfortable.
724
00:45:43,080 --> 00:45:45,760
I'm grateful for all the people around me.
725
00:45:45,760 --> 00:45:52,000
First and foremost, my wife, she's my rock and she didn't pay me to say all of this.
726
00:45:52,000 --> 00:45:56,200
When she hears this, I'll hopefully make some brownie points, but it's really true.
727
00:45:56,200 --> 00:46:01,820
She gives me a lot of wisdom and she gives me a lot of direction and she helps me how
728
00:46:01,820 --> 00:46:02,820
to be better.
729
00:46:02,820 --> 00:46:08,000
Obviously, my parents, they were who gave me the foundation.
730
00:46:08,000 --> 00:46:13,320
My teachers, my mentors, both in the past and in the present, my friends, both in the
731
00:46:13,320 --> 00:46:17,040
past and in the present.
732
00:46:17,040 --> 00:46:22,400
There's people that I've never met, some of them alive, some of them not alive, who are
733
00:46:22,400 --> 00:46:23,800
role models.
734
00:46:23,800 --> 00:46:27,960
Obviously, I'm building Tao Ventures with a team.
735
00:46:27,960 --> 00:46:29,640
My co-founder for sure.
736
00:46:29,640 --> 00:46:34,600
The reason we are building this fund together is because we know we are a good team.
737
00:46:34,600 --> 00:46:35,920
We are a good partnership.
738
00:46:35,920 --> 00:46:41,400
We can keep each other both accountable, but also bring the best in each other.
739
00:46:41,400 --> 00:46:43,120
Big shout out here to you, Sanjay.
740
00:46:43,120 --> 00:46:46,160
I don't think I'm perfect at this, Rishad.
741
00:46:46,160 --> 00:46:47,760
Nobody is, to be honest.
742
00:46:47,760 --> 00:46:53,760
The day I believe I am or that I've hit my limits, then that means that I'll start failing.
743
00:46:53,760 --> 00:46:56,120
It's a good reminder to myself, there's always more to learn.
744
00:46:56,120 --> 00:46:57,920
There's always more to unlearn.
745
00:46:57,920 --> 00:47:04,880
I discover every day as I go by that something I knew, there's far more to learn about it.
746
00:47:04,880 --> 00:47:05,880
Resilience included.
747
00:47:05,880 --> 00:47:07,760
It's been great talking to you, Amit.
748
00:47:07,760 --> 00:47:10,480
Thanks so much for coming on the show today.
749
00:47:10,480 --> 00:47:15,760
We didn't get to talk too much about investing or other topics I had in mind.
750
00:47:15,760 --> 00:47:18,760
Would love to do it again in the new year.
751
00:47:18,760 --> 00:47:19,760
Absolutely.
752
00:47:19,760 --> 00:47:20,760
Thank you for having me.
753
00:47:20,760 --> 00:47:22,520
Thank you to all of you watching us.
754
00:47:22,520 --> 00:47:25,080
We are TaoVentures.com.
755
00:47:25,080 --> 00:47:28,480
Feel free to check us out.
756
00:47:28,480 --> 00:47:31,480
We read everything that reaches our inbox.
757
00:47:31,480 --> 00:47:32,960
So welcome to reach out.
758
00:47:32,960 --> 00:47:36,640
I'm not able to respond to everyone, but I will certainly read it.
759
00:47:36,640 --> 00:47:40,840
Once again, we're a focused seed stage, primarily enterprise and healthcare, investing in the
760
00:47:40,840 --> 00:47:42,680
US, but very much open to Canada.
761
00:47:42,680 --> 00:47:43,680
Awesome.
762
00:47:43,680 --> 00:47:55,320
Thanks, Amit.