Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Anaconda is a popular platform for data
0:02
science, machine learning, and AI. It
0:06
provides trusted repositories of Python and R
0:08
packages and has over 35 million users
0:11
worldwide. Rob Fertrik is
0:13
the CTO at Anaconda, and he joins
0:15
the show to talk about the platform,
0:17
the concept of an OS for AI,
0:19
and more. This episode
0:21
is hosted by Lee Acheson. Lee
0:24
Acheson is a software architect,
0:26
author, and thought leader on
0:28
cloud computing and application modernization.
0:31
His bestselling book, Architecting for Scale,
0:33
is an essential resource for technical
0:35
teams looking to maintain high availability
0:37
and manage risk in their cloud
0:39
environments. Lee is the
0:41
host of his podcast, Modern Digital
0:44
Business, produced for people looking to
0:46
build and grow their digital business,
0:48
listen at mdb.fm. Follow
0:51
Lee at softwarearchitectureinsights.com and
0:53
see all his content
0:55
at leeacheson.com. Rob,
1:10
welcome to Software Engineering Daily. Thank
1:12
you, Lee. It is very exciting to be here. I
1:15
actually have listened to this podcast in the past, and
1:17
so it's a little bit of extra excitement in addition
1:19
to looking forward to a great conversation. So
1:22
why don't you start out by just
1:24
telling me, you know, what is Anaconda?
1:26
I'm sure most people on our podcast
1:28
have at least heard of it, but
1:31
I'm sure there's several, there's probably many
1:33
people who haven't yet. So let's start
1:35
out with setting the ground straight and
1:37
telling everyone exactly what Anaconda is. Oh
1:40
man, there's quite a few different answers to
1:42
that question. I guess at its core, you
1:44
know, Anaconda is a company that is really
1:46
focused on helping people innovate and doing that
1:48
by giving them a way of connecting to
1:51
the broader open source ecosystem specifically around Python,
1:53
but not just Python. And
1:55
Anaconda was originally named Continuum Analytics when
1:58
it was founded. They did a
2:00
lot of work in the data science space,
2:02
recognized the need to really empower Python programmers,
2:05
specifically in data science and other areas. And
2:07
in order to do that, basically they were
2:10
solving their own problems. These were Python developers
2:12
that were actually trying to help enterprises run
2:14
their numerical computing workloads, run other kinds of
2:16
workloads. And they realized, especially in like Windows
2:18
environments and other environments, that there was a
2:20
need to standardize how people got
2:23
access to all the broader open source ecosystems,
2:25
the Python packages that millions of people around
2:27
the world were producing. And so they created
2:29
this distribution of Python packages and Python
2:31
libraries, not all of them written in Python, by the way,
2:33
that's kind of the reason why a lot of this stuff's
2:35
written in Fortran, C++, and other languages, and
2:38
getting the right bits onto people's computers, helping
2:40
them do that simply, manageably, et cetera, they
2:42
came up with the Anaconda distribution, which is
2:45
where actually the name came from. You want
2:47
a big distribution of Python, oh, big Python, Anaconda.
2:50
And then suddenly everybody knows continuing
2:52
analytics through the Anaconda distribution. So
2:54
that's how the name got changed
2:56
to Anaconda. And so at
2:58
its heart, we produce trusted repositories of
3:01
Python and our packages, actually. We
3:03
provide products and services around security
3:05
and governance of those packages.
3:08
And again, that connection to the broader
3:10
open source ecosystem. And we actually have
3:12
kind of enterprise products, as you'd imagine,
3:14
around AI, data science, et cetera. That's
3:16
our workbench product and others. And we
3:18
also do a tremendous amount around the
3:20
open source ecosystem. So we either actively
3:22
participate in, or find or develop many
3:24
different open source projects, everything from Numba
3:26
to Bware to Panel to on and
3:29
on and on. And we donate quite
3:31
a bit of money to the various
3:33
open source projects as well. But
3:35
again, the real goal of the company
3:37
is to connect scientists, data
3:39
scientists, engineers, programmers, knowledge workers, actually.
3:41
We can talk about that later,
3:43
but Excel from Microsoft last year
3:45
announced that you are now gonna
3:47
be actually use the Python programming
3:50
language inside of Excel. That's
3:52
really exciting to hear. Yeah. I
3:54
remember hearing that. When
3:56
you have the broader knowledge workers, the kind of advanced
3:58
knowledge workers that want to have... Again, not
4:00
just to the Python programming language, but
4:03
really to that broader ecosystem and all
4:05
the innovations those packages and community provides.
4:08
And Anaconda wants to connect people to
4:10
that. Yeah, I know that one of
4:12
the problems that lots of languages have,
4:14
I'm more familiar with the problem of
4:16
how packages work in Ruby and Rails,
4:18
especially when there are third party packages
4:20
that are not all in Ruby. And
4:22
so, and how it has to build
4:24
in the background and the
4:26
installation of some of those packages can be quite
4:28
problemsome. Python is a lot easier in general,
4:31
but a lot of that is because of
4:33
the Anaconda Packaging System. Is
4:35
that correct? Is that the best way to think of it? I think
4:37
so, yeah. And actually there are open source
4:40
efforts that Anaconda created Conda specifically,
4:42
that that's where the majority of the user base
4:44
uses actually. And actually, I think it's funny, our
4:46
last measurement, it was actually 45 million
4:48
users, which I think a lot of that comes from
4:50
the explosion in AI and Python kind of
4:52
being the lingua franca of AI and
4:55
using Conda to set up your Python environment and
4:57
not just get the right bits on your computer
4:59
in terms of, you know, is this
5:01
NumPy, is this SciPy, is this SciKit Learn, is
5:03
this Keras or PyTorch, but getting all the dependencies,
5:06
all the things, those things rely on and doing
5:08
that solving in order to make sure those environments
5:10
run. Yeah, that's exactly the problem you're talking
5:12
about. So quickly deploy small
5:15
scale apps to share. I think I got that
5:17
phrasing from your website and that sounds a little
5:19
bit like what you're describing, but it's probably, it
5:21
is a lot more than that. But I know
5:24
one of the major use cases for your platform,
5:26
and you just mentioned it briefly here now,
5:29
is the growing field of artificial
5:31
intelligence and AI tooling and
5:34
data science in particular. You've
5:37
kind of taken that a step further and
5:39
you're actually pushing a concept
5:41
that you call OS for AI. Can
5:43
you talk a little bit about that? You
5:46
meaning out of Conda, I'm sorry. Yeah.
5:49
Okay. I
5:51
guess it's more of a conceptual or philosophical
5:53
point than like a literal operating system. And
5:55
that does confuse some people, but the idea
5:57
here is You have
5:59
this. The operating system you know and
6:01
of takes all the hardware abyss. You.
6:04
Know especially if you go back to the
6:06
seventies and you'll have like a similar similar
6:08
of the Peter Steele up until this day.
6:10
but you have you back and and you
6:12
needed the operating system in order to take
6:14
all of that complexity and actually makes as
6:16
platform that Dental continues to be Job applications
6:18
for people actually use the hardware and kind
6:20
of and able to change and conceptually speaking
6:22
that's what we feel we live. By.
6:24
Providing that feel, providing that that middle
6:26
layer their that connects people to that
6:29
broader open source ecosystem makes it simple
6:31
and easy to develop your applications. Everything
6:35
from science to modern A I
6:38
had models and applications. For.
6:40
Births are specific to I know
6:42
by desserts are there specific things
6:45
you do. Towards. The a
6:47
I use case that make his last. Yeah, absolutely. So
6:49
it's all an extension of what we get. Sub you
6:51
look at it and accountants. I hate. I get my
6:53
Python package is from. You get my packages and you
6:55
provide ways of doing this. You know if the individual
6:58
user that's what they care about, I want to get
7:00
my environment that up only get access to what I
7:02
need so I can train a model I can to
7:04
play model I can. I can leverage that model inside
7:06
of my applications. Then you have the organization's employees, people
7:08
that wanna make sure that they have some. Control.
7:11
Over the Wild West. That. That other uses
7:13
our our interview with a meters. There's hundreds of
7:15
thousands if not more different packages the could be
7:17
that can be used and it's not of are
7:19
secure world. And so seriously that
7:21
was. That was the data sites space.
7:24
and now. If. You to sneak okay
7:26
well as anacondas really going to getting bits on
7:28
your computer? The. Right? That's for you,
7:30
simply sincerely, audibly, etc. But we change
7:32
the definition. Those bits snow under a
7:34
Python and are packaged now. Maybe it's
7:36
datasets, That. You want to use in
7:38
order to train your models are finding your mouse
7:40
or the models themselves that you need to get
7:42
installed and get up and running or easily again
7:45
for your specific environment. Are you running some kind
7:47
of the Linux server? Are you running a Windows
7:49
desktop? Know what kind of steep you you have?
7:51
You even have a Gp You is that of
7:53
sound. Mobile. Device as computing and so
7:56
if you can abroad would Anaconda does say it's
7:58
not just Python and our packages, it's game. That
8:00
those bits happen to be models, they have
8:02
to be datasets, etc. Then you can see
8:04
why. the to kind of ai explosion is
8:06
a natural extension of what encoded us. So.
8:09
You also the data delivery for
8:12
use by a on. I
8:14
would say some of the stuff is
8:16
today. Some the stuff will be ocean
8:18
kernels. Yeah so it's. A lot
8:21
of the. Rapidly. Changing improvements
8:23
means to right now which are be honest I think
8:25
I'm ready to release when but say which is I
8:27
can't even keep on top of it. At All
8:29
is that says. We. Have the
8:31
millions of users we have people understand how we are,
8:33
you help them get their and barnes up and and
8:35
twelve applications each other more not and so that's a
8:37
very natural extension for them to come to us as
8:39
they will help us of these problems to and for
8:41
see them. So yes some of the susceptible to today
8:43
some this stuff will do in the near future and
8:46
some stuff is is gonna be coming. To. The
8:48
some those datasets millward Giant are.
8:50
so are you still packaging those
8:52
for delivery or are you also
8:54
looking at tooling to get access
8:56
to data system, other locations and
8:59
maniacs to the whole world of
9:01
possibilities air and whole world. We
9:03
address Valve. And it's not
9:05
just access to the data itself, it's everything
9:07
around the actual province video to see. Have
9:09
that, the ongoing copyright discussions, and kind of,
9:11
do I have access to the right data
9:14
drive? Do I even know. That. I'm
9:16
using that kind of data to train that I have
9:18
a license to train on. Or. That I
9:20
have the right to train on and so helping
9:22
people saw those problems. It is a D or
9:24
it's It's not just us, it's a very common
9:26
refrain that people are holding off on Sundays Technologies
9:28
to. They don't understand the exposure, They don't understand
9:30
what they can do and how they can do
9:33
it. And then you'll even hear things about like
9:35
what data should even be twenty on with it
9:37
is to be using. And so actually I went
9:39
Attended a very interesting presentation on the size model
9:41
for Microsoft at Gdc. and they were
9:43
making a point that saito sometimes trend is models as
9:45
a bit a garbage in garbage out know you want
9:47
to use high quality data so it's wanting to train
9:49
your data and all your models are on all the
9:51
date on the internet it's another thing to say work
9:53
and actually took a subset of data and and get
9:55
better output because we've we don't we don't need to
9:57
have every possible piece of san section that was ever
10:00
written, incorporated into our model when we really want
10:02
it to be an expert in, you know, I
10:04
don't know, science or physics or something else. It's
10:06
not just about the size of the data sets.
10:08
It's about provenance. It's about security. It's about auditability,
10:10
reproducibility, and then collaboration and sharing. Cool.
10:13
Cool. So two different topics and just taking
10:15
from words that you said to words that
10:17
you said, and I like to talk about
10:19
them independently, we could probably have a whole
10:21
conversation on both of them, but one
10:24
of them is secure and the other one
10:26
was copyright. So let's
10:28
take whichever one you want to cover first,
10:30
but I'd like to talk about both of
10:32
those areas of what you do. I
10:34
mean, flip a coin. They're both tremendously interesting. Then
10:37
let's start with security. All right.
10:39
So you, you mentioned securely deliver
10:41
to the desktop, the package,
10:44
whatever, whether the package is data code,
10:46
whatever. So that implies
10:48
a lot of things and perhaps a
10:50
lot more than what you actually do.
10:53
I'm not exactly sure it can
10:55
be a wide spectrum of different things that
10:57
securely delivered to your desktop can mean. Why
10:59
don't you tell me what you mean by
11:01
that? Yeah, again, it could absolutely be
11:03
a lot of things, but I guess at its heart,
11:05
it's, you know, they talk about a lot about the
11:07
software bill of materials. You want to think of it
11:09
as like chain of custody. Like you watch, you know,
11:12
law and order and either kind of police procedurals and
11:14
you know, you want to make sure that, you know,
11:16
that evidence, you understand what happened to it every step
11:18
of the way so that there's nothing that was corrupted,
11:20
there was nothing, no kind of broken trust or misuse.
11:22
You can think of packages and whatnot, the same thing.
11:24
So how do I know that this open source code
11:26
that I'm pulling into my own, you
11:28
know, my own computer and I'm trying to use,
11:31
or I'm trying to share with others doesn't have
11:33
anything nefarious in it. It wasn't the wrong code.
11:35
It wasn't something that somebody, you know, some malicious
11:37
actor injected something bad into it. You know, there've
11:39
been plenty of very high profile security and incidents
11:41
over the years where maybe a company like, well,
11:44
I don't name any names, but you can look
11:46
them up, but lots of big name companies had
11:48
a supplier that provided them software, provided them
11:50
something. And the attack actually came through the
11:52
supplier to that major company. And so
11:54
didn't talk much about my background, but I spent
11:56
17 years in high performance computing space. So I've
11:59
started several companies. of the years and
12:01
the last one that I founded was acquired by Microsoft in
12:03
2017 and so I joined Azure
12:06
and I led their HPC and AI software infrastructure
12:08
team from the product side actually, it's supposed to
12:11
be the dev side and then that's where I
12:13
left and then joined Anaconda. And so, you know,
12:15
while at Microsoft it was a very, very, very
12:17
major concern that anything we're pulling in from anywhere
12:19
else that we understood exactly what we were doing
12:21
because Microsoft was such a huge target for various
12:24
obvious reasons and our clients were such huge targets
12:26
and the last thing we wanted is for people
12:28
to be able to kind of backdoor into
12:30
our clients through us, through our suppliers and so on.
12:33
All those same concepts apply to getting bits
12:35
onto your computer and so when it's Python
12:37
packages, whether it's a model that comes from
12:39
somewhere, whether it's a data set that's used
12:41
them, you want that same chain of custody
12:43
knowledge, you want that same problem, you want
12:45
that same control. That could be for security
12:47
reasons, like I said, in terms of malicious
12:49
actors, but it can also even just be
12:51
for reproducibility later. If you're going to use
12:53
this stuff in pharmaceutical industries or financial industries
12:55
or other industries that have regulatory needs or
12:57
other kinds of, you know, especially life or
12:59
death situations, you know, if your model
13:02
makes certain decisions, if your system causes certain things
13:04
and something goes wrong, they're going to understand why
13:06
you better be able to reproduce your results. So
13:08
even just being able to track your data, track
13:11
your models, track your packages and control them over
13:13
time is very important. This
13:20
episode of Software Engineering Daily is
13:23
brought to you by HookDeck. Building
13:25
event-driven applications just got significantly easier
13:27
with HookDeck, your go-to event gateway
13:29
for managing webhooks and asynchronous messaging
13:32
between first and third party APIs
13:34
and services. With HookDeck, you
13:36
can receive, transform, and filter webhooks
13:38
from third party services and throttle
13:40
the delivery to your own infrastructure.
13:42
You can securely send webhooks triggered
13:45
from your own platform to your
13:47
customers' endpoints. Ingest events at
13:49
scale from IoT devices or SDKs
13:51
and use HookDeck as your asynchronous
13:53
API infrastructure. No matter your use
13:55
case, HookDeck is built to support
13:58
your full software development. Like
14:00
all you see: Hooked Xcl I to
14:02
receive events on your local hosts, automate
14:05
death staging and prod environment creation using
14:07
Be Hooked that a P I or
14:09
Terraform provider and gainful visibility of all
14:11
events using Be Hooked that logging and
14:14
metrics and he hooked up Dashboard. Start.
14:16
Building reliable and scale a bull
14:18
Avenger of in applications today. Visit
14:21
hopethat.com/as Eat daily and sign up
14:23
to get a three month trial
14:25
of the hook That team plan
14:27
for free. From.
14:35
From putting him since on what
14:37
you do in one of his
14:39
don't do what you do you
14:41
do for Nord Security validates in
14:44
Michigan Too costly of the bits
14:46
is correct and validated so that
14:48
people to never served become com
14:50
is where you intended to be.
14:53
sports are not do the things
14:55
like packaging the code into server
14:57
computers and running and virtual environments
14:59
and that sort of not managing
15:02
back process. Not necessarily. yeah that's
15:04
sensory more what are usually do Hollywood of the
15:06
plight of stuff you can take that as we
15:08
are. We do things like Cd duration and some
15:10
of our A for products so we can give
15:12
people the ability to say looks. these packages of
15:14
these other do a lot of it's they have
15:16
known vulnerabilities that we're okay with a voter though.
15:18
These other things we can mitigate or they don't
15:20
apply to us or you're we understand them. It's
15:22
ago had use those but these other cities these
15:25
other problems do not use those and discuss facts
15:27
you getting organizations the ability to does have a
15:29
bit of control over the wild west and so
15:31
they want to power their users. Going to bother.
15:33
Scientists and engineers but they have to have
15:35
some level of control over that. And so
15:37
yes it's everything from building the source as
15:39
you know building it ourselves to the we
15:42
know that what you're getting is what we
15:44
have A to wheat we've actually produced the
15:46
artifacts. from the source code itself to city
15:48
to ration to mickey. Source of integrates with
15:50
other security providers tools. As a general philosophy
15:52
I very much like to force myself to
15:55
nazis competitively or not sentence your somewhere in
15:57
So anytime the center by go what is
15:59
that a person hospice thing that we like
16:01
to use do is break. How can we
16:03
work together to give you a better experience,
16:05
a better solution. And so when it comes
16:08
to things like security and other capabilities I
16:10
don't view what anacondas as as having to.
16:12
I don't view people who part of the counter who
16:15
are you so much that anacondas threats he said it's
16:17
it's kind of growing the pie for everyone so we
16:19
do look and say we don't need to every possible
16:21
thing for security we to see to make sure that
16:23
the pieces that we handle we handle very very well.
16:25
No we integrate that other schooling so that people can.
16:27
people can get the solution of any. Good.
16:36
This episode of Software Engineering Daily
16:39
at Forty Five Vantage. You
16:41
know your father will be for this
16:43
month. So many companies cloud pass or
16:45
the number two line item in our
16:47
budget and the number one fastest growing
16:49
category. A Spend. Vantage. Hop
16:51
See, get a handle on your
16:53
cloud does that says said reports
16:55
and dashboard still for engineers, finance
16:57
and operation seems. With. Vantage, you
16:59
can put costs in the hands
17:01
of the service owners and managers
17:04
who generate them, giving them budgets,
17:06
alerts, anomaly detection, and granular visibility
17:08
into every dollar. With native filling
17:10
integrations with dozens of cloud services
17:12
including eight of us Azure, Tcp,
17:14
Data Dog, Snowflake and To Win
17:16
at Ease, Vantage is the one
17:19
in Op's platform to monitor and
17:21
reduce all your cloud bells. To.
17:23
Get started! Had to vantage that
17:25
sh connect your accounts and get
17:27
a free savings estimate as part
17:30
of a fourteen day free trial.
17:38
Was looks google assistant were than an
17:41
ambiguously babies are more closely related Burma
17:43
I miss we thought or copyright you
17:45
imply the you do things to help
17:48
with. Copyright Reuters I imagine there's
17:50
you're talking with the same sorts of
17:52
things as far as trusting were. Source:
17:54
Of information or not, I'm a
17:57
conservative, comes from the original source,
17:59
and is. Credited. Back to
18:01
the original source, but for can you elaborate a
18:03
little bit more what you meant by copyright. Yeah,
18:05
so we don't Today you know if you
18:08
go get a cast of the that's not
18:10
something we're doing, it's more some the were
18:12
very very interested in leaning into an axial
18:14
couple saying it rak to solve that problem,
18:16
but we want to be part of the
18:18
solution that com. It's really to see
18:20
think if we went how people innovative and connecting
18:22
to open source and helping do it simply streamers
18:24
of than we have to alberta arm as well
18:26
and so he actually recently went. To
18:29
his ears know. Cofounder of Anaconda and the
18:31
the City Oh for a long time again
18:33
big shoes some since here and now it
18:35
was Ceo see recently actually stepped over into
18:38
the role of she's a I officer and
18:40
is below with the copyright and time data
18:42
provenance and making sure that we're all doing
18:44
the right thing with how we use intellectual
18:47
property and data from the trainees models and
18:49
whatnot. that is kind of a passionate subject
18:51
from is actually one of the reason why
18:53
you can kinda anaconda was listen to him
18:56
talk about those with such intensity and twenty
18:58
platform us. And so she is also eating
19:00
some kind of a i specific technology and
19:02
other developments with an anaconda. And so a
19:04
lot of this is just around. Being. Aware
19:06
of that problem says and leaning and everything from us
19:08
during the Ike A and a kind of joined the
19:11
Be Beacon A recently formed A I Consortium. To.
19:13
Even collaborate with people, illegal industry and others to again
19:16
be aware of his problem and and help it evolve
19:18
in a direction that we the going to be best
19:20
for innovation. I don't think it's right to look at
19:22
the since a lock it down because we don't know
19:24
how to handle it at the same point. You.
19:27
Have to you this responsibly that we can
19:29
keep this kind of commons around February to
19:31
benefit from. As
19:38
a listener software Engineering daily you
19:40
understand the impact of generative ai.
19:43
And the podcast we've covered many
19:45
exciting aspects of Cheniere technologies as
19:47
well as a new vulnerabilities and
19:49
risks they bring. Hatter. Ones
19:51
A I read teaming addresses the
19:53
novel challenge as if air safety
19:55
and security for businesses. Lansing new
19:57
Ai opponents. The. a place in
19:59
that sense testing AI models and deployments
20:02
to make sure they can't be tricked
20:04
into providing information beyond their intended use,
20:06
and that security flaws can't be exploited
20:08
to access confidential data or systems. Within
20:11
the HackerOne community, over 750 active
20:14
hackers specialize in prompt hacking and
20:16
other AI security and safety testing.
20:19
In a single recent engagement, a team of
20:21
18 HackerOne hackers quickly identified
20:23
26 valid findings within
20:25
the initial 24 hours and accumulated
20:28
over 100 valid findings in the two-week
20:31
engagement. HackerOne offers strategic
20:33
flexibility, rapid deployment, and
20:35
a hybrid talent strategy.
20:38
Learn more at hackerone.com/AI.
20:40
That's hackerone.com/AI.
20:52
So, it is safe to say that
20:54
you are, as a
20:57
company, skewing towards AI
20:59
versus the general package
21:01
delivery process. Is
21:03
that a fair statement? Yes. Okay.
21:07
So, our existing business, our existing community, our
21:09
existing efforts in user base, like that is
21:11
not being abandoned in any way, shape, or
21:13
form. A lot of this is, as I
21:16
mentioned, a natural extension of what our users have been
21:18
asking us to help themselves, or in many cases, what
21:20
we've been helping themselves. So, I know you know this,
21:22
and a lot of the listeners are going to know
21:24
this, but AI reminds me of those bands that
21:27
have been toiling for 10 or 15 years. And
21:29
then again, hit album, and people are like, whoa, where did
21:31
this overnight success come from? And it's like, this
21:33
has been going on for a long time. And
21:36
we've been doing machine learning and artificial
21:38
intelligence almost from the beginning of the
21:40
company. It's just the new
21:43
kind of impact that LLMs
21:45
have had, and some of the advances they've
21:47
enabled in conversational programming, and chat interfaces, and
21:49
that kind of revolution. I think it's broadened
21:51
what people can apply it to. It's broadened
21:53
the number of scenarios people can actually, the
21:55
problems people can solve with the stuff. And
21:58
so, it feels a bit like an overnight shift. And it
22:00
is in the sense of those breakthroughs, but it's not
22:02
like the field of AI hasn't existed for decades. In
22:05
1997, I worked as a summer intern
22:07
for a company called Biocomp Systems in
22:09
Seattle that used neural nets to help
22:11
people do industrial optimization. People used it
22:13
for like financial stuff, which the CEO
22:16
was really unhappy with. And
22:18
I remember leaving that job and thinking, oh, neural nets are so
22:20
cool, I'll never see those again. Obviously
22:22
very wrong about that. But anyway, so this stuff's been
22:24
around for a while and Anaconda has been working on
22:26
it and helping people with these problems for a very
22:28
long time. And so it's really just, I think,
22:31
the explosion of interest, the kind of addressable
22:33
market, if you will, the number of problems
22:35
makes it seem like we're, I guess, you
22:37
know, moving over there. And I
22:39
think instead it's really more adding these capabilities
22:41
and bringing them to our community and then
22:43
adding all that energy and kind of growing
22:45
it for everyone. I got a
22:47
story to go along with your early AI story.
22:50
In 89, I was working for Hewlett-Packard. And
22:52
my boss at the time, along with his boss and
22:55
a couple other people from the group I was working
22:57
at, had moved not long
22:59
ago from working on a Lisp
23:01
machine, which was part of the HP
23:04
AI strategy. And that was, it was
23:06
all Lisp. Lisp was the AI language
23:08
of choice. That was back in the
23:10
80s. And it wasn't long where you
23:12
didn't hear about that anymore. And it
23:14
was like, well, AI obviously isn't gonna
23:16
happen. This is just, no, it's never
23:18
gonna come. This is unrelated to
23:21
anything that has any value anytime in
23:23
the future. And then bam, it hasn't
23:25
been a bam. It's just been worked
23:27
on little by little in the background.
23:29
And now suddenly it's come into its
23:31
stride. And it's really has
23:33
made some great progress in recent years. A
23:35
good friend of mine, a gentleman named Ian
23:37
Fender, he actually has a Lisp machine. Really?
23:40
Oh, wow. Ian collects computers
23:43
and he has a mind blowing collection
23:45
of technology that he has. I've
23:47
worked with him for years previously and yeah, good friend.
23:49
But yeah, the Lisp world is definitely world I'm familiar
23:51
with. And one thing I try to keep in mind
23:54
with anything is things are never quite as good as
23:56
they seem, but they're never as bad as they seem
23:58
either. And with hype cycles, it's... It's the same
24:00
thing. And so, you know, AI is going
24:02
to take over the world. You know, sure, my, I
24:04
would say super worried about, you know, the impending doom
24:06
or the impending transformation. No, I don't think it's going
24:08
to be as good or as bad, but I think
24:10
you're right. People get caught up in those cycles, like
24:12
they did in the seventies and the eighties and the
24:15
nineties and the disappointment that it isn't as good as
24:17
they thought it was going to be leads to that
24:19
disillusionment. When in reality, it's just a long, steady march
24:21
towards progress. Exactly. Yep. Yep. And
24:23
AI is going to be a major part
24:25
of that, just like other technology has been,
24:27
but no more, no less than that. Eventually.
24:29
We just don't know exactly where yet. And
24:31
so I was speaking to a group
24:34
of interns the other day, and
24:36
their number one question was, will we have
24:38
jobs anymore? Software interns. Software interns. I said,
24:40
well, a couple of things to keep in
24:42
mind. One, AI is going to
24:44
change jobs, but it's not going to eliminate
24:46
jobs. It's going to change jobs. And just
24:48
like any other piece of technology, it's going
24:50
to change jobs. But the other thing
24:52
is, if any job is going
24:54
to be even more important than it
24:56
was before, when in the world of
24:58
AI, it's software developers. So don't worry.
25:00
And that actually helped a lot, I
25:02
think, but I think it's amazing how
25:04
many people are actually worried about AI
25:06
at this point. And not
25:08
really for valid reasons. Well, they're valid reasons,
25:11
of course, but they're not reasons that are
25:13
going to come to fruition. We just don't
25:15
know yet. It's just too early to
25:17
tell for sure what's going to happen. But
25:19
history has shown us what will likely happen.
25:22
I think it's that uncertainty that is what
25:24
drives the worry. If you have a high
25:26
trust environment, if you can rely
25:29
on your community, your friends, your coworkers, whatever
25:31
group you're thinking about, and you have that
25:33
support and you have that trust, I think
25:35
you can face that uncertainty and that adversity
25:37
together when it removes some of the worry. I
25:39
think when you have a low trust environment, then you feel much more
25:42
responsible for yourself to solve those problems. And I think it leads
25:44
to that anxiety of, what if I do lose my job? How
25:46
am I going to pay my mortgage and feed my family? But
25:48
I do try to maintain a positive attitude that, technology,
25:52
industrial, societal progress Has
25:55
made the world better for everyone. I Would not
25:57
trade my life today to go back and be
25:59
a Roman emperor. I would absolutely not. Absolutely
26:01
no. As so as a result. Again, I try
26:03
to keep in mind that yes, there's going to
26:05
be change, but we're going to navigate together and
26:07
my own purpose. You know, my own kind ago
26:10
my sister make the world a better place and
26:12
and hopefully capture a bit about betterment from school.
26:14
I am capitalists, but I do. We make the
26:16
world better place in that means helping. The.
26:18
Nazis grandiose but helping to roll through the say it's like
26:21
trying to make it to as we got his our revolution
26:23
it is better for everyone. But. Your point?
26:25
Yeah, I absolutely disagree. job the future. And one thing
26:27
that I that I made a mistake early on. I
26:30
took a lot of pride in being like a
26:32
hardcore C Plus Plus developer and being like really
26:34
low level knowledge in August and arse of an
26:36
icy side away from Python for a time. For
26:38
me it was like if the tool wasn't hard
26:40
was it a real tool Sand I had kind
26:42
of lost the purpose which is to solve a
26:44
problem like why are you doing what you're doing
26:46
and all the sun was like hey Russia chance
26:48
of the the problems ice with the best tool
26:51
for the job and so his ai is going
26:53
to remove grunt work. If it's gonna be this
26:55
network of experts that I can have helping me
26:57
solve problems and answer questions and broaden my creativity,
26:59
it's learn. Things like how is that not better
27:01
for me. And so yeah, I think as long
27:03
as you. View what you're doing is
27:05
helping people solve problems, address challenges, do things
27:07
like that and you don't get caught up
27:10
into specific. Skill. Set of the
27:12
tool or your knowledge you know kind of esoteric
27:14
knowledge or something is be the reason why you
27:16
have value. Then you'll be able to adapt. You'll
27:18
be able to learn you know whatever those to
27:20
list. learned how people see some from. From.
27:23
Think we have a similar background. I.
27:26
Spend. Most my early career in C
27:28
Plus Plus as well and I was
27:30
viewed oxford and C Plus last modified.
27:32
I was actually the someone wrote an
27:34
article about from the work we were
27:36
doing and it's P C post Western
27:38
or because we want the first ones
27:40
to use C Plus Plus code in
27:42
a Unix kernel. Is like that
27:45
was revolutionary back. This is not that
27:47
hard you know exists. This is how
27:49
this works. That of course it's all
27:51
pretty common nowadays, but it was kind
27:53
of interest to him. Always imagined it
27:55
would never move away from C Plus
27:57
Plus. Other thing, the mood me away
27:59
some C Plus Plus. The project that
28:01
ended up point me into learning Roby.
28:04
And the alley not Ruby was such a
28:06
sake language until a really high powered and
28:08
it's now have all the languages I used.
28:10
Ruby is by far my favorite just because
28:13
it's. So. Easy to do what you
28:15
want to do. Now it's not the best for a
28:17
lot of environments. Lot of. Projects.
28:19
I work on nowadays but at the time
28:21
was really interesting and but it's the language
28:23
is like your pi thoughts. The i bought
28:26
me away from C plus was yeah I
28:28
totally great guy Rubber I took such pride
28:30
I did a ton of the that template.
28:32
never going. Back. In early aughts and
28:34
I was so proud and in turn is by
28:36
not only say as he did come up with
28:38
the great designed to a particular challenge palmer that
28:40
I was working at time but the I read
28:42
All and Rescues books and all that stuff and
28:44
it was was so much pride in the difficulty
28:46
of it and it really was revelation to say
28:48
that it's just the right tool for that job.
28:50
There are plenty of other tools out there and
28:52
yeah I was is used to to and it
28:54
was fine actually. early on we had Python versus
28:56
a list arguments but then we used Chef. Back.
28:59
When it was still fall Outs code but the Chef
29:01
Automation language and that was my introduction ruby and it
29:03
was the same thing of like oh. Again,
29:05
Right tool for the right job. Really going
29:07
abroad, my perspective. Makes.
29:10
Perfect sense of self, but also
29:12
research for anaconda a sound of
29:14
phrase I'm I sound very enlightening
29:16
that I sense a layman. Very
29:18
interesting that I don't think really
29:20
applies now that we've had this
29:23
conversation. While. I think a
29:25
better phrase might apply setting that context.
29:27
Let me tell you what the phrase
29:29
was, it was Anaconda is low code
29:32
A I Development. Now.
29:34
And I don't see that in what you're
29:36
saying here, but maybe a better phrase? all
29:38
birds you for that same com it might
29:40
be. Sastre. Onramp day
29:42
I get started with have quicker easier
29:44
is that a better example of what's
29:47
you really are doing his I think
29:49
those are two aspects I don't quite
29:51
into the same thing other they are
29:53
related. The low code part is so
29:55
we we are car doesn't year college
29:57
blocks. And it's a get this over them. It's really
29:59
bad. The regime around helping people understand and learn
30:01
Python it's aimed at students and whatnot and Edgewater
30:04
kind of in the names. but the real focus
30:06
their or could have taken distraction out of it
30:08
is low code know code kind of composition of
30:10
capabilities to produce results. And if you look at
30:12
things like I want to have models and I
30:15
want to corporate the my application Michael that's just
30:17
more lego blocks that you're kind of using to
30:19
assemble and build your final structure. but yes faster.
30:21
Honor of the I that is both we're trying
30:23
to you today and I guess a guiding principle.
30:26
Of. The company and as is everything from.
30:29
Making. It simple on when he myself but
30:31
maybe some people incorporate and you're gonna see
30:33
some stuff coming from us later this year.
30:35
I think it'll and body those things much
30:37
more clearly. Good. Time as
30:39
very much looking forward seen other that's
30:42
great. Talk about
30:44
standardization. How the standardization fit
30:46
into your strategy. In.
30:48
What way? Where Do you mean? So.
30:51
Is a eyes closed or
30:53
open? That's
30:55
obviously biased in this prospectus for
30:58
us up or dislike. I
31:00
think open source ecosystems have clearly
31:03
demonstrated over the last couple decades.
31:05
That. They were really drive innovation and
31:07
you know look at a change in
31:09
Microsoft. Look. At them except under eighteen
31:12
and Under Bomber A dazzling and a bomber.
31:14
Open Source is absolutely a feared from there
31:16
and all that kind of or thought of
31:18
the time and the now one of the
31:20
largest about the largest contributor open source projects
31:22
in the world. And it's really it's not
31:25
just in words, it's assets are hope I
31:27
can be seen a nap and I think.
31:29
Being. Able same up. there are billions of people. Miss World.
31:32
And. We want to empower all those people
31:34
you know. Talent is is evenly distributed, opportunity
31:36
is not. So Let's make opportunity be equally
31:38
distributed and let's give people the ability to
31:41
contribute all that innovation. I don't see how
31:43
any close ecosystem any one company could possibly
31:45
help to compete with that as I work
31:48
at a company and with comes to settle
31:50
again. I'm a capitalist, but I think the
31:52
world better place when you have that open
31:54
innovation. and there is absolutely
31:57
a role to play there and
31:59
excited look at this, we say, okay, what
32:01
are the kinds of problems the community doesn't want
32:03
to solve? What are the kinds of problems the
32:06
communities aren't good at solving? What are the kinds
32:08
of problems that we can collaborate with those communities?
32:10
And then what are the kinds of problems those
32:12
communities run into in organizing and collaborating and maturing
32:14
and operating over time? And how can we help
32:16
address those problems? And so when it comes to
32:18
that, I guess I'm a true believer that open
32:20
source and open collaboration, that's just, that's the main
32:22
driver. Standardization, I think really helps when it comes
32:24
to helping to enable that, you
32:27
know, if you want to get together with
32:29
a friend and go to dinner, that's trivial, right? You know,
32:31
now it's maybe a birthday party, and you got 10, 20
32:34
people takes a little bit more effort, might have to schedule
32:36
a time, might have to make a reservation, you have to
32:38
put stuff for that. Now you're doing a wedding, and
32:40
you got 100 people, you got 200 people, and
32:42
like that takes real planning. And then now you
32:44
have like a major concert, you have your Taylor
32:47
Swift, and you want to come to a city
32:49
and take over. And it's just it's just order
32:51
of magnitude more collaboration. And I think when you
32:53
want to have innovation happen at scale, and when
32:55
you want to enable people to build and to,
32:57
again, put those Lego blocks together, you have to
32:59
have some kind of definition of a Lego block,
33:01
right? If you had 50 different kinds of sizes
33:03
and connectors and whatnot, like no one's building, you
33:05
know, whatever the latest hot Lego model. And
33:08
so I think there's a role for standardization, which is really all people just
33:10
coming together and saying, look, let's define the, you know,
33:12
your programmer, so you get it, like you have an
33:14
API, let's define the interfaces. And let's define those interfaces
33:16
that we are all free to innovate within those interfaces.
33:18
But now we have those ways of collaborating together. So
33:21
that's where I believe standardization play a role where I
33:23
don't like standardization, I think I go so far as
33:25
to say, I think you and definitely listeners would agree
33:27
to this is when it's used for any kind of
33:29
capture, when it's used for any kind
33:31
of, in the old days, companies were trying into
33:33
standards bodies and make it so that their technology
33:35
was the only compliant one or things like that.
33:37
But again, getting back to supporting open source and
33:40
supporting the commons, I think there are ways of
33:42
having those standards that foster innovation and foster collaboration
33:44
without locking people out. So you
33:46
can imagine open
33:48
interfaces for AI, that
33:51
general comment can apply to a
33:53
couple of different layers within the
33:55
AI ecosystem. It can certainly companies
33:57
like open AI are trying to...
34:00
to create open interfaces to give you
34:02
access to AI. But
34:04
there's also interfaces to consistent uses
34:07
of the same datasets and
34:09
making datasets available for AIs
34:11
and making large language models
34:13
available for multiple use cases in different
34:15
ways. And standardization that can happen
34:17
in those areas or openness, I should say.
34:20
Maybe it's a better term than standardization given the
34:22
language you're using. I'm not sure we can talk
34:25
about that. But there's different layers
34:27
there. Where do you think we
34:29
are in that hierarchy as far as we
34:32
know how to do open software and
34:34
we know how to do open APIs? Do
34:37
we know how to do open datasets? Do we know
34:39
how to do open large language models? Do we know
34:41
how to do whatever the
34:43
next level is there? Are we
34:45
good at that yet? Are we starting out or
34:47
do we just not know where we're going yet?
34:50
That's an interesting question. I've never been asked that
34:52
question with that phrasing and framing, which is really
34:54
cool. I would say, on one hand, I
34:57
think we absolutely do know what to do. And
34:59
what I mean by that is you look at
35:01
how software has been distributed with the different licenses
35:03
and the evolution of how people open up their
35:05
code and the variety of ways people can do
35:08
it. And you say, OK, conceptually, we're probably
35:10
going to do the exact same thing for data. We're going to do
35:12
the exact same thing for models. So I think
35:14
in that sense, there's a bit of, I don't want to
35:16
say it's common sense. I'm not trying to oversimplify. But I
35:18
do think that there is a way of saying, look, these
35:20
problems are solved in this domain. Let's just broaden our perspective
35:22
and say, OK, they're probably going to be solved in a
35:24
similar way. The devil is in the details, though.
35:26
So the part that we haven't gotten right is what do those
35:28
data licenses look like? How do we make
35:31
sure that people who contribute are maybe
35:33
compensated or credited and whatnot? And so
35:35
it's not that those technical challenges and
35:37
those legal challenges and those philosophical challenges
35:39
aren't there to be solved. But I do
35:41
think that we can say we've solved these problems before
35:43
in a different domain, and we can just almost apply
35:45
those approaches in this new domain. I think there's a
35:47
tendency to look at something new and say, this is
35:49
new. And so therefore, it's entirely new. We don't know
35:52
how to do anything in this area. And
35:54
in reality, it's just a different view of the problem you've
35:56
already solved. And so you just kind of have to have
35:58
that adaptation. to take a while
36:00
to shake through all that. And given the, again,
36:03
the massive uncertainty and that kind of changes there,
36:05
and honestly, the economic impact, it's probably going to
36:07
be a complicated discussion. Yeah, I 100%
36:10
agree with everything you said there, but there
36:12
is one aspect with things like
36:14
data and large language models that
36:16
use data that's different, typically, not
36:19
all the time, but typically compared
36:21
to just code. And hate
36:23
saying the word just code, but you know what I mean.
36:26
And that is privacy. And
36:28
whether we're talking about PII
36:30
or whatever, there's
36:33
information in data that is
36:35
specific and valuable just by
36:37
having the data, independent
36:39
of whether you have the right to use it
36:41
or not, having that information here may not be
36:44
appropriate and the privacy aspects that
36:46
go with that. That's different
36:48
than with code, is rarely
36:50
a time where code itself,
36:53
your open source code has to be
36:56
private. No, you're absolutely
36:58
right. Rarely that situation, but that wouldn't
37:00
be a situation with data. Open
37:02
source data might still be private. Yeah,
37:05
that's exciting. The emergent risks or problems
37:07
that come also when you, any one
37:09
data set may be fine, but
37:12
you could put three or four of them together and suddenly
37:14
now you can identify people and suddenly now
37:16
you can do attribution. So there is an issue where,
37:18
you know, you get these emergent things that come out
37:20
of, okay, well, I've released my data set because it's
37:23
fine by itself. There's no PII or there's no way
37:25
of tying it back. But then you get, you know,
37:27
other people release their data sets and suddenly somebody realizes,
37:29
well, if I get data sets, A, B, C and
37:31
D, now I can do horrible things.
37:34
Well, how do you deal with that problem? Because you
37:36
have to source each one of those data sets is
37:38
fine. And so how do you, yeah, that's
37:40
exactly a challenge. I remember the first time I learned
37:42
that, you know, this is obviously even before AI, this
37:45
is just when people were like, Oh, I can take,
37:47
I don't know, IP address information and I can take,
37:49
you know, health information. I can buy data sets from
37:51
credit card companies and, you know, your shopper's club card
37:53
for your grocery store. And I can put these things
37:55
all together and I can learn all kinds of interesting
37:57
stuff about you that you did not intend. But again,
37:59
every day, data set provider in that sense wasn't doing
38:01
anything wrong. Right, right. And so how do you then
38:03
deal with that in this world where the models can
38:05
be trained on that data and then people can actually
38:07
do stuff is a very interesting question. I
38:09
don't have an answer for that one. This is why I love being in
38:11
this space every day. So we actually have a question, not get excited. Yeah,
38:14
no, that's great. Yeah, I think if you
38:17
would agree that this is an important area that has
38:19
to be dealt with in the open
38:21
data. Well, I do. I also
38:23
wonder how much data is actually not going to be
38:25
public. I think there are a couple of guiding principles
38:27
that I think I'll just say I have at the
38:30
moment, always open to evolving them. But one of them
38:32
is that small open source models are going to be
38:34
great, if not perfect for many, many people. But they're
38:36
going to want to be able to control the data.
38:38
Like I think the AI revolution
38:41
has really, I think, changed people's
38:43
view of the data, right? Like you see people putting
38:45
all their content behind paywalls or locking it behind agreements that
38:47
you're not going to use to train models because they
38:49
suddenly realize, hey, all that data is actually incredibly valuable in
38:51
a way that it was, it was not that it
38:53
wasn't before, but that, you know, there wasn't that direct
38:55
connection. And so I think you're going to have people keep
38:57
those data sets private, and they're going to want to
38:59
train their models internally. They're going to
39:01
want to govern them internally and probably run them
39:04
privately or at the edge or in some hybrid
39:06
fashion. And so I think that is a change.
39:08
I think you're going to see what people previously
39:10
kind of gave away, or at least it didn't
39:12
necessarily govern, and they aren't going to do that anymore.
39:15
And so that will actually hinder innovation. It'll be very
39:17
interesting to see, like, does that mean that only companies
39:20
that have massive resources like OpenAI and others and Microsoft
39:22
and the Googles of the world that can license content
39:24
are going to be able to train? Or,
39:26
you know, how is that going to evolve over time? I don't know. Yeah,
39:29
it's a great question. I think
39:31
that's one of the fundamental questions that's going to
39:33
come with the whole AI revolution.
39:36
Is AI going to be, in
39:39
general, giant data sets, or
39:41
is it going to be many, many, many
39:43
small data sets and training that comes from
39:45
that? I think there's a big
39:47
world in the
39:49
small data set AI. Let
39:52
me give you a simple example. I would love
39:54
more than anything. I write a lot of content.
39:56
I've got books and articles and everything that I've
39:58
written. I would love. I love nothing
40:00
more than to take all my content, put
40:03
it into an AI, and have a
40:05
chat bot be on my website to
40:07
be able to have people ask questions
40:10
and respond with things that I
40:12
know. Now, I know I can
40:14
do this today. I just haven't taken the time to do
40:16
it. And I know there's companies that are
40:18
looking or have started to do that sort of things, but
40:21
that's really what I want. That's not a large dataset
40:24
problem. It's a large learning problem,
40:26
but it's not a large dataset
40:28
problem. Should we be
40:30
separating the learning and the ability
40:32
to learn from an AI model from
40:35
data to be able to do
40:37
things like that easier? If I mentioned
40:39
any question, I think so, because I like
40:41
to think of these models as almost like
40:43
expert assistants, a grad student that
40:45
maybe is fresh out of college or even just
40:47
a, I don't know, computer science students fresh out
40:49
of college. If I'm building, okay, I'm a technical
40:51
co-founder of a startup and I'm trying to build
40:54
a team out, I don't look for one person
40:56
that can be the product manager, the technical lead,
40:58
the system architect, the IT person, security person, the
41:00
front end developer, back end developer, et cetera. I
41:02
look for a team of people and
41:04
I combine them together and it's the, some is
41:06
greater than the individual parts, that's the power there.
41:08
And I feel the same way about AI models.
41:10
And so I think you're right, being able to
41:12
say, hello, we have these, you'll see that through
41:14
a mixture of experts and other systems that say,
41:16
let's take these individual pieces and put them together
41:18
and actually generate it that way. As I mentioned
41:20
earlier, you train your model in the entirety of
41:22
data on the internet and you're gonna get every
41:25
piece of fan fiction, you're gonna get celebrity birthdays
41:27
and celebrity obituaries and you name
41:29
it, all in that model, do I need
41:31
those in order to ask it to help
41:33
me how to improve the flow of my
41:35
story or the grammar or to explain physics
41:37
concepts to me or I'm a big
41:40
fan of Khan Academy. If I'm trying to brush up
41:42
on different math concepts, I don't need all that. And
41:44
so I think you're exactly right, taking small data sets and small
41:46
models and putting them together so I have this team of experts
41:48
that really enable me and empower me, I do think that that
41:51
is a large part of the future. That's
41:53
great. So normally about this
41:55
time in an interview, I ask, so what's next
41:58
or what's the future? But we've been talking. a
42:00
lot about that, but maybe the best
42:02
way for me to rephrase that question this
42:04
time is to say we've talked a lot
42:06
about where AI is going, but where
42:09
is Anaconda going next? Perfect.
42:12
Yeah. So we did touch on this earlier, but it's
42:15
taking what we've traditionally done, getting the right
42:17
bits on your computer, making it manageable and
42:19
governable, and then helping people solve kind of
42:21
higher level AI, machine learning, data science problems,
42:23
and expand that into data and models. I
42:25
mentioned serverless Python, things like PyScript, and whatnot
42:27
that allow you to actually execute this stuff
42:29
using the cheapest hardware, which is the hardware
42:31
you already own, i.e. your laptop or your
42:33
phone, things like that. We're also very, very
42:36
heavily focused on high performance Python. People
42:38
often talk about, oh, Python's not as fast as C++
42:40
or it's inefficient or whatever. And I think getting back
42:42
to the earlier point about why, what are you trying
42:45
to solve? Why is this not the right tool for
42:47
the job forever? That wasn't the limiting factor. The
42:49
limiting factor was getting it into people's hands, making
42:51
it understandable. Again, collaboration, secure, all this stuff that
42:54
Anaconda traditionally focused on. But
42:56
Python's the lingua franca of AI, and AI is
42:58
central to the world. And that Nvidia GPU isn't
43:00
cheap. And they're
43:02
in short supply. And so helping people get
43:04
the most out of their investment
43:07
in their infrastructure is actually a core concern.
43:09
And I think Anaconda is uniquely suited to
43:11
help solve Python performance problems. And so it's
43:13
a category of problems and it's a category
43:15
of technologies and approaches. So you're going to
43:17
see a lot of stuff from Anaconda around
43:20
that, both directly from Anaconda, but also what we're going
43:22
to foster in the open source ecosystem and the Python
43:25
ecosystem. Everything from the interpreter to the language to
43:27
the libraries, you name it. And
43:29
then you kind of combine those all together and
43:31
it's really about making it easy for
43:33
people to build these models, to
43:35
incorporate these models, to deploy these applications fast,
43:38
efficiently, effectively at scale. This
43:41
has been a great conversation, Rob. I really
43:43
appreciate it. We're so close to being out
43:45
of time. But I want to thank you
43:47
so much for coming on. This has been
43:49
a great conversation. Thank you. My
43:51
guest today has been Rob Futrich, the
43:54
CTO, not EVP of engineering, but
43:56
the CTO at Anaconda. Rob, thank
43:58
you for joining me. and software
44:00
engineering daily. Thank you, Lee. I
44:02
always love an exciting conversation. And thank you for
44:04
absolutely providing one. It's been fantastic talking to you.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More