Podchaser Logo
Home
AI in Pharmaceutical Supply Chains and Manufacturing - with Laks Pernenkil of Deloitte

AI in Pharmaceutical Supply Chains and Manufacturing - with Laks Pernenkil of Deloitte

Released Wednesday, 6th March 2024
Good episode? Give it some love!
AI in Pharmaceutical Supply Chains and Manufacturing - with Laks Pernenkil of Deloitte

AI in Pharmaceutical Supply Chains and Manufacturing - with Laks Pernenkil of Deloitte

AI in Pharmaceutical Supply Chains and Manufacturing - with Laks Pernenkil of Deloitte

AI in Pharmaceutical Supply Chains and Manufacturing - with Laks Pernenkil of Deloitte

Wednesday, 6th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:07

Welcome, everyone, to the AI

0:09

and Business Podcast. I'm Matthew

0:11

D'Amelio, Senior Editor here at

0:13

Emerge Technology Research. Today's

0:16

guest is Lax Pernankill, a

0:18

Principal in Life Sciences Consulting

0:20

at Deloitte. Lax joins

0:22

Emerge CEO and Head of

0:24

Research Daniel Fajela on today's

0:26

show to discuss AI in Pharmaceutical

0:28

Supply Chain Manufacturing, showcasing a

0:30

host of AI use cases

0:32

and applications across manufacturing and operational

0:35

lines of business. Later,

0:37

the two surmise the near future

0:39

of AI adoption in Pharmaceutical Supply

0:41

Chain Management and where business leaders

0:44

should focus their investments and attention.

0:47

Today's episode is sponsored by Deloitte,

0:49

and without further ado, here's their

0:51

conversation. So

0:58

Lax, welcome to the program. Thank you. Yeah, glad to

1:00

be able to dive in with you here today. We're

1:03

talking about a part of the life sciences world that you

1:05

are awfully up close and personal with,

1:07

which is supply chain manufacturing. I mentioned one

1:09

of our previous intros for the series. So

1:12

much of the AI focus in this area

1:14

is like clinical trials, drug development, but there's

1:16

all these other big operations in life sciences.

1:18

So because some of it will even be

1:20

new for our audience, I'd love to kick

1:22

off with what you're seeing boots on the

1:24

ground in terms of trends

1:27

and challenges within supply chain manufacturing that

1:29

that's making AI relevant making data relevant

1:31

today. Thanks. Well, first of all, thank

1:33

you for having me on. And of

1:35

course, I'll say this. So in my

1:38

two decades of being in

1:40

technical operations, manufacturing supply chain

1:42

quality and other aspects of

1:45

pharma and tech operations, I've

1:47

never seen a time where my

1:49

clients have been so surprised with

1:52

the pace at which AI

1:54

and data has taken over the

1:57

conversation and the focus.

2:00

of organizations, almost every single client

2:02

of ours is very indexed on

2:04

trying to think through what

2:06

does AI mean for our business, particularly

2:08

in operations. And so operations is not

2:10

far behind at all in that journey.

2:12

I'll say a couple of things. So

2:15

first, from a trend perspective, operations

2:17

has always been a game of

2:19

using data and facts in making

2:22

decisions, whether it's in

2:24

supply chains, whether it's in manufacturing,

2:26

whether it's in quality, because these

2:28

are highly engineered products and processes,

2:31

there has always been a need

2:33

for using that data to make

2:35

those decisions. What the new emerging

2:37

trends around AI has done is

2:40

put a shining light on data

2:42

that's not used. My thesis

2:44

advisor or my PhD would

2:46

often say the best

2:48

way to produce any variance

2:51

or any error in a process is

2:53

not to measure it. So in the

2:55

industry, it's always been the norm that

2:57

in the past and a long time before,

3:00

you don't measure if you can't explain why

3:02

the data is the way that it is.

3:04

But the industry has now come around that,

3:06

the regulators have come around that and there

3:08

is this mega trend in the industry in

3:10

operations on doing more with

3:13

data. The other tidbit I have, which

3:15

is along with this trend that you

3:17

see is that a typical manufacturing floor

3:20

creates about a terabyte of data every day. And

3:23

barely 5% of that is used

3:25

in operations. And so pharmaceutical

3:28

operations and our clients have

3:30

out changed that. The second trend you will

3:32

see is that there's a resurgence of sort

3:36

of industry product technologies in

3:39

pharmaceutical operations. And this emergence

3:41

of these technologies are creating

3:44

new sources of streaming data right from

3:46

the production floor, right from the supply

3:48

chains, where data is actively coming in,

3:51

whether it's in cell engine therapies, the

3:53

data of where your particular therapy is

3:55

in the out heading to the treatment

3:57

center or in a production.

4:00

where the manufacturing yield of the manufacturing process

4:02

is why available on our dashboard that we

4:04

need to see and make sure that we're

4:06

taking the process to get the most out

4:08

of the manufacturing process. The data

4:11

streaming coming from the production floor has

4:14

exponentially increased. That increased because of

4:16

the new and emerging technologies that

4:19

are being deployed in manufacturing and

4:21

supply chain. The third

4:23

thing is regulators have become

4:25

more and more attuned

4:27

to leveraging these novel

4:30

technologies, including AI. So

4:32

all of these three trends, the

4:34

regulators, availability of data, the availability

4:36

of new technologies to capture the

4:39

data and do something with it,

4:41

has created a substrate for pharmaceutical

4:43

companies to bring AI to life.

4:45

And bring AI not only for

4:48

a rear view, backward looking analytics,

4:50

but a forward looking, take

4:53

active decisions as the process is happening,

4:55

as the supply chain is executing to

4:57

delivering products to patients. So hopefully that

4:59

gives you a sense for what's happening

5:01

in the market. It does,

5:03

and I'd like to actually paint the picture. You're talking

5:05

about a terabyte of data a day, which is a

5:07

pretty overwhelming amount. I think I can

5:09

only imagine the number of pieces of equipment.

5:12

I can imagine in certain applications there's more computer

5:14

vision being used, whether it's detecting errors in the

5:16

machines themselves or the product or what have you,

5:18

and then all kinds of other movements and sensors,

5:21

et cetera. Give us an idea of what the

5:23

pie chart or some of the most important big

5:25

streams of data are and which of those are

5:27

really new? Which of those are growing? Because clearly

5:29

the expansion of this data is part of what

5:32

we need to wrangle to get value, and we're

5:34

gonna get into that. But talk about

5:36

what makes up that big pie of

5:38

a terabyte here. Yeah, yeah, that's great.

5:40

That's a great question. So generally speaking,

5:43

there's a lot of data that gets

5:45

generated in what are called L0, L1

5:47

systems. So the typical manufacturing architecture has

5:50

got from level zero to level five,

5:52

at level five, enterprise planning systems and

5:55

level zero are sensors on the production

5:57

floor. And about the L0, L1 layer. is

6:00

where 70 to 80% of

6:02

the data gets generated. These are

6:04

pressure data, sensor data, recipe data,

6:07

equipment data, temperature of the environment, samples

6:09

that people take. There's a lot of

6:12

data that gets generated at that L0,

6:14

L1 level. And then

6:16

the rest of 20% is split between metadata associated

6:19

with all the other systems that are

6:21

trying to control the manufacturing process and

6:24

the ensuing business processes. So

6:26

that's typically what we see is there's a lot of that

6:28

data on the L0, L1 layer closest

6:30

to the production set is not used

6:32

as much because these are high-volume data

6:34

sets and also these are regulated manufacturing

6:36

processes that we don't tend to change

6:38

as much. What we are seeing is

6:40

a credible change in how that's

6:43

being transformed. There's an example of a

6:45

client of ours built in a manufacturing

6:47

AI tool that on the production floor,

6:49

as things change, they can actually immediately

6:52

take an action and what

6:54

is the next best action they can take to bring that

6:56

almost like a self-driving car, bring the car

6:59

back between the two lanes and deliver the

7:01

product at the end of the manufacturing process.

7:03

I would say a lot of it is

7:05

on the lowest level of data generation. Got

7:08

it. And clearly leveraging that data

7:10

is part of how we bring to

7:12

bear the value of artificial intelligence. There's

7:15

probably innumerable applications beyond

7:17

even the vision

7:20

in manufacturing and other things that I might be

7:22

thinking about. You're deep in this game. You're looking

7:24

at which of these are adding

7:26

value today. You're looking at which of these

7:29

applications of AI could make

7:31

the biggest impact in the next couple of

7:33

years in terms of really driving efficiencies and

7:35

effectiveness within operations. What are some use cases

7:37

in this domain that for you are

7:39

really worth talking about that leaders should understand and that

7:41

you can help explain? Yeah, so

7:44

I'll break them into kind of

7:46

broadly with the emergence of

7:48

generative AI. I'll talk specifically about generative

7:50

AI and where generative AI applications are

7:52

in pharmaceuticals and that type. But let

7:54

me start with what was the journey

7:57

in the last half decade or so

7:59

around bringing. large machine learning and AI

8:01

into the production floor as a starting point

8:03

for the use cases and then we'll step

8:05

over to the narrative AI. On the machine

8:07

learning model, the

8:10

use case and applications spread

8:12

between trying to get control

8:14

of your manufacturing process, trying to

8:17

get control of inventory, trying to

8:19

get control of distribution

8:21

and logistics of your products

8:23

in the marketplace. So each

8:25

of those domains have

8:27

a significant set of applications of using

8:29

AI. As an example, we talked

8:32

at length about manufacturing process data

8:35

and then controlling the process. Another

8:37

example on the production floor is

8:39

almost always inadvertently there are deviations

8:41

that happen or nonconformances

8:43

that happen in manufacturing. Most

8:46

of them, 80 to 90% of them are

8:48

benign nonconformances. Somebody forgot to

8:50

put a signature, somebody forgot

8:53

to change their temperature from

8:55

35 to 37. These

8:58

really don't have an impact on the

9:00

product because the process is usually robust.

9:02

But a lot of these deviations are

9:04

written up because of the regulated process,

9:06

then somebody has to adjudicate whether this

9:08

is a risk to the product and

9:10

to the patient. So using

9:12

AI to very quickly scan

9:14

these deviations and adjudicating what

9:16

and then only giving that

9:19

5% of significant critical deviations

9:21

to a human to look at is a

9:23

talk you say that almost 89%

9:26

of my clients are trying to deploy

9:28

right now actively in the marketplace. And

9:30

these are all just legacy, I'll call

9:32

it legacy, but machine learning type AI.

9:35

In the generative AI space, which

9:38

is picking up significantly right now,

9:40

there are assistive applications that clients

9:42

are thinking about and

9:45

generative applications. So a typical pharmaceutical

9:47

company will need to submit what's

9:49

called annual report on that product

9:51

that kind of looks back at the

9:53

last year and say here are all the changes

9:55

that happened in the manufacturing process, the product and

9:57

so on and so forth and they compile a

9:59

lot. of data and create a report out of

10:01

it. Pull your use case for

10:04

gender to AI that can pick up data

10:06

sets and create a standard report that

10:08

then somebody, a human, can go and edit

10:10

in mnds. That's a use case that a

10:12

lot of times thinking about a

10:14

typical pharmacist comes

10:16

to the point where you have a million documents

10:19

that are SOPs and manuals and all

10:21

of these things that people need to

10:24

understand. A large language model can read

10:26

all those documents and become an SOP

10:28

assistant for manufacturing and operations in operators

10:31

on the production floor. There

10:33

are many of these gender to AI use

10:35

cases that are coming to bear. Supplier management

10:38

is another gender to AI use case where

10:40

suppliers being able to track and see which

10:43

suppliers are performing well and then creating and

10:45

then communicating with those suppliers. If you're seeing

10:47

these trends, what are you going to do

10:49

in terms of adjusting your performance with another

10:51

use case? There are a plethora of use

10:54

cases. In fact, we put out

10:56

a few hardware on what those use cases

10:58

are in the public domain as well. Yeah,

11:01

I'm sure that some of that's going to be tied into

11:03

our broader show notes and the bigger sort of picture we're

11:05

painting here in Lifeside with all of your experts. I

11:08

want to dive into a few of these that you mentioned

11:10

if that's all right just to paint a mental picture for

11:12

the listeners who are sort of tuned in and sort of

11:14

saying, okay man, what does this look like in real life?

11:16

So one example you talked

11:18

about was in the manufacturing process, we've got

11:20

temperatures and pressures and combinations of chemicals and

11:22

a thousand things that happen, all of which

11:25

need to be measured and managed because we

11:27

cannot have a negative impact on patients. And

11:30

we can have AI sort of sipped through those

11:32

variants. We've got a lot of streams of data

11:34

here. I imagine some of this stuff, I don't

11:36

know if all of this is coming from amazingly

11:38

fancy new machinery. I imagine some of it is coming out

11:40

in a pretty, you know, gunky old

11:42

school format that somebody has to export

11:45

and harmonize and then load into something. Yeah,

11:47

it's a great point. I have no idea

11:49

what this looks like in practice, but it

11:51

seems like kluge is the word

11:53

that comes to mind. I mean, just from

11:55

what I know about manufacturing outside of Lifeside,

11:57

kluge is pretty much the modus operandi. Life's

12:00

eye might be a little bit sharper, but you know, individual

12:02

piece of equipment that are exactly the same, just

12:06

temperature settings need to be different for this one than this

12:08

one. Stuff is crazy in the

12:10

world of physical stuff. Let me ask this,

12:12

you know, when it comes to finding that

12:14

variance, my guess is what has to happen

12:16

is we need to have enough history of

12:18

what that variance is and enough humans really

12:21

identify, hey, when these temperatures at

12:23

this point or these pressures at this

12:25

point or this combination that happens here

12:27

doesn't have these things involved and

12:29

this other thing happens later in the process, then

12:32

this has actually become a problem.

12:34

We need to figure out when these

12:36

combinations overlap enough where it would be a

12:38

challenge and that's got to be our training

12:41

information for us to be able to, let's

12:43

say to your point, we can flag the

12:46

green, yellow, red, right? You've got 5% red, we've

12:48

got 10% yellow, we've got the rest of it

12:50

is pretty clean even if there's a little bit

12:52

of variance. Is this the right way to understand

12:54

it? I want the leaders at home to kind

12:56

of think about what it might look like to

12:58

train such a system. That's great. That's

13:00

a great question. So that's actually two parts

13:02

to the question you raised. One is what

13:04

I call readiness of the data and the

13:07

other is the completeness of the data to

13:09

train and to use it in AI. The

13:12

readiness of the data in of itself is

13:14

a massive, massive undertaking. A lot of the

13:16

times what we see is that our clients

13:18

have invested in systems and data and

13:21

I'm not necessarily, for

13:23

good reasons, not necessarily spent the time

13:26

to kind of clean, have master data

13:28

that's clean that can actually be used

13:30

in a usable format to use

13:33

your words of not be clued as you

13:35

use it in your decision

13:37

making. So now there are data

13:39

sets that are regulated that they need

13:42

to be clean so they have invested

13:44

those but those that are not

13:46

required, why bother investing in cleaning those?

13:48

So a lot of the times many

13:51

of these programs either win

13:53

or lose on the back of

13:55

getting that front end cleaned up

13:57

really well and there are some.

14:00

very cool examples of where we went

14:02

in with some unique ways of cleaning

14:04

that with very little human intervention using

14:07

graph methodologies and so on and so forth

14:09

that kind of drives that cleaning very quickly.

14:11

So that's one part of the mountain, you

14:13

know, digging out of the mountain. The

14:16

second part of it is completeness of

14:18

data. And that's where you

14:20

kind of mentioned, you know, do you have

14:22

enough radiation in your training data for the

14:24

AI models to detect when is it good,

14:27

when is it bad. And this is where

14:29

it's an art as much as it is

14:31

a science because when you train models,

14:33

you will want to make sure that

14:36

the data that you have is an

14:38

accurate representation of the reality that there

14:40

is. But also this is why we

14:42

look at AI as a human and

14:44

machine interface with an assistance in the

14:46

middle, because many

14:49

manufacturing processes change over time.

14:52

So every time a manufacturing process and

14:54

operations and supply chains change, you're

14:57

now in a new regime that the

14:59

old AI model will need to continue

15:01

to learn. So how do we build

15:03

training data sets and

15:05

train AI models to

15:07

recognize these regime changes

15:10

is another interesting topic that we've come up

15:12

with some unique ways of solving.

15:15

And we've used new anomaly

15:17

detection algorithms, we've used trend detection algorithms

15:19

to kind of solve for that and

15:22

which is really cool. So yeah,

15:24

these two are probably the biggest

15:26

challenges in seeing the value from AI. Yeah,

15:29

and I want to get a little bit into

15:31

that. I mean, setting the data table and making

15:34

data infrastructure come to life, you

15:36

know, not an easy game. And then, you know,

15:38

being in this space for the last 10 years,

15:40

there was a long time in AI, which I'm

15:42

sure, you know, you recall well, where

15:45

we're really talking about kind of bandaid surface

15:47

level stuff, if it got beyond a POC,

15:49

it was kind of hanging out in one

15:52

specific corner, one specific workflow. We're now seeing

15:54

enough AI fluency from many

15:56

good experiments, many failed experiments and enough

15:58

excitement around the world. around really being

16:00

able to see how powerful this next wave

16:03

of AI is for people to be open

16:05

to the data infrastructure question. What

16:07

do you think, because this is gonna apply to the

16:09

next use case I'll talk about on the supplier side,

16:11

but what do you think leaders are gonna have to

16:13

understand about making those kind of

16:16

undergirded investments? Because there's gonna be some systems

16:18

that cannot stay the way they are if

16:20

we wanna do forecasting, if we wanna make

16:22

smart decisions, if we wanna be compliant. How

16:24

do some people think about that investment? Because

16:26

you're articulating some really important ideas and probably

16:28

haven't reached every leader who needs to hear them.

16:31

So there's one part of that

16:33

which basically says the more AI

16:36

is at the fingertips of the

16:38

frontline colleagues that

16:40

are manufacturing, that are supplying, that

16:43

are performing logistics activities, the

16:45

more it becomes helping those

16:48

individuals in the front lines, the

16:50

more value that AI creates. A

16:53

lot of the times AI helps in correcting

16:55

for what might be a wrong

16:58

step that a particular operator might

17:00

be taking, or an adjustment or

17:02

immediate turn to what's called a

17:04

golden path or operations, trying to

17:07

get back to the golden path.

17:09

So pushing the AI decisions, AI

17:11

adoption down to the front lines

17:14

is one way to scale this and make this

17:17

beyond the one POC or the two

17:19

POCs and trying to deploy AI at

17:21

the front lines. The second part of

17:23

that is we

17:26

always ask our clients to think

17:28

about value as a starting

17:30

point. So the applications or the

17:32

use cases you're reference, if there is no

17:35

value to be had from those use cases,

17:37

don't even go down that path even if

17:39

it's cool, right? And value is measured in

17:42

only a handsome of ways, whether

17:44

you're driving up more capacity, more

17:46

revenue, more product being

17:48

made, you're driving cost down, you're

17:51

turning your assets faster, your

17:53

asset efficiencies higher, or

17:55

you're distinctly driving some

17:58

other non-tangible, sometimes tangible. and

18:00

it will benefit around sustainability or quality or efficacy

18:02

or whatever it is. So it needs to tie

18:04

back to some things that the

18:06

operations leadership team cares

18:08

about as part of their objectives and goals.

18:10

So starting with that value and then driving

18:12

down to the use cases is another way

18:15

to ensure that you're delivering value back to

18:17

the patient. Absolutely, I mean, beginning with the

18:19

end in mind that there's a

18:21

fear and FOMO are very bad guides for where

18:23

to apply emerging technology, but value is a really

18:25

good place to start. So hard to disagree with

18:27

you there. When you mentioned supply chain, just to

18:30

touch on this before a little bit of parting

18:32

advice for the leaders here, another great use case.

18:34

I can imagine, you know, you're talking about supplier

18:36

performance. Talk a little bit about,

18:38

you know, because there's a lot of players, I mean,

18:40

there's people that are supplying raw

18:42

chemicals here. There are people that are handling part

18:45

of the manufacturing process and giving us goods that

18:47

maybe aren't finished yet or whatnot. And there's complexity

18:49

here that some of the listeners and including myself

18:51

might not be familiar with. What are we measuring

18:53

for those folks where AI can

18:55

help us really keep expectations in

18:57

line? So the supplier of score carding

19:00

and using AI to perform supplier score

19:02

carding, supplier performance management is another emerging

19:04

use case. To

19:06

give you a little bit of flavor for this, there

19:09

are 170 points that typical FOMO

19:11

companies use, data points, if you

19:13

will, for evaluating how a supplier

19:15

is doing. And a lot of the times, 89% of

19:17

that is manual, sort

19:21

of hard charged, sifting through a lot

19:23

of, you know, past performance, being all,

19:25

you know, documents, you know, all of

19:27

that stuff, come at those data points.

19:30

And being able to automate and have,

19:32

you know, large language models

19:35

kind of sit on top of all

19:37

of these data sets, whether it's certificates

19:39

of analysis, supplier purchase orders, invoices, whatever

19:41

it is, and supply communications, and being

19:44

able to kind of come up with,

19:46

how is the supplier performing? Have they

19:48

met these six or seven distinguished criteria,

19:52

qualitative and quantitative in

19:54

terms of performance is going to be the

19:56

art of the future. This supplier performance also

19:58

extends to, you know, We did an

20:01

example with a client where we

20:03

applied AI on all of their

20:05

certificates of analysis, COFAs as they're called, for

20:08

all of their chemicals that they get. And

20:10

when we ran that, we could

20:12

easily find issues where

20:15

some COFAs was approved, but actually should

20:17

not have been approved because

20:19

that was out of spec as it

20:21

was received, as received from the vendor. So

20:23

some of these things can be caught using

20:25

AI, and that helps you grow and have

20:27

meaningful conversations with suppliers to make sure that

20:29

your supply is not only the supply that

20:31

you paid for, the product that you paid

20:34

for, but also it doesn't have a downstream

20:36

impact on products that you are making using

20:38

those raw materials. The second thing

20:40

I'd say is there's an amazing amount

20:42

of information in the public

20:44

domain on like what is happening in

20:46

the marketplace for a variety of these supplied

20:48

products, right? And so being able to mine

20:51

that information from the public domain and then

20:53

generate insights is another sort of supplier

20:56

performance and insights

20:58

are used that are kind of

21:00

thinking about as well. Got it.

21:02

Is there information about pricing, timing,

21:05

whatever for what's happening with... Or even something

21:07

has happened, like for example, the earthquake that

21:09

happened in Japan, what is the issue of

21:12

all the supply issues that might come down

21:14

the pipe to me maybe six months down

21:16

the line because of this earthquake

21:18

that shut down a supplier that was a third

21:20

tier supplier for my... I think tier

21:22

supplier for my first year. Being

21:25

able to sort of field

21:27

and forecast and search at scale the external data

21:29

that could help us know maybe we want to

21:31

steer clear from these two suppliers or maybe we

21:33

still think they're safe, something like that. Exactly. We've

21:35

had on folks like Signal and Meltwater and some

21:37

of the players that do like this broader media

21:39

monitoring, but you're obviously talking about in a very

21:42

narrow context of how is this going to affect

21:44

my supply chain, which I think is useful for

21:46

the audience to know. I know we're going to

21:48

be coming up on time in a second here,

21:50

Lex, but I want to get your

21:52

vantage point. You're seeing a lot of people now almost

21:54

certainly, at least in

21:56

a significant way, start their AI journey. Certainly

21:58

their Gen AI journey. journey in the

22:01

supply chain manufacturing portion of life sciences.

22:04

What sort of parting advice do you have for leaders that are

22:06

looking at the different use cases? They maybe

22:08

have a good sense of where the value is

22:10

in their business, but they're not exactly sure where

22:12

AI would fit in. Where should they begin their

22:15

thought process of making this a really high ROI

22:17

endeavor into a new technology? Yeah,

22:19

I think some of the most

22:22

common starting points that I have

22:24

seen in the last couple of

22:26

years in both AI, broadly engine

22:29

AI specifically, have been in manual

22:32

data intensive parts of the

22:34

value chain. Whether it is,

22:37

as I mentioned before, mining

22:39

for information on complaints, mining

22:41

for information on non-contamancers to

22:43

reduce the human effort involved

22:46

in removing them, or

22:48

getting more, squeezing more juice from

22:50

their manufacturing assets by using the data

22:52

that is coming from the manufacturing assets

22:54

to make the decisions. Those

22:57

are two common standard starting points for

22:59

many of our pharmaceutical clients. On

23:01

the supply chain side, the other common

23:04

starting point is inventory, deploying AI on

23:06

managing inventory in their network, because there

23:08

are so many moves that happen in

23:10

the manufacturing network, as well as in

23:12

the outbound distribution network. Optimizing

23:15

for that inventory goes a long way because these

23:17

are really, really, really expensive products. Those

23:20

three are the most common starting points

23:22

that net a lot of return for

23:24

our clients, because either direct to bottom

23:27

line, direct to top line impacts of

23:29

the executed interest. But then there are,

23:32

we've compiled 45 plus 50, nearly 50

23:34

use cases where they can go and

23:38

deploy these AI use cases. Each

23:41

of them have returned tight with

23:43

them, the scale will differ between

23:45

clients. But those three would be my starting point for a

23:47

lot of my clients. That's cool. So

23:50

in terms of early conversations, early investigations of the quality

23:52

of our data, the potential impact of these technologies, we've

23:54

now got some low hanging fruit, hopefully for the listeners

23:56

tuned in around where they might be able to find

23:58

them. So I appreciate you being as practical as you've

24:00

been and I had a lot of fun learning with

24:02

you here at LAP So thanks so much for being

24:05

with me Before

24:14

we draw a close to today's

24:17

episode some highlighted points I think

24:19

will go a long way for

24:21

our listeners, especially where they came

24:23

from our guest First

24:25

AI is helping shed light

24:27

on untapped data sources in

24:29

pharmaceutical manufacturing workflows and beyond

24:32

Operations are now prioritizing the use

24:34

of data to make decisions rather

24:37

than relying solely on experience and

24:39

intuition as regulators and industry

24:41

leaders encourage more data driven decision

24:43

making LACS

24:45

emphasizes that data needs

24:47

to be ready for training a

24:50

system to flag potential issues They're

24:52

in including identifying when combinations of

24:54

variables overlap and become problematic LACS

24:57

also highlights the importance of clean

24:59

and complete Data

25:02

for AI models to detect

25:04

variations in manufacturing processes operations

25:06

and supply chains Leaders

25:08

must understand the importance of data

25:11

infrastructure to support AI adoption and

25:13

make them and make informed

25:15

decisions Leaders must understand

25:17

the importance of data infrastructure to

25:20

support AI adoption and make informed

25:22

decisions Deploying AI at the frontlines

25:24

and starting with value are key

25:27

to scaling AI and delivering tangible

25:29

benefits to operations leadership teams AI

25:32

can help automate supplier performance management

25:35

by analyzing large data sets and

25:37

identifying issues enabling meaningful

25:40

conversations with suppliers And

25:43

on behalf of Daniel Fagella our CEO

25:45

and head of research in the entire

25:47

team here at Emerge Technology Research Thanks

25:50

so much for joining us today, and we'll catch you

25:52

next time on the AI in fitness podcast Thank

26:00

you.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features