Podchaser Logo
Home
Implementing KYC and User Verification with Alex Grinman

Implementing KYC and User Verification with Alex Grinman

Released Thursday, 9th May 2024
Good episode? Give it some love!
Implementing KYC and User Verification with Alex Grinman

Implementing KYC and User Verification with Alex Grinman

Implementing KYC and User Verification with Alex Grinman

Implementing KYC and User Verification with Alex Grinman

Thursday, 9th May 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Almost every application or system involves

0:02

some sort of user onboarding. Increasingly,

0:05

companies must implement Know Your Customer

0:07

and Know Your Business Compliance, or

0:10

KYC and KYB, as part of

0:12

that process. In addition,

0:14

they often handle Personal Identifiable Information,

0:16

or PII. Footprint

0:19

is a developer platform that was

0:22

co-founded by Alex Grinman for handling

0:24

identity, security, fraud, and authentication. Alex

0:27

joins the show to talk

0:30

about identity verification, security, compliance,

0:32

footprints front-end and back-end design,

0:34

and much more. Gregor

0:36

Vand is a security-focused technologist

0:39

and is the founder and

0:41

CTO of MailPass. Previously, Gregor

0:43

was a CTO across cybersecurity,

0:45

cyber insurance, and general software

0:48

engineering companies. He has

0:50

been based in Asia Pacific for almost

0:52

a decade and can be found via

0:54

his profile at Vand.HK. Hi,

1:09

Alex. Welcome to Software Engineering Daily.

1:12

Hey, Gregor. Great to be here. Thanks for having me. Yeah,

1:15

Alex. It's great to have you here.

1:17

You've been in the authentication and identity

1:20

space for kind of a little while

1:22

now, and you're the co-founder and CTO

1:24

now of Footprint. What

1:27

was your road to becoming the

1:29

co-founder and CTO of Footprint? You

1:31

know, I started programming when I was a

1:33

kid. Started building software. Actually really started building

1:35

iPhone apps in high school. It's kind of

1:37

where I really started. And I've always been

1:39

interested in identity and security. I mean, the

1:42

idea of, you know, how do

1:44

you prove something to somebody without, you know,

1:46

giving away all of your information was intriguing

1:48

to me. I think my real kind of

1:50

intro into cryptography was, I took

1:52

this very theoretical cryptography course at MIT and

1:55

taught by the inventors of zero knowledge proofs.

1:57

Just fell in love with, you know, how...

2:00

A little bit of simple math and a

2:02

little bit of programming can create these pretty

2:04

amazing guarantees. I started a company right out

2:06

of school in the authentication space. Kind of

2:08

merged my ideas of building iPhone apps with

2:10

security and identity. So I thought it was

2:12

pretty cool that the iPhone launched with a

2:15

secure enclave chip on device, initially

2:17

built for storing things like credit cards

2:20

or other passwords. But really, you could use it

2:22

to store any kind of key material. And so

2:24

I thought, hey, it'd be pretty cool if we

2:26

stored a public-private key pair on

2:28

one of these chips. And then the private key never

2:30

left your phone, kind of like a YubiKey, but in

2:32

a much more useful form factor that everyone already has.

2:35

Maybe we can use that to encrypt messages between

2:37

people or use it to authenticate using an SSH

2:39

key. And so I ended up starting a company

2:41

in that space, which led

2:44

me to things like FIDO2, U2F, Passkeys, and

2:47

built that first company. We sold that

2:49

company to Akamai. And the

2:51

startup bug kind of built and it bit me pretty

2:53

early on. And here I am. Yeah,

2:55

awesome. So yeah, I mean, footprint, as we'll

2:57

hear, kind of very much builds

3:00

on kind of all of that. So

3:02

maybe at like a high level, like

3:04

what does footprint do? So

3:06

at footprint, we're helping companies automate onboarding. And

3:09

so onboarding is this kind of like opaque

3:11

thing. What is onboarding? Every product you use

3:13

has an onboarding form. The simplest

3:15

product, maybe just ask you for your name and

3:17

email address. Almost every onboarding form has you set

3:19

up a credential, a way to authenticate yourself. So

3:22

when you come back later, your account is for

3:24

you and no one else can access it. But

3:27

there's this other range of onboarding that

3:29

actually many products that we use have.

3:31

Think of your investing app, your credit

3:33

card that you sign up for, a

3:36

bank account, maybe even when you're

3:38

renting a car or booking a

3:40

house to rent, right? These

3:43

products, these marketplaces, they need to know who you

3:45

are, maybe a little bit more than most other

3:47

applications. They collect a lot of other personal information

3:49

like your full name, your address, your date of

3:52

birth. Sometimes they ask you for your social security

3:54

number if you're in the States, or

3:56

a scan of your driver's license or your passport.

3:59

And what they do with this information is they

4:01

verify all of this, right? They try to verify

4:03

as much of it as possible to make sure

4:05

that the person signing up really is who they

4:07

say they are, and also that they're not a

4:09

criminal or a bad person or have a history

4:11

of defrauding people or things like that. And

4:13

that's a really complex process that many

4:15

companies have to solve. And they have

4:17

to solve it on multiple different layers,

4:19

everything from storing that data securely to

4:21

verifying it correctly. And so what

4:24

footprint does is we give companies a couple

4:26

lines of code, and we basically take care

4:28

of all of that, everything from collecting information,

4:30

verifying it, storing it securely, providing an audit

4:32

trail. If you're a regulated industry, you know,

4:35

sending that audit trail to third parties so

4:37

that you can be in compliance. We handle

4:39

all of that and basically make it so

4:41

that companies don't have to rebuild this onboarding

4:44

flow from scratch. And then for consumers, what

4:46

we're doing is once you onboard once to

4:48

any company or product that uses footprint, every

4:51

future time will just be one click. So low

4:53

restriction makes it easier to use products in a

4:55

trusted and verified way, which more and

4:57

more products on the internet today are requiring just

5:00

because of the nature of the really sensitive things that

5:02

you can do online. Yeah, so I

5:04

guess just to kind of read that back, footprint

5:07

is taking both sides of the coin here, which

5:09

is that both the user has to go through

5:11

a process that is a footprint process

5:13

to sign on. At the same time, the

5:15

platform has to be also in

5:18

some way, communicating with footprint or using footprint,

5:20

like behind the scenes, like how is it

5:22

API driven? Or is it sort of something

5:24

that you integrate into the platform? Or how

5:26

does it work? Yeah, that's a

5:28

great question. So a lot of KYC identity

5:31

companies, onboarding companies, they're typically just like a

5:33

back end API, they, you know, you send

5:35

them some data, they tell you if it's,

5:37

you know, verified or some additional scores or

5:39

something. And we at footprint believe that you

5:41

have to really verify who the user is,

5:43

you have to be all the way at

5:45

the edge, right? You have to actually interact

5:47

with that end user directly. So

5:49

companies, really, the integration touches two points, they

5:51

embed some code in their front end, and

5:54

then they embed some code in their back end as

5:56

well. And our front end takes care of securely collecting

5:58

all that information, verify. trying and doing

6:00

things like dynamic step ups, right? If we

6:02

determine there's some high fraud, potential

6:05

risk signals, we can step up the user and

6:07

ask for some more information, all

6:09

without any code or any complexity that customers

6:11

have to implement. And on the backend, once

6:13

the front end finishes, they go ahead and

6:15

they verify that exchange sort of that transaction

6:17

for what we call a footprint ID or

6:19

an FPID, a single sort of token that

6:21

they can then store on their side, that

6:23

really encapsulates all the identity and the risk information

6:26

and the overall decisioning that happened, whether or not

6:28

that user is who they say they are. So

6:31

the integration kind of touches both sides, but

6:33

the end user really interacts directly with footprints

6:35

through kind of this embedded onboarding experience. And then

6:37

that user actually gets credentialed as well. So they

6:39

set up a pass key and sure we'll get

6:41

into like the details all there,

6:43

but really, so we interact with both the end

6:45

consumer and the business. Yeah, so

6:48

I don't know if you like this analogy or

6:50

not, but is it kind of the stripe for

6:52

identity management? Is that a good analogy or is

6:54

it different? Because I mean, all the things you've

6:56

kind of just mentioned there where you're interacting with

6:58

the footprint platform as a

7:00

token is passed back and that's kind

7:02

of what's representing the user. Sounds like

7:04

Stripe. Yeah, exactly. Kind of. Yeah.

7:08

Exactly, yeah. I mean, it's interesting, like a lot

7:10

of people think of Stripe as this payments platform,

7:12

right? That does orchestrate payments and

7:14

billing. But at the very kind of initial

7:16

product that Stripe launched was that they said,

7:18

okay, storing a credit card number and then

7:21

charging it is very complex. So we will

7:23

essentially tokenize the credit card and give you

7:25

back like a Stripe customer ID or

7:28

payment object, whatever it is. And then we take

7:30

care of charging that payment object.

7:32

And so that's effectively what footprint does as

7:35

well, except instead of just being a

7:37

credit card, which is actually a pretty complex object in

7:39

its own right, but the identity is

7:41

much more complicated, right? You have everything from the

7:43

basic sort of PIA attributes like name, date of

7:45

birth, SSN, But then

7:48

you also have things like identity documents

7:50

and those all have information that can

7:52

be extracted and verified. You have information

7:54

about someone's driving history that is derived

7:56

from their identity, right?? There's all sorts

7:58

of identity information. The nation that

8:01

we basically so guys all into

8:03

one simple strength, that customer store

8:05

and saw that complexity for them.

8:08

Yeah. I get them a sense so this

8:10

kind of almost like three kind of facets

8:12

to this and also you know when I

8:14

look at the forefront of website the them

8:16

in the same way. So we got security

8:19

with a whole thing and in legal compliance

8:21

could you maybe says speak to three security

8:23

that is of epic term generally said his

8:25

of the security of me the context of

8:27

footprint which might. Be. Interesting to

8:29

hear about. That. Is halting and

8:31

that's the can be hey consult some served

8:33

as just a ton of concept stuff in

8:35

there a be great to hear about and

8:37

ended the compliance piece which I'm really cares

8:40

about. the accept films like this whole problem

8:42

I wouldn't ever wanna go in touch but

8:44

you guys are are doing that which is

8:46

very cool so the s les les maybe

8:48

hear about more. The set of. Technology

8:50

technical side of these three areas.

8:53

Yeah. Silly and out in have insecurity

8:55

avid many bundle like risk and identity

8:57

and the or that killer is is

8:59

in a very crucial of but brendan

9:01

our customers especially on the companies that

9:03

are really heavily regulated right? They need

9:06

to make sure that they. Are

9:08

actually verifying that person's identity and said

9:10

this involves the a bunch of things

9:12

like complex it integrations with the Lexisnexis

9:14

of the world experience, different kind of

9:16

girls' data sources, and that actually is

9:18

a fairly come on a ties industry.

9:20

Today you know there's a lot of

9:22

companies do just this At the very

9:24

end, they actually boil down to maybe

9:26

like five or six different girls like

9:28

likes as nice as it's in our

9:30

experience and which we connected directly and

9:32

in this even though I say it

9:34

modifies, it's actually fairly complex with most

9:36

companies should not be. it's raining directly

9:39

with experience lexisnexis right they have some

9:41

share databases data sources some this joints

9:43

but really what it boils down to

9:45

his you're taking he's a fairly legacy

9:47

a p eyes and sometimes it's like

9:49

a four hundred a pdf of all

9:51

the possible i'd risk code you can

9:54

get and boiling that down to pay

9:56

does does the information entered match each

9:58

other rights are trying to So

10:00

think of like, does the name match the date of

10:03

birth match the address, right? Is

10:05

this a recent address or maybe it's

10:07

an older address. So kind of boiling

10:09

all of that down into

10:11

deciding whether or not the data is actually accurate, right?

10:14

So that's sort of maybe step number one is

10:17

does the data entered actually

10:19

is it correct and correct,

10:21

by the way, is not a binary decision, right? As you

10:23

can imagine, users sometimes miss type

10:25

things. And by the way,

10:27

this is like a core part of why

10:29

Footprints front end helps here

10:32

because we catch a lot of

10:34

the errors on the front end that a lot

10:36

of other KYC companies that are just back in

10:38

APIs cannot do. For instance, if the

10:40

user actually accidentally fat fingered, you know, their date of birth

10:42

and they're off by 100 years, like our front end will

10:44

catch that. It won't let you enter in a bad date

10:46

of birth. And

10:48

similarly, if you're entering in a PO box

10:50

or the address of a prison or something

10:53

like that, our front end will double check

10:55

that that is correct for you, right? And

10:57

tell you, hey, PO boxes are not allowed,

10:59

right? And so we improve the accuracy of

11:01

the overall sort of identity risk matching. And

11:04

so that's sort of the kind of the baseline. The

11:06

next layer on top of that is we

11:09

have a fairly sophisticated decisioning and rules engine

11:11

that's built. And the reason is, you know,

11:13

different businesses, different industries have different

11:15

rules. Like if you're a bank, the, you

11:17

know, compliance policy that you need to follow

11:19

is very different than if you're, you know,

11:22

let's say, renting apartments to people. And

11:25

so we give companies, you know, based on industry,

11:27

you're in a default good set of rules, but

11:29

every business can customize them. And this

11:31

is really powerful because because once again,

11:34

our front end is integrated, you can create a

11:36

rule that actually does a dynamic step up,

11:38

which actually changes the front end, right? And

11:41

so the nice thing is that you

11:43

can go from just basic sort of

11:45

what's called EKYC or just non-documentary information

11:47

to then stepping up and doing a

11:49

document verification and collecting a selfie, matching

11:51

the selfie to the document. And the nice

11:53

thing about kind of the front end integration is

11:55

that this gets into the vaulting as well, is

11:58

that all of this data is collected. to directly

12:00

kind of in the same way that Stripe

12:02

provides an iframe to collect a credit card

12:04

number of footprints kind of iframe securely collects

12:07

all this information and vaults it directly and

12:09

the footprint vaulting infrastructure will get into kind

12:11

of how that all works. But

12:13

that means that this data doesn't pass

12:15

through your servers, your

12:18

code essentially makes you out of scope

12:20

for a lot of security and compliance

12:22

regimes with touching this very sensitive data.

12:25

So footprint kind of handles all of

12:27

that as well. And that is sort

12:29

of I would say the first line

12:31

of the product vaulting really is the

12:33

foundation for everything. I mean all the

12:35

data that we touch is vaulted. Our

12:37

technology were built on top of what's

12:39

called AWS Nitro Enclaves. It's a fairly

12:41

recent technology from AWS. There's

12:44

a couple of competing technologies on Azure and

12:46

Google. If you're kind of in the

12:48

security space, you're probably thinking like Intel

12:51

STX, right? That's like your that's probably

12:53

the go to example, the most famous. Everyone

12:55

has a couple of really cool advantages

12:58

over Intel STX. And a

13:00

lot of it is really around like developer experience

13:02

and the fact that the isolation guarantees are really

13:04

nice. And so just kind of a quick primer

13:06

on what what net

13:08

benefits are and what guarantees they provide

13:11

essentially lets you take some code and run

13:13

it inside of an environment that is network

13:15

gaps, memory gap, CPU isolated

13:19

and attested itself. And so

13:21

this is running on special hardware where

13:23

you can get a cryptographic attestation that

13:25

it's running on untampered AWS Nitro Enclave

13:27

hardware that you've signed. So

13:29

the idea is that no one else can kind

13:32

of deploy additional malicious code. And

13:34

even if you have like a malicious

13:36

dependency that you accidentally introduced, well, because

13:38

network gaps, that can't escape. And

13:41

so what we've done at Footprint is we've

13:43

designed essentially our entire vaulting infrastructure around Nitro

13:46

Enclaves. And at a very simple

13:48

level, we have what's called a user vault. So

13:50

every single user in the Footprint

13:52

system has their own what we call a vault.

13:55

And that really is just a public private key

13:57

pair. As soon as data enters

13:59

Footprint, it's... encrypted with the public key and

14:01

the corresponding private key that can be used

14:04

to decrypt that data is only accessible within

14:06

these nitro enclave environments. And

14:08

so whenever you need to process data, compute

14:10

on it, you actually have to send not

14:12

only the encrypted data, the end to end

14:14

encrypted data to the enclave, but also send

14:17

this encrypted private key that can only be

14:19

decrypted by the enclave itself.

14:21

And so this lets us in

14:23

a very small footprint that is

14:25

isolated by the kind of nitro

14:27

enclave isolation guarantees with our assigned

14:29

code do computations on that

14:31

data. So I added a very simple example.

14:33

If you need to compute somebody's age, right,

14:36

you don't need to send their date of

14:38

birth and then do that computation inside of

14:40

a bunch of application code that might accidentally

14:42

exfiltrate that data. You can actually send the

14:44

encrypted date of birth, decrypt it,

14:47

compute the age from that, send

14:49

the age back out. And so we essentially can

14:51

kind of guarantee that data, the

14:53

minimum amount of information that is needed

14:55

is what leaves the enclave. Furthermore,

14:58

the enclave can do decrypt and then re-encrypt.

15:00

So if you need to send data to

15:02

a third party, you can decrypt it and

15:04

then re-encrypt it to a new public key

15:06

that only the third party has access to

15:08

the corresponding private key. And so this way

15:10

you can ensure that as data moves from

15:12

the enclave through untrusted, unisolated

15:15

sources, it is still entered and

15:17

encrypted to its destination. And

15:20

so this really powerful building block lets us

15:22

provide things like vault proxy functions. So

15:24

lets our customers send data to third

15:26

parties securely without ever having to see

15:28

that data or touch that data and

15:31

ensure that it's entered and encrypted to

15:33

the actual destination. And

15:35

that's really important because companies that

15:37

do onboarding, that touch this really

15:40

sensitive data, they often,

15:42

what they really end up doing is

15:44

sending that data somewhere else anyways. They have baking

15:47

partners, they have compliance auditors, third party auditors that

15:49

look at all this information. And so making it

15:51

really easy to move data securely that doesn't really

15:54

put you in scope for a lot of security

15:56

compliance regimes is pretty important. And

15:58

so I said the word compliance many times. Compliance kind

16:00

of is boiled through everything in

16:03

footprint and also really in what

16:05

our customers need. This is everything

16:07

from data security compliance, making sure

16:09

that if you're holding onto credit

16:11

card numbers or SSNs, you're compatible

16:13

with the associated privacy framework or

16:15

PCI framework that you need to in

16:17

order to run your business. But it also

16:19

touches the identity and risk kind of

16:22

compliance part of it. So our

16:24

customers have what's called typically, we call

16:26

them CIPs or they call it's an

16:28

industry phrase, which is customer identification policy.

16:31

It is essentially a set of rules

16:33

that says, these are the things that

16:36

we've checked before we've opened a new

16:38

account for a user. And banks, for

16:40

instance, have very intense CIP

16:43

policies, brokerages have pretty intense

16:45

CIP policies. Certain

16:47

like if you're renting apartment rental, they

16:49

might have to create their own that

16:51

is not based in financial regulations. And

16:54

so what footprint lets you do is we

16:57

have this concept called playbooks, which not only

16:59

define the information you collect, but also as

17:01

mentioned, the rules and decisioning

17:03

logic that you use to actually decide whether

17:05

you want to approve a user or reject

17:08

a user or maybe reject the user, but

17:10

put them in a manual review queue, because

17:12

they might be over that decision might be

17:14

overridden by one of your customer support folks.

17:16

So the idea is that these playbooks

17:18

let you essentially map the banks or your

17:21

third party, you know, order whoever's deciding, maybe

17:23

it's just you what your CIP policy is,

17:25

you map that to a playbook and footprint

17:27

not only takes care of collecting

17:30

all that information, but also putting that

17:32

user in the right stack of approved,

17:34

rejected or manual review, to make

17:36

sure that you're compliant. And

17:38

some new product, and I'll stop talking in

17:41

a second, but the new product that we're

17:43

launching pretty soon is a

17:45

compliance product, you know, similar to like the

17:47

Vantas and Dratas of the world, but really

17:49

focused on onboarding, which lets you show evidence

17:51

that you're following that CIP policy. So you

17:54

so you can basically give access to a

17:56

third party auditor, and without them having to

17:58

see any sensitive data, they could actually

18:00

audit your manual review process. They can

18:02

audit your how things are being approved

18:04

and verify that hey, the bank's CIP

18:06

policy is actually being met by sampling

18:09

a bunch of different onboarding cases and

18:11

providing that on a real time basis.

18:13

Basically alleviating the company from doing all

18:15

this manual work, approving their compliance, which

18:17

they have to do basically on a

18:20

quarterly basis, sometimes more often even. Yeah,

18:22

very fascinating. So yeah, just to touch on

18:24

that last bit there to start with the

18:27

new product, was that kind of a

18:29

clear ask from existing customers? Or is

18:31

it just a sort of natural evolution

18:33

of where this technology can go or

18:35

a combo? Yeah, I

18:38

mean, it's a combination. I think what we've seen is

18:40

that there's two things that we've seen in the kind

18:42

of the market that let us build

18:44

this. The first is that data

18:46

security and security in general used to

18:48

not really be part of the conversation

18:50

with financial compliance. But it

18:52

really is now. Banks not only want to see

18:54

that you're identifying customers properly, but that you're also

18:57

storing that data properly. And that's because of all

18:59

the breaches and leaks that have been happening. As

19:01

you can imagine, the more data that's leaked, the

19:04

harder it is to actually prove that the person

19:06

that's onboarding is who they say they are. And

19:08

so these things are really married at the hip.

19:10

And that's why we believe in footprint. It makes

19:12

sense to not only prove that a

19:14

customer is storing that data securely, identifying that

19:16

person correctly, but also they're storing

19:18

that data properly as well. And that's why the vaulting

19:20

technology is so core to what we do. The

19:23

second piece that has been pretty interesting and mostly

19:25

in effect of the market is there's been a

19:27

lot more scrutiny. You can see all the banking

19:30

banks that have kind of gone under

19:32

that have kicked off syntax from their

19:34

platform, the banking as

19:36

a service providers that have struggled kind of losing

19:38

partner banks. We've seen a lot of

19:41

our customers have to switch partner banks, for instance,

19:43

in order to keep their business running. And so

19:45

it's been pretty obvious that if

19:47

you switch to a different banking provider, now you have

19:49

to go through a completely new audit

19:52

process. Footprint can really make that really

19:55

simple and it makes it easy even to be onboard

19:57

to multiple banking partners as well without having to change.

20:00

and do a bunch of manual work to prove that you're in

20:02

compliance. Yeah, because just to kind

20:04

of, again, maybe read back, once on the

20:06

user side, the user has used

20:09

footprint once through some

20:12

platform. If another platform is

20:14

also using footprint, does that become a pretty

20:16

kind of a breeze for that user to

20:18

move through? Is that kind of how it works? Yeah,

20:21

exactly. And so, you

20:23

know, this works actually both from the end user

20:25

and consumer kind of point of view, but

20:28

also from the footprint customer, the

20:30

business, onboarding onto multiple partner

20:32

banks. So from the end consumer

20:34

point of view, if you're onboarding to multiple products

20:37

that use footprint, you don't have to, you

20:39

confirm your information because we authenticate you and

20:41

we basically can auto fill a lot of

20:43

your information. We skip past that, but we

20:45

also get a lot of feedback and labels

20:47

from how you are as a user across

20:49

different platforms. So if you do

20:52

fraudulent activity in one place, you

20:54

shouldn't just be able to create a new account and

20:56

redo that same fraudulent activity somewhere else. And

20:59

there's basically infinite number of bad actors, right?

21:01

Because a single sort of fraudster or bad

21:03

actor can create a bunch of different synthetic

21:06

identities or steal a bunch of real people's

21:08

identities and onboard a bunch of different accounts.

21:11

But good actors are much more valuable to

21:13

spot. So the way to think about it

21:15

is if you're a good actor across a

21:17

number of platforms, you should have a much

21:20

easier time onboarding somewhere else and

21:22

basically jumping through less hoops because you've proven

21:24

yourself to be a good actor and footprint can

21:26

do that in a way that's both secure and

21:29

also compliant because we can also prove that in

21:31

a lot of our reporting. 10

21:38

seconds on the clock. How many things can you

21:40

name? They're always growing your skills as a

21:42

software developer, open source, the impact of

21:44

AI. How about your businesses

21:46

on Shopify? Shopify

21:48

is a global commerce platform that helps you sell

21:51

at every stage of your business from the watch

21:53

your online shop stage to the first real life

21:55

store stage, all the way to the did we

21:57

just hit a million order stage Shopify is there.

22:00

to help you grow. Whether you're delivering

22:02

daily digest or serving sensational scoops, Shopify

22:04

helps you sell everywhere, from their all-in-one

22:06

e-commerce platform to their in-person POS system.

22:09

Wherever and whatever you're selling, Shopify's got

22:11

you covered. Shopify helps you turn browsers

22:13

into buyers with the internet's best converting

22:15

checkout, up to 36% better compared to

22:19

other leading commerce platforms, and sell more with

22:21

less effort thanks to Shopify Magic, your AI-powered

22:23

all-star. What I love about Shopify is no

22:26

matter how big you want to grow, Shopify

22:28

gives you everything you need to take control

22:30

and take your business to the next level. Shopify

22:32

powers 10% of all e-commerce

22:34

in the United States, and Shopify is the

22:36

global force behind Albers, Rothys, and

22:38

Brooklyn Inn, and millions of other

22:40

entrepreneurs of every size across 175 countries.

22:43

Plus, Shopify's extensive help resources are there

22:45

to support your success every step of

22:47

the way, because businesses that grow grow

22:49

with Shopify. Sign up for the $1

22:51

a Sign up for the $1 a month

22:53

trial at shopify.com/sedaily all lowercase.

22:55

Go to shopify.com/sedaily now to

22:57

grow your business no matter

22:59

what stage you're in. shopify.com/sedaily.

23:05

I'm going to come back to the technology in

23:08

a second because I've jumped away, but this is

23:10

just really interesting. So is that almost like a

23:13

user can have almost, I know this is a

23:15

slightly loaded term, so I appreciate about like a

23:17

credit score, a credit score almost like with footprint,

23:19

so like that now suddenly

23:21

someone is, we're able

23:24

to see the stats on someone

23:26

across multiple platforms, multiple companies, and

23:28

understand this user has a pretty

23:30

good history of being a reputable

23:35

user, I guess. I would say it's not

23:37

so much around, we're not trying to build

23:39

like a credit score across, you know, like

23:41

a social credit score or anything like that.

23:43

It's much more around, if you're doing obviously

23:45

bad activity, right? So for instance, if you're

23:47

on board to one platform, and you've

23:49

created this identity, and you've associated essentially

23:51

with your devices, and then

23:53

you go ahead and you onboard onto another

23:55

platform using the same device, but now you're using

23:57

a different identity or using different information, we can

23:59

have actually detect that you're using

24:01

the same devices that were previously associated with a

24:03

different identity and footprint. So that's one kind of

24:06

way we can catch bad actors, which you know,

24:08

the typical sort of frauds that we call duplicate

24:10

fraud, right, is I go on Telegram and I

24:12

go into some group and I purchased like a

24:14

packet of 50 stolen identities.

24:16

And that might even include driver's licenses, by the way,

24:18

right. And then I go into like Robinhood and I

24:20

go into a couple of different apps, maybe there's some

24:22

signup perks like it, you know, invest in your first

24:25

stock for free or something and I go and I

24:27

create a bunch of accounts and then I do a

24:29

bunch of bad activity with those accounts and then essentially

24:32

just abandon them later. It's something that's really hard

24:34

to stop. There's a lot of different tools that

24:36

companies have to string together in order to even

24:38

try to stop them. And then there's always that

24:40

risk that they're preventing a good person

24:43

from just using their product, which is a horrible

24:45

experience. And so what footprint does

24:47

is because we have this kind of concept

24:49

of like shared identity and we credential the

24:51

user, we can and we actually associate that

24:53

through various kind of device intelligence that we

24:55

collect as well using app clips and intern

24:57

apps, you know, we can get into kind

24:59

of the tech specs of how we're

25:01

building that. But the idea is that

25:04

we can essentially detect duplicate fraud happening

25:06

in real time. And we can

25:08

determine that the same sort of

25:10

person, you know, because we're at the front

25:12

end, because we're all the way at the

25:14

edge of where the user actually participates in

25:16

onboarding flow, we can detect that the same

25:19

person is onboarding multiple different identities and essentially

25:21

block them really quickly, right and provide those

25:23

risk signals to our customers that this might

25:25

be a duplicate fraud attack.

25:27

And that actually helps prevent a

25:29

lot of bad acting. The

25:31

other sort of spectrum and this is something

25:33

that you know, we're still developing is around

25:35

fraud labels. So you can tell us, hey,

25:37

we've offboarded this user for fraud.

25:40

And the reason this is a tricky

25:42

problem in general, is sometimes people commit

25:44

fraud with their real identities, right. And fraud

25:46

on one platform might not mean you

25:48

do want to not allow that user out

25:50

to a different platform, right? Like, if I

25:52

rent a car and then you know,

25:54

don't return it or something, I mean, that's

25:57

pretty bad, right. And that so this

25:59

actual translation. of like how you commit

26:01

fraud on different platforms. For us, what we're

26:03

trying to do is really solve like the

26:05

onboarding fraud. We wanna determine the

26:07

information that somebody's providing really is their information.

26:10

So if we can detect that, you know,

26:12

you're using data from somebody else across

26:14

the same device, those are the kinds of fraudulent actions

26:16

that we want to stop in real time. And they're

26:19

pretty hard to stop with today's tools. Yeah,

26:22

got it. This

26:29

episode of Software Engineering Daily is brought

26:31

to you by Vantage. Do you

26:33

know what your cloud bill will be for

26:35

this month? For many companies, cloud costs are

26:37

the number two line item in their budget

26:40

and the number one fastest growing category of

26:42

spend. Vantage helps you get

26:44

a handle on your cloud bills

26:46

with self-serve reports and dashboards built

26:48

for engineers, finance and operation teams.

26:51

With Vantage, you can put costs

26:53

in the hands of the service

26:55

owners and managers who generate them,

26:57

giving them budgets, alerts, anomaly detection

26:59

and granular visibility into every dollar.

27:01

With native billing integrations with

27:04

dozens of cloud services, including

27:06

AWS, Azure, GCP, Datadog, Snowflake

27:08

and Kubernetes, Vantage is the

27:10

one spin-off platform to monitor

27:12

and reduce all your cloud

27:14

bills. To get started,

27:16

head to vantage.sh, connect your accounts

27:19

and get a free savings estimate

27:21

as part of a 14-day free

27:23

trial. Okay,

27:31

so back to the technology

27:33

side. The vaulting is such

27:35

a core piece to the

27:37

whole footprint platform. It sounds

27:40

like from your past experience, etc., you

27:42

were pretty familiar with even

27:45

just the idea of the concept of enclaves.

27:48

But I imagine

27:50

finding other engineers with

27:53

this knowledge. How

27:55

easy is that and how easy has it been

27:57

to get on board other engineers into what sounds

27:59

like a... fairly esoteric

28:02

system or having to think quite a different

28:04

way about how to build a platform. It's

28:08

interesting. I think one of the

28:10

toughest problems with working in security

28:12

and cryptography in general

28:14

is this concept of threat models. A

28:16

lot of people operate with different threat models

28:19

in their mind. What is a threat model?

28:21

Threat model is basically what powers an adversary

28:23

has. It's a set of assumptions you make

28:25

that if an adversary could

28:27

break, let's say, Amazon's root of, you

28:30

know, nitro enclave root of trust, if

28:33

they could manufacture those signatures, forge those

28:35

signatures, well, that would defeat the nitro

28:37

enclave guarantees, right? Or at least some part of

28:39

them. So, you know, we assume

28:41

in our threat model that, you know, the

28:43

adversary doesn't have the ability to forge, let's

28:46

say, global kind of, you know, CA's like

28:48

Amazon's root CA for nitro enclave trust. But

28:50

the real threat model, you know, that I think

28:53

affects a lot of businesses today that is very

28:55

reasonable that I think a lot of businesses don't

28:57

actually consider is this idea

28:59

of, okay, database encryption, data encryption,

29:01

right? So, the today kind of

29:03

standard is you store some data in your

29:05

database and then you turn on, you

29:08

know, encryption at rest, which is essentially a

29:10

switch on whatever database provider is using

29:13

that says, okay, when the hard disk

29:15

starts up and loads into memory, it

29:17

will decrypt this data, usually with a

29:19

symmetric key. And so what that

29:21

really means is that if somebody runs into the data

29:23

center, sticks their hand, you know, in the cage,

29:25

rips out a hard drive, hopefully that hard drive

29:27

will be end to end encrypted with a key

29:30

that is not anywhere to be found. And

29:32

so this protects that data in those kinds

29:34

of cases. The reality is that

29:36

these data centers are actually fairly well protected.

29:38

I mean, it's really unlikely of cameras everywhere,

29:40

they've lost some cages. So, while

29:42

this protection is great and important, and we should

29:45

still do it, by the way, we do use

29:47

this protection as well, even though it's redundant, in

29:49

some cases, it doesn't really

29:51

solve the much more realistic problem, which

29:54

is how many different services are

29:56

talking to your database. And in modern

29:58

kind of cloud based company, Yes,

30:00

you have your applications, your microservices that all

30:03

have access to connect to the database. But

30:05

you might also have third party tools like

30:07

Retool or business intelligence dashboards, and they all

30:09

essentially have access to this data. And so

30:12

it becomes really difficult for businesses to limit,

30:14

okay, no one should be able to see

30:17

the social security number or the credit card number. And

30:19

so that's why kind of in general vaulting

30:22

is important. And so the threat model that

30:24

footprint operates under is that if we made

30:26

our database read only on the internet, publicly

30:28

available to everybody, you should not be able to

30:31

learn about people's private information, their

30:34

SSN, their documents. And so that's

30:36

sort of the core threat

30:38

model that we operate under. And so I would

30:40

say to go back to the original question, I

30:42

think the first kind of hurdle when you're bringing

30:44

on folks into working in security and enclaves and

30:46

the entire kind of area in general, is

30:48

just to be really clear about what threat model you're

30:50

fighting. And the main threat model

30:53

that we're trying to fight with enclaves is

30:55

the application surface area, right? Like the

30:57

99% of the code

30:59

or 95% of the code that everyone on

31:02

the team is constantly writing to is sort

31:04

of out of scope for being able to

31:06

really abuse and sorry, sensitive data and

31:08

a malicious kind of dependency there, or

31:11

some bad code that was committed, right?

31:13

We do the maximal amount of, you

31:15

know, protection to separate that

31:17

from leaking any kind of

31:19

sensitive PI data, card data, things

31:22

like that. The good news about nitro

31:24

enclaves is that the actual service area is really

31:26

small. Like I would say like our nitro enclave

31:28

related code is six to eight thousand

31:30

lines of code, which, you know, in

31:33

the grand scheme of things is pretty small. So

31:35

the number of engineers that actually have to write enclave

31:37

code is fairly small. They're really nice

31:39

things that when you hire a third party, you know,

31:41

pentest and auditing firm, small service area for

31:43

them to go through as well. And so

31:45

the nice thing is that it's not so much about

31:47

arming a team of 100 engineers to go

31:50

and work on this one part of the

31:52

code, just having a small team that really

31:54

understands the threat model, really understands the security

31:56

guarantees that you're trying to build. And then,

31:58

of course, going deep into the fundamentals. like

32:00

actually understanding like, you know, the elliptic

32:02

curves, you know, algorithm that you're using

32:04

and like making sure that everything is

32:07

up to spec and incorrect. As

32:15

a listener of Software Engineering Daily, you

32:17

understand the impact of generative AI.

32:20

On the podcast, we've covered many exciting

32:22

aspects of Gen AI technologies, as well

32:24

as the new vulnerabilities and risks they

32:27

bring. HackerOne's AI Red

32:29

teaming addresses the novel challenges of

32:31

AI safety and security for businesses

32:33

launching new AI deployments. Their

32:36

approach involves stress testing AI models and

32:38

deployments to make sure they can't be

32:40

tricked into providing information beyond their intended

32:42

use, and that security flaws can't be

32:45

exploited to access confidential data or systems.

32:48

Within the HackerOne community, over 750 active

32:50

hackers specialize in prompt

32:53

hacking and other AI security and safety

32:55

testing. In a single recent

32:57

engagement, a team of 18 HackerOne

32:59

hackers quickly identified 26 valid findings

33:02

within the initial 24 hours and

33:04

accumulated over 100 valid findings in the

33:08

two-week engagement. HackerOne offers

33:10

strategic flexibility, rapid deployment,

33:12

and a hybrid talent

33:14

strategy. Learn more at

33:16

hackerone.com/AI. That's hackerone.com/AI.

33:29

I'm not sure completely understand if you're

33:31

not familiar with this, but like MongoDB,

33:33

you know, they released queryable encryption. That

33:36

feels like they're almost trying to kind

33:38

of get into this

33:40

space, but maybe from a different angle

33:42

where it's not as much because the

33:45

AWS Enclaves sounds, it's quite a hardware

33:48

heavy hurdle to get over it

33:51

for someone else to kind of come into that space.

33:53

But yeah, I don't know if you are familiar with

33:55

any kind of comparisons you could, or

33:57

differences you could make there. Yeah,

34:00

I'm not I'm not familiar with the fundamentals of it I have

34:02

I read a little bit about it I think

34:04

the major questions I would have you know if I were to go

34:06

and dig into this and compare it is How

34:09

those keys are managed how how it actually

34:11

does the encryption? Basically, there's two important

34:13

things to consider when you're writing data What does that

34:15

process look like like how are you encrypting it in

34:17

a way that it can be you know? Queridably decrypted

34:20

later and then of course like

34:22

how those underlying keys are stored because one of the

34:24

nice things about nitro enclaves Is that

34:26

you get these hardware security? Guarantees

34:28

of isolation that says that this key

34:31

you know this master of private key

34:33

for that user vault cannot be decrypted

34:36

outside of you know footprint signed

34:38

enclave code running on legitimate enclave

34:40

hardware and Potentially you know you

34:42

could even write and maybe an adapter among good to

34:44

be that talks to a nitro enclave in real time

34:47

Or something you might be able to do that as

34:49

well There's a really good and this

34:51

is a pretty old paper I came out of

34:53

MIT and I read it when I was there

34:56

called Crypt EB Which actually tries to

34:58

do a lot of these you know fully homomorphic

35:00

encryption if you go really far into like the

35:02

you know Theory field there are companies that are

35:04

actually trying to build this in practice It's

35:07

pretty slow like this is sort of a generalizable

35:09

way of computing on encrypted data And

35:11

there's sort of two interesting things about that the

35:13

first is that it's pretty slow But there's a

35:15

lot of research happening to optimize it and the

35:17

Crypt EB paper is sort of a way of

35:19

saying okay Let's take a bunch of different partially

35:21

homomorphic cryptographic algorithms and Encrypt

35:24

data in some special ways that we

35:26

can still do a bunch of queries

35:28

without fully decrypting all that data

35:30

I think the problem in general There's

35:32

two problems is one is fairly complicated

35:34

for most developers to interact directly with

35:36

these systems because it requires a deeper

35:39

level of understanding about things like Homomorphic functions like

35:41

how to actually encrypt data in a way that

35:43

you can query what you want later and most

35:46

companies as they're building products They actually don't necessarily

35:48

know all the queries that they're going to want

35:50

to run Eventually and so kind

35:52

of footprint takes the more opinionated approach of

35:54

you Rarely will ever need to query SSN

35:57

and we know what PI data looks like we know what

35:59

identity data looks like like so, we can fingerprint things

36:01

and make things searchable in a way that

36:03

you're using us as your identity source, secure

36:05

identity source. So we already know all the

36:07

kinds of queries you're going to want to

36:09

run on searching for

36:12

identity information. And we're going

36:14

to make that secure versus a more

36:16

generic approach, which then kind of punts

36:18

that decision and architecture to the developers.

36:21

And the second thing, which is also pretty important is

36:23

how are the keys managed, right? So this is something

36:25

that footprint solves with Metro Enclaves. But at the end

36:27

of the day, cryptography doesn't really solve any problems. It

36:29

just kind of boils it down to the length of

36:31

the key. And those keys are pretty short,

36:33

but you still have to store them somewhere. So even

36:35

in the world of fully homomorphic encryption, even if we

36:37

had an, even if it was efficient, you still need

36:39

a secure place to do those computations.

36:42

And so, you know, fully homomorphic encryption becomes

36:44

super performant tomorrow. I think our Nitro Enclaves

36:46

are the best place to do them. And

36:48

we would essentially be able to roll out

36:50

a bunch of new abilities for doing more

36:52

computations efficiently in a way that

36:55

secures the key outside of the rest of

36:57

the application service area. Yeah,

36:59

it's a very good point to make just about working

37:02

with encrypted data, which is yeah, the speed

37:04

factor. And I think it's a great point

37:06

to make about it is

37:09

okay, actually, in cases such as this, to

37:11

have opinions about the kind of data that

37:13

is going to need to be worked with

37:15

often. And then the kind that

37:17

isn't and developers should maybe kind

37:19

of try and think a little bit more ahead

37:21

on these ones where they can. And

37:24

I say this more from the, I guess, the perspective

37:26

of developers should start getting really used

37:28

to having to work with encrypted data, because

37:30

to this point in time, probably

37:33

I'm just going to hazard a guess that

37:35

like at least 50% out there listening have

37:37

not ever like their databases that they work

37:39

with are just full of they

37:41

can see all the data and they know exactly what's going

37:43

on, or they can see it. But you

37:45

know, fast forward two years, I hope

37:48

it's only as long as two years. But

37:50

you know, most databases, if you're a developer, you look

37:52

at it, you have no idea what's going on.

37:55

And you're maybe working with some sort of fake data,

37:57

which we've fake tested, which we've got a couple of

37:59

episodes. coming up on, or maybe have

38:01

already been aired. But generally speaking, so the

38:03

age of this database that everyone

38:06

can kind of just see and

38:08

work with really fast, no, it's going

38:10

to have to change. And it

38:12

sounds like Nitro Enclaves are kind of

38:14

the absolute hardened version of that, which

38:16

is obviously needed for such a platform

38:18

that is a vaulting platform. But there

38:20

are other step measures that developers can

38:22

take that maybe isn't quite as far

38:24

as Nitro Enclaves, but it's good

38:27

to be aware of all the different approaches. Yeah,

38:30

absolutely. We've seen already that a lot of

38:32

our customers and companies we talk to, they

38:34

don't even want to store SSNs. You wouldn't

38:36

believe the amount of architecture hoops that people

38:38

jump through so that they don't need

38:41

to store things like SSNs. And

38:43

because they don't want to have to worry

38:45

about encrypting the audit for it later, so

38:47

they sometimes require users to retype their SSN

38:49

multiple times in order to then send

38:51

it in memory without actually ever storing it

38:53

down. And I actually applaud those approaches.

38:56

I mean, they're kind of thinking about it the

38:58

right way. And we're just trying

39:00

to make that developer experience much easier. And some

39:02

of the businesses actually do need to store this

39:04

data for audit purposes, so that

39:06

we can solve it for them. Absolutely.

39:09

So moving forward to

39:11

how Footprint is a platform that

39:13

a consumer or user is going

39:15

to be aware of and interact

39:17

with, as well as the provider,

39:19

the one that needs all this

39:21

technology so that they can verify.

39:24

It's a kind of a double-sided problem, or at least that's how I

39:26

kind of looked at it, that you kind of

39:28

have to get both sides in agreement

39:30

that they're going to use this thing. I guess

39:32

the user consumer maybe has way less choice. If

39:35

they're coming onto a platform that uses Footprint, well,

39:37

that's what they need to use. But

39:39

yeah, how are you kind of approaching this? Are

39:42

there other services using Footprint? And sort of

39:44

are there any, what kind of hurdles are

39:46

you facing there, if any? Yeah,

39:49

so I think you said it right, which is

39:51

users, they don't really care, I think.

39:54

We'd like to think that most people

39:56

love to perfectly architect their entire world

39:58

of privacy and security. But

40:00

most don't understand it, and they shouldn't really understand it.

40:02

I've always been a believer that security

40:04

without user experience is pointless. It's

40:07

not easy to use. No one's going to use it. So it doesn't matter how secure

40:09

it is. And so part of the overall

40:12

ethos of Footprint is actually be

40:14

very design and consumer-focused, even though

40:16

we're selling to enterprises, businesses. And

40:19

part of that is because the end user interacts

40:21

with these embedded onboarding modules that we built. And

40:24

so from the user's point of view, they

40:26

think that the first time they're onboarding Footprint

40:29

via a product, they're not really thinking about

40:31

Footprint. They're not really seeing it. We do

40:33

a couple of things to help the user

40:35

understand if they're curious, because even though a

40:38

lot of users might not care so much

40:40

about their privacy and security, any time

40:42

you ask them for very sensitive information,

40:44

they always pause. And that's the worst

40:46

thing. You're trying to convert a user

40:48

to buy some stock in your investing app, and

40:51

they don't want to give you the core information you need to open an

40:53

account for them. That's tough. And the

40:55

drop-off rate is huge. One

40:57

of our missions is actually to lower the friction,

40:59

make it easier for people to use trusted applications

41:01

on the internet. And so to

41:03

do that, we actually automatically generate certain privacy pages,

41:06

and we explain to users how their data is

41:08

stored and secured. And it's akin to the very

41:11

old days of the badges on

41:14

websites of your credit card information stored

41:16

securely, but I think on steroids and

41:18

where we actually explain to users much

41:20

better. So from the user's

41:22

perspective, the customers, the businesses, the

41:24

products that they're onboarding to have already made a decision

41:27

to use footprint, and so they kind of go along

41:29

with the flow. The one crucial step

41:31

that we do to that onboarding is we have

41:33

users set up their passkey, and we verify their

41:35

phone number. So we do a couple of credentialing

41:37

steps. So when the user comes back,

41:40

it kind of is a magical experience, where they

41:42

do a faith ID or they do a touch

41:44

ID. They basically use their

41:46

passkey, which from the user's perspective is just

41:48

a biometric. But as we understand it, the

41:50

biometric is really a safeguard to the cryptographic

41:52

key that's stored on that device or synced

41:54

across all their devices. And so

41:56

from the user's perspective, they do this very secure operation

41:58

and all the their data is

42:01

autofilled. And so that actually

42:03

is a much better feeling than if,

42:05

for instance, some competing technologies like, I think

42:07

you enter the last four of your SSN

42:09

and then verify an OTP and

42:12

then it fills in your information, that to users

42:14

I think feels a little bit more sketchy of

42:16

like, how do you get all my information? Maybe

42:18

I'll drop off, maybe I'll convert a little bit

42:20

less clear, but also less secure because fin swaps

42:23

as we know are happening all the time. Men

42:25

in the middle of phishing happens all the time

42:27

with OTP codes. So Paski's really led us that

42:30

when these are on boards the first time, we

42:32

create that Paski for them that lets us not

42:34

only secure that account, but also kind of slowly

42:37

start onboarding them into this idea of footprint in

42:39

this portable identity across the internet. For

42:41

businesses kind of on the other side, they also

42:43

don't actually necessarily care about the portable identity, especially

42:46

when we were just starting out and we had

42:48

zero portable identities, right? Why would they care, right?

42:50

It's kind of a chicken or the egg network

42:53

bootstrapping problem. And so we actually

42:55

knew that, you know, that was part of

42:57

our design from day zero. We knew that

42:59

companies will not care about that. And so

43:01

the reality is we actually set out to

43:03

build the best onboarding platform regardless of the

43:05

portable identity, right? Even if portable identity never

43:07

existed, companies should still choose footprints because it

43:10

simplifies developer experience. It's much more

43:12

accurate. It has security baked in

43:14

and solves all these kinds of

43:16

compliance regulatory problems as well. And

43:18

so that's sort of how we convince

43:20

businesses to use footprint is just giving them

43:22

a much more superior product. And then eventually

43:24

the portable identities start rolling in and then

43:26

it actually creates an even higher value prop

43:28

because not only can you securely onboard your

43:31

users, you can also onboard

43:33

them with less friction and means higher

43:35

conversion rate, better user experience. And

43:38

so for businesses, things like the past keys,

43:40

the credentialing actually end up playing a pretty

43:42

positive role too. And that was something that

43:44

wasn't necessarily obvious to us when we first

43:46

set out to build footprint. And this is

43:48

part of the security pillar as well,

43:50

which is, you know, every platform

43:53

that we talked about early on, like they need to have

43:55

the user log back in. And it turns

43:57

out that account takeover has detrimental effect.

43:59

In fact, of course, to the end

44:01

user, but also on the business who

44:04

sometimes is responsible for remediating the effects

44:06

of that account takeover. Take, for example,

44:08

a gambling app. Gambling

44:10

regulations are really strict. If you break them,

44:12

you could lose your gambling license, can totally

44:14

shut the business down. Imagine

44:16

a case where a parent creates an account, they're

44:18

doing everything legally, they KYC themselves,

44:21

kid picks up their phone, does the OTP challenge

44:23

because they picked up the phone, and

44:25

then logs back in. And now, places are bad.

44:27

Now that you have an underage person doing gambling,

44:30

the parent complains that they lost a bunch of

44:32

money and it wasn't them, it's this gray area

44:34

situation. Things that, strong authentication

44:36

that proves that the person who verified their

44:38

identity initially is the same person unlocking that

44:40

device, all of a sudden becomes

44:43

very important. That's true,

44:45

not only for, this is more of a

44:47

friendly fraud scenario where someone you know

44:50

takes your phone, maybe they don't even know any

44:52

better, but this is true as well if somebody

44:54

breaks into your account, somebody, like an actual fraudster

44:56

hacker or whoever it might be. So,

44:59

really having that strong identity and authentication

45:01

tied to the identity that created that account

45:03

becomes really important. Things for account recovery, if

45:05

you lose access to your devices, how do

45:07

you prove who you are again? For,

45:10

to do that, we know how to verify someone's

45:12

identity, we have a bunch of authentication mechanisms that

45:14

are beyond just username and password, which you might

45:16

forget or lose access to. These

45:19

kind of properties that make portable identity

45:21

possible also have become pretty

45:24

valuable to companies that aren't even thinking

45:26

about the portable identity aspect, just some

45:29

kind of core onboarding things that they

45:31

have to solve. Yeah, I

45:33

really like how you guys have thought

45:35

about this, which is, yeah, the portable

45:37

identity, yeah, that would be great once

45:40

hundreds, thousands are using, but if

45:42

you focus on that on day one, that's how you

45:44

try and sell it, of course, unless the product is

45:47

just a superior product than, yeah, chicken

45:49

and eggs. So really like that. Pascis,

45:52

I'm really interested on this one.

45:54

Some listeners will probably know me

45:56

by now that I love Pascis,

45:58

but I still find. And

46:00

I love talking about them because I think there's

46:02

just so many bits to them and nuances of

46:04

the experience especially. So it's really

46:06

interesting you said that actually Pasky's has kind

46:09

of really helped here and sped up some

46:11

of the parts of the process. I'm

46:13

making a guess here that actually when

46:15

we're talking about identity and documents, then

46:17

when we're then asking people to do

46:20

a kind of an extra step of

46:22

authentication, they're quite, perhaps they're

46:24

quite happy to do that. I'm coming

46:26

from a slightly different side,

46:28

which is a platform that is actually

46:30

currently building a platform that is Pasky

46:32

only. And actually it's become our

46:35

bit of a hindrance for us where users are

46:37

like, what is this thing? What

46:39

is this Pasky? And it's because we're one of the first times they've

46:41

ever seen a Pasky perhaps, or for most

46:44

cases. Do you find that your users, have

46:46

they seen Pasky's before? Is it something to

46:48

do with like, yeah, the biometric or however

46:50

they do it, it could be via even

46:53

their password vault. Do they just kind

46:55

of match it up and say, yeah, well, this all makes

46:57

sense? Or have there still been some UX

46:59

hurdles to get over there? Yeah, I

47:01

mean, and I think this is totally hit the nail

47:03

on the head. Like the user experience part

47:05

here is really the hard problem. I mean,

47:07

we've passed a lot of the other hurdles,

47:09

the technology problems, the cryptographic keys, the hardware

47:11

devices, storing them. This is

47:13

sort of the really difficult one. And

47:16

there are a couple of things that we've done that actually

47:18

make Pasky's a little bit easier for our users to adopt.

47:20

So one of the cool

47:22

things, part of the onboarding process and

47:25

footprint sometimes includes like document verification. And

47:28

when you're scanning a document and then doing

47:30

a selfie scan, I mean, users are fairly

47:32

familiar with this process just because tons of

47:34

platforms like a Coinbase requires it regularly.

47:37

And one thing that we do, when

47:39

you onboard on the desktop or web,

47:42

we show QR code to that user,

47:44

and they scan it with their phone, the camera on

47:46

their phone, and that actually opens up a native experience

47:49

on their device. So we're leveraging this

47:51

new technology called InSnaps on Android and AppClips on iOS.

47:53

And so you might have seen this if you went

47:55

to a restaurant and you got like a QR code

47:57

on a bill, and you scan it with your phone.

48:00

that QR code and it brought up like this

48:02

native toast app or this app to pay. We're

48:05

actually using that exact same tech. And so

48:07

when you scan that QR code, we're loading

48:09

an app just in time. It's a really

48:11

small application that

48:14

really gives us, the business and the user, a

48:16

bunch of different benefits. First of all, it's much

48:18

more native user experience, much more trusted. The user

48:20

feels that they're doing something that is okay

48:23

versus pulling up a sketchy. We've

48:25

all done those. I checked into a

48:27

hotel recently and they made me go to this

48:29

random URL and upload a picture of my driver's license. So

48:33

this feels like a really native experience. It's

48:35

branded to the product that they're onboarding to.

48:37

So they'll see Coinbase or they'll see Flexcar

48:39

there. And so the idea is we do

48:42

the document scanning, we do the selfie scanning.

48:44

And by the way, it's much more accurate

48:46

because we can use rectangular detection, use all

48:48

the native ML capabilities on the device to

48:50

have much better scanning than you would get

48:53

through a webcam. We do

48:55

the space detection, selfie detection. We can

48:57

scan the barcodes and the different attributes,

48:59

fault it directly, all that good stuff.

49:01

But the other benefit is that we

49:03

can tap into the native PASQI biometric

49:06

API. So we can, as part of

49:08

the liveness check, these are not necessarily

49:10

even know that they're registering a PASQI.

49:13

From their perspective, it seems like they're setting up

49:15

Face ID for this product. And

49:17

so it actually feels a lot more native because you're in

49:19

a native app, there is a lot fewer

49:21

screens to go through and

49:23

you're already giving some information. So maybe

49:25

emotionally, it's easier to also do a

49:27

Face ID error because that's actually much

49:30

faster than scanning a document and whatever

49:32

else. So we couple that

49:34

piece with the identity verification piece

49:36

for documents. And then it works

49:38

standalone too. And if you're not scanning a document, you could

49:40

just scan that QR code to use your phone to set

49:42

up that PASQI. So one of the things that we do

49:44

is when we onboard a PASQI, we always do it with

49:47

a phone. And our reason

49:49

for that is that maximizes the chance

49:51

that the user will go through that flow

49:53

successfully, but also that that PASQI will then

49:56

be synced across all their devices. Who knows

49:58

where they're onboarding on desktop. our

50:00

web and even if you know it's like sometimes

50:02

you're renting an apartment let's say you rent an

50:04

apartment in person you use like the realtors laptop

50:06

or whatever right at least you're using your own

50:08

phone right so we try to make sure you

50:10

use your own device every single time to do

50:12

these kind of really sensitive things like sending past

50:14

key scanning a document and so that's just part

50:17

of the regular flow we've seen pretty good success

50:19

with that you know I think different platforms when

50:21

they introduce past keys they try to explain a

50:23

past key to a user and I'm not sure

50:25

exactly you know there's I think different

50:27

users maybe react differently to those kinds of

50:29

things I just want to call out

50:31

to delicious I think that's one of the most

50:33

interesting and fantastic ways to use past keys the

50:36

way that's just being explained there I was gonna

50:38

ask anyway yeah you know how have maybe

50:40

mobile devices or has that come into it

50:42

but this is it I think that past

50:44

keys on desktop currently are the big fall

50:46

down for all the reasons you've just mentioned

50:48

you know you could be on someone else's

50:50

laptop much more likely than a phone for

50:52

example but also just the general

50:55

experience across OS's across browsers

50:58

very disjointed right now I

51:00

really love the way that you've explained

51:02

how you approached it so yeah that's

51:04

fantastic so just kind of

51:06

starting to wrap up a bit I mean

51:08

this we're recording this probably around a month

51:11

before listeners will be hearing it but you

51:13

mentioned one kind of product development towards the

51:15

start of this episode but what else can

51:17

you share you know across say the next

51:20

six to twelve months like what can people

51:22

potentially expect from footprint yeah

51:24

first of all thank you appreciate the kind

51:27

words there and we're pretty excited about it yeah

51:30

there's two kind of real updates coming pretty

51:32

soon probably sooner but probably the next six

51:34

months or so in addition to this

51:36

compliance product we've been working really hard to integrate a

51:38

bunch of new fraud tools and so one of the

51:40

really cool things about using footprints on the front end

51:42

is that because we can do

51:45

a bunch of behavioral analysis of the user

51:47

we can actually bake that into our rules

51:49

engine and so bake that into different step

51:51

ups so what you'll see with footprint is

51:53

that we're now starting to integrate and

51:55

provide a lot of risk signals around fraudulent behavior

51:57

think of like user is typing SSN

52:00

very hesitantly or like they're typing in two

52:02

digits and then they're pausing and typing in

52:04

more right to suggest that maybe they're copying

52:06

it from somewhere you know usually people know

52:08

those by heart so taking into a bunch

52:10

of different variables and user behaviors and how

52:13

they interact with the different inputs maybe

52:15

they're you know figure printing their mobile

52:17

web based session all sort of the

52:19

front-end bells and whistles you can imagine

52:21

that can tie into the KYC the

52:23

onboarding experience and exposing those as fraud

52:25

risk signals we're also integrating third-party tools

52:27

you know I think it's unlike KYC

52:30

which is commoditized and largely can be solved

52:32

for any kind of business with the simple set of

52:34

rules I think fraud is much

52:36

more complicated because different businesses experience really different

52:39

types of fraud payments fraud like crypto for

52:41

instance experience horribly different fraud and so the

52:43

idea of being able to without having to

52:45

you know a new fraud attack pops up

52:48

put three engineers on it spend a month

52:50

building the integration by that time a new

52:52

fraud attack hits your company the

52:54

idea with footprint is because we own

52:56

that onboarding flow for you you can

52:59

just turn on different tools and make it

53:01

really easy to deploy anti-fraud techniques without writing

53:03

a single line of code in response to

53:05

it in real time we can even tell

53:07

you that these things are happening before you

53:09

even realize they're happening and this is

53:11

something that we've seen be really effective to a lot

53:13

of customers that we're talking to and our

53:15

existing customers so that's one and

53:18

kind of related to that you know we've talked

53:20

about this and kind of I've spoken maybe for

53:22

granted a little bit about like how companies just

53:24

use our onboarding experience and that actually you know

53:26

if you're head of product at a company that

53:28

onboarding is part of your domain you're probably

53:30

a little bit hesitant when you hear that

53:32

you say okay we spent you know three

53:35

months building a perfect onboarding flow that like

53:37

explains the product and then bounces into you

53:39

know collecting some information something very

53:41

intricate how can we just replace all of

53:43

that with like an embedded you know credit

53:45

card form like Stripe would do but for

53:47

identity and you know we've built from

53:49

the ground up a very customizable onboarding form that

53:51

makes it so that it looks and matches your

53:53

you know the look and feel of your product

53:56

but we've even hit kind of the upper

53:58

bound limits of that as well And so

54:00

one of the things that we're rolling out really soon is

54:03

we've taken everything that we've done in our

54:05

embedded onboarding experience and we're breaking that out

54:07

into components. So effectively if you

54:09

want to adopt footprint instead of having to

54:11

rebuild your onboarding flow or you know Delete

54:13

the thing that you've already built and replace

54:15

it with footprint what you can really

54:18

do is go to all your different PII inputs and

54:20

just replace them with the footprint components and We'll

54:23

magically take care of the rest and so

54:25

that includes all that sort of validation Errors

54:27

like you know the PO box example I gave or like

54:30

the you know fat fingering the date of birth all

54:32

the way to all the fraud behavioral

54:35

Capturing that we do because those are you're now

54:37

using our inputs and all the

54:39

one-click abilities So, you know if we detect that

54:41

we already know who that user is We can

54:43

just pop a modal up and say hey authenticate

54:45

your past key and then auto fill the rest

54:47

of those form fields have the user Confirm that

54:49

information and skip past the next step. So we've

54:51

effectively taken our Embedded onboarding experience

54:53

and broken it out into components We're

54:56

calling it onboarding components and kind

54:58

of like what you would imagine, you know an SDK,

55:00

right? So instead of if you're a new company

55:02

and you're building your onboarding flow You can still

55:04

build it just use the components from footprint You

55:06

can customize them however you'd like but you get

55:08

these sort of superpowers right from day one So

55:11

those are the two things that we're excited about

55:13

probably the next next few months and

55:16

yeah a lot more coming later But I don't know if I can share

55:18

that just yet Yeah, I like

55:20

where I think I hear that's going almost sort

55:22

of Kind of SSO but

55:24

in a nice way where the user hasn't

55:27

quite realized that there it's the

55:29

experience that's driven it And it's not sort of

55:31

a maybe quite so much a brand but they

55:33

realize this experience of of onboarding there They're used

55:35

to it and then yeah, they kind of the

55:37

magic bit there, you know I just use my

55:40

past key and boom all these details Obviously

55:43

if they slowly they kind of know why they came up

55:45

and where they've come from but yeah, I use my past

55:47

key Oh, wow, they already know about me and in a

55:49

good way fantastic. I could just move through this process very

55:51

fast Yeah, I like the sound of that. Yeah,

55:54

our goal isn't to teach the user about footprint when

55:56

they're throwing out that onboarding flow for the first time

55:58

but rather to teach

56:00

about it when they, so they understand

56:02

why it's being auto-filled the next time or

56:04

the third time and fourth time. That's really

56:06

where I think people will start seeing footprint

56:08

directly, but right now we're focused

56:10

on building that initial onboarding experience and making that

56:13

the best. Yeah, the example of that I guess

56:15

I've seen more closely, but it's

56:17

kind of on the end of an

56:19

experience is the shop pay model

56:22

where, yeah, users weren't really aware they

56:24

were using Shopify across all these different

56:26

sites, but they were aware that this

56:28

payment method, which was just, I believe,

56:30

the first iteration was just put in

56:32

your phone number and we've got the

56:34

rest basically. And that kind

56:37

of, yeah, people started then clock,

56:39

oh, this means that Shopify, but

56:41

the experience was so great. They didn't really

56:43

care to begin with. And then now Shop

56:45

Pay has been rolled out

56:48

outside of Shopify, which is super interesting. But yeah,

56:50

it kind of sounds like a same approach, very

56:52

different place to position it for the user. But

56:55

yeah. Yeah. Shop day is incredible.

56:57

I mean, I think they're also rolling out Pascu as

56:59

well. I believe so that,

57:01

yeah, I think it's a very similar model. I think they're

57:03

probably innovate, like maybe them and Stripe link,

57:05

I think is trying to do

57:07

something similar as well with the payments provider.

57:09

But yeah, we definitely have a

57:11

lot of respect for those giants. They're doing awesome

57:14

work. Yeah. Yeah. Likewise. Alex,

57:17

it has been great to have

57:19

you on. There's been just so many interesting

57:22

things here. And I'm sure a lot of

57:24

inspiration for developers out there as well, especially

57:26

around sort of the vaulting side, as well

57:28

as, you know, all this stuff towards the

57:30

end of about Pascu and how, how to

57:33

implement them in a nice way. So

57:35

yeah, I just want to say thank you so

57:37

much for coming on. Really appreciate the time. Thanks

57:40

for this has been super fun. Really glad to do it.

57:42

Thank you.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features