Podchaser Logo
Home
TechStuff Classic: The Secrets of Tor and the Deep Web

TechStuff Classic: The Secrets of Tor and the Deep Web

Released Friday, 9th April 2021
Good episode? Give it some love!
TechStuff Classic: The Secrets of Tor and the Deep Web

TechStuff Classic: The Secrets of Tor and the Deep Web

TechStuff Classic: The Secrets of Tor and the Deep Web

TechStuff Classic: The Secrets of Tor and the Deep Web

Friday, 9th April 2021
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:04

Welcome to Tech Stuff, a production

0:06

from I Heart Radio. Hey

0:12

there, and welcome to tech Stuff. I'm your

0:14

host, Jonathan Strickland. I'm an executive producer

0:16

with iHeart Radio and I love all things tech

0:18

and it's time for a classic episode.

0:21

This episode originally published

0:23

on April six. It

0:26

is titled The Secrets of Tour

0:28

and the Deep Web. I've covered

0:31

these topics a few times over the years.

0:33

This one was a pretty fun discussion. Hope

0:35

you enjoy. The Mighty Tour

0:38

is one of the Avengers. He wields

0:40

the hammer Mjolner, and his

0:42

brother is Loki. She's

0:45

not even growing her eyes, She's just staring me down

0:47

this time. Okay. So seriously, though,

0:49

what tour is free software. It's

0:52

an open network, and it helps you defend

0:54

against traffic analysis. In other words,

0:56

people trying to figure out what you are

0:58

doing and who you're commun ninicating with. Traffic

1:01

analysis is a form of network surveillance

1:04

that threatens personal freedom and privacy. Uh,

1:06

it threatens confidential business activities

1:08

and relationships, and it threatens state security.

1:11

Therefore, some folks got

1:13

together and said, hey, you know what we should do is we should

1:15

come up with the means to allow

1:18

people to communicate over the Internet,

1:20

but do so in a private,

1:22

anonymous fashion, so that you can

1:24

set up these anonymous channels. Perhaps

1:26

the most popular way to access this is

1:28

through a customized build a Firefox

1:30

called the Tour Browser Bundle. Right,

1:33

Yeah, because just using Tour on its

1:35

own is one thing to do to

1:37

to allow you to have a little more of an anonymous

1:39

presence, but it requires

1:42

more than that, because if you access Tour through

1:44

some other means, if you don't have say Flash

1:47

disabled in your web browser, then

1:49

you're still kind of broadcasting where

1:52

you are because Flash often involves

1:54

uh identification information

1:57

in order for it to work. So it

1:59

is a and source. So if you feel like getting in there

2:01

and and doing your own thing, you're absolutely able

2:04

to um and uh and

2:06

And a lot of people do use it in one

2:08

form or another. At its peak, in more

2:11

than half a million people were using it every day.

2:13

Yeah, oddly enough, I think as

2:16

I a call in that year, there was some news that broke

2:19

about government agencies.

2:23

Yeah, Edward Snowden had that leak

2:25

about the n s A, and suddenly

2:27

people were thinking, you know, I was like it

2:29

doubled. Yeah, Yeah, it was one

2:31

of those things where people began to get very

2:34

concerned. And it's not necessarily

2:36

that these people are doing anything wrong. In fact,

2:38

that's not the point at all. The point is that

2:41

they have an expectation to privacy

2:44

and being able to hold this kind of anonymous

2:47

communication with other people. The

2:49

communication itself isn't necessarily anonymous,

2:52

but the channels are. Uh,

2:54

you know, that's just that's just an expectation we have.

2:57

It's not that, you know, I'm planning

2:59

something to Ferry. It's just if I want

3:01

to send a message to Lauren, and

3:04

it's just for Lauren's eyes, I

3:06

don't think anyone else has the right

3:08

to look in on that. So yeah,

3:10

and in normal internet traffic, that's absolutely a

3:13

possibility. Yes, Because we've

3:15

talked a lot about how information travels

3:17

across the internet. You know, it all gets divided

3:20

up into these little packets. Then the packets

3:22

go across the network and then get

3:24

put together Willy Wonka style on

3:26

the other side, so that you get whatever it is you

3:28

were trying to send, which is unfortunately probably

3:31

not a delicious chocolate bar no or Mike

3:33

TV either. It's not neither of those

3:35

things. What it might be like if I, if I were to send

3:37

that email to Lauren, and it's a sizeable

3:40

email, that email gets divided up into

3:42

numerous packets. The packets go

3:44

across the Internet, not necessarily taking the

3:46

same path, and they eventually reassemble

3:49

on the other side and then Lauren can read it. But

3:52

in order for that to happen, these packets

3:54

have to have little bits of information so the

3:56

routers know where to send the information

3:58

onto next. So it's

4:01

kind of like an address on a piece

4:03

of mail. So let's say

4:05

that you've got a snoop in your neighborhood

4:08

and this person is getting into everybody's

4:11

business. And the way this person

4:13

does it is they look at all the mail

4:15

that's going in and out of

4:17

a person's mailbox. And even if they're not

4:20

opening that mail and and reading

4:22

all of it, just just the fact that you're sending

4:24

it to particular people at particular times

4:26

can tell that snoop a lot about what's going

4:29

on. Right, So if you're sending out, uh,

4:31

you know, envelopes to say

4:34

a medical facility,

4:36

that could give a lot of information to a snoop

4:38

if they're seeing that stuff from various

4:41

insurance companies is coming into you that

4:43

could you know, I'm going with a medical thing here,

4:45

but really this applies to any sort

4:48

of communication. So

4:50

so what we're saying is that it's not enough for

4:52

the content of what you send over

4:54

the internet. Uh necessarily,

4:56

I mean you are hypothetical, you maybe you're

4:58

fine, it's not enough for you to encrypt

5:00

the content, but the actual transfer

5:03

of the content in some cases needs

5:05

to be encrypted exactly. And there are a lot

5:07

of legitimate cases where you

5:09

would want that to happen. I mean, let's

5:12

talk about journalists for example.

5:14

So you might have a journalist who is pursuing

5:17

some major story,

5:19

perhaps they're in unfriendly territory

5:22

to do so, and they want to be able

5:24

to contact sources that

5:26

might be in danger otherwise if there

5:28

if if this communication were publicly known

5:31

or really anything that could endanger

5:33

the journalist, a source, or the story

5:35

itself, then you would want to have a

5:37

way of securely communicating and

5:40

making sure that no one's really snooping

5:42

in on you. Well, that's that's a

5:45

perfectly legitimate source. There are governments

5:47

that use this kind of thing in order so

5:49

that they can gather information and

5:51

disseminate information. Uh, you've

5:53

got companies that use this kind of stuff

5:55

in order to have secure

5:58

communications about upcoming products

6:00

or services that are not part of the public knowledge

6:02

and don't need to be oh sure, I mean even if

6:05

you're just doing r and D about something you

6:07

know, like like let's say that you're the

6:09

example that you used and in our notes here is Apple.

6:11

Like if here, if you're creating a new product and you

6:14

start researching patents online,

6:16

um, the right person could could

6:19

find your searches and figure out

6:21

what you were looking for, and that

6:23

sucks for you. Yeah, yeah, if you

6:25

had the next big idea and

6:28

you were waiting, because you know, like

6:30

the company of Apple, they get a lot of

6:34

a boost from folks whenever they

6:36

announced something brand new that surprises

6:38

everyone, which of course is exactly

6:40

why you have so many news agencies

6:44

scrutinizing everything Apple

6:46

does in order to try and guess what's

6:48

coming next. So the more

6:50

you're able to keep that secret, the bigger

6:52

the impact is when you unveil it. Because

6:55

the worst, the worst feeling

6:58

is when you tune into an Apple of that and

7:00

it ends up being exactly what you expected. It

7:02

was. Time to be right. Every everyone still

7:04

tunes in but then they're like, oh, but that's exactly

7:07

what they were talking about last week. I know,

7:09

and you read what they wrote last week,

7:11

so stop it me.

7:16

Sure, and and lots of other people who could

7:18

generally be considered to be working

7:21

for for non nefarious purposes, but

7:23

nonetheless would like a little bit of secrecy, uh,

7:26

for example, activists or whistleblowers,

7:28

um or you know Chinese citizens who

7:30

really just want to use Facebook or read news from other

7:32

countries. Sure, and we've seen plenty

7:34

of examples also, things like the Arabs Spring.

7:37

You know, places in the world where you

7:39

have people who are

7:41

trying to enact change in a very

7:44

harsh environment where if

7:46

their activities were picked up on by

7:48

official sources, government sources,

7:50

state sponsored sources, they

7:52

could face some serious consequences.

7:55

And it's not necessarily the again, like you

7:57

said, that they're doing anything nefarious, it's just they

7:59

can't do it at all without fear

8:02

of some form of consequence

8:04

unless that can remain secure. So

8:06

you've got to figure out how do we make this secure.

8:10

Also, we have to figure out how

8:12

do we frame this in such a way where we

8:14

also admit some people do

8:16

use it for nefarious purposes. Oh, sure,

8:18

of course. I mean there are plenty of

8:21

people out there who are going to use this

8:23

kind of anonymous connection in order to conduct

8:26

illegal or otherwise illicit activities.

8:28

We've talked about some of them in previous episodes,

8:31

in fact, and we'll mention some more as

8:33

we go along. So again,

8:35

it's one of those things where you would probably argue

8:37

that it's a relatively small

8:40

percentage of the population using it for

8:42

these purposes, but they're the ones who get

8:44

the most press, uh, and

8:46

so therefore it kind of

8:48

creates this public perception that people

8:50

who use tour are up to something. Also,

8:54

you know, we mentioned the fact that in a

8:56

normal Internet communication, the

8:59

you know what, what amounts to the

9:01

the address on the label is perfectly

9:04

visible because it needs to be so that it can route

9:06

across gets to the place it's gone. Yeah,

9:08

and Tour they had to figure out a way

9:11

around that so that you could have it be

9:13

ob you skated, so that if

9:15

someone were to snoop in on communication,

9:18

they would not be able to determine what the origin

9:20

nor destination were. And

9:22

that is pretty amazing

9:25

stuff because you gotta you gotta figure out a way of implementing

9:28

that where it can still work, Like, how

9:30

do you disguise the address and

9:32

still hope that it gets to where it's going,

9:35

Because if we did that to the to the US Postal

9:37

Service, our stuff would never get

9:40

anywhere. And it wouldn't

9:42

be their fault either, because you just wouldn't

9:44

be following the rules. Oh sure, Yeah, if you don't write your

9:46

address on something, then how does

9:48

it get to that place? So here's

9:50

another funny thing, Lauren, Um,

9:53

who was it that came up with

9:55

this whole tour idea?

9:58

I mean it must have been like, um,

10:00

like hackers, you know at

10:03

def con convention, who

10:05

all got together and so we don't want the government looking

10:07

in on our stuff, right, you know? It was the

10:09

government. It was the it

10:11

was it was the U. S. Naval Research Laboratory

10:14

UM back in back in actually,

10:17

which makes it extra hilarious that

10:19

that the n s A has

10:21

kind of been trying to crack trying

10:23

to crack it because you've got a government agency

10:26

doing its best to figure out how to intercept

10:30

information that goes across a tour network,

10:32

and another government US government

10:35

entity that's responsible in

10:38

large part for the creation for its creation

10:40

and furthermore, other governmental agencies

10:43

that are responsible for funding it. As of one

10:45

point two four million dollars, half

10:48

of tours revenue UH came

10:50

from government grants, including a large

10:52

part from the Department of Defense. So this

10:55

is an example of two different parts

10:57

of the United States government working at odds

10:59

against each other, one part saying

11:01

this is absolutely necessary for us to

11:03

be able to operate in a

11:06

secure way, and the other part saying,

11:08

we want to be able to see what's going on here. So

11:11

so so yeah. But but this all got its start back

11:15

with the U. S. Navy and UM. It

11:17

was part of an onion rooting

11:19

project. Routing project, Yeah,

11:21

rooting. If you're in England, it's routing.

11:24

Here in the US, it's usually routing either

11:26

way. Why would you even call it

11:28

an onion It's because it relies

11:31

upon quote a layered object

11:33

to direct the construction of an anonymous,

11:35

bidirectional real time virtual circuit

11:37

between two communicating parties and initiator

11:39

and responder. And that's as clear as

11:42

day. Yeah, we can just end the podcast now,

11:44

guys, don't worry. We're going to explain the

11:46

whole layered thing a little bit later

11:48

on. So we will. We will make

11:51

sure that you understand why

11:53

an onion It's actually a pretty clever

11:56

way to describe what's going on. But the

11:58

project had specific goals

12:01

to research and develop and build

12:03

anonymous communication systems, to

12:05

analyze other anonymous communications

12:07

systems, and to create low latency

12:10

Internet based systems that resisted traffic

12:12

analysis, eavesdropping, at other attacks

12:15

from outsiders as an Internet

12:17

routers or insiders as an

12:19

onion routing servers. I

12:22

have more to say about the secrets of tour. The deep

12:24

Web got a lot of layers

12:26

to peel off that onion. But before we

12:28

get to that, let's take a quick break.

12:38

So the ideal was to create some

12:40

form of distributed system

12:42

where you could have two parties communicating

12:45

with one another and no one would be

12:47

able to know that those two parties were in

12:49

communication. They would know the communication

12:51

is going on because traffic is moving across the network,

12:54

but because of the network's design, they

12:56

would have no way of knowing what to end

12:58

parties were actually communicating with one another. Because,

13:00

just as we were saying with that snoop, even

13:03

if you can't see what the information

13:05

itself is, just knowing who is talking

13:07

to whom gives you a lot of

13:10

info right because of this,

13:12

and funnily enough, the Navy actually had to step

13:14

back from the project in order to make it actually

13:17

useful because the network needs to be open,

13:19

right. Um. So, I mean if

13:21

if you know, if you can see that everything is

13:23

coming through, if if

13:26

only the Navy used it, then you would know

13:29

whenever communication was happening that the Navy

13:31

was communicating with people like

13:34

you would. You would have limited the number of

13:36

people that could possibly be the ones

13:38

communicating by making it open and

13:40

say this is a playground where everyone

13:42

can come in. Suddenly you can't tell

13:44

who's communicating with whom because there's so

13:46

many's too much noise and not in the traffic,

13:49

right. Um. So, the project incorporated

13:51

as a nonprofit in two thousand six, and

13:53

it currently depends a whole lot on crowdsourcing.

13:55

UM. There are only nine full time

13:57

tour employees as of this podcast, which

14:00

we are recording on April. By

14:03

the way, um and uh, the

14:05

rest of the development is spread across dozens

14:07

of part time assistants and hundreds of volunteers.

14:10

The code is open source, which

14:13

actually makes it harder to mess with. Um. You

14:15

know, like if someone say, say the n s A

14:17

tried to create a vulnerability deliberately,

14:20

then anyone could catch

14:22

it, right, Yeah, it's not like it's hidden

14:25

the way behind closed doors. In that way, it

14:27

gets overlooked and you suddenly have this back

14:29

door entrance into the Tour network. No,

14:32

it's it's it's much more likely for someone to catch

14:34

it if lots of people are looking. Yeah exactly. Yeah,

14:36

you've got lots of people checking on it all the time.

14:38

So it's actually more secure by being

14:40

in plain sight in that way. So here's

14:43

how it used to work. Because you

14:45

know, I mentioned that tour was

14:48

had an onion in the oh, but it doesn't really

14:50

involve onions anymore. And

14:52

then we've mentioned onions. Yeah, so yeah,

14:54

so we're gonna we're gonna go back to how

14:56

it worked originally because the way it works

14:58

now is not that much different, but it doesn't involve

15:01

the onion metaphor anymore. So, first

15:04

of all, to achieve anonymity, the Tour Network

15:06

uses something called privoxy filters,

15:08

which prevent client information from reaching

15:11

servers. So this means

15:13

that a client, you know, that's that's your computer.

15:16

When you are trying to access anything,

15:18

Let's say you're using your your browser

15:21

to access your email, because I love that example.

15:23

It's easy one. So your your computer

15:25

is the client. It's sending a request

15:28

to another computer. It's asking for

15:30

data from this computer that

15:32

hosts the the email service

15:35

that you use, and that is called the

15:37

server. Now, normally the server

15:39

receives information that can identify the

15:41

client, so you have some sort of

15:43

address that identifies

15:46

this is the machine that's asking for that information.

15:48

So then the server knows exactly who it's

15:50

talking to. Well, privoxy filters

15:53

prevent that from happening, so it's possible for

15:55

a client's identity to remain unknown

15:58

to the server and also

16:00

to the rest of the network as these

16:02

requests go across the network. Also,

16:05

one of the other things that has and we'll talk more about

16:07

this in a bit, is the ability

16:09

to create hidden services. But you

16:12

know, I'm not going to spoil that because the discussion

16:14

we have later on will really kind of bring that to

16:17

light and it will make much more sense after

16:19

we talk about exactly how this communication occurs.

16:22

Yes, so it's possible

16:24

to use onion routing software to send information

16:27

completely anonymously. In other words, you could use

16:29

it so that you could send an anonymous

16:32

message to someone else, they would not know the

16:34

identity of that person. But that's not the purpose

16:37

of tour. The purpose, like I said

16:39

before, is to allow anonymous channels

16:41

of communication. So you

16:44

and the person with whom you're communicating know each

16:46

other's identity, but nobody else does, right,

16:48

So this allows you to have that honest,

16:51

open expression of information without

16:53

fear of someone else snooping in on you or

16:56

any other consequences apart from whatever

16:58

consequences come from just that communication

17:00

between two parties. If you tell

17:03

someone that they dressed like a slab, there's

17:05

going to be consequences. What I'm saying doesn't

17:07

have to be someone snooping in on you. Good

17:10

point. I get that a lot. Uh.

17:13

So it uses proxy servers,

17:15

and a proxy server acts as an intermediary

17:17

between a client and some other server.

17:20

So you can kind of think of it as this is the go between.

17:23

So if I were to send a

17:25

request to get my email,

17:28

but I wanted to go through a proxy server, I

17:30

would log into the proxy server. The

17:32

proxy server would then send my request

17:35

onto the email server, and

17:38

from the email servers perspective, it looked like

17:40

the proxy server was the origin of that

17:42

request, it isn't able to see back

17:45

to exactly there's

17:48

a hop missing there. So

17:50

that's really important in this. And uh,

17:53

the communication part is

17:55

the tricky part. Like I said, So you've got this information,

17:58

it's passing between nodes or little

18:01

routers within the tour network. Okay,

18:03

so think of these nodes as rest stops

18:05

between the client, the sender,

18:08

and the recipient the server. Right. Each

18:10

node only knows the identity of

18:13

the node before it and the node after it,

18:15

right, So uh, and the note before it and

18:17

after it completely is dependent

18:20

upon when you're sending the message, because you're

18:22

you're going to create new pathways every

18:24

time you create a connection, so it's not like

18:26

you have a set path each

18:28

time. It's like the Internet. It's very

18:30

flexible. So when you

18:32

send a message, and let's say it's going through letters

18:35

A through G, we're just designating these

18:37

nodes as A through G and for some reason it's

18:39

going into a B, C, D, E F G order. So

18:42

node D only knows about

18:44

nodes C and E. The information

18:46

came from C. It knows it has to send the information

18:49

onto E. It has no awareness of

18:51

a B or you know, effor G. So

18:55

that's it. And that means that if

18:57

you were to intercept information

19:00

passing between two nodes, you would just know which

19:02

note it came from and which node it went to. You wouldn't

19:04

know the actual person who sent it, nor would

19:06

you know the person to whom it went. Ultimately,

19:08

on top of that, the nodes

19:11

encrypt the communication as it's passed

19:13

along. Yes, and this is where you get

19:15

that layer and layer and layer of

19:17

encryption. And because there's so many

19:19

layers of encryption, well, what else

19:22

has lots of layers? An onion?

19:24

I was going to think of Game of Thrones, but yes,

19:26

Onion is right. Onion is exactly the thing

19:28

that they went with because Game of Thrones really wasn't

19:31

that popular. Also, it's proprietary. I

19:33

mean, you know, yeah, that probably would have George

19:35

R. Martin gotten a little upset about that. But

19:37

yeah, so so Onion is in fact

19:40

what they went with because there's so many different layers

19:42

of encryption. Still a little

19:44

bit more to talk about with the secrets of tour in the

19:46

Deep Web, but before we get to that, let's take

19:48

another quick break. Okay,

19:57

so here's my example, and I think

20:00

it's a doozy of an example. Because it's completely

20:02

believable. I decided to

20:05

use as an example two of our beloved

20:07

co workers here at how stuff works.

20:09

Uh And when you start thinking to yourself,

20:12

who would be so paranoid

20:14

that they would need an incredibly secure communication

20:17

process? Two names leap to

20:19

mind from the shadows and then back

20:21

into the shadows, because that's where they belong. One

20:24

of them wearing a gremlin mask. Yeah, and maybe

20:26

a fedora on top of it. It's not a fedora, I

20:28

know, Ben Dora. No, it's a trill Bey,

20:31

I'm going to call it a fedora anyway. So Ben Bolan

20:33

and Matt Frederick so stuff they don't

20:35

want you to know hosts. Yes, and if you've

20:38

never ever listened to that show, go

20:40

check it out. Watched the

20:42

show. Yeah, that's great. So so let's

20:44

say that Ben wants to contact Matt

20:46

and he wants the communication to be secure, so he sends

20:48

it across the Tour network using this

20:51

freely available software. He's got the Tour bundle

20:53

installed and he sends the message

20:55

along. So here's what happens. Ben would

20:57

contact a proxy server on the TORN

21:00

network. Now, that proxy server

21:02

would then determine the route of

21:04

nodes or the number of hops

21:06

that it will take to get from the

21:08

proxy server to Matt's computer. So

21:11

for argument's sake, let's say again that it's

21:13

just uh five nodes,

21:16

So it's a B, C, D E.

21:18

Those are the Those are the nodes that it's

21:20

going to go through. Now, each

21:23

hop becomes an encryption layer

21:26

on this onion, and the core of the

21:28

onion is Ben's original message

21:30

to Matt, So that's the very center. Now

21:33

Ben's proxy server starts to construct layers

21:35

of encryption based upon the path

21:38

that this onion is going to take

21:41

journeying from the proxy server all

21:43

the way to Matt's computer, and the innermost

21:46

layer will be the encryption for Matt's proxy.

21:48

Yes, so the next layer out would

21:50

be the node just before it

21:52

gets to Matt's proxy. The next

21:54

layer out would be the node before that, and

21:57

so on and so forth until you got to the first

21:59

node that the proxy server sends

22:01

this onion onto. Now, every

22:03

time the onion travels to a new node, it

22:06

decrypts that layer. The corresponding

22:09

layer strips of encryption. Yeah,

22:11

so that that layer of the onion gets pulled away,

22:14

and that's how the node knows where

22:16

to send it onto. Next,

22:18

so proxy service sends it on to node

22:21

A. Note A strips away that

22:23

encryption and sees that needs to send it on to Node

22:25

B. Node B gets

22:28

this onion. Now Node BE only

22:30

knows that Note A set the onion, doesn't

22:32

know where the onion originally came from, and

22:34

it decrypts that. Next layer strips

22:37

it free UH, finds the identification

22:39

of Notes C and send it along. Yep. Node

22:41

C doesn't know about Node A, just notes knows

22:43

about Node B, so so on and so forth till

22:45

it gets to Matt. By the time it gets to Matt,

22:48

all those layers of encryption have been stripped away and that

22:50

can actually read what the messages. Therefore,

22:52

anyone who's trying to analyze all of this traffic

22:54

would would just see a message passing between

22:57

two seemingly random routers with

22:59

with no way of knowing either where

23:02

that information came from or what the ultimate destination

23:04

is. Yep. And because you've encrypted it

23:06

so many times, they probably can't even tell

23:09

what the information. They can't read it, they

23:11

don't know where it's going. They're

23:13

in the dark. So to them, it's just all

23:15

they know is that traffic is going across this

23:17

network, but they don't have any way of deriving

23:19

meaning from that. Now, once

23:22

Matt's proxy receives that onion, a

23:25

virtual circuit forms along the

23:27

notes. Think of it as like a temporary

23:30

pathway that solidifies

23:32

between uh Ben's proxy

23:34

and that's final computer,

23:37

and it allows for encryption

23:39

to pass both ways. So you have

23:41

two different kinds of encryption. You've got one

23:44

kind whenever Ben sends a message

23:46

to Matt, and essentially you have the

23:48

inverse of that when Matt

23:50

sends it to Ben. So unless

23:52

you have the key to that encryption, you

23:54

can't figure out what's going on either.

23:57

So it's it's pretty

23:59

secure or now there are some la

24:03

Mainly we're talking about vulnerabilities when you send

24:05

it from your computer to that proxy server

24:08

and when that last proxy sends it to the

24:10

destination, because this is when

24:12

you don't have the protection of

24:15

the network itself. It's when it's you can

24:17

think of it as the information is leaving the network

24:19

to get to wherever it's going or entering.

24:23

Yeah, and again, if you're using

24:25

a browser that still has certain things

24:28

enabled like Flash or Java, then

24:30

you may end up having sending

24:33

along some information that people could identify

24:35

you on based on that, but within the network

24:37

itself, it's incredibly secure,

24:40

right And and so this, this circuit that that you've

24:42

created, well will last as long as both parties

24:45

want it to. You can you can send a command to collapse

24:47

it at the end of your session, you say

24:49

destroy, and it collapses.

24:51

This uh, this virtual circuit, and then if

24:53

you wanted to create a new one, you could, and it

24:56

would be a new virtual circuit, probably

24:58

taking a totally different hathway through the

25:00

nodes. And you know, I made the example

25:02

of ABC D E that kind of

25:04

stuff, but really, you

25:07

know, it could be any order. You know, it's it's

25:10

and it will be any order, right, that's

25:12

all. That's one of the whole points because if it were the same pathway

25:14

each time, then you would ultimately be able to determine

25:17

who sent it and who it went to. So it

25:19

has to be uh, you know. And of course,

25:21

the more the more routers you have available,

25:23

the more of these relay nodes you have, the

25:26

more secure the communication becomes, so

25:28

that's also really important. Then there's

25:30

also a concept called loose routing, which

25:33

adds another layer of security on this because

25:35

like I said, you know, you ultimately you

25:37

have these proxies that no way

25:40

more information than all the nodes do. They

25:42

have to in order to be able to make that layer

25:44

of encryption and have this onion pass

25:47

from one spot to the next. So

25:49

one thing you could do with loose routing

25:52

is that the proxy ends up sending

25:55

the onion on to the

25:57

first node. But that's all the proxy

25:59

knows about the probably and then the first nodes responsibility

26:02

is to create the rest of that pathway.

26:05

So even that first stop isn't

26:07

aware of where, how, what path it's

26:09

gonna take to get to its destination.

26:11

It just knows this is the first step

26:13

of that path, but beyond that, I don't know. So

26:15

it adds another layer of security to it that

26:17

way. Now, again, if you were able to target that first

26:20

node, you might be able to figure some

26:22

stuff out, but really you just know that it came from a proxy.

26:24

You wouldn't know who sent the

26:26

information to the proxy in the first place. But

26:29

yeah, so we've got these these

26:31

endpoints that have some vulnerabilities, but other

26:33

than that, it's it's pretty secure. Uh,

26:36

I've got to We've got a great little bit about how secure

26:38

it is, and a little in just a little while. But

26:40

today nodes or relays

26:43

within the system still don't know the origin

26:45

or ultimate destination of information, and

26:47

you still create virtual circuits between

26:49

the initiator and the recipient for

26:52

encrypted anonymous channels. But there's no more

26:54

use of this onion metaphor.

26:57

I mean, it's not it's not the same implementation.

27:00

You get the same result, but it's a different

27:02

implementation that does it. But it's

27:04

this, you know, it's following a lot of the same

27:06

philosophies. And you've got a Tour directory

27:09

that keeps track of all the available nodes

27:11

that are on the system at any given moment.

27:14

As of January, there were about

27:16

five thousand computers around the world operated

27:19

by those volunteers that I mentioned serving as potential

27:21

nodes in this system. Right, And when you send

27:23

a message to a recipient across the Tour network,

27:26

your tour browser or whatever

27:28

consults this directory, which

27:31

then gives it a route

27:33

of nodes, and then you can send the encrypted

27:35

information across and each node further

27:37

encrypt the message again and only

27:39

knows the note immediately before and after, kind of like

27:41

the previous version we just talked about.

27:44

So it's not that different. It's just

27:46

this whole layer metaphor is kind

27:48

of no longer as accurate. But

27:51

um, yeah. One thing you've got to remember

27:53

is that because you've got this extra layer of encryption

27:56

going on and it's purposefully

27:59

obvious, skating the the origin

28:01

by hopping around a lot, communication

28:04

is not as quick, right. It's going to

28:06

take a longer necessarily, So if you're using

28:08

tour in order to send instant messages, your

28:10

definition of instant maybe a little different

28:12

than what it normally would be. It may just

28:15

be pretty darn quick, but not as instant

28:17

as this other method. Yeah.

28:19

Um. Furthermore, it is not the

28:21

most secure thing that you can do. No.

28:24

I actually read a great article on

28:26

the best way of using tour as

28:28

as part of an approach

28:31

to securely using the Internet and maintaining

28:33

your anonymity, and I thought about including

28:36

it in this podcast. I really did, guys. I

28:38

was gonna go all into the

28:40

tips this guy had, and then I realized that it was

28:42

so in depth and there was so

28:45

much to keep tak into consideration that

28:47

really we could just do a full podcast

28:49

just on that, and perhaps in the future we will.

28:51

If you guys in particular, want to know. Seriously,

28:54

I want to be as anonymous and secure

28:57

as possible. Tell me what I need to do. Well,

28:59

we'll we'll give you podcast. We should we should

29:01

do that episode. UM, I'll tell you right now. It's

29:03

crazy, but but right because because

29:05

even if you're using the most recent version of Tour

29:08

I mean, which, as we have just detailed,

29:10

is an incredibly uh complex

29:13

and encrypted process,

29:15

a determined party could exploit vulnerabilities

29:18

and Firefox itself, which which Tour is based

29:20

in. UM, it could attempt to set up

29:22

monitoring nodes in the network, or

29:25

it could just methodically work on key

29:27

decryption in order to spy on your activities

29:29

so stuff can still

29:32

happen. Yeah, we'll think about doing

29:34

a full security episode. I mean, I

29:36

kind of think we'll have to pull Ben in for that one.

29:38

Oh, that would be great. We should totally do more classovers.

29:41

We'll we'll see if we can get Ben to be

29:43

available for an episode where we really

29:45

talk about and you know it's going to

29:47

sound paranoid and crazy, but the thing

29:49

is technology in order for it to work, UH

29:52

needs to have certain information so

29:54

I can allow you to have this communication.

29:57

But because it needs that certain information. It

29:59

means at your anonymity is at

30:01

risk, so you've got to do these kind of crazy things.

30:03

Also they're wacky bugs like heartbled

30:07

Yeah actually, um okay, go

30:09

ahead and mention this so heart bleed. If you

30:12

listen to our previous episode, we talked

30:14

all about this vulnerability that was an

30:16

open SSL versions one

30:18

point zero point one through one

30:20

point zero point one f and

30:23

UH and how that ended up meaning that

30:25

people who use the heartbeat

30:27

method could get access to encryption

30:30

keys and thus see everything that's

30:32

going across the server. So you might wonder does

30:34

this work on the tour network, this

30:37

crazy relay node network, And

30:39

the short answer is, technically it works,

30:41

but it doesn't help anybody out because

30:45

even if you were to see the

30:47

information moving across a node, it

30:50

still has multiple layers of encryption, so

30:53

it's not as vulnerable. Vulnerable, Yeah,

30:55

although I mean toward toward being toward

30:58

did say that you know, if you if you only

31:00

want to be secure, you might just want to stay off the internet for

31:02

a few days, right, And they did say that

31:04

they had planned on rolling out patches

31:06

of the open ssl UH

31:09

software because the upgrade the

31:11

newest patch does patch that vulnerability.

31:14

So they are going to be fixing

31:17

up those nodes over time anyway.

31:19

In fact, by the time this podcast comes out, most of them

31:21

may already be addressed. But

31:24

yeah they said that, Um that worst case

31:26

scenario, you're probably still pretty

31:29

okay, you know in the

31:31

grand scheme of things.

31:32

That herd

31:34

bleed story was a real eye opener.

31:37

Yeah. Then we have the other

31:39

thing we alluded to earlier, oh right, hidden

31:41

services, and that's where that dark net or

31:43

deep web kind of thing comes in. Um

31:46

okay. So, so tour also

31:48

provides a way to to offer up access to

31:50

a server or to run an entire service

31:52

without revealing your IP addressed to your users

31:55

and from behind a firewall. Um,

31:57

sites and services set up like this are are off

32:00

the beaten Internet path. You can't even find them

32:02

using Google or other web searches.

32:04

You have to be using tour in order to find

32:06

them. And um they're they're all using what's

32:08

called the dot Onion extension because

32:11

onions. Um okay. So, so basically

32:13

how this works. The hidden service has

32:16

a public to tour listing, and

32:18

so when a client wants to access that service,

32:21

the client sets up a rendezvous node

32:23

and sends along an access request via

32:25

the usual tour encryption routing

32:28

process UM through a

32:30

random introduction node that the service

32:32

has set up UM, and

32:35

then the client and service can contact

32:37

each other through that rendezvous node, again

32:39

using the usual tour circuits UM.

32:42

It's it's like the introduction and the rendezvous

32:44

nodes are translators, right. It

32:46

protects the service and the client because

32:48

neither knows where the other is.

32:50

That the translators are the recipients for

32:52

each party's communications. And so

32:55

this this deep web

32:57

or darknet hosts

32:59

law of different stuff, some things that

33:01

are definitely in the nefarious category,

33:04

like the Silk Road, although Silk Road

33:06

still has some legit. Sure

33:11

of the stuff that was on Silk Road was completely

33:13

legal, the other not

33:15

so much. Yeah, so a silk

33:17

Road, of course that got shut down, but it

33:20

existed on tour and

33:22

this kind of hidden web because you know,

33:24

you wouldn't want it to

33:26

be easily accessible, uh,

33:28

and then everything would come

33:30

crashing down, you know, ultimately came crashing

33:32

down anyway, but it was hidden better than

33:35

just sitting there and on the web. So

33:37

yeah, that's that's definitely one

33:39

of the other issues. And again there are

33:41

other things that are on this deep net, this

33:44

this dark net or rather or deep web that

33:47

again not nefarious at all. They

33:49

have very legitimate purposes for existing.

33:52

It's completely legal, but it's also designed

33:54

in such a way as to protect the identity

33:56

of the people who need to use the services.

33:59

So it again, just

34:01

because we have some really high profile

34:03

examples of naughtiness

34:06

doesn't mean that the entire network

34:08

is naughty, just like there are other

34:10

services that people have used where some

34:12

people are using it in order to get

34:14

like illegal downloads of whatever

34:17

content they want, but most people

34:19

aren't. A lot of the focuses on the people

34:21

who are the pirates, and thus the entire

34:23

service gets painted as yeah,

34:25

yeah, it's I I read a really

34:28

great quote and I don't have it open right now, and um.

34:30

Bloomberg business Week did a really great

34:32

article in January about

34:35

about tour in general and the

34:37

kids who are running it and all that kind of stuff, and uh,

34:40

the the example that I think they used was that,

34:42

you know, you don't hear about someone

34:45

who's stalker couldn't find them. You

34:47

you hear about the kid who got drugs or

34:50

the child porn rang or something,

34:53

right, Right, So you know, there are

34:55

some very very the

34:57

Navy wouldn't have been interested in making this

35:00

uh in order just to have crime happened,

35:02

because as low as your opinion of

35:04

the Davy, maybe depending on if you're

35:06

a Marine or not, it's it's

35:08

really not in that business. No. But

35:11

but certainly the fact that this kind of

35:13

illegal activity can go on means that

35:15

it attracts attention from, for example,

35:18

the n s A. Yes, uh,

35:20

I love the stories about the n s A and

35:23

Tour because they're both

35:25

infuriating and funny at the same time.

35:27

So infuriating in that uh, the

35:30

n s A has attempted. We know

35:32

the n s A has attempted to try and crack

35:34

because some of those

35:36

slides that have come out from Standon's

35:38

League as specifically mentioned Tour yep

35:41

and UH. One of

35:43

the documents within the n s A is

35:46

titled Tour Stinks. And

35:48

the reason they say Tour stinks is because

35:50

it's so gosh darn't hard to figure

35:52

out what information is within

35:55

the Tour network. Now, they

35:57

do note that if you are able to target

36:00

those points where information is

36:02

coming into the network are coming out of the network,

36:05

then you are more likely to be able to determine

36:07

what is going on and who was talking

36:09

to whom. But if it's

36:12

within the network itself, there's no

36:14

report that has leaked so far that

36:16

has indicated the NSA has been able to crack that,

36:19

which has not stopped a whole lot of

36:21

theorists from saying that they

36:23

have totally cracked it, and that

36:25

the reports saying that they haven't cracked it

36:27

are just so that people feel, yeah, that they

36:30

people will feel a false sense

36:32

of security using Tour. Here's

36:34

the thing about conspiracy theories, and again, I wish we had

36:36

been on here right now. Uh you know, you

36:38

can. You can have a lack of evidence and that

36:40

becomes evidence, or if you have a denial,

36:43

then that becomes hard evidence. You

36:45

know. So I

36:47

I think, I really do think, because

36:49

I don't think the n s A ever intended

36:52

for all the information to leak out based upon I

36:54

don't know everything that's happened since then.

36:57

Uh so I'm pretty willing to

36:59

believe that they have not yet cracked

37:02

how to get look

37:04

at information in a meaningful way on the Tour

37:06

network itself. In general, I would say

37:08

that Tour seems for many

37:11

purposes pretty secure. Now

37:13

keep in mind you still have to uh

37:15

practice good internet security on

37:17

your own, even if you're using tour. UH

37:21

And like I said, well, maybe we'll do a full episode

37:23

on that. If you're interested in that, let's no because

37:25

you know maybe that our listeners are thinking, wow,

37:27

they did a heart bleed episode in a tour episode.

37:30

Go back to talking about Nintendo and

37:32

that wraps up this classic episode from hope

37:36

you enjoyed it. If you have any topics

37:38

that you think I should tackle for future episodes

37:40

of tech Stuff, or maybe there's one that you've

37:42

listened to and you think that really needs an

37:44

update it's seriously overdue. Let

37:47

me know the best way to do that is over on Twitter.

37:49

The handle I use is tech stuff hs

37:52

W and I'll talk to you again really

37:55

soon. Tech

38:00

Stuff is an I Heart Radio production.

38:02

For more podcasts from I Heart Radio,

38:05

visit the i Heart Radio app, Apple Podcasts,

38:07

or wherever you listen to your favorite shows.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features