Podchaser Logo
Home
Supper Club × Bun with Jared Sumner

Supper Club × Bun with Jared Sumner

Released Friday, 2nd December 2022
Good episode? Give it some love!
Supper Club × Bun with Jared Sumner

Supper Club × Bun with Jared Sumner

Supper Club × Bun with Jared Sumner

Supper Club × Bun with Jared Sumner

Friday, 2nd December 2022
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

I sure hope you were hungry.

0:02

Cool. I'm starving. Wash those

0:05

hands, pull up a chair secure that

0:07

feedback. because it's time to listen

0:09

to doctor Linky Wes boss attempt

0:12

to use human language to converse with

0:14

and pick the brains of other developers.

0:17

thought there was gonna be food. So

0:19

buckle up and grab that Bun

0:21

because this ride is going to

0:24

get wild.

0:26

This is the Supper

0:29

club.

0:37

Welcome to supper

0:40

club. Today, we have a

0:42

very special guest. We have

0:44

Jared Sumner, the bun

0:47

creator. I was trying to think of some

0:50

way to rhyme, Sumner, with Bun, and

0:52

fun, creator, all these things, lovely

0:54

things, separate club, whatever we got

0:56

here. My name is Scott Tullenski. I'm a developer

0:58

from Denver, Colorado with me as always. Is

1:00

Wes BOSS as he is always?

1:03

And today, we're Hockin JavaScript

1:06

Runtimes, WorkHockin Bun, and

1:08

Jared. How's it

1:10

going? Pretty good. This episode is sponsored

1:13

by Tupol at TUPLE

1:15

dot app and Tupel is

1:17

the remote pair programming tool

1:19

for developers who are tired of pairing

1:22

over Zoom. This is

1:24

just an incredible pair programming

1:27

tool to share your code with a lot

1:29

of really, really desirable features

1:31

more on them later on in the episode.

1:33

Awesome. So Jared, do you wanna

1:35

give us a little bit of before we get into

1:38

anything to intense or anything

1:40

code, do you want to give us maybe little bit background

1:42

of who you are, what you what you

1:45

do for a career, what you what you're doing now,

1:47

and and maybe how you got there? My name

1:49

is Jared, and I'm the the founder of

1:51

oven and declare Bun. Bunch

1:53

is an all in one JavaScript runtime with

1:56

the built in package manager. JavaScript

1:58

trends, Tyler, bundler,

1:59

the

2:01

really fast script Sumner. It

2:03

takes a lot of existing tools puts it all in

2:05

one and makes it really fast. Awesome. Yeah.

2:07

We we are honestly really excited

2:09

about this because I I probably

2:11

hasn't even been eight months and it

2:13

feels like half of our are talking

2:15

about just, like, new JavaScript

2:17

runtimes like, the edge and,

2:20

like, like, what is, like, the next step

2:22

of all of this So

2:24

it's it's really excited to see Bun on

2:26

the scene with this type of stuff, and we're kinda hopefully,

2:29

with this podcast, we're gonna sort of dig into

2:31

what it is and and what it's for. Mhmm.

2:33

Why did you make fun? Like, isn't is no

2:35

not good enough? What's what's the answer there?

2:38

I was just really frustrated with how long

2:40

everything takes

2:41

when you're when you're building something, especially

2:45

on the front end side, like, that iteration

2:47

cycle time is just really slow. It's

2:50

the point where, like, I was

2:52

just checking a hacker news a lot of the time

2:54

just waiting for things to build. And,

2:56

like, when something takes longer than, like, a few

2:58

seconds, you sort of just immediately lose

3:00

focus, you get distracted, and and then you go

3:02

do the the thing that helps you Bun you're distracted,

3:04

which is read Twitter, redact earnings,

3:07

whatever to and it makes you more attractive.

3:09

Okay. So

3:11

Bun is setting out to replace

3:13

like, a couple things. Maybe let's let's go

3:15

through that. I think probably the biggest one is

3:18

it's replacing your your JavaScript runtime.

3:20

Right? So your most people probably right now use

3:23

Node JS. So the idea

3:25

is that you'd be able to run a

3:27

server with Bun on it. Yeah.

3:29

So you'd be able to run a server with bun on it. And

3:31

you you Bun you'd be and

3:33

button also exposes a built in transpiler

3:35

API.

3:36

So you can use buttons transpiler. But

3:40

Bun actually, on every single file in

3:42

button, it it'll run in the runtime, it runs

3:44

the Transpilot too. And and that's how it's able

3:46

to do, like, built in TypeScript support

3:48

for for Transpilotion just

3:51

removing the types. and

3:53

also things like it has a built in plug in system.

3:56

The API for the plug in system is very

3:58

similar to ISVs. I

3:59

almost

3:59

just directly

4:00

copied the Bun.

4:03

that lets you have,

4:06

like, override I like have your own

4:08

conspilot plugins then you can

4:10

also have macros too, which which

4:13

which have access which have a small amount

4:15

of access to the AST it's

4:18

all done the fastest way

4:20

possible using direct integrations

4:22

with the the engine that

4:25

the transpilot and all of that is written in

4:27

native code, but it

4:29

has very fast JavaScript

4:32

bindings. Okay. And is

4:34

that using Bun under the hood

4:36

that you are binding to? Buddl uses

4:38

JavaScript Core, which is the JavaScript

4:40

engine used by Safari slash

4:42

web kids. Okay. Yeah. Interesting.

4:44

This was something where I experimented

4:46

with a bunch of of run times before

4:48

choosing JavaScript core and It

4:51

just consistently on the benchmarks,

4:53

I ran JavaScript Core was

4:55

the fastest to start up while

4:57

also having a really fast jit. And

5:00

their team is just really good at optimizing

5:02

the engine. And not that V eight isn't V

5:04

eight is obviously great too. It's really

5:06

cool just like I often tweet about

5:08

the various performance improvements that

5:10

the the web kit slash Safari team does, and it's

5:12

really cool. Like, they made a

5:15

recent one was they

5:17

made string that replaced,

5:19

like, I think it was two times faster. It

5:21

wouldn't have been more than that in a bunch of different

5:23

cases. And it's,

5:25

like, this really crazy optimization where,

5:28

basically, they can detect the

5:30

the and the

5:32

jit that's the the jib will actually look

5:34

at the the constant string you're you're replacing

5:36

and and it'll affect, like, constant

5:38

propagation. in the

5:41

in how the different I

5:43

forget what they're called. The

5:45

the the JS nodes basically, how they represented

5:48

in the result is that it

5:50

can it can basically it's basically like memoizing

5:52

it sort of.

5:54

But automatically Yeah. Without user,

5:56

user input without the user having to do anything.

5:58

And Bun it's gotta be great

5:59

to just I mean, for users, the

6:02

average user is not going to know as friends.

6:04

Right? They're gonna fire up Safari Bun then just be like,

6:06

alright. Safari performs better or or

6:08

any browser that's using or,

6:11

you

6:11

know, there's I think there's Tory is, like,

6:13

the the web

6:14

the web or the application

6:16

framework for for this type of thing.

6:18

And, like, nobody's gonna necessarily know

6:20

what's happening, but they're gonna experience, the

6:22

web is being faster Bun developers get that

6:24

for free. So, yeah, that sounds really great

6:26

to hear. I actually had no idea that

6:29

Bun wasn't using VA. III

6:31

just figured it was for some reason. So

6:33

that's interesting. That's actually really

6:35

interesting because I forget who

6:37

it was we were talking to. The folks from El

6:39

Gallia because we we talked

6:41

about web kit a while back and we're

6:43

like, yeah. Is is anybody other than Apple

6:46

use it and some the folks from El Gallia

6:48

El Gallia came home. Yeah. Like,

6:50

we we work on it. They we put this thing

6:52

in the PlayStation. Like, it's it's pretty

6:54

big. thing, and it needs to

6:56

go everywhere. So I was kinda

6:58

excited to see that. Do you know like, do people

7:01

outside of Apple

7:03

work on JavaScript core as well?

7:05

Or is that more like a web kit thing

7:07

that you sometimes get outside contributions?

7:09

They have outside contributors. And

7:11

and also some of the the people who work on

7:13

WebKit now, like, at Apple,

7:15

originally were just outside contributors.

7:17

Oh,

7:19

It makes sense. In PlayStation

7:21

in particular, is it big

7:23

outside contributor? You can see on on GitHub.

7:25

There are their windows builds

7:28

of web kit that are done by the

7:30

PlayStation team. Interesting. And

7:33

and in general, think there's a I think there's also,

7:35

like, a refrigerator company. I'm

7:37

I don't quite remember. Also, a lot of the

7:39

the readable stream and writeable

7:42

stream, like, the web stream's APIs are have

7:44

Canon in the copyright, like, the the

7:46

camera company. Really?

7:48

This is really random. I don't actually know the

7:50

story there. because the

7:52

webkit is effectively, like, a monorivo. And

7:56

so so, like, Bun can directly use code

7:58

from webkit. for, like,

8:00

various web APIs. Like, one example of

8:02

that is, like, readable, stream and writable,

8:04

stream. Most most of the code is just

8:06

directly copy pasted from webkit.

8:08

seem for, like, URL and

8:10

a lot of the dom event stuff, like,

8:13

event target. It makes it much easier for

8:15

fun to be web like like,

8:17

web API compatible because --

8:19

Yeah. -- we don't even have to implement the actual

8:21

web APIs. We can just use it from directly

8:23

from a browser. Yeah. I think that's

8:25

that tax rate? Because, like, when I think about

8:28

projects like Bon, I think, how

8:30

in the world, like, can

8:32

can there be all of this

8:35

feature feature

8:37

completeness with a tool like

8:39

this that's just getting started from

8:41

scratch essentially now, but it's pretty

8:43

amazing that you have additional help. We're

8:45

kind of odd

8:47

question here, but how did you come up with

8:49

the bun name? The

8:51

whole bun and other thing. That's

8:53

it. Bun, I so I didn't come up with

8:55

any of the names. Bun, it was originally

8:57

named after a friend of mine has a

8:59

bunny who who she named and

9:02

-- Right. -- name.

9:03

And and

9:05

at first, I was like, well, I really wanna name

9:07

this after your bunny, but then I thought about

9:09

it more. And Bun, it makes sense because

9:11

it's, like, also because I

9:13

think, like, cute names and locals

9:15

are good. But Bun- Yeah. --

9:17

in in Bund's case, it's like because Bund

9:19

is both a bundler

9:21

and also Bun bundling of

9:23

of many tools in the JavaScript ecosystem.

9:26

So it's,

9:26

like, good in both

9:29

ways. And then I thought Oven

9:30

just sounded good. Our lead

9:33

in best or suggested Yeah. Especially because

9:35

you already had the bun thing going down. You get

9:37

it. I I never put the bun in bundler or

9:39

any of that together, but it also

9:41

tracks. I know you're cool.

9:43

Yeah. That's that's awesome.

9:45

So

9:46

I gonna

9:47

I wanna talk about, like, no compatibility

9:51

and I guess, like,

9:53

web API compatibility. because this is

9:55

kind of we've talked about this a little bit on the

9:57

podcast. Is there's,

9:59

like, this sort of, like, new wave of of JavaScript.

10:01

Right? And part of

10:03

it is, like, let's not just

10:06

write not apps. Let's write it towards the

10:08

the fetch API and the streams

10:10

API and stuff that's just, like, web

10:12

standards. And then there's this other thing that's, like,

10:14

well, we kinda spent in the last ten years

10:16

writing stuff with node modules,

10:18

and that'd be nice to support that. So,

10:20

like like, where are we going with that? Does does

10:22

Bun support node stuff? Or are we are we trying

10:24

to focus on the winter CG stuff.

10:27

Bun is trying really hard to focus on

10:29

on being node

10:31

compatible. while also supporting

10:33

web stuff really well.

10:35

Basically, I think people

10:37

shouldn't

10:37

have to rewrite that code. And

10:40

I think it's like, I think,

10:42

like, the the the bar for having to

10:45

rewrite code is it

10:47

has to be so much better before

10:49

it remotely makes sense and even then it almost never

10:51

makes sense. So Bun

10:54

really, it tends to be a drop in replacement

10:56

for for for node. But

10:58

I think, like, the future is

11:01

closer to what looks like web

11:03

APIs

11:03

than and where, like,

11:06

most of JavaScript should just run the

11:08

same or or, like, it should

11:10

run successfully on many run times

11:12

and many environments. And

11:14

but there's just this gigantic ecosystem. Like,

11:16

I think NPM is the largest software repository

11:18

in the world. I've I've heard that, wow, a few

11:20

times. I just buy, like, package

11:22

count. and and I think it it just be

11:24

crazy to not support it. Yeah. Bun also

11:26

seems like pretty instrumental for its success to

11:28

be able to you know,

11:30

be a drop in replacement to that

11:33

regard where, like, how many people are

11:35

going to be able to start

11:37

fresh from anything or rewrite

11:39

major swaths of things

11:41

to support something. So,

11:43

yeah, that's interesting. So, obviously,

11:45

Bun at JavaScript runtime. It's a

11:47

transpilot, so you can take your type

11:49

script and output, regular

11:52

JavaScript, task runner. Are

11:55

you going to be touching any of the linting

11:58

formatting space that's sort

11:59

of

12:00

ESLINT prettier right now? I we have no

12:02

plans for it right now. Okay. I think this is

12:05

honestly a personal bias, and that is I tend

12:07

to not use Lynters very much. Exactly.

12:11

So so I have, like, fewer

12:13

opinions about them. I my

12:15

my I use prettier a lot. I do think

12:17

prettier is great. I think I think it

12:19

was, like, my second job.

12:21

I I we have

12:23

the PR comments. We're we're always, like,

12:26

formatting issues and, like, typos and or

12:28

and, like, and all of that

12:30

just, like, just a just a colossal waste

12:32

of engineering time. Yeah.

12:34

Yeah. And like, that is

12:36

AAA perfect example of where tooling

12:39

has a big impact. I still feel

12:41

like formatting time is

12:43

is is enough of, like, a a big like,

12:45

it's it like, pretty definitely has a performance

12:47

issue when it's, like, larger files. Bun don't feel

12:49

like it's it's enough of an issue relative

12:51

to, like, Transpilot or bundlers

12:53

or run time or any of the

12:55

other things. Yeah. I can see you a

12:57

stronger case for a linter, but I

12:59

think

13:00

before Bun before it makes sense for

13:02

bun to do something in that area,

13:04

I think we

13:05

need a much better editor integration than

13:07

what we have right now.

13:09

That needs to be like

13:12

part

13:12

of it where, like, the the editor is

13:14

integrated with runtime. Like, for example,

13:16

we should I think it would be really cool

13:18

to do, like, automatic type generation.

13:21

you with, like, the plug in system. I I

13:23

don't know exactly the specifics of that,

13:25

but that's awesome. And so with your

13:27

node support, do we

13:29

still use NPM or

13:32

do we just require something and you

13:34

NPM install it? Like, where does is

13:36

NPM even a thing in here or does it just reach out

13:38

to the repository on its own? Actually,

13:40

the the version of Bun were about to release, you

13:42

know, I think, in the next it's most likely gonna

13:44

be today. It might be tomorrow. It does

13:46

automatic Bun install,

13:48

but not using NPM, using

13:50

Bun. PAX Manager.

13:52

And it'll and it just works if

13:55

you import, like, if you import load dash, if you

13:57

import React or or whatever, and there's no

13:59

node modules folder. then -- Yeah. -- it'll

14:01

automatically install it and use a shared

14:03

global cache. And

14:05

this shared global cache means

14:07

that instead of installing node

14:09

modules repeatedly for every project you only install

14:11

at once. So you save a lot of disk space, you

14:13

save a lot of time spent installing. then

14:15

if you do need to do if you do need to, like,

14:17

debug your your dependencies because there's some

14:20

issue or or not, you can still use

14:22

node modules. So Bun has full

14:24

support for the regular, like, node modules

14:26

folder because it really started out

14:28

before it was a runtime. It was

14:30

a front end dev server, node module

14:32

resolver Bun transpilot. So

14:34

all of that is kinda is just baked in.

14:36

So so, yeah, you can use node

14:38

modules and there's no,

14:41

like, it's designed to work

14:43

out of the box with that. So, like, you don't need

14:45

to, like, change any of the import

14:47

specifiers or anything like that. Okay. Yeah. What

14:49

about what about, like, common Supper? Is

14:51

it just JavaScript nodules?

14:54

Or it's both Cool. And and on the

14:56

common JS side in particular, the way

14:58

that works internally is it

15:00

actually transforms if you

15:02

have common JS, it'll transform it

15:04

into it to ESM,

15:05

and that just runs automatically. But

15:08

Bun, it actually has a special

15:10

thing that makes, like, internally,

15:13

the Common JS is be becomes

15:15

synchronous Bun, which

15:17

isn't part of this spec, but that's that's how

15:20

that's like a good way to make it the

15:22

the basically the asynchronousness

15:24

of ESM

15:26

when it's unnecessary ends up

15:28

causing applications to load a little bit slower.

15:31

because you have all these extra microtasks. Mhmm.

15:33

Yeah. This this way kind of lets

15:35

you use ESM internally as

15:37

an implementation detail that's, like, not

15:39

visible to the user. Bun it avoids that

15:41

overhead of all the microtoscopy ticks.

15:44

Wow. I'm I'm just looking at the docs here

15:46

Bun I noticed you have support for HTML

15:48

rewriter, which is something

15:50

from Cloudflare workers, which

15:52

allows you to basically intercept the

15:55

request, fiddle with it, and

15:57

and send it along the way. So

15:59

did you have to, like, re implement that yourself,

16:01

or is that something that CloudFare makes

16:03

available? So they have

16:05

their low HTML parser.

16:07

that

16:08

that's open source for from

16:11

Cloudflare. And so -- Okay. --

16:13

and that's the same one that classify

16:15

workers uses. So I

16:17

just implemented the same bindings, like matching

16:19

their API and copied a bunch of their tests. And

16:21

I just like, it's always a global was

16:23

kinda random, honestly. That's

16:25

a cool API, honestly. Like, I

16:27

I've used it a couple times to I

16:30

I have a couple of Sat Software as

16:32

a Service. and they've removed features.

16:34

And I was like, well, screw you guys. Like,

16:37

I I have a domain name. I can I

16:39

basically just wrote a little bit of code in

16:41

between the software service and

16:43

myself and added those

16:45

features back in. It's it's it's

16:47

beautiful API. It is that where you

16:49

see bun being used

16:51

is in, like, an edge location?

16:53

What what are your thoughts on serverless

16:55

edge, all that stuff? I think one is gonna be really

16:57

good for applications

16:59

that do lots of server rendering and

17:01

do and and eventually edge as

17:03

well, but I think right now

17:05

server side rendering and and

17:07

APIs are gonna be more of the focus

17:09

and and also CLI ops. I think those

17:11

two categories right now the the

17:13

on the server side rendering side of things for

17:16

server side rendering React, it's

17:18

more than three times faster than Node.

17:20

Yeah. It's a that it's like

17:22

four four times. Four four no. It's five

17:24

times. And so

17:27

it's it's a number now. Yeah. Where

17:29

does that speed come from. Yeah. It's like

17:31

it's a few things. I I spent, like, basically,

17:33

a whole month on making React

17:35

fast and fun and and also in general,

17:38

making streaming web

17:41

streams faster, fast and fun. The three

17:43

things basically are fun as

17:45

a slightly faster JSX transform.

17:48

that GSX transform improves

17:51

rendering performance for React by around

17:53

thirty percent Bun just built in when

17:55

you have it in production mode. Bun has

17:57

a really fast HCP server. It's

17:59

the point where when people benchmark

18:01

Buddl using Node

18:03

JS based HCP clients, the

18:05

the the bottleneck is the the benchmarking tool

18:08

and not Bun. In most

18:10

cases, if if their throughput is high enough.

18:12

Yeah. And the third thing is

18:14

that Bun a bunch of

18:17

optimizations for that

18:19

for the actual web streams implementation.

18:22

Most of it is around, like, moving

18:24

all the the code for, like, queuing data

18:27

into into native code and

18:29

a lot of there's a lot of overhead

18:31

in the default Webstream's API.

18:34

that

18:34

Bun had to, like, work

18:35

with. So AJSX

18:37

transform, did you just write

18:39

that yourself to convert it to

18:42

Yeah. Yeah. The j the GSX

18:44

transform is essentially all it does

18:46

is it in lines what what React

18:48

itself does. when you call the in

18:50

the the newer JSX transform, if you

18:52

do that JSX function --

18:54

Yeah. -- it just has a it just returns an

18:56

object in a specific format, but

18:58

it turns out most of that can be

19:00

done by the transpiler. If

19:02

you just and just return to

19:04

object literals,

19:05

the

19:06

it And

19:07

then in the case where it doesn't

19:09

where it can't do that, it has,

19:11

like, a a

19:12

separate function that it runs. Bun, yeah,

19:14

that

19:15

was a pretty good performance improvement. Oh,

19:17

and the the typescript stuff.

19:21

So bun just

19:23

scripts types. Right? Is is that what you you

19:25

do? Okay. says nobody

19:27

nobody except for Microsoft has created

19:29

an actual, like, typed

19:31

scripts type checker. do

19:33

you foresee any future where somebody will will

19:35

be able to write that? I understand it's a

19:37

massive Is it a massive project?

19:41

Yeah. I think the version that is likely that

19:43

somebody will do is

19:45

not exactly a TypeScript

19:48

check type checker, but I think that something that

19:50

is interesting is that the types

19:52

that you type annotations proposal.

19:54

Alright? Yes. We call it new types of comments.

19:57

type those comments. Yeah. And I think the

19:59

the long term there, maybe with

20:01

that, we'll open up, is a type group

20:04

alternative. where you could have maybe

20:06

eighty percent of of or ninety

20:08

percent compatibility with TypeScript itself.

20:10

And then you have

20:12

some differences otherwise. Bun you sort

20:15

of if you're if you're not a hundred if

20:17

you don't need to be a hundred percent

20:19

compatible, then you can change

20:20

how it works a little bit and change some

20:22

of the rules. and you could have something that's a lot

20:25

faster and a lot simpler. Bun

20:28

I think that

20:31

would be interesting. And I think that's also

20:33

pretty plausible. of course, the

20:35

challenge there is you like,

20:37

though, probably a lot of why

20:39

people use TypeScript is the really good editor,

20:41

integration Bun editor tooling. Yeah.

20:44

So it's really it's not as

20:46

much like a language problem or, like, even,

20:48

like, the type checker itself is not the hard part.

20:50

It's making it so the developer

20:52

experience with all the tooling is really

20:54

good. and that's just like a

20:56

huge investment. You just need a bunch of people

20:58

working on that. Yeah. No

21:01

kidding. I'm I'm often just

21:03

blown away by even if you look

21:05

at the VS code release of every single month what they're

21:07

putting it's like, how many freaking people

21:09

are working on this plan?

21:12

It's massive. That's it.

21:15

They do such a good job.

21:17

Yeah. Yeah. They do. This episode is

21:19

sponsored by Tupol. Tupol is

21:21

the screen sharing tool

21:23

dedicated to frictionless remote

21:25

pairing. We all know

21:27

that the things that are a huge

21:29

pain when trying to pair over

21:31

Zoom or or Google Hangouts like

21:33

a a high latency sharing or really

21:35

bad resolution Bun just in general

21:37

clunkiness in the UI. I just wanna show you

21:39

this code and have you look at it while we work

21:41

through this together. Well, with

21:43

Tupelo, you get access to

21:46

five k screen sharing

21:48

without destroying your CPU.

21:50

Zap is extremely performant. Also,

21:53

you get the ability to have very

21:55

low latency screen sharing as

21:57

well. So just without you noticing,

21:59

you can get five k screen sharing and low

22:02

latency, so that way you can make sure

22:04

that everyone's on the exact

22:06

same page. So if you want to try this

22:08

tool out, head on over to

22:10

tupol dot app forward slash

22:12

syntax, that's TUPLE dot

22:14

app forward slash syntax Bun

22:16

give it a try today, this thing very

22:19

well may solve all of your pair

22:21

programming woes. Let's talk about

22:23

Zig. What the heck is Zig? And

22:25

and how many times have

22:27

you been asked that question since releasing

22:30

a bond? It's it's a

22:32

a lot. I I think I mean,

22:34

okay. So I'll so Zig is a really

22:37

fast programming language, really

22:39

low level. Yet, you have to manage

22:41

memory all yourself. There's no garbage

22:43

collector. There's no baro checker.

22:46

It's just it's just up

22:48

to you. And this

22:51

is

22:51

it's really good when you when you

22:53

used to understand everything that's going on in

22:55

a program because there's

22:57

no hidden behavior. So for

22:59

example, if in

23:01

in c plus plus, it's really common to

23:03

have constructors and and disruptors

23:05

that run automatically. it

23:08

a JavaScript version of that kind of would would be like

23:10

if you had try catch

23:13

Bun then finally everywhere. that

23:15

finally could just be inserted basically

23:18

everywhere. So you really it

23:20

it makes it really hard to know what

23:22

is actually being run when you call a function

23:24

Bun you create an object or what not?

23:26

And Zinc doesn't have

23:29

that. And

23:30

that ends up being, like, a really good thing

23:32

for performance because you know

23:34

everything

23:34

that is going on. And,

23:37

also, it

23:39

has this it it

23:41

sort of sort of actively discourages

23:43

allocating memory.

23:44

the

23:45

And and So

23:48

and what I mean by that specifically is

23:50

they make it so every time

23:52

you allocate memory, it could

23:54

potentially throw an error. and you to handle that

23:57

air, which makes it really

23:59

annoying to allocate

23:59

memory. So

24:00

then you have this you

24:03

you said so you sort of are encouraged to do

24:06

static allocation, which basically

24:08

means getting

24:11

all the memory you need at once

24:13

and then not using very

24:15

much of it.

24:17

it So

24:18

so it sort of encourages

24:20

you stylistically and also

24:23

through through its other features

24:25

to to break fast

24:28

software. That that's wild. And, like,

24:30

how did you find out about this? How do you how do

24:32

you learn this type of thing? I the

24:34

first time I saw Zig who was on Hacker News,

24:36

it was somebody had posted their

24:38

docs, and I just read the entire

24:40

thing, like, top down. And

24:42

it that sounds like it's, like, really complicated, but

24:45

it's it's really not it's it's not a huge

24:47

because the language is really simple. It doesn't have

24:49

a ton of syntax. You can just read it

24:51

in one sitting. Yeah. And then also, it's all

24:53

one page. I really like that style of box where you it's

24:56

all just one page. Yeah. Just

24:58

search it. Command f. Command f. To

25:00

find anything. Yeah. What what's

25:02

your history with languages? So, like, I I would assume

25:04

you have a a large history of

25:06

of different languages. Is is

25:09

that your Your your

25:11

vibe? Yeah. Kinda. Well, not

25:13

totally, like, medium.

25:15

My first language was

25:18

Ruby I the

25:20

first thing I learned was,

25:23

like, Ruben Rios. And then

25:25

and then after that, I learned JavaScript.

25:28

a little bit at that point I didn't know if Bun dropped

25:30

it very well at that time.

25:33

And then I spent and

25:36

and and Bun

25:38

then after that, I built a few iPhone

25:41

apps,

25:41

which was where

25:42

it's an objective c.

25:44

them And then after

25:46

that, I mostly worked as a on

25:48

front I mostly spent I

25:50

spent more time as a front end engineer like,

25:53

system stuff. A little after that, I spent

25:55

about a year building a game in a it it

25:57

was building like a multiplayer box of browsers.

26:00

And that was really performance

26:03

intense.

26:03

Like, because we had to do the

26:06

rendering, we had to do the

26:09

multiplayer part we had to do

26:11

and

26:11

and with Boxes, you have to fit a

26:13

ton of data into the smallest

26:15

amount of into, like, a

26:17

very limited amount of memory. because in

26:19

browsers, you really can't use more than a gig of memory before the top

26:21

just crashes. It it was I spent

26:23

probably, like, three weeks when I was working on the

26:25

game just on, like, how do we

26:27

unlike the object for storing, like, how we

26:30

store the boxals -- Yeah.

26:32

-- because it also needed to be sent over

26:34

the network. and and it needs

26:36

to be, like, synced between multiple

26:38

players as they edit. It was

26:40

really hard. So I learned a lot of

26:42

doing that. So is that what your understanding of, like,

26:44

memory management comes from? because it seems to

26:46

me like like getting

26:48

into something like Zig is pretty

26:50

intimidating for a developer like me

26:52

who's only ever had languages

26:54

like JavaScript, which are garbage

26:56

collected or Even rust is

26:58

fairly intimidating for me. Right? So,

27:00

like, is that where

27:02

you really got got your hands dirty

27:04

with with that kind of more

27:06

intense memory management?

27:08

I guess, why wasn't I more

27:11

intimidated? I

27:13

think also I I did I did spend a little bit of

27:15

time with Go. I always say Go was like a good

27:17

warm up. Go was still garbage collected, but

27:19

you Bun Go, you still sort of

27:22

you still think about There's

27:24

like, you you still

27:25

think about bikes more directly. You

27:28

can you can do, like, memory

27:31

unsafe things in a way that's generally

27:33

not what you can do in JavaScript. But

27:35

I would say that So

27:38

the most part programming stuff, the

27:40

way I think about it

27:41

is it's all just code. And,

27:43

like, somewhere Sumner you do something,

27:45

there's a function being called,

27:47

there's some maybe some

27:50

assembly being generated at

27:52

some point. And and

27:53

pretty much everything is just like

27:55

sugar on top of that.

27:57

the So

27:58

even if and and also it's like

28:00

not like, you can

28:02

in in Zinc's case in particular,

28:04

the hardest thing really is the memory management

28:06

stuff. But for the most part, the language is

28:08

really simple. Like, if you if you could

28:10

ignore that, then it probably has

28:12

the similar it probably has a similar amount of Syntaxfm,

28:15

like, interesting.

28:16

Wow. This is

28:18

maybe a crazy

28:20

question to ask, but it's something that I'm I'm

28:22

curious about. Have you ever thought

28:24

about putting bun on

28:27

hardware I know there's Espresso, which is like

28:29

a JavaScript interpreter that runs in low

28:31

memory and and things like that.

28:33

And we've never really we kind

28:35

of, but you never really been able to run

28:37

node on small hardware other than

28:39

like a raspberry pie. Is is that something that

28:41

could maybe happen at some point? the

28:44

the very literal answer is,

28:46

like, yes, it is possible. And

28:48

-- Yeah. -- webkit even has

28:50

a webkit port. WPEs specifically

28:54

for embedded devices.

28:56

Really? So, like, they they have a

28:58

a build of what of what get designed for this that

29:00

bun could in theory use. And also

29:04

Zig is very good for embedded programming

29:06

environments Bun, like, because of the

29:08

a lot a lot a lot

29:10

because of this philosophy of of

29:12

statically memory allocating memory and low

29:14

level memory management. That's, like, perfect for

29:17

for embedded. and a lot of Bun code

29:19

itself kind of looks like you would what you would

29:21

do in an embedded environment. But

29:23

I think it's it's like

29:26

a

29:26

whole different focus Bun something that we're just not focused

29:28

on. Yeah. Yeah. Totally. Oh, man.

29:30

I'm looking at this WVP. This is supported

29:32

by Gallia. and

29:35

they Jared showing photos of the fridge I

29:38

think this is a BMW

29:40

console. Makes sense. That's that's why photos

29:42

of the fridge. We have a I remember of the

29:44

fridge. back in the day that a

29:46

lot of, like, infotainment systems were

29:48

running on, like, q x or something like

29:50

that. It was, like, a BlackBerry port

29:52

I'd I'd like to do a show on that anyways. That's

29:54

not the show, but somewhere.

29:59

What

30:00

about, like, back to the node

30:02

stuff? Package is there a package JSON

30:04

file in bun projects? I guess there can be? What

30:06

what's your thoughts on that? sort

30:08

of the way, I I think this is somewhat this is Sumner, like, slightly

30:11

controversial, but I

30:12

the think

30:14

Package

30:16

JSON is kind of package JSON is

30:18

sort of just like a better import map than

30:20

the import maps that you see

30:22

in browsers in a lot of ways. And I think and

30:25

that's kind of how bun sees it. So bun has

30:27

very much has built in support for packaging. So

30:29

if you use the exports field,

30:31

You can use main module, all

30:33

that stuff. If you if you wanna write a

30:35

library and target Bun specifically, you

30:37

can use the bun export condition.

30:39

Bun bun will pick that up. Yeah. So like bun

30:42

has great support for that. And with the new

30:44

changes to the module resolution and the next

30:46

version of bun, where it automatically

30:48

installs NPM packages, it still supports Supper

30:50

JSON. So the idea there is you would

30:52

have your packaged JSON as

30:55

you did before. And if it just doesn't

30:57

have a note modules folder, then your

31:00

dependencies, a Bun package manager will

31:02

automatically read your dependencies, install

31:04

them into the shared global cache and

31:06

use them. And and and it's intended to

31:08

behave a lot like

31:11

Bun. and PMPM and and

31:13

all of those. And in fact, you

31:15

can actually use button install as a separate

31:17

package manager, and that will

31:19

install a new modules folder similarly to to the

31:21

-- Mhmm. -- the impact arrangers. And that node

31:24

modules, that's completely designed to

31:26

work with node. Interesting. Yeah.

31:28

Like, relation actually to one itself.

31:31

It's just now that button is Bun runtime is starting to use

31:33

it too. Why can't node just get

31:36

faster? That's a you know,

31:38

like, one why are people are

31:40

obviously creating new JavaScript run times

31:42

to sort of get around that.

31:44

Is there anything stopping? Is it just

31:46

it's just too big of a project? Too much craft. I

31:48

think it's really hard. I think it's things

31:51

things sort of need to be fast from the beginning

31:53

because otherwise

31:56

you basically need to rewrite everything. There's

31:58

a lot of for

31:59

for fun, it's like a lot of the time we

32:02

spend is on how

32:03

do we implement this? Have API in

32:06

a way that makes it enable it to be

32:08

really fast? And

32:10

and designing APIs to be

32:12

really fast. it's just really hard. Like, it's possible. It's

32:14

just really hard. Yeah. It's just

32:16

that's so much momentum in one direction,

32:18

so much code. What about like,

32:21

okay. What's what's the future of

32:22

Bun look like? In in,

32:25

like, short term future or long term

32:27

future? Like, where do you

32:29

see bun? evolving Bun?

32:31

the short term, our focus right now is getting to

32:33

a one point o stable release. And

32:35

I think that's gonna be another three

32:37

or four months. And

32:39

then we're

32:40

gonna have hosting too as a

32:42

built in part of the product. And

32:46

and that hosting will work with existing frameworks like

32:48

Next and remix and a lot

32:50

of a lot

32:51

more. In the longer

32:53

term, the

32:54

the goal is to become the default way

32:56

people write JavaScript and and

32:58

run JavaScript. The reason why is it, I think

33:00

we can just make it a lot faster. and

33:02

and make it a much simpler developer experience Bun

33:06

putting everything into one tool by

33:08

making it really fast. Another

33:10

thing I'm really excited about is our testing

33:13

library. We we have a built in testing library

33:15

and it's like, last I

33:17

checked, for

33:18

small scripts, it's something like forty times faster

33:20

than just. Mhmm. And

33:23

if it's like if you're doing like if it's like

33:25

tight scripts and And

33:27

it's like a lot of TypeScript because the TypeScript

33:30

transpilot is really fast and fun. It

33:32

ends up being something

33:32

like two hundred times faster,

33:34

which is kind of a big number

33:36

I don't I don't I feel comfortable

33:39

paying two hundred, but

33:41

that's what the the what I tried it last. That's

33:43

what it looked like. And and I

33:45

think a lot of it comes down to just when

33:47

you put everything into one tool

33:50

and those and you can make the pieces

33:53

work really really well together. Yeah.

33:56

Wow. That that's awesome. So

33:57

for button, like, transpiling TypeScript

33:59

is is basically free. because it could

34:02

be because it should also trigger because it always

34:04

transpires JavaScript, and the actual TypeScript

34:06

part doesn't really add much

34:07

overhead at all. And and you have,

34:09

like, the whole JEST library API is

34:12

available in BUNS. So, like, if you have a large

34:14

test suite, you'd be able to move that

34:16

over? The the we're not to we're not

34:18

there yet.

34:20

Okay. We have we don't have that many of the

34:22

batches implemented yet, but that is the plan.

34:25

We're

34:25

just not done

34:27

it. In fact, the the command for for

34:29

the test for the test command is

34:31

literally WIP test to emphasize

34:34

that is a work

34:36

in progress. and it. When

34:38

you're writing code to write that test,

34:40

are you writing that in Zig? Or are you

34:42

writing that in JavaScript, TypeScript? Like, is there

34:44

any type of Bun that's

34:46

written? Okay. how much of is there any part of bun that

34:48

itself is written in JavaScript? Oh, yeah.

34:50

So right now, there's a the most

34:52

of the node polyfills Jared

34:56

written in JavaScript. So if you if you like

34:58

and some of these will eventually be moved

35:00

to native code. Otherwise, like, the readable streams

35:02

is a lot of it is in JavaScript, but the

35:04

readable stream implementation uses, like, the JavaScript court

35:06

built in. So it has this kind of weird at

35:08

syntax that looks like

35:09

it's decorators, but it's

35:12

not Yeah.

35:12

Most of the the the JavaScript in Bond right now

35:15

is Node JS Polyfills.

35:17

Yeah. And then some

35:20

of the FFI code, but has a built in FFI library

35:22

that's sort of like it's

35:24

sort of like nappy, but

35:28

a lot faster in in in a a bunch of cases.

35:30

What sorry. What's FFI? Yeah. So

35:32

FFI stands for foreign function interface,

35:34

and the idea is it lets

35:38

you call code written in a different

35:40

language from JavaScript. And in BUNS case,

35:41

it's Bun really fast.

35:44

It it tends to be

35:48

what were the numbers? I think it was, like, three times

35:50

faster at three times lower overhead, function

35:52

call overhead. Let me double check. Yeah. I

35:54

think think it was three times.

35:56

lower overhead than nappy, node to API. And button also supports

35:59

nappy too. So you can use node

36:01

native modules in bottom.

36:04

really? So if you have

36:06

a eight year old compass

36:08

project that needs to be

36:11

natively bundled, you

36:13

can do that. potentially. There's also the the thing with it

36:15

is that the downside with it right now is there's not

36:18

really good test coverage for our

36:20

nappy implementation. Bun need to

36:22

fix that. But but

36:24

but the code is written.

36:26

I don't need tests. What

36:30

what are your personal thoughts

36:32

on types in JavaScript? Like, you

36:35

you we've mentioned very briefly

36:38

the the the commenting

36:40

proposal for types in JavaScript. Do you think

36:42

that JavaScript and TypeScript, like,

36:44

there will eventually be some sort

36:47

of great. JavaScript now has

36:50

types and we don't need to have

36:52

TypeScript. Or what are your thoughts there?

36:54

I think if, like, I

36:56

think the TypeScript as it is today

36:58

is mostly good

37:00

from, like, a a technical perspective it's

37:02

mostly for developer experience

37:05

and for,

37:05

like, editor tooling and

37:07

for, like, type safety.

37:09

But I think Bun the

37:12

version of of like, if I

37:14

were to, like, wave a magic wand and say,

37:16

like, what is what would, like, the best way to

37:18

do types from, like, a

37:20

technical perspective in JavaScript.

37:22

b, I

37:23

wouldn't what I

37:25

would do is so

37:26

so you know how, like, there's strict mode and

37:28

there's, like, sloppy mode. Yeah. I would I would do, like,

37:30

types or, like, a typed mode. Like,

37:33

you do, like, use types or

37:35

something -- Yeah. -- at the top of the file,

37:37

and then have you

37:40

specify size

37:42

types. So you so you wouldn't be saying

37:44

the type of the number, you would be saying the type is

37:47

like a in like an i sixteen, like a signed

37:49

sixteen bit integer or

37:52

Yeah. or, in sixty

37:54

four and it's, like, a a sixty

37:56

four bit injure or or

37:59

whatever. And and you would

38:01

have, like,

38:02

sized structs. Like, I

38:04

think

38:04

there even is, like, AAA

38:06

size types proposal. I don't think

38:08

it's called size types or things like structs. Oh,

38:10

really? Something like that. But it's like stage one. So -- Yeah. -- it's

38:12

not -- Yeah. -- if it's ever been happen. I think something

38:14

like that would be really interesting. because

38:18

then that would be that would be

38:20

a lot easier for JavaScript to have

38:22

much better performance than

38:25

it is right now. And, like, it

38:27

that would make it potentially competitive with, like, more like,

38:29

much more competitive with, like, rust and

38:31

natively compiled languages. If you can

38:33

have types that

38:36

are known sizes and and fixed sizes.

38:40

Mhmm. But that's like

38:44

a huge break from from the -- Yeah. -- everything. And

38:46

it it it would be pretty

38:48

niche. Like, unless unless, like, the

38:50

editor tooling was really good.

38:52

Yeah. Yeah. And it also was

38:54

like a huge it it would be a huge investment

38:56

from various JavaScript engines to support

38:58

it because -- Yeah. -- now you

39:00

have, like, then it's like,

39:02

well, what about web assembly? Web assembly is pretty good. And

39:04

it has and that's and

39:06

and it works with many languages instead

39:10

of just one language. So I don't I don't really know. But I

39:12

think if, like, a if

39:14

types were to be like a built in thing to JavaScript,

39:16

I think something like that would be

39:18

really cool. Nice. What what

39:20

about other than types like, what's

39:22

missing from JavaScript? What would you love to

39:24

see be added to JavaScript in general?

39:26

Especially, I love hearing this from people

39:28

who are not like use other languages and they would love to see it

39:30

in JS. I think something

39:32

about something about making

39:34

promises faster. I still think too

39:36

much overhead there. I don't exactly that's

39:38

not exactly your question, but Bun- Yeah.

39:40

-- I think there's

39:42

something where basic because

39:44

a weight always causes a microtic, it

39:46

it has this overhead. That that means

39:48

you can't just use it everywhere without

39:50

-- Yeah. -- really slowing down your

39:53

code. there that seems kind of

39:55

like a language slash like spec problem. And

39:57

there should be some way to be,

39:59

like, make

39:59

this async if but only if it

40:02

actually needs to be. And I

40:04

think that would be good from this

40:06

is something that you can do in in segue

40:08

Bun other languages where it's

40:10

much easier when have a little bit

40:13

lower level control over how you suspended Versum. In

40:16

six cases, those are actual keywords

40:18

suspended Versum. and you can

40:20

so so you can do async and

40:22

also wait in sync, but you can also

40:26

change the async Bun wait effectively

40:28

our suspended resume, but with, like, a little bit lower level control. Another

40:30

thought there too is that there's something with

40:33

I feel like generators something

40:36

Sumner, like, if generators didn't expose an

40:38

object in the same way where you had

40:40

this, like, catch and and return

40:44

and and value and done. Yeah. Bun

40:47

if they were if they

40:49

just didn't have that as,

40:52

like, a user user visible observables is word

40:55

that is often used. as

40:59

being observable, then I think it it would be quite a

41:01

bit faster. I think

41:02

it was just pure syntax. Yeah. I guess,

41:04

mostly, I think,

41:05

in terms of performance Bun

41:07

not about -- Yeah. DX

41:08

as much. But

41:09

I I do this a lot, but DX a lot. It's

41:12

just -- Yeah. -- my mind is

41:14

defaults to performance right now.

41:16

What about like, you build a

41:18

SQ Lite client right

41:20

into bun. How come like, why

41:22

do why is SQ Lite so good? And

41:24

why is that built right into a language what

41:26

I really like about SQLite is it self contained because it's it's

41:28

one one file. You don't need Bun a

41:31

full server. You don't need to it's

41:34

like really easy to reason about. I think there's a future

41:36

where maybe bun can you can, like, bundle

41:38

SQLite databases with your

41:41

code into, like, the same JavaScript

41:44

bundle. That's

41:44

not implemented right now, but that's, like, a

41:46

thing I think would be good to do. I think

41:48

we're fun

41:48

we're probably gonna also have a

41:50

built in post stress and built in my SQL client

41:53

as well. I

41:54

think all these

41:55

the the database libraries need to

41:57

be really fast. And I

42:00

think

42:00

it makes sense for the runtime to just include

42:02

it as a built in. So long as the

42:04

runtime's implementation is very

42:06

fast and Yeah.

42:07

I think that's

42:08

just like a really by

42:11

putting those things in it as defaults, it's

42:13

really high impact to to

42:16

ensure that everybody has a fast experience.

42:18

Totally. I I love that, honestly. Like,

42:20

that a lot of this stuff is just

42:22

included and and nice

42:24

and fast. It's honestly, I wish something that Node would do more often.

42:26

I I even remember Bun- Yeah. --

42:28

selling. One of the guys that works on Node Bun, like,

42:30

do should we have we

42:32

really do strip types and run JavaScript. Like like, right now,

42:35

we're running TypeScript and Node is a pain in

42:37

the butt, you know. And it's just like, I just wanna

42:39

be able to make a quick Node script

42:42

with dot t s on the end. And and now I'm now I'm reaching for

42:44

Bun or do you know what that type of Bun? Because it's

42:46

it's way easier. So III

42:49

really appreciate having that a

42:51

little bit more batteries included totally.

42:54

Cool. Well, let's get into that point. Do you wanna

42:56

get into the separate Yeah. Questions? it.

42:58

Cool. So this is the part of the show where we asked the same kind of

43:00

questions to every guest that comes on. We call

43:02

these the separate club questions. They're just

43:05

kind of general computing questions. So first and foremost,

43:07

what kind of computer do you use and

43:10

prefer to develop on? I use an m

43:12

one. Like,

43:14

the the fourteen inch,

43:16

like, backs out one. Yeah.

43:18

Yeah. They're pretty amazing. What

43:20

about, like, keyboard? Anything fancy there?

43:22

It's a

43:23

razor keyboard. That's like a

43:25

mechanical keyboard. Oh, okay. And then I do

43:28

I use

43:30

a LG Ultra Fine five k monitor.

43:32

It's nice. Mhmm. But it's like

43:34

it has some issue where it will the

43:36

the keyboard and mouse will sometimes just

43:40

disconnect randomly. It's really annoying. Oh,

43:42

because you use it as, like, a dog. Use the

43:44

the whole thing as a dog. Oh, yeah. Interesting.

43:49

Bun but it's otherwise very

43:51

nice. I think it it's like when you

43:53

try to use a a big monitor and it doesn't have and

43:55

it's not like four k or five k, It's

43:57

just really hard to read. Yeah. It's just really

43:59

big. A lot of people are always like,

44:02

I'm gonna get a TV to code a friend.

44:04

It's like, It's a ten eighty

44:06

six, you know, Rizalous. Yeah. Yeah. I've had a

44:08

better Samsung than that twelve

44:10

years ago. What about text editor? What what

44:12

check text editor are you using? I use VS

44:15

code. It's pretty

44:16

it's like pretty standard. I

44:19

think I use my editor theme.

44:21

I think it's like dark

44:24

one, which I think is the Adam theme.

44:26

Mhmm. Yeah. I I

44:27

and then, like, on the terminal stuff,

44:29

I use fish. and

44:31

I use starship dot r s, which is makes it

44:34

prettier. Yeah. Yeah. And

44:36

then for the actual terminal itself, I currently

44:38

use ala

44:40

pretty Alacretia is really fast. There's actually a noticeable

44:42

difference in benchmarks when you run it in

44:44

Alacretia versus, like, iTerm, which is really

44:48

funny. Bun it's like the the time it takes for your terminal to

44:50

print the standard out or standard air

44:52

actually does have

44:54

an impact. the downside liquidity

44:56

is it doesn't have built in tab support.

44:58

So it

44:59

that's

45:00

just, like, kind of annoying. So you

45:02

just have a lot of windows.

45:05

And then it also has a few rough edges. Like, if

45:07

you do a command h, it will just

45:09

quickly up full application instead

45:11

of hiding the

45:14

window. And I think each one shows up as a separate instance

45:16

in, like, winning a new Apple app, which is

45:18

just not really how Mac apps are supposed to work.

45:20

Yeah. That's so funny. Yeah. Yeah.

45:24

I've never heard of this one before.

45:26

Have you have you checked out warp at

45:28

all? Yeah. I tried it, but

45:30

it was, like, early in

45:32

the beta. I was trying to get It's been a while. Would I really should

45:34

do? Yeah. Would I really think somebody

45:36

should do, like, a a terminal with,

45:40

like, a co pilot style autocomplete built

45:42

in. Yes. Like, that should,

45:44

like, drive the entire terminal.

45:46

Because most of

45:48

the time, you're

45:48

you're running the same commands over and over and you're running Bun and

45:50

even the commands you do wanna run that are like one

45:52

off have have some stack overflow

45:56

post somewhere -- Yeah. -- tells you exactly what you wanna do. And it

45:58

should just like, you you

45:59

start writing some of the command in, like,

46:02

the the language model

46:04

just finishes it for you. That's

46:06

gotta happen. I I keep thinking that. Like,

46:08

I I have to install ESLINT and

46:10

these four other ESLINT

46:12

plugins altogether Like, can

46:14

somebody can somebody just detect that

46:16

up? That's exactly what I'm trying to do. And I I

46:18

guarantee it probably even

46:20

already works. inside of, like, a shell file, inside

46:22

of AVS code. So why can't

46:24

we get that in the terminal? Like,

46:26

that really hacky thing. Bun think it would be funny

46:28

that somebody

46:30

could probably do would be, like, just, like, reversing

46:32

just for, like, as a short term hack to

46:34

test it, just, like, reverse engineer

46:37

copilot's API and then just, like, stick it

46:40

into, like, a some, like, fish

46:42

integration Bun just have

46:44

it, like and just to,

46:46

like, seed does it produce useful autocomplete, if you like, even the

46:48

right product there? And and and the answer

46:50

probably is yes.

46:52

because, like, It it often

46:54

if I do, like, if I'm, like, writing a

46:56

markdown dog, if we're, like, doc probably, docs for

46:58

bun, and it has, like, a shell script in

47:00

in the the, like, code fence.

47:03

then, like, copilot will complete the shell

47:05

script correctly. So WERP has this

47:08

thing called commands dot

47:10

dev where you write which you wanna do in

47:12

plain English and it will use AI to

47:14

try to figure it out, but it's it's

47:16

not quite there. It's it's not there.

47:18

So, like, I think if we

47:20

could just pipe copilot directly

47:22

into the terminal, that would be sick. Just pipe

47:24

it into my brain.

47:26

What about What do you if

47:28

you had to start coding today, like or maybe you're talking to somebody who's

47:31

learning web programming today,

47:34

What would you tell them to learn? Probably

47:37

JavaScript. Probably probably not TypeScript, actually. Probably

47:38

just JavaScript. Well -- Yeah. --

47:40

maybe TypeScript. I don't

47:41

know. I think I

47:44

think maybe TypeScript put a strict mode off and just like

47:46

the, like just like the loose

47:49

the, like, the the the the config

47:51

that has the least amount

47:53

of of red squiggles.

47:56

Yeah. And just like

47:58

helping you, not making you cry. And

48:01

use tooling that has as much autocomplete

48:03

as possible because I think a lot of

48:05

what's what's frustrating

48:08

is something is broken and and it's really hard to figure out why. And

48:10

I think just having really good auto complete

48:12

helps a lot. Yeah.

48:16

I think, like, editor tooling matters almost as much as programming

48:18

language or maybe more

48:20

for for developer productivity

48:23

and, like, happiness. I also think it's it's important to

48:25

remember that, like, at Bun the end of the day, it's all

48:27

just code. So it's okay to, like,

48:30

not understand everything

48:32

that's happening Bun that

48:35

like some things are like

48:37

you can sort of treat it

48:39

as a black box like,

48:42

you you can't do that in all cases,

48:44

but you can

48:46

you can it's it's okay to like,

48:48

just it's important to not feel overwhelmed.

48:50

when when because it's just it's impossible to know everything. It's okay

48:52

to not know things. Awesome. So the next

48:55

section is we have two things.

48:57

You have a sick pick. Bun

49:00

a shameless plug. A sick pick is something that you

49:02

pick that is sick. It could be

49:04

literally anything keyboard, a mouse, a chocolate

49:07

bar, a hat, a desktop app where you

49:09

name it. And then a a shameless plug

49:12

is anything you wanna plug to the

49:14

audience. Shameless

49:16

plug oven

49:18

and Bun is hiring. We're hiring

49:20

we're hiring for two roles mostly

49:22

right now, Zig and c plus plus engineers.

49:25

Bun we're we also

49:27

need some help with writing

49:30

JavaScript Polyfills

49:32

for compatibility with node and

49:35

improving our test coverage. And for that skin, that's mostly in TypeScript

49:37

and JavaScript. If

49:40

you're interested please

49:42

email jobs at oven dot s h or

49:45

DM me on Twitter Jared

49:47

Sumner. Awesome. Supper. Awesome.

49:49

Yeah. Cool. Hopefully, we can find just somebody out

49:52

there. Yes. Here's Did

49:54

you have a a sick pick? Let me

49:56

just look at, like, maybe there's, like,

49:59

a meme that somebody

49:59

Sumner in Bun a while

50:01

ago. This maybe this meme, I don't

50:03

know. to hear. So

50:06

you're your sick pick is

50:08

the hilarious meme of the Spider

50:10

Man meme, fun, you know, and

50:12

node all pointing at each other. That's

50:16

funny. Where's cloud

50:18

fire workers being thrown in there?

50:20

I guess they did they made that they

50:22

didn't they didn't get the results at

50:25

that point. Yeah. Cool. Well, awesome.

50:27

Thank you so much for coming on. Really appreciate all

50:29

your time. I know you're super

50:31

busy building this entire thing, but I

50:33

appreciate you coming on to explain

50:35

to audience what it is and how it works and all that.

50:38

It's always very interesting. Thanks for

50:40

having me. Yeah. Thanks so much.

50:42

Yeah. Alright. space.

50:44

Head on over to syntax dot f

50:46

m for a full archive of all of

50:48

our shows.

50:50

and don't forget to subscribe in your podcast

50:52

player or drop a review if you like this show.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features