Podchaser Logo
Home
ChatGPT prompt engineering mastermind: debate preparation prompt, travel guide, business research

ChatGPT prompt engineering mastermind: debate preparation prompt, travel guide, business research

Released Wednesday, 3rd May 2023
Good episode? Give it some love!
ChatGPT prompt engineering mastermind: debate preparation prompt, travel guide, business research

ChatGPT prompt engineering mastermind: debate preparation prompt, travel guide, business research

ChatGPT prompt engineering mastermind: debate preparation prompt, travel guide, business research

ChatGPT prompt engineering mastermind: debate preparation prompt, travel guide, business research

Wednesday, 3rd May 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:15

Who was the person who submitted the

0:17

debater? The

0:20

debater was Michael. Is

0:22

Michael here? Michael, try

0:24

raising your hand yeah. I just

0:26

wanted to put any topic in that I

0:29

don't know it's totally fine. I've actually run

0:31

across someone there may

0:33

have been a high school or a college student, but who is

0:35

using cha e p t for debate

0:38

practice. So here's my

0:40

side. Give me the other side, tell

0:42

me the problems so then I can

0:44

come up with how to defend against them. So I think

0:46

this is a great prompt. What

0:48

topic do you want to have j chat,

0:50

t p t do some debate prep for? Could

0:55

be anything. If a prompt

0:58

the topic would be does

1:00

prompt engineering make sense for Lang

1:02

a large language models? Okay,

1:05

cool. I also saw someone

1:07

had pineapple pizza, yes or no in the chat.

1:09

So I, I appreciate the absurdity everyone

1:11

is putting down. Does prompt engineering

1:14

make sense? Can't spell make

1:16

sense for large language.

1:20

Models. And I

1:22

got some feedback that there are some blind

1:26

listeners in the audience so I just wanna read

1:28

off the prompt really quick. Your task is to act

1:30

as a debater one. In debater two, you'll

1:32

be given a topic debater one will have to persuade

1:34

Debater two and vice versa. You have to

1:36

present valid arguments and reflect on

1:38

them. Notice the use of reflection.

1:41

Your goal is to achieve a bias-free understanding

1:43

of the topic and an agreement. Cool.

1:46

Okay. So it started,

1:49

it gave a Ëœsomewhat

1:52

meandering, response.

1:55

It's particularly talking about prompt engineering mitigates

1:57

ethical concerns, and that doesn't really make

2:00

a lot of sense. Then Debater two

2:03

comes back with actually that's,

2:05

nope, that doesn't make any sense either. Prompt engineering

2:07

can lead to overfitting also. No, that's

2:10

not something that happens in this stage.

2:12

Is there anything useful out of this? Okay,

2:17

this point right here, prompt engineering is

2:20

not intended to replace more advanced AI

2:22

tech. It's to complement it,

2:24

you how this, oh, by using a combination

2:27

of prompt engineering and other techniques, we can

2:29

develop more powerful and robust AI

2:31

systems. That's a good point.

2:33

It's pretty much the only good point. But hey

2:36

so one thing I would say

2:38

is I would probably,

2:40

actually I don't need to rerun this in a

2:42

separate one. Please present

2:45

a brand new set of arguments

2:49

focusing on what's

2:51

the right way to say this? Avoiding discussions

2:55

of overfitting and

2:58

what was the other one that it was just BSing

3:00

us on? Yeah, mini and

3:03

that and prompt

3:06

engineering reducing

3:09

complexity is, okay. So you know what? I'll just say avoid

3:12

discussions over fitting and focusing

3:14

on how do I say this?

3:18

Best uses of a

3:21

prompt engineers skills.

3:25

See if that comes up with anything useful. Jason

3:30

is asking, you're in 3.5? Yes,

3:32

I am in 3.5 because if I use four,

3:35

this will be a very short mastermind cuz

3:37

we'll only be able to do I think it's 20

3:39

now or maybe 25. Cycles

3:42

props. So yeah I'm using

3:44

three, five. I also using

3:46

three five because that's what most people

3:48

have access to in the a p I. Obviously, I

3:51

and other people have access to 4.0

3:54

just in the chat, but not

3:56

in the a p i. That's much rarer. Okay.

3:58

Wait where did I, oh, there it is.

4:01

Okay. So better?

4:06

A little better. Okay. What is, what does Debater

4:08

two say? Why it's bad Debater

4:12

two point is just that it's not

4:15

better than building a better system. Okay.

4:19

I feel like that's getting to

4:21

your point, Michael, of how do we find the good

4:23

and find the bad in a debate

4:25

prep tool. Let's see. Oh

4:28

yeah, the travel guide one. Okay. This is a

4:30

long one. So I am not going to read

4:33

all of this, but Zeeshan,

4:36

let's see if I can find you in the chat and unmute

4:39

you. Okay, Zeeshan,

4:41

you should be unmuted. Hey Greg,

4:43

how you doing? Good. Nice

4:45

to hear from you again. Alright. So

4:48

we have a ton of stuff. This

4:50

is having

4:52

never visited Turkey. I have

4:55

no idea if any of this is accurate, but

4:57

it at least looks good.

5:00

Yeah. So this is actually based on something

5:02

I was actually trying to do. Cause I'm actually

5:04

gonna be doing this trip. Oh, okay. Yeah.

5:07

So I thought when I had put

5:09

it in, it did a pretty good job. But

5:12

the one thing I was hoping to get more from

5:14

it was like specific

5:16

things. So for example, if you look at day two, it says, do

5:18

a boat tour. Who, is there a

5:21

company they can recommend, or or

5:24

a specific kind of organization to go through.

5:27

Or for example, hiking in the nearby mountains.

5:29

That's not as the script as I would hope,

5:31

especially if somebody going to a foreign country.

5:33

Like I need, like it spelled out

5:36

as simply as possible. So

5:38

I'm not sure what else I could do to maybe to

5:41

refine this. Yeah. Okay. So

5:43

let me pause for one second. Amanda said,

5:45

could you please throw the prompt into

5:48

the chat and I will do that.

5:51

It is in there. Okay, so we

5:54

want, it sounds like what you're looking for partly

5:57

G P T cannot provide, which is tell me

5:59

exact company names, give me their website,

6:02

gimme their phone number, whatever. But

6:04

if it could at least tell us some details of,

6:06

where to go hiking, what to do, hiking,

6:09

that kind of thing. Sorry, one quick

6:11

question. Why wouldn't it be able to tell me specific

6:13

like companies? Because if I were to go on like Yelp

6:16

or like some of these review sites or whatever, like

6:18

you can pull it up through Google. So why wouldn't

6:20

chat G B T be able to do that? Because

6:23

chat PT isn't connected to Google.

6:26

It mostly, as far as I know, did not

6:28

do indexing on deep links. So

6:30

like Yelp, it would not have indexed Wikipedia,

6:32

it would've indexed, oh if we were using

6:35

Bard, which I'm by no means promoting,

6:37

but just as an example, if we were using Bard,

6:39

it would be able to do that. Sorry.

6:42

And same thing for Microsoft

6:45

bing Bing in theory should be able to

6:47

do the same thing, and that's actually

6:49

something I've been wanting to experiment with, probably

6:51

won't today. Let me start the 10 minute

6:54

timer before I get distracted. Okay.

6:57

So let's work in

7:00

some requests for detail here.

7:02

Your experience, travel guide, specializing in

7:05

Turkey, need you to plan a detailed

7:07

itinerary First. Five days are here.

7:09

Second. Okay. Is

7:11

there, yeah, this is what I'm looking for. You

7:13

need to plan every aspect of my trip, including travel

7:16

from hotel to destinations and back again.

7:18

You'll need to, you'll also

7:20

need to include restaurants recommended to eat and

7:24

for activities. We are interested in these things. Okay, so

7:27

then that's perfect right there. For

7:30

all activities, please

7:33

provide, how many was it

7:35

providing? It's providing. Looks

7:38

like one activity

7:40

per day. Two, sometimes. Okay. So

7:43

for all activities, please provide at

7:46

least three concrete details.

7:48

And I don't know if this language is gonna work, that's why I'm running

7:50

it in another screen. About where

7:54

the where to go for the

7:57

activity and

8:00

how to do it, how

8:02

to, what's the right way to say this? Where to

8:04

go for it and let's just say what

8:06

to bring. That'll be a little bit easier. Or

8:09

how to prepare for a ma or, yes,

8:11

that's what I'm trying to say. Thank you. All

8:13

right. So let's just run this and see how it does.

8:16

Okay. Yeah. So it's, gimme more detail. It's like providing names

8:18

of places. Yeah. So that's what's

8:20

helpful. I don't know how to pronounce this, but k

8:23

a s diving, book a trip with

8:25

them. Go to a specific

8:27

beach for sp swimming. It's

8:29

not including anything

8:31

like you're gonna need sunscreen

8:34

and a towel. And again, I don't know what else, but

8:36

for these things, but, oh, okay. Wear

8:38

comfortable shoes and clothes for the hike that's

8:41

getting there. All

8:43

right. So by the way, the reason I'm, I

8:45

have this open in two different windows is twofold.

8:48

Number one, I want you to be able to see

8:50

the sort of before and after, and I'm just gonna keep bouncing

8:52

back and forth. But number two, if

8:54

I ask it, okay,

8:56

now tell me blank, that

8:59

is a good way to get the information, but

9:01

it's a bad way to make a prompt because

9:03

then you have to try and figure out, how do I mush these two

9:06

prompts of, do this stuff now, tell me

9:08

more together. That said, prompt

9:10

chaining is a thing that I am doing

9:12

a bunch of exploration with and frankly,

9:14

it's come up a bunch on Reddit posts I

9:17

think it was Lee that I

9:19

interviewed who was using prompt

9:21

chaining to do "is

9:23

this a question that my AI can answer?"

9:25

If yes, answer it. But if no, don't

9:28

try and answer it cuz then the AI just starts

9:30

going crazy and instead just say "I'm

9:32

sorry Dave, that's not a question I can answer. Sorry."

9:35

So that's a helpful error checking thing. I

9:38

just saw in the chat closing

9:41

out the debate thing one keyword that may help is to

9:43

specify the debaters need to take opposite

9:45

and opposing viewpoints. Thank

9:47

you, Jason. I think it was doing that,

9:49

but I think it might have helped to call that

9:51

out. Good point. And then

9:54

Eric said by putting in location

9:56

you're staying and the activity, you

9:59

can then ask for how much is the

10:01

transport and some of

10:03

that stuff. That is a good point. I

10:06

want to focus a little more actually. No.

10:08

Zeeshan, this is your prompt. What would

10:10

you like to focus more on? Would you like more tell me

10:12

the details of, you need to bring, I don't

10:14

know, sunscreen, or are you

10:16

feeling more like you want some more

10:19

breadth: give me more ideas per day?

10:21

I think probably more ideas per day. I'm not too worried

10:23

about what to bring. The

10:25

more more idea to be helpful. One

10:28

thing I wanna show, I'm not gonna use it

10:30

because it doesn't do what I am

10:32

wanting for you all to be able to see the sort of changes,

10:35

but if you click this edit button, you

10:37

can make changes and let's just arbitrarily

10:39

say no, no day should go past now 7:00

10:42

PM and then it'll give

10:44

this little threading, sorry, threading

10:46

interface right here so I can use this

10:48

to go back to the previous version of the prompt.

10:50

The problem is that for you all on the screen

10:53

share, you then can't see it

10:55

at the same time. Like you can only see, here's

10:57

this version where it's 7:00 PM Let me scroll

10:59

up now it's 5:00 PM and

11:01

I want you to be able to see the progression over time.

11:04

And so does that save as one thing in the,

11:07

the left window? Paint it. Save. Okay. Yeah.

11:09

Yeah, it's I'm keeping it out of the view,

11:11

but it's still just the first turkey

11:14

itinerary. This other one is this window.

11:16

Sorry. Yeah, this window right here.

11:18

Gotcha. All right, so we're

11:20

gonna come back over here, going to

11:22

paste the exact same prompt, make

11:25

sure that I actually got it correct. All

11:29

right. And you said

11:31

more ideas for activities. Okay. So

11:34

for all, so

11:36

for every day list,

11:39

at least five

11:42

let's go with four cuz it'll probably break it up evenly

11:44

activities and then

11:46

provide the concrete details about them and blah, blah, blah,

11:48

blah, blah. All

11:51

right. Ooh, it's not listening.

11:55

Oh. Actually no, I take it back. It is listening. Okay. So

11:59

I'm gonna ignore the checkout of the hotels, but it's giving me

12:02

scenic, hike, stunning views of

12:04

the Mediterranean. That's not really

12:06

an activity, but Okay. Stop for lunch.

12:09

In the afternoon, explore the ancient ruins of

12:11

the Lian City of Fellows.

12:14

I'm hoping I'm pronouncing that right. And then recommended

12:16

restaurant for dinner. Eh, it's,

12:18

that's not quite enough, but we're getting

12:20

there. So

12:23

I think let's

12:26

open another tab over here so

12:29

we can just keep bouncing back and forth. And

12:31

that's not the updated version. I need the updated

12:33

version. Updated

12:36

version is here. And

12:39

three concrete details about it. These

12:42

details for

12:45

each activity should

12:48

be at least two sentences. So

12:51

length guidelines are a thing that can help

12:54

sometimes it backfires it,

12:57

like Yeah, exactly the way I was worried

12:59

it would, yeah. So what it's doing is

13:02

it's taking the, at least two sentences

13:04

to be the day instead

13:07

of each activity. So

13:09

let's I'm not even gonna keep this visible.

13:11

Let's just try that one more time and

13:14

rephrase this as let's

13:18

see. For

13:20

each day, list at least four activities

13:25

describing them in

13:28

fine detail. Of

13:30

at least two sentences

13:33

per activity, then

13:37

oops. Then please provide at

13:39

least three concrete details about it. Laba.

13:41

All right, let's see if that works. Ooh,

13:46

it's not liking that. Nope, it's not liking that

13:48

at all. Okay, so

13:51

I'm gonna try a bit of a different thing here. We're

13:53

gonna try some shot prompting. So just as

13:56

a reminder, shot prompting is

13:58

explaining what

14:00

you want by providing

14:02

some number of examples. To be clear,

14:05

zero examples is a way

14:07

of doing, it's called zero shot prompting. But

14:09

in this case, I'm gonna go for one. And

14:12

I'm going to go back to

14:14

where I liked the output. So let's find

14:17

one that was on the longer side here.

14:21

Yeah, so let's do this one and

14:25

then we're gonna say example

14:27

output. And there's a bunch of ways of doing

14:29

the shot prompting. Totally.

14:31

Totally varies. But we're just gonna do

14:34

this day two,

14:37

that's, and by the way, including these

14:39

bullets, hopefully will cause it to start doing the bulleting

14:42

again. Cause that was really nice. And

14:44

then what was the other one?

14:48

No, that's the one I just copy and pasted. That

14:51

was good. And it doesn't actually

14:54

matter if, as much

14:56

if these things go together. Although

14:58

they are, because they're both in the same location.

15:03

Great. Alright. Now, how would you do zero shot prompting?

15:06

So zero shot is what we've actually been doing. We haven't

15:08

been providing an example. All

15:12

right. Let's see if this works as intended

15:16

better. Okay. Still not getting

15:19

the sentence length. Interesting. So this

15:21

is something I've noticed with chat, t p t

15:23

and other things. It sometimes just

15:25

gets very fixated on not responding

15:27

the way that you want it to. So honestly,

15:30

I'm gonna have it I'm gonna make a new

15:33

tab. This is silly and weird,

15:35

but this I think will work. And

15:38

that's the 10 minute timer. All right, I'm gonna do one more

15:40

minute on this cause I wanna see if this actually works.

15:43

Rewrite these rewrite

15:46

this itinerary. Ooh,

15:49

can't spell that. All

15:51

right. Travel schedule with

15:55

three sentences per bullet.

16:00

Because honestly I just don't feel like actually doing

16:02

it. Ooh, wow. It refused.

16:04

Oh, that's so interesting. This is a great

16:06

example of when I'm gonna do edit because I don't actually

16:09

really for each bullet.

16:12

You gonna take that? Maybe not.

16:15

Oh no, it actually provided an example

16:17

of a company that I

16:20

could go with, so that's really helpful. Yeah,

16:22

I don't know if it's I don't know if it's real,

16:25

that, that would be a great example actually of

16:27

why you have to be careful with this kind of thing cuz hallucinations

16:30

are very common. But just

16:32

cuz I'm curious, yes, they exist.

16:35

I would check and make sure they still exist after the pandemic,

16:37

but they at least existed at one point, cool. Alright,

16:41

so since that was the timer, I'm gonna move on.

16:43

What I was trying to do was get

16:46

chat g p t to write longer

16:49

so that I could then plug that in as my

16:51

one shot example, because what keeps

16:53

happening is I'm feeding it two sentences,

16:55

telling it that I want it to produce more than

16:57

two sentences and it's basically being

16:59

like, oh, I'm gonna listen to your example

17:02

more than your command: no. All

17:05

right. Let's see here. You're welcome.

17:08

Got a next one. I

17:10

only see one Eric, so I'm hoping this is

17:12

the right. Eric, I'm un unmuting you. If

17:15

you are not doing Eric that just submitted this. Yeah.

17:18

Just CEO of a startup. Okay. Yeah. Yeah. Thanks

17:21

for hosting this, by the way. You are

17:23

welcome. I have a lot of fun doing

17:25

these. And I will put it in the

17:27

chat. It has requested. Hang on. There

17:29

we go. All right. So

17:32

here is your prompt and tell us

17:34

about what the output is that you were looking for.

17:37

Oh boy, that's some weird output. But

17:39

anyway, go ahead. I just gave an example if you

17:42

like, they have five, or actually six different ideas

17:44

there. And basically I'm just interested in business research.

17:46

Oh, got it. Okay. In general.

17:49

And I just think because the amount of time you can

17:51

research different ideas using Google in the past or you

17:53

would really need to aggregate all the data points.

17:56

And like you said, chat BC has a lot of

17:58

hallucinations. You can't take it a hundred percent,

18:00

but the amount of like

18:03

research you can do compared to what you were

18:05

able to do four months ago, five months ago is

18:07

just like amazing, right? You can really get ideas, experiment

18:09

with them, look at competitors in like seconds.

18:12

So I just wanted to bring that up. I didn't know if anybody

18:14

else is, experimenting with those kind of ideas

18:16

or have played around with it. But the

18:19

the business idea, business solutions analyzing

18:21

competitors all that stuff I think is super interesting

18:24

and I think it's just gonna get further along.

18:26

This is just like the first we're seeing.

18:29

Oh yeah. I've already research

18:31

and like ahead business research. Sorry.

18:34

Yeah, I've already seen I don't remember the

18:36

name and actually, I'm not sure I'm allowed to say the name,

18:38

but I have seen someone who was

18:40

working on a plugin that pulled

18:43

in s e m Rush, which is a social

18:45

media marketing optimization

18:48

tool. Mouthful into

18:50

chat G P T as a, an official plugin as

18:52

a few others as well. So yeah,

18:55

there's definitely a lot of interest in this.

18:57

Okay, so rereading, actually these

19:00

are actually different prompts. Okay. It was

19:02

just like if you played it through from top to bottom, just to see how,

19:04

how chat two b t would got it. Yeah.

19:06

Okay. So let me just throw

19:08

in one at a time and

19:12

then, okay. This is

19:14

interesting. Not, I

19:17

don't know enough about this area.

19:19

So would you say this is

19:21

a reasonable response or is this kind of a

19:24

reason? Yeah, if you want to just for like phone, we can just

19:26

make up an idea. But I was just, just

19:28

Sure. Bring, this is just something I thought of, but whatever you

19:30

wanna do. Yeah. Or we can roll with this, but yeah,

19:32

that sounds, that actually seems pretty like

19:34

on point. Okay, cool. So you

19:36

wanna go onto the second prompt then the

19:39

customer segments? I think it was, yeah,

19:44

there we go. Oh, and

19:46

I see in the chat someone saying you don't need to

19:48

read it all loud. The reason I'm reading it

19:50

aloud again, is for the people who are

19:52

listening because they're not able

19:54

to see the screen either because they are blind

19:57

or they are driving or whatever the

19:59

case may be. Sorry if it slows things down a little

20:01

bit, but I wanna try and make it accessible to everybody.

20:05

So basically the idea is, how can you bring automation to

20:07

smaller businesses, right? Small medium enterprises.

20:10

So now we're saying, business to customers or business

20:12

to business. And I just think it's just

20:14

spits out 10 ideas for both, right? Which

20:16

I think is obviously you can think some of these are things

20:18

in your own, but yeah, just the speed

20:20

of ideas is just what blows me away.

20:23

So I think what tends to be more interesting

20:25

about this kind of thing is I don't know,

20:27

I'm just gonna pick out law firms arbitrarily

20:30

cuz my dad's a lawyer. Tell me more

20:32

about the possibilities for

20:36

law firms and then let

20:38

it run. And

20:41

I'm not a lawyer so I, I can't be like, seven's

20:44

totally wrong, but like being able

20:46

to do this iterative narrow

20:49

down now, generate a bunch of ideas,

20:51

things like that tends to help a lot.

20:53

And this is. At least from what my dad

20:55

tells me, attorneys on the call feel

20:58

free to pipe up and say, this is junk. But yeah.

21:00

It at least seems reasonable. Case management

21:02

is one suggestion. Cuz there's

21:05

just a lot of tracking deadlines and generating status

21:07

updates. Document management

21:09

is basically the same thing. Billing is the same

21:11

thing. Yeah. A lot of these actually make a

21:13

lot of sense. What was the third chunk?

21:16

Yeah, so your third one was exactly where

21:19

I was just thinking do.

21:22

So the third one is, what kinds of

21:24

products and services do my competitors offer

21:27

and how do they compare to my services?

21:29

Now, obviously that in fact actually it

21:31

should say, yeah, as an AI language

21:33

model, I don't have access to specific information

21:36

about your competitors or your product slash service,

21:39

but, you can, you could put in something

21:42

I guess Eric is the, is there an example

21:44

you could say of what you might hypothetically

21:46

offer? So give it a little more context. If you go

21:48

back to the the questions or the prompts

21:50

I posted, if we go with the same

21:53

like business idea the, with the

21:55

total addressable market. And maybe that then

21:57

follow up with the question of competitors. So what's the total

21:59

addressable market for an automation company targeting

22:02

SMEs in upstate New York? What services,

22:04

I should say, are offered by

22:07

those companies? We'll just say those

22:09

is not clear enough by those

22:11

service providers. Hopefully

22:15

that'll be clear enough that I'm talking about the

22:17

companies, not the SMEs

22:20

the clients basically. All

22:23

right. So yeah it still did the I don't actually have access

22:25

to the internet, so I don't know, but

22:28

some possible services are workflow automation,

22:30

data management, customer relationship management, all

22:32

that stuff. not a business major,

22:34

but this sounds right, Eric.

22:37

Yeah, I was just basically giving you a breakdown,

22:39

but yeah but it's not doing the

22:41

calculation, which is what you actually wanted.

22:43

And that's I don't think we're gonna get there with chat bt

22:46

you definitely won't get accurate. Accurate. You'll,

22:48

you might get a number, but it's gonna be like 37

22:52

million. Yeah. But Exactly. But plugins,

22:54

like you were talking about with access to software

22:57

with that do have that data combined

22:59

with like ideas and computer

23:01

you can talk to is just like blowing

23:04

my mind. What's gonna come. So that

23:07

was the idea with these prompts. Got

23:37

another minute or two left on

23:39

this. I could see

23:41

plugging in a couple of the other things

23:44

that you had asked about, but is there

23:46

another direction that you would be more interested to

23:48

take this, Eric? No, that was

23:50

pretty much it. Okay. Then

23:52

I'm actually curious to plug in

23:55

these two things here.

23:58

So please give me some ideas for revenue

24:00

models and please give me the cost structure

24:02

of such a company and we'll see.

24:04

We're not giving it a whole lot of clarity, but

24:07

we'll see if it comes up with, yeah.

24:10

Okay, cool. So subscription

24:12

models, pay per use, commission, consulting,

24:16

performance fees. These all make sense.

24:18

They're pretty generic. Cost

24:21

structure, again, these make sense to

24:23

me, but I guess Eric sanity check as the

24:26

at least more expert than I am. I'm

24:28

not that much more of an expert, but it's, for the most

24:30

part it sounds about right. It

24:32

looks good. Cool. Yeah, it looks good and

24:34

I'm coming from the startup world. I'm yeah

24:37

that's what I know about building a business, but I don't

24:39

know about this industry, okay. Cool. All

24:43

right. Let's see. I'm

24:46

gonna guess I know who this is. Is this

24:48

Yulee? Yes, it is.

24:51

I thought I recognized this prompt. Let me unmute

24:53

you. Okay. This is an

24:55

enormous prompt. So let's just

24:57

open a new one. I'm going to

24:59

paste this in, and as you can see, it is enormous.

25:02

I'm not gonna read all of it. Yeah. Basically.

25:05

Yeah. And this is a really cool idea.

25:07

I interviewed Lee sometime

25:09

this week for the podcast. What

25:11

I did is I took the same strategy, but

25:13

this one is for any type of business

25:16

content. So what it does is it creates

25:18

a series of questions for the

25:20

business owner or the person with a product,

25:23

idea, service, whatever, anyone

25:25

who needs to generate content. And what it does

25:27

is it creates a series of questions.

25:29

They then give the answer, then it creates a

25:31

prompt for them which

25:33

they can then use to generate

25:36

anything. But I just felt

25:38

like it needed some tweaking because

25:41

I don't know, I just think it's too long and

25:44

maybe it has information that is

25:46

unnecessary. So one thought is

25:48

just 11 questions is a

25:50

lot. At minimum, just asking them

25:52

in groups of five or six would help, but.

25:58

I wonder if the type of intelligence

26:00

thing might be a

26:03

little much,

26:06

but I see what you're getting at. Okay. Let me throw a couple

26:08

of examples in. One nice thing about meta

26:10

prompting like this, and, sorry,

26:12

let me back up. What Lee

26:15

is doing here is crafting

26:17

a prompt that will then generate

26:19

a prompt that you can then run

26:21

to generate the content. So

26:24

that's called meta prompting. It's

26:27

also called prompt generating prompts, but let's not go there.

26:30

And basically the trick with

26:32

this is you're trying to

26:34

not only make sure the prompt that you

26:36

get built is good, but then that the output of the

26:38

prompt is good. So there's some iterative

26:41

testing here. One nice thing though

26:43

is you don't have to answer all its questions even though

26:45

they are there. So I'm just gonna say

26:47

I don't know, Lee what product or idea

26:49

do you want to do and what target audience? And we'll just not

26:51

answer the rest. Oh, just

26:54

God, I don't know. Just say an educational

26:56

game that teaches all

26:58

of recorded human history

27:01

and we'll do obviously that's gonna be

27:03

school kids. We're

27:05

just gonna leave it at that. Is

27:08

there anything else that we should throw in here? Oh, yes,

27:10

actually, number six. Somebody in

27:12

the chat throw out a famous

27:14

person. I don't really even care who, just

27:16

a famous person so that we can say, talk

27:19

in the style of, I don't know,

27:21

Arnold Schwartzenegger or yeah,

27:23

I don't know. Shakespeare. That's a

27:25

good one. All right. Gonna be rhyming couplets

27:28

of assuming I can spell Shakespeare right, whatever. It'll

27:30

recognize it even if I did bangle it. All

27:33

right. Ooh, it's still

27:35

giving me those questions. Generate, anyway,

27:37

let's see if it does it. All

27:40

right. Great. So it's

27:42

doing the keywords, it's

27:45

not super

27:47

pulling in anything Shakespeare

27:50

related, although it is focusing on, Nope,

27:53

actually it's not. Okay. Optimize your content.

27:56

Okay. So then let's add

27:58

in I can go here

28:01

and replay sophistication

28:05

going to be elementary

28:08

school age. I don't actually know,

28:10

so I'm not gonna answer that. Multiple intelligence. We're

28:12

gonna say music and

28:15

goal is, What

28:17

is the goal of the educational game? Teach specific

28:19

historical events or concepts or to encourage

28:21

general knowledge about history. Let's go

28:23

with general knowledge and

28:28

let's see if we run that.

28:31

Generate. Anyway, that's

28:35

getting better. Okay. So it's it's

28:37

still mostly just history, but there is music

28:40

and history, making history

28:42

come alive. That's,

28:45

I thinks pretty much it though, in you here.

28:47

It should give you your prompt. There

28:49

it is. Okay. Okay, cool.

28:52

So that's really what I wanted. I was

28:54

eventually just gonna be like, all right don't bother giving me

28:56

the the seo. Just give me the prompt.

28:58

But it worked. Okay,

29:00

so now let's run this

29:04

and these

29:08

seam speaking as a different kind of educator.

29:10

These seem pretty good. Obviously age appropriate

29:12

language is something you need, but incorporating

29:15

music at sound effects since we said music

29:18

is the way these kids learn. Talking

29:20

about that interactivity is a good

29:22

general thing. Visuals,

29:25

that's interesting. It's it's repeating

29:27

itself. Use visuals to enhance learning, make it

29:29

visually appealing. In fact,

29:32

actually, I think there was another visual thing too. It's

29:34

weird because it's creating instructions

29:36

for someone on how to put together

29:38

not the content for their website where

29:41

they have a game or a product, but it's

29:43

creating things that they

29:45

will need to create the product. Oh,

29:47

I see what you're saying. Okay. So so

29:51

is design prompt window here? Yeah.

29:53

So let me tweak this and

29:56

then I have an idea for how to do

29:58

that. So instead of saying, can you help

30:00

me create, just tell it, create content

30:03

for the game that is

30:05

user-centered, blah, blah, blah, blah, blah, blah, blah.

30:08

Visuals is fine. Okay.

30:13

And then it should also say, making it fun.

30:16

Make sure that the content

30:18

includes music and

30:21

interactive learn

30:23

learning. And then just

30:25

to add a little spice on this

30:28

I'm gonna say think step by step.

30:30

I'm not sure if that's gonna help here,

30:33

but I'm hoping. Nope.

30:35

Didn't work. My hope was that

30:38

telling it think step by step would get it out of

30:40

the tell you the stuff to

30:42

do and get it into the

30:44

do the stuff did not work. So

30:46

let's just try taking that out. I don't think it actually

30:48

is gonna make a difference though. Think step by step

30:50

is mostly a logic or math tool.

30:55

Yeah. Okay. So still

30:57

not doing it, it's still giving you instructions.

31:00

So one more tweak here and then I wanna

31:02

go back to the prompt. Yeah,

31:05

it's okay. Super involved if you wanted

31:07

to move on to someone else's, because

31:09

if you, cuz what I, when I did this the first

31:12

time and I went through the step by step because

31:14

I was answering each question one at a time, it

31:17

was then giving me the kind of content I needed, but

31:19

I just thought there's gotta be an easier or simpler

31:21

way to do this. Yeah.

31:24

I think part of the problem here is

31:27

whether or not it's intentionally doing this. I

31:29

think this is probably too

31:32

too, lemme say this a different way.

31:35

It's too much content that is it you're

31:37

asking it to generate because

31:39

this isn't, one sentence bullet point

31:41

questions. At least that's not the way it's reacting.

31:43

It's reacting of oh, you want an

31:46

entire lesson. And so I

31:48

think if we maybe try.

31:50

Oh yes. Sarah. Hang on, let me paste that prompt

31:52

in here. Yep, there you go. So

31:55

maybe if we rerun

31:57

this yet again and say

32:02

basically I want to give it Yeah. Create the

32:04

content of the game. 20

32:06

questions. Okay.

32:09

So least confusing. We're

32:11

confusing something here because this

32:14

is for this would be for someone who already created

32:16

the game and they just need the content for the website.

32:20

The content then promotes

32:22

it. So this is supposed to be general use

32:25

for anyone with website content.

32:28

Got it. Okay. So this isn't yeah. So we're

32:31

instructing it to, yeah. So I don't need it to,

32:34

it's too it, but

32:36

what it does is it generates, 10

32:38

questions, you answer

32:40

'em one at a time and then it creates a problem. And

32:42

now you run that prompt and you get all

32:45

of your website content for anything.

32:48

It's general use, but I just, yeah. So

32:50

I can definitely say right here, that's part

32:52

of the challenge cuz create the perfect content

32:55

for their project. Yeah. Just

32:57

thinking in terms of English structure,

32:59

that's the project's content,

33:02

not the project's marketing

33:05

content. Oh,

33:08

I doubt this tweak is gonna make the difference,

33:11

but let's just try it really quick. Okay.

33:13

Yeah, go. So all I'm doing is changing it to

33:15

create the marketing

33:17

for the project, and now it's gonna ask

33:20

all the same questions again. Of course. So lemme

33:22

just go back, where did I have those so I

33:24

don't have to retype a whole bunch of stuff? Yeah.

33:27

Because I wanted it to be relevant, to anyone

33:29

who has website content, general

33:33

use. All right. So

33:35

it's generating the seo.

33:38

And then can

33:40

you ah, yeah. Can you

33:42

help me create engaging and fun content to market

33:44

my educational game for school kids

33:47

that teaches all of recorded history? Awesome. Okay.

33:50

Let's go run that in another

33:52

window. By the way, this is something

33:54

that came up on the last mastermind, but you

33:56

don't want to run you

33:58

don't wanna run this prompt in the same discussion,

34:01

number one, because it's going to confuse

34:03

your prompt generator. But number two, because

34:06

it's going to have a bunch of context that it's not supposed

34:08

to. The whole point is you want to run

34:10

this in a brand new, excuse

34:13

me, environment, so that it doesn't have any

34:15

of the other context to make sure this prompt works on its

34:17

own. Well,

34:22

it's certainly getting closer now. It's not doing lesson

34:24

plan kind of stuff. It's doing social

34:27

media content and, choose a writing style,

34:30

determine the best format of blog versus

34:32

video versus email, and here

34:35

are specific kinds of content. So

34:38

I think probably to do this, in a single

34:41

prompt obviously is challenging, but if

34:43

you were gonna do that, probably

34:45

the nuance would be giving

34:48

it more output constraint

34:51

in terms of not just for

34:53

this prompt over here, but that this prompt should

34:55

be asking for a specific to

34:58

generate a specific number of pieces

35:01

of content. Give me 10

35:03

Facebook posts, give me

35:05

20 tweets, give me five

35:07

ideas for YouTube videos, whatever.

35:10

But that might be the thing to try and get this

35:12

to be a little more concrete and a little

35:14

less generic. Lemme just

35:16

throw that in really quick here just to see

35:20

generate 10 Facebook posts and 10 ideas

35:23

for YouTube videos. That's

35:25

the only tweak I'm making. Yeah.

35:29

So Lee, I think this is really what you were looking for,

35:31

right? Yeah. That's a lot better. Okay,

35:34

going back to the prompt

35:36

generating prompt way

35:39

up at the top, and yes, that is the

35:41

10 minutes. I'm gonna do another minute or two on

35:43

this and then we will wrap up. Let

35:45

me see. All right, so copy this,

35:47

make a new window, run

35:50

it again and

35:53

Okay. Generate the perfect

35:55

prompt to create. In

35:58

fact, actually I think it might work if I just put it right here.

36:00

So generate the perfect prompt for them to use

36:02

on chat PT to create 10

36:05

Facebook posts and 10

36:08

YouTube videos with

36:10

the perfect content for blah, blah.

36:13

And then we'll just throw in the same

36:15

answers before please

36:17

generate and

36:20

Nope. Doesn't really want to

36:22

do it. Generate anyway. Wow.

36:29

Yes, it worked. Okay,

36:32

so it is outputting a table.

36:35

Yeah. Of, I don't entirely

36:37

understand. Oh, this is basically the feedback

36:40

prompt matrix. Interesting. This is

36:42

basically the summary of the 11

36:44

questions, which we didn't really answer. So it's just

36:46

made up stuff, which is kind of cool

36:48

and kind of weird. And then here's a long set

36:51

of SEO keywords, and then

36:53

there are the Facebook prompts, sorry,

36:55

posts bring history to life

36:57

with our new educational game. Perfect for school, kids

36:59

of all ages. Teach history in a whole new

37:01

way with our innovative educational software

37:04

and the YouTube videos and

37:07

yeah. Okay. I think that

37:09

got to where you wanted, is that right? Yeah. Yeah.

37:11

You could either do it that way or just instead

37:14

of social media, just replace that with generate

37:16

website content. But now

37:18

I see that you have to be very, you can't just say content.

37:20

Yeah, you have to be specific. I don't think even generate

37:22

website content would work because website

37:24

content could be your about page or it could

37:26

be your sales page, or it could be your FAQs

37:29

unless you're asking it to format

37:31

it first, maybe. Okay, got it. Yeah,

37:34

you'd need a little more clarity around

37:36

the output. Again, output constraint,

37:38

that's a specific phrase. I

37:40

specifically want these pieces of content.

37:43

Plus also, I

37:45

would guess either one or

37:47

two pages of the website content would

37:50

be the limit that chat G p t

37:52

would be able to, to output. You see how

37:54

long this is. I would expect

37:57

most websites in this marketing context

37:59

would be, yeah, like almost

38:01

that length or maybe two. Yeah. Screens

38:03

could be. So you'd really want to be like one,

38:06

you could write something. In fact, actually

38:09

just for the heck of it, I'm gonna try writing this. You

38:11

could do something where it would be like, generate

38:13

all the content and then offer me pages

38:15

to generate. Okay. From

38:18

perfect content perfect

38:20

website site

38:23

content for marketing their project.

38:26

That should be five

38:28

pages landing

38:31

page, FAQs

38:33

concerns. And I don't know what it's gonna come up with

38:35

for that. And actually, given

38:37

that we're running short on time, I'm just gonna say it's those

38:40

three pages. And

38:42

please output each page,

38:46

pages content, go,

38:50

and it's gonna ask the same questions. As always, please

38:54

generate now. Go.

38:59

Oh, yeah, there we go. It's doing it. Okay, now

39:02

it's throwing out the oh, interesting.

39:04

Okay. Yeah, it's not quite doing it. It's,

39:07

it did give the prompt and

39:10

then it started in with, here are my

39:12

best guesses for what the pages

39:15

should be. And the prompt

39:17

did not keep

39:19

the content context that

39:22

I want you to generate pages for a website.

39:25

Yeah. It's close, it's website content, but it's

39:27

not specific enough. So I bet if I run

39:29

this over here, it is probably

39:31

gonna return some much more generic,

39:36

actually, this isn't bad. Yeah. Okay. This

39:38

is it's not doing the, here's the F

39:40

FAQ page and the sales page and the whatever

39:43

page is. But, welcome to our revolutionary

39:45

new educational game. We're learning history becomes an

39:47

exciting adventure filled with puzzles, quizzes, and

39:49

interactive features. And then it's

39:51

going on to talk about, learn about

39:53

significant hor historical figures. And

39:56

what are you waiting for? Join us. Yeah.

39:59

It's very markety copy, but it is, it's

40:01

copy, but it works, but it's fast.

40:04

Yeah. Cool. Okay. Okay.

40:06

All give you some good ideas. Thank you. Appreciate

40:08

your help on that. You're welcome. Awesome.

40:11

Okay let's see. If this has been helpful for

40:13

you subscribe on YouTube like

40:15

it, five stars, that's podcast

40:17

anyway, like subscribes five stars,

40:20

all that stuff, you know, the jam. That would be super helpful I'm

40:22

gonna keep doing these masterminds cause I love them

40:24

and I have a lot of fun. I'm

40:26

gonna produce a course cuz I've already given a talk on

40:28

it in San Diego. Thank you all so

40:30

much for coming. And the

40:32

link is feedback, suggestions,

40:35

whatever would make this better for you.

40:38

I would love to hear, cause

40:40

that's the whole reason I'm doing it. And

40:42

let's see. Oh, questions in the chat. You

40:45

are welcome Amanda. Can

40:47

you ask chat g b t to generate something simple

40:49

like a recipe for you as an exercise? Totally.

40:54

And would the course be on Udemy?

40:56

Probably. I know there's a million courses

40:59

on Udemy. I don't necessarily know if that's the best

41:01

way. If people are like, don't go there, go

41:03

to my spiffy course.com.

41:06

I don't know. Let me know in the chatter, in

41:08

the feedback form. Let's see. So recipe

41:11

I have what do I have actually

41:13

at the moment? I have sweet potatoes and

41:17

frozen chicken breasts

41:19

and guacamole. What is a

41:22

recipe I could make

41:24

and how long will it

41:26

take? Actually,

41:29

just to illustrate, I'm

41:32

gonna run this also in chat, g

41:34

p T four, because

41:36

it usually gets way better results.

41:40

It's just yeah I, sorry, I don't remember

41:42

who was asking, but yeah, I'm capped

41:44

at 25 messages every hour, so we would've long

41:47

exceeded that. Interestingly

41:49

though, it is giving me pretty much the exact

41:51

same results in this case. I guess it's a pretty

41:54

constrained set of ingredients. If

41:56

you wanna throw something in the chat of a

41:58

list of things to

42:02

put in, go for it. Interesting

42:04

though, it's giving me different instructions. 400

42:07

degrees here, 4 25 there, 20

42:10

to 25 minutes. In both cases looks

42:13

like the ingredients are the same, though. They

42:17

are the same, but they're, no, they're not the same

42:20

and they're in different order. That's hilarious. So

42:22

this is an example of the temperature

42:25

setting, which admittedly we can't change

42:27

in this interface, but

42:30

in other interfaces you can actually change

42:32

the temperature to make it more creative

42:34

or less creative. And That's actually,

42:36

if you wanna know more about that, go to the podcast cause we

42:38

just covered that in the Quest g p t

42:40

episode. But you

42:42

can't play with that in here. You

42:44

can play with it in a more technical interface

42:47

called the Playground. Oh, and I see

42:49

a chat message specify not to be longer than

42:51

X Minutes. Sure. What's

42:54

a recipe I could make

42:56

that takes less than I

42:59

was already saying 20 minutes. So I guess, let me say 15

43:01

minutes. I don't actually know if that would

43:03

work with a frozen chicken, but maybe

43:05

it's smart enough to try throwing it in the microwave. Oh

43:09

yeah. Sorry. This

43:11

error is because it is generating in this window,

43:13

I have to wait until the other window is

43:15

finished. Chachi PT and OpenAI

43:18

are aware of that really

43:22

should have started with the 3 51 cuz it's so much faster.

43:24

Sorry about that. Yeah,

43:28

it's pretty, ah,

43:31

okay. So it since I said 15 minutes,

43:33

it just said you should start with two thaw chicken

43:36

breasts and I specified

43:38

they were frozen, so that's an interesting hack,

43:42

air quotes, but whatever, it's

43:44

at least at least trying to give me something. Okay.

43:48

Now regenerate and,

43:52

oh, interesting. The

43:54

three five is giving me a very different answer. It's saying

43:57

avocado and tomato salad, whereas four

43:59

was suggesting the chicken

44:01

and guacamole wrap, which

44:04

actually now that I think about it, didn't I specify they were

44:06

frozen chicken breasts?

44:09

Okay, good. Yeah. Thought

44:12

it was saying something different here. Cool.

44:15

All right oh, apologies. Looks like I'm a minute

44:18

over, but I hope everybody

44:20

got a lot out of this. It is super

44:22

fun for me to do All right everybody,

44:25

thank you so much and hope you have

44:27

an awesome rest of your week. Talk

44:29

to you soon and see you on the podcast,

44:32

hear you on the podcast, and see you on YouTube. Whatever.

44:36

Bye everybody.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features