Podchaser Logo
Home
Activity

Activity

Released Thursday, 18th June 2020
Good episode? Give it some love!
Activity

Activity

Activity

Activity

Thursday, 18th June 2020
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

[Music]

0:15

Everything can be improved, iterated and refined and if you don't think that's true,

0:19

maybe you haven't analyzed it enough. Calculator choices carefully considered. Absolutely analytical.

0:25

Analytical is part of the Engineered Network. To support our shows, including this one,

0:28

head over to our Patreon page and for other great shows visit engineer.network today.

0:32

Activity. Are you working today? Are you being productive today? Are you demonstrating activity

0:40

today? You might think activity is a thing that others need to see, but I wonder if that's really

0:48

completely true. When we combine produce with activity we get productivity. On balance though,

0:57

It's more about expectation over a certain time horizon to get a result.

1:03

So, if we were to look at an activity that a major corporation, let's take for example Apple

1:10

because they're a good example, why not? And let's say that they release a new phone every year or

1:15

maybe an incremental update every year and then a major update every two years. So, when people look

1:22

at what Apple is doing in the meantime between those two sorts of release dates, every 12 months,

1:27

to 24 months, there's a lot of speculation, a lot of guesswork, a few leaks of information

1:33

sometimes from the inside. But ultimately, though, there's very, very rarely any actual

1:39

evidence that there's any activity happening. There's no proof that anything is going on.

1:47

We all assume, we all know, but there's no evidence to suggest it. There's no external

1:52

visibility that anything is happening. So, so far as we know, Apple's not doing anything.

1:57

Well, we know that that's not true though, because after 12 months or 24 months, we see

2:00

the results. So we know, and we can infer in the interim that activity was happening.

2:06

When the next device is released, it's probably amazing. People love it. Well, probably, maybe

2:11

they do, maybe they don't, but faith is restored. There was activity all that time. Of course

2:17

there was and we didn't need evidence of that to get a good result at the end.

2:22

So our expectations are set because we understand implicitly and we're trained by observing

2:29

a cycle of productivity that these things, like making a new phone, takes a certain amount

2:35

of time, effort and energy and that's normal and that's okay.

2:40

And a lack of visible activity isn't indicative of all of a lack of productive progress.

2:48

So as we sort of break that down, it comes to the underlying trust that actions are occurring.

2:55

So let's consider the opposite extreme.

2:58

We're going to look at now micromanagement and metric driven reporting.

3:02

Let's say that you're a new manager coming into a business and you're trying to understand

3:06

what a team is actually doing.

3:09

You have no long-term view, no real time horizon, no set expectations, no evidence of a productive

3:16

output and hence no evidence of an end result.

3:20

On the surface, the team isn't visibly doing much, it's not producing much visible activity

3:26

and the pressure is on to demonstrate that activity is happening.

3:31

To do that, the new manager sets some activity-based KPIs and starts reporting those KPIs to the

3:37

next layer of management above them. The team shifts from a trust-based system of work handover

3:43

and progress internally to pushing to close bug reports faster, open change control requests,

3:50

formalize interactions with other team members, and everything starts to become very transactional

3:58

and takes on a transactional overhead. The problem is that overhead isn't free and that

4:03

overhead costs, let's say 25% of their overall efficiency, where that previous energy spent

4:09

doing the doing is spent proving the doing and their final delivery is proportionally

4:14

late or hours worked by the team increases to compensate for that. So that overhead costs

4:20

about 25% of your overall efficiency where previous energy that was spent doing the doing

4:26

is instead spent proving the doing and then the final delivery will be proportionally

4:32

late or the hours that they're working increases to compensate for all that extra overhead.

4:37

The manager though is demonstrating that activity in the team is happening and their management

4:42

above have great visibility that activity is happening but it's come at a huge cost.

4:49

So trust, trust is eroded, interactions in the team and between teams become transactional

4:57

and impersonal, actual productive work is slowed down, the result is actually a net

5:03

loss overall. Morale starts to drop, people become more distant, more hostile, and then

5:10

there's an unhealthy tension that builds between teams and silos begin to be built for some

5:17

perceived self-preservation or superiority versus inferiority or this team is doing better

5:23

than that team, so we're not going to help them, they're not going to help us.

5:27

That works directly against teamwork in the context of the whole organization.

5:34

The infection of KPI-driven or metric-driven activity measurement seems to come from ignorance,

5:41

I think, and a belief that humans can be evaluated as discrete productive entities with different

5:48

levels of performance for different activities.

5:52

That kind of mindset completely ignores the fact that humans are complicated.

5:56

Invisible work like tidy up activities, a desire to help others in need is a real thing,

6:03

along with learning curves. And humans, we change focus based on our outside lives and pressures and influences.

6:10

That changes any equation that people might have and could possibly ever document to say,

6:15

"This person can produce this amount of work in this amount of time," because all of it

6:19

is extremely variable. But ultimately, if you're going down the path of metric-driven

6:25

activity measurement, and there's a push down the chain to demonstrate that movement is

6:30

occurring in a team, and activity is happening, and the longer that time horizon is for those

6:37

final deliverables, the more the pressure will build. You get comments like, "Oh, are

6:43

Are they still working on that? It's been two weeks, it's been two months, it's been a year.

6:49

When in fact it's always taken a year between phones, just like every other year for the

6:55

past decade.

6:58

Your ignorance of the past is not a measure of the team's performance, it's just a measure

7:04

of your own ignorance.

7:07

All it takes is a lack of trust or even a so-called respectful challenge of why is this

7:12

taking so long? Can we go faster? So that drives evidence-based management versus trust-based

7:19

management. Now don't misunderstand, I just think it's very important to acknowledge that

7:26

cost tracking is important, sure. Milestone reporting is also important and I think reporting

7:34

up the management chain about roadblocks and disruptions is very, very important.

7:40

But I do think that reporting on activity alone is entirely and completely pointless on every

7:47

single level. And yet it seems to persist, probably because it's easier for some people

7:54

to quantify and understand, despite the fact that it doesn't actually add any value.

7:58

Now, you may be wondering, I'm talking in very general terms, why don't we talk about something

8:05

specific. So let's talk about an example in programming terms. One of the activity-based

8:10

metrics is bugs closed over a time period. I mean it could be that we could all just as easily be

8:15

talking about service tickets and so on. It doesn't really matter but let's go with bugs closed over

8:20

time. So assume there are three teams and all of them are working on different sections of the code

8:25

maybe different code libraries, different controllers physically, it really doesn't

8:28

matter. The business though, it's trying to determine performance between those teams to

8:34

grade them and by measuring activity metrics. Team A closes 5 bugs, Team B closes 10 and Team C closes

8:42

50 bugs and that's all over the same time period. So Team C, 50 bugs? Well they're the highest

8:48

performing team aren't they? Well maybe they are, maybe they're not. What if I actually told you

8:54

that Team A objectively was the highest performing team? Because when we deep dive into it, well it

9:02

It turns out that Team A worked for 25% more hours

9:06

than the other two teams did, and resolved issues that had caused system crashes

9:10

and catastrophic failures.

9:13

And both teams B and C had been trying

9:16

to fix them for months, and they'd failed many times to resolve those issues.

9:20

Turns out that Team C's bug fix count list

9:24

consisted of 30 of them, which is typographical errors.

9:27

And the balance of those bugs they closed very straightforward, low priority bugs that really didn't add any specifically huge value

9:35

and yet they counted against their metric and it looked amazing but it just wasn't. So the problem

9:43

with the metric you might say is actually the lack of depth of detail since not all bugs are

9:47

created equal in terms of effort to resolve them and their impact to the system which is true

9:52

but then we reach that point how do we honestly objectively and fairly assess

9:58

just how difficult a bug is to resolve. Sometimes we measure priority of bugs like a P1, priority

10:04

one, you know, can't ship without it, P2 can wait for the next service pack and so on, but estimating

10:08

effort to fix a bug is extremely difficult even for experienced developers. There's always an

10:14

element of interpretation, professional judgment and balance to determine which team was actually

10:19

the highest performing and ultimately only the manager that technically understands what their

10:24

team is doing is able to interpret this and if they can't understand it then their only option

10:31

is to metricize their team's work. Of course there's other options, maybe they don't want to

10:38

spend the time to understand, or as in like they're too lazy, or maybe their management layer above

10:44

demands that they must have a metric. So when I come across the situation of activity measurement

10:50

as a measure of output. The things that strike me are, first of all, A) the manager either doesn't

10:57

understand what their team does or B) they're too lazy to do their job or C) they've allowed

11:02

their management to push irrelevant activity metrics upon them. For all these reasons or any

11:09

of these reasons, they shouldn't be in management. Try something else like cross stitch or something.

11:15

Thinking of it like a mathematical equation just for a second.

11:18

Productive plus activity equals productivity, which is what we care about. Whereas,

11:25

activity minus productive equals, well, just activity, which is a waste of time,

11:33

a waste of money, just a waste of everything.

11:37

If you're enjoying Analytical and want to support the show, you can via Patreon at

11:43

at patreon.com/johncheejee or one word.

11:46

With a thank you to all of our patrons and a special thank you to our silver producers

11:49

Mitch Bjorgar, John Whitlow, Joseph Antonio, Kevin Koch, Oliver Steele and Shane O'Neill.

11:55

And an extra special thank you to our gold producer known only as R.

12:00

Patrons have access to early release high quality ad free episodes as well as bonus

12:03

episodes and this is done by Patreon.

12:06

Visit engineer.network/analytical to learn how you can help this show to continue to

12:10

be made. Of course, there's lots of other ways to help like favoriting the episode in your podcast

12:15

player app or sharing the episode or the show with your friends or via social.

12:19

Some podcast players let you share audio clips of episodes, so if you have a favorite segment,

12:23

feel free to share that too. All of these things help others discover the show and can make a big difference too.

12:29

You can follow me on the Fediverse at [email protected], on Twitter @JohnChigi or the network at engineered_net.

12:38

Accept nothing. Question everything.

12:41

always a good time to analyze something. I'm John Cheerjee. Thanks so much for listening.

12:46

[Music]

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features