You have to prioritize!

woman-687560_1920

If you want to have the most impact you have to prioritize! But how? Based on what criteria?

Do a little of everything? Pick what’s top of your list and work your way down? Do whatever comes to mind and grabs your interest at the moment? Don’t know, so rather stall and don’t do anything at all?

Well, the above approaches don’t seem like great strategies, so let’s take a more structured approach instead.

At Amazon there are limitless opportunities – always – and as a result, we constantly need to prioritize and make trade-offs. Actually, when candidates ask me in interviews what the hardest part of working at Amazon is, I tell them “it’s to decide what not to do”.

There is often quite some ambiguity on how one should make such decisions. I see this across individuals and organizations, way more often than I would have expected. Most people have a good grasp of how they should prioritize, but then they mix and muddy things as they get into the details.

Prioritization is about discipline – both in thinking, as well as in execution.

The operational discipline is something you need to develop for yourself. The mental model is easier to share though. Here is a prioritization framework that works in most cases.

How to prioritize

  1. P0: Things that HAVE to be done to support a strategic goal or prevent a strategic risk. Those are typically set top-down as company or organizational goals. If things MUST be done to support those organizational priorities, they need to be treated as non-negotiable (P0s). The important thing though is that this only applies to blockers (!) for such goals; it doesn’t include all of the nice-to-have things one could do for that space. Nice to have work must stand on its own cost-benefit analysis. It’s not a P0 if it’s not a blocker without a feasible workaround!
  2. P1-3: Things that provide the highest ROI (return of investment) / best cost-benefit ratio in sorted order. Everything else you do need to be evaluated under the criteria of ‘most bang for the buck’. Don’t spend energy on something that will be useful in the future (hopefully), but not just yet. If it will yield a higher return than what you’re doing right now, stop doing what you do and switch over; if it doesn’t, then double down and finish what you started. Sort the things that you need to do by ROI, nothing else.
  3. Exceptions from the rule. There are some reasons why you might have to invest in some projects with lower ROI. The clearest is if you hit a scaling limit by putting more people on a problem. If adding more people to a project doesn’t scale your delivery pace (close to) linearly, you should deploy them somewhere else. Similarly, if you need to make investments to lower your operational cost or substantially increase future delivery speed (e.g. re-architecture), you need to prioritize those accordingly. However, I might argue that those effects can and should also be quantified and expressed in an ROI decision. The other reason to keep some capacity for work that is not ROI-prioritized is to diversify your opportunities and/or make room for experiments to explore new areas. Be very conscious though, as to how much time and energy you want to devote to such activities.

Pitfalls to avoid

Don’t mix criteria. If you make ROI decisions, make ROI decisions. Don’t mix ROI and opportunity or something else.

If you go to a supermarket and shop for oranges, all other things equal, you will pick the ones at the lowest price. You will not pick a bunch of the lowest priced ones and another bunch of the expensive ones, just because they are there. ROI is your metric, stick to it. Opportunity only tells you that you can buy oranges, it doesn’t tell you that the price is right.

Side note: ROI doesn’t need to mean dollars – it means the impact (return) of your resources (investment) on the metric you care about most (e.g. cost, speed, quality, precision, satisfaction).

Don’t take previous decisions as gospel. Don’t block yourself by perceived constraints or previous decisions. As you get more data and understanding, challenge previous assumptions! For example, a goal is not a value in itself, it might have been set based on an incomplete understanding of the total opportunity. As you understand the opportunity space better, re-examine previous goals – if they no longer express the most important thing to do, make a pitch to change the goal!

Elephants get chained when they are young and too weak to break those chains. They learn that chains define their limits. As they get older, they don’t even try to break those chains anymore, even though they easily could. Don’t be chained by previous assumptions, re-evaluate what you know and question what you believe as you learn more!

Invest the intellectual energy to set strong and data-driven priorities. Exercise the operational discipline to focus on those goals without distraction. Nurture the curiosity, flexibility, and courage to revisit those decisions and underlying data to verify that you are still pursuing the right goals.

 


Did you like this article? Want to read more?

I will keep posting articles here and I have them lined up way into summer 2020. However if you want to get it all in one comprehensive, structured, and grammar-checked (!) view, check out our new book:

 

Put on your oxygen mask first - book cover

Put On Your Own Oxygen Mask First

A practical guide to living healthier, happier and more successful in 52 weekly steps

By Alfons and Ulrike Staerk

ISBN 9781077278929

Find it on Amazon: Paperback, Kindle

 

If you like what you’re reading, please consider leaving a review on Amazon. If you don’t like it, please tell us what we can do better the next time. As self-published authors we don’t have the marketing power of big publishing houses. We rely on word of mouth endorsements through reader reviews.

Throwing Spaghetti on the Wall…

success-2081168

Have you ever heard someone say: “Don’t just throw spaghetti on the wall and see what sticks.”?

Well, obviously that’s not a good strategy to understand priorities and inform a future course of action. It’s also messy and a little disgusting…

A much better approach is to understand problems and drill down to root causes, identifying cause and effect correlations, and then formulating a set of hypotheses on how to influence those root causes. But let’s start from the beginning…

The spaghetti approach

Here is how I know when a PM interview doesn’t go well:

Me: “Interesting problem. How did you find out how to solve it?
Aspiring PM: “I did A/B testing and looked at the results.
Me: “Sounds cool, how did you know what to test?
Aspiring PM: “Well, we tried out a bunch of things, and then picked the one that showed the best results.

That’s not experimentation, at least not in a scientific sense, that’s classic throwing spaghetti on the wall and seeing what sticks. It’s expensive. With this method, you will find the right solution only by brute force or sheer luck. More often than not, the true solution and needle mover will remain elusive.

If you want to get to the true best global solution through experimentation, you need to have a plan first!

Drop any preconceived notions of ‘the right solutions’. In fact, burn your list. Instead, start from identifying the root causes and focus your experiments on understanding what drives those root causes.

The scientific Method

Experimentation is like throwing pebbles. If you have a plan where to throw them, you will likely hit your targets with a few throws. If you don’t, you will need a LOT of pebbles to hit anything worthwhile.

Here’s how you develop a plan before you start throwing your precious stack of pebbles:

Step 1: Root causes – What is the problem?

Start with identifying the problem. Then ask yourself what causes that problem. List all the drivers that you can identify from the data and observations that you have available.

Check for causations. Are those drivers really causing the root problem, or are they just correlated? Drill all the way down until, based on the data you have available, you cannot draw clear cause-effect relationships anymore.

Step 2: Hypothesis – Enter the unknown!

Up to here, causations were directly supported by existing data and observations. Now they are no longer, and you need to find ways to fill your data and knowledge gaps. You start making a plan for throwing your pebbles.

Start to develop hypotheses for the cause-effect relationships for which you don’t have clear data. Check if there are any drivers that you might have missed. Where do you have hunches (informed guesses), but no data?

Step 3: Experimentation – Closing the data gaps.

You have several brilliant but untested hypotheses. Now is the time to come up with a plan to put those hypotheses to the test. It’s time to develop experiments that can validate your hypotheses and provide you with the missing data.

Be clear as to what data specifically you need to get from an experiment to validate your root cause hypothesis. You can get a lot of data out of experiments, but not all of it will truly correlate to the specific needle that you want to move.

Think creatively and broadly as you get into designing your experiment. Not every experiment needs to be a big engineering project.

There are many ways to get data. Experiments can be product implementations, but they can also be very simple initial and manual tests with small groups of users or user research studies. Of course, the closer your experiment is to a large scale production roll-out, the more precise your data will be. However, you don’t always need that precision for the initial validation of an idea that will inform the next steps in a project.

The faster you can get results, the better. Sometimes you need to build something out in scale to get the right data; more often, you don’t. There are no bonus points for expensive and slow tests.

Step 4: Refinement – What have you learned?

Look at the data. See what hypotheses are validated and which ones are not.

Don’t leave it with that simple checklist though. Reflect on how your cause-effect framework might have changed with the new data and insights. Does the experiment’s data indicate new root causes that you were previously unaware of?

Finally, ask yourself if you have answered enough of your root cause questions to build your MVP, or if you need more experimentation and data to ensure you will head out in the right direction.

Sticking points

  • Experimentation is great!
  • More specifically, targeted experimentation is invaluable to get missing data and understand your space.
  • Just trying out stuff, on the other hand, is wasteful and will likely increase confusion instead of reducing it!

 


Did you like this article? Want to read more?

I will keep posting articles here and I have them lined up way into summer 2020. However if you want to get it all in one comprehensive, structured, and grammar-checked (!) view, check out our new book:

 

Put on your oxygen mask first - book cover

Put On Your Own Oxygen Mask First

A practical guide to living healthier, happier and more successful in 52 weekly steps

By Alfons and Ulrike Staerk

ISBN 9781077278929

Find it on Amazon: Paperback, Kindle

 

If you like what you’re reading, please consider leaving a review on Amazon. If you don’t like it, please tell us what we can do better the next time. As self-published authors we don’t have the marketing power of big publishing houses. We rely on word of mouth endorsements through reader reviews.

The Fallacy of Measuring Everything

girl-1245773_1920

I wrote many times that you cannot manage what you don’t measure. While I still agree with that principle for most of the things we do, especially those we need to drive towards a certain goal, I will make a counterpoint in this post today.

The counterpoint is that we are overdoing measuring ourselves and pushing ourselves towards goals in today’s culture. We’re mechanizing every single part of our lives.

As always, the magic lies in the balance, and balance is what we are often losing sight of.

We push and measure ourselves at work. We track every single minute, make ROI (Return of Investment) decisions for everything we do and don’t allow any slack or waste (i.e. idle time or downtime).

Then we come home from work and do the same all over again. We track the time we spend on different activities, run through our task and priority lists, make sure every evening for the kids is booked and planned with some enrichment activity, and even when we go for a walk in nature we’re tracking our steps, distance, and how we rank against our buddies.

We deprive ourselves of downtime, time to go with the flow, time to think and let our thoughts go free, time to recharge and recover.

Everything must be in balance to thrive. Respect that balance.

Let go, as much and as often as you push and focus.

Contrary to previous posts and recommendations, I’ve lately stopped tracking my steps and recreational activities. I’m not measuring ‘fun’, ‘recovery’, and ‘relaxation’ anymore, as I realized that measuring those and pushing myself to do more and better, only turns it into another chore. ‘Recovery’ becomes another drain instead of something that recharges us.

I’m still pushing hard against goals at work, and I have a list of things I need to do in my private life. I still have clear goals and outcomes I want to achieve. However, I am now also clearly identifying areas, where none of those measurements matter, and I can just go with whatever happens at the moment.

I have a general framework of how I want to spend my time (family, mindfulness, sports, and nature), but I won’t sweat or be mad at myself if I didn’t do all of them every week. I also don’t worry anymore if I spent 5mins on a walk with my dog or 30mins. It’s the quality that counts, and how much it helped me unwind and recharge.

I have very clear goals and metrics for work, however, I also identified areas, especially in my personal life, where I only go with loose frameworks and personal values.

It is liberating, and it gives me more focus and energy to measure and manage the things that need to be managed.

If all you have is a hammer, everything looks like a nail. Make sure ‘measurement’ and ‘achievement’ is not the only tool you have in your toolbox.

 


Did you like this article? Want to read more?

I will keep posting articles here and I have them lined up way into summer 2020. However if you want to get it all in one comprehensive, structured, and grammar-checked (!) view, check out our new book:

 

Put on your oxygen mask first - book cover

Put On Your Own Oxygen Mask First

A practical guide to living healthier, happier and more successful in 52 weekly steps

By Alfons and Ulrike Staerk

ISBN 9781077278929

Find it on Amazon: Paperback, Kindle

 

If you like what you’re reading, please consider leaving a review on Amazon. If you don’t like it, please tell us what we can do better the next time. As self-published authors we don’t have the marketing power of big publishing houses. We rely on word of mouth endorsements through reader reviews.

What do Lord Kelvin and Peter Drucker have in common?

combined

Probably the most famous quote from management guru Peter Drucker is:

If you can’t measure it, you can’t manage it. – Peter Drucker (1909-2005)

However, Scientist Lord Kelvin beat him to the punch and called out a similar principle even earlier:

If you can’t measure it, you can’t improve it. – Lord Kelvin (1824–1907)

For one thing, this shows again that Physicists are usually beating everyone else to exciting insights about how the world works. 🙂

Lord Kelvin also found the second law of thermodynamics, which postulates that everything will eventually end in unstructured chaos anyway, but that’s another story, so let’s not get distracted.

While I full-heartedly agree to the above principle about measuring, I would extend it to:

If you don’t know what you want to manage, you’re wasting your time measuring. Likewise, if you’re not committed to do what it takes to improve a metric, you might as well not bother measuring it at all.

All right, after that motivational downer, I want to reflect a little bit on metrics and reporting, what we should measure, and how we should think and talk about those numbers.

How to think about metrics

Why we care about metrics

Impact and outcomes (Output metrics) – In all we do, we prioritize and target our energy on doing a few things that we believe have the most impact on a given customer or business outcome. There are many things we decide not to do to keep that focus. Once we’ve done what we have set out for, we’d better know if our believes and assumptions were right (i.e., if we can build upon them) or wrong (i.e., what we can learn from them). Metrics help us to track whether our actions lead to the anticipated outcomes. They help us identify where we need to course-correct.

Defects and actions (Input metrics) – As hinted above, not all plans work as anticipated. Looking at the right (input) metrics helps us see where things don’t develop according to plan and prediction. Once we are aware of those areas, we can assess impact, and develop strategies to fix the issues. Input metrics are typically leading indicators, and while we care about the effects and outcomes, input metrics are where we can learn why things don’t quite work and take proper actions. Here is a quick read-up on input versus output metrics: https://www.linkedin.com/pulse/focus-inputs-alfons-staerk/

Early warning (Health metrics) – Last not least, metrics help us to avoid being blindsided. Like a canary in the coal mine, a good set of ‘early warning’ metrics help us avoid to discover an issue through an escalation, and instead proactively identifying it ourselves. No one wants to get an angry email from a customer or their boss.

When you develop the set of metrics that you want to track for your product and program, you want to make sure to track all three categories. Each of them is equally important and serves a specific purpose. However, you need to think about how you use them strategically and intentionally.

How we track and report metrics

Outcome metrics (impact and outcomes) are the ultimate goal, but they typically lag and move slowly. You want to present them to leadership and stakeholders, but do you want to do it weekly? Do they change fast enough? What’s the right forum for them (quarterly, monthly, or weekly reports)? If you present them weekly for reference, are they the data that you want to draw attention to every week?

Input metrics are critical to managing your product and program. They are leading indicators and most often change weekly. You do want to look at input metrics (defects and opportunities) weekly, but you also want to make sure you look at and focus on the ones where you would consider taking action. If you are not willing to take action or ask for support to take action, it’s just noise and distracts everyone. Make input metrics actionable or think harder what the right actionable input metric would be! Last not least, some input metrics are noisy. If that is the case, think about how you can report them differently to separate noise from a real trend. Metrics are all about learning, not about showing that you have many numbers.

Health metrics are critical, and you should look at those weekly or even daily. They are your insurance that you are not caught on the wrong foot. However, by definition, they should be very dull and not change much. No news is good news! If you have a story to tell about your warning indicators every week, then there is a more fundamental issue at hand. With that, in most cases, those metrics are something you and your team need to look at very frequently; however, you don’t want to report them to a broader group frequently (e.g., through regular reports). Instead, those are the metrics that, while not looked at by a large group regularly, should kick off an immediate heads-up to your leadership when you see things going sideways.

How to talk about metrics

For me, the most frustrating experience in metric reviews is to see a sea of data with no apparent focus or structure. In those cases, it takes me a while to catch on the slide structure, and by the time I have, I have missed the call-outs. The second worst thing is to have the same call-out on the same data, that didn’t change anyway, every week. The third most frustrating thing is to have a call out, that’s not related to the data on the slide – it utterly confuses me every time. Bonus frustration: having a new slide where the structure needs to be explained instead of being self-evident.

Slides are stories. They need to be able to speak for themselves without additional explanation. The stories they tell need to engage and focus on the ‘news.’

Let’s start with the simple thing – the presentation.

Visual presentation – Make it digestible

As we think about how we can turn slides into stories and data into statements, we need to give focus to presentation. A sea of data is not a story; it’s a distraction. One hundred rows don’t convey insight but chaos. Data that are not organized along a logical flow isn’t providing a signal, but increasing noise.

The flow of your data – First of all, think about the right logical flow of your data. What is the correct organizational principle that will guide the viewer along and help them make their findings? Often this is obvious (e.g., funnel steps or input metrics that feed into an output metric), but give it a hard thought. Having the right structure is the difference between a strong slide and story, and a weekly struggle to get through your WBR section.

Help drive focus – Most times, less is more. What data is needed? What data would you take action on? What data is critical, versus supplemental, and how can you visually highlight the critical data? Can you bold specific data, can you visualize a funnel structure in how you present your data? Make it easy for the viewer to see for themselves what you can see in the data. Also, make the hard choices not to show data that doesn’t matter. We’re all proud of all the data we can find; however, focus wins the game in a presentation every day. Metrics meetings are crisp and focused presentations of the state of the union, not word search puzzles.

Be clear what you talk about – When you talk about something that is not on the slide, be clear about it before you get into your story. Try to avoid that situation though – if you launched something new and it doesn’t show in the data yet, then talk about it once it does. When you talk about something on the slide, make sure to refer to where the data is – searching for the needle in the haystack is no fun in a fast-paced meeting.

Data with Intent! – WHY? SO WHAT?

We don’t talk about the data just because we have it. We have an intent! Be sure to talk about that intent. Why should I care? Why does this particular data matter? You know it, but I don’t, so please explain it to me!

Here are some categories that are usually leading to exciting stories about data. Don’t feel you have to tell each of them every week, tell a story if things have progressed or changed in a meaningful way.

Progress – We all want to know if we are making progress against our goals. A key to seeing that on a weekly or even monthly basis is to have a ramp plan. If we want to achieve a particular goal by the end of the year, where should we be this month, next month? How close are we to that ramp goal, and if there is a gap, what can we do to close the gap? An output metric without a ramp plan is useless! We need to know if we can feel good or should be worried. These callouts are typically related to the outcome metrics we committed to.

Learnings – What did we learn from the data. Are there any surprises or positive trends that we didn’t anticipate. This is where input metrics come in. Focus on the differentials, though, don’t repeat the same insight every week (without taking action). This is where we should talk about surprises and experiments and their related learnings and outcomes. Focus on what’s new, don’t tell the same repeating story again and again without changing the game from one meeting to another (if you decide something is not worth managing, remove the metric from the official deck). Most of the learnings come from input metrics that we are tracking.

Attention needed – Sometimes, trends turn in the wrong direction without warning, and without us having done something specific to anticipate that development. Those are the canaries in the coal mine. Be sure to call out when such things happen. Also, be sure to have some insights on what happened, or at least a plan and a timeline on how to get those insights. These alerts are essential data points and not bad. They tell us when we need to focus on something. Take them as an opportunity to fix something early on before it gets bad. However, don’t wait for a pre-scheduled meeting if you notice that health metrics erode – kick off an email thread with your leadership immediately and take action!

A lot of the follow-up questions from leaders usually poke into one of the above areas. By looking at the categories in that framework, thinking through your story along those lines, and presenting it succinctly, you will convey that you are on top of your game. You will show your confidence, ability, and success, instead of being caught off-balance.

Last, not least:

Don’t try to fill the space/time – If you don’t have a story in any of the above areas, don’t make one up. Power usually lies in not giving in to the temptation to fill empty space. Just say that nothing happened to the data that is worthy of a call-out, and make a short statement as to what changes you expect to see shortly and why (if you don’t anticipate any, then your program is dead, and the slide should be removed). Related, there is no rule that you have to have three call-outs. I would much rather hear one strong call-out and finding than three repetitive ones.

Learning from the data can be fun if we let it be! Use metrics, reports, and reviews as an opportunity to learn about your space and tell compelling stories to leadership.

 


Did you like this article? Want to read more?

I will keep posting articles here and I have them lined up way into summer 2020. However if you want to get it all in one comprehensive, structured, and grammar-checked (!) view, check out our new book:

 

Put on your oxygen mask first - book cover

Put On Your Own Oxygen Mask First

A practical guide to living healthier, happier and more successful in 52 weekly steps

By Alfons and Ulrike Staerk

ISBN 9781077278929

Find it on Amazon: Paperback, Kindle

 

If you like what you’re reading, please consider leaving a review on Amazon. If you don’t like it, please tell us what we can do better the next time. As self-published authors we don’t have the marketing power of big publishing houses. We rely on word of mouth endorsements through reader reviews.