Back to vistaly.com

The two hypothesis that bridge day-to-day product work with business impact

Steve Klein
Steve Klein

How often do we brainstorm, build, and launch something—then realize it didn’t actually move the needle on the business outcomes we care about? In an “output-first” culture, it’s easy to rack up shipped features without addressing the real questions:

  • Are the things we're shipping driving the change in user behavior they're intended to create
  • Is this change in behavior valuable enough to our users that it results in some business impact

If this feels all too familiar, it might be time to take a closer look at how you connect your day-to-day work with the business impact you're hoping to drive.

I recently sat down with two experts to discuss exactly this:

  • Jon Harmer - Product Manager at Google Cloud (and Techstars mentor)
  • Scott Sehlhorst - Product strategy consultant and founder of Tyner Blain

They’ve both spent decades helping teams clarify their product strategy, identify which problems to solve, and communicate the business impact of that work. Below is a short recap of our conversation, with some top takeaways for Product Managers and Product Leaders.


1. Focus on "Landings," Not Just "Launches"

Jon highlighted a concept they use internally at Google: don’t measure success at “launch”—measure success at “landing.”

  • “Landing” means your release actually resonates with users.
  • “Launching” just means you deployed something.

In other words, your team’s done more than ship a new feature. You’ve solved something. If customer behavior isn’t changing in a measurable way, it hasn’t landed—no matter how slick the code or how fast you shipped it.

Why this matters: If you’re only celebrating what was shipped, you’re missing the big picture. You need to identify and be able to measure how customer behavior will change as a result shipping it (e.g. % of people that consistently use it, time it takes to use it, reported satisfaction with using it).

Make sure you have a baseline to track against before launch, and put events on your calendar 1 week, 1 month, 3 months, 6 months out to check-in on the goal you set.


2. Build "Outside-In" with Customer and User Journey Maps

A recurring theme was the importance of stepping back to understand the entire customer journey, not just the slice where your product gets used.

  • Customer Journey Map: Looks at everything the user does to accomplish a broader goal—even the parts that have nothing to do with your product.
  • User Journey Map: Focuses on how a user navigates your product’s specific workflows.

Scott emphasized the outside-in perspective: Instead of asking, "How do we get people to press this button?" ask, "What are they trying to achieve overall?” and "What are the other steps involved in achieving their goal?" This helps you spot new opportunities to solve real customer problems, not just polish the existing flow.

Why this matters: If you don’t see the bigger picture of why people use your product—and what else they do outside of it—you risk optimizing the wrong things and missing larger opportunities.


3. Explicitly State Your Solution and Outcome Hypotheses

A big takeaway: explicitly state your “solution” hypothesis and your “outcome” hypothesis.

  1. Solution Hypothesis:

    • “We believe that by introducing [Feature/Improvement X], we’ll solve [Customer Problem Y], and see [Behavioral Change Z]”
    • This is about what you’re building and why you believe it will address the need.
  2. Outcome Hypothesis:

    • “If we see [Behavioral Change Z], it will lead to [Business Result W].”
    • This is about the measurable impact on user behavior—and the value it creates for the business.

Why this matters: When something fails, it’s tough to know whether you picked the wrong problem or built the wrong solution. Defining both hypotheses up front helps you test your assumptions more precisely—and pivot faster if you ship something that doesn't have the expected results.


4. Use Coherence Checks So Leadership And IC PMs Are Aligned

Even if you personally are convinced of an outcome-driven approach, you still have to bring the rest of the organization along—especially if you report to someone who prioritizes “output” or “velocity.”

Scott offered a simple technique: the coherence check. When a leader asks you to build something specific, trace it back to the business goal with a simple question:

“If we build this, which user problem are we solving, and how will we know if that helps the business?”

  • If there’s a clear chain of logic (problem → solution → measurable change in behavior → business impact), then great—get building.
  • If not, it’s a sign you need to clarify the rationale before your team invests time and energy.

Why this matters: Coherence checks let you influence “top-down” decisions without simply saying “That’s a bad idea.” or "But customers are asking for this other thing." Instead, you’re inviting leadership to confirm that you’re both on the same page about the ultimate goal.


The Bottom Line

Every team wants to move faster, but speed without direction is just a great way to burn budget and time. The core question is always: are your customers doing something differently that ultimately drives your business metrics?

  • Shift from focusing on product “launches” to product “landings.”
  • Map the outside (customer’s broader journey) in (your product’s core workflows).
  • Explicitly define your Solution and Outcome hypotheses.
  • Use Coherence Checks to gain organizational buy-in and ensure you’re moving the right metrics.

At the end of the day, what matters isn’t how many things you build; it’s whether those things deliver sustainable business impact. And that starts with linking each day’s product decisions directly back to your top-level goals.

Want to see how Vistaly helps teams define these hypotheses and connect product outcomes to business KPIs? Check us out at vistaly.com or drop me a line!


Big Thanks to Jon and Scott

  • Jon Harmer: Product Manager at Google Cloud, Techstars Mentor, and creator of a top Maven course for product managers on driving business impact. You can connect with Jon on LinkedIn or explore his Maven course for an in-depth dive into these concepts.

  • Scott Sehlhorst: Founder of Tyner Blain, a product strategy consultancy, and host of “The Problem with AI” podcast. Connect with Scott on LinkedIn to follow his latest writing and product strategy insights.


Full Transcript

Steve Klein

So I want to start with the mindset that a lot of product teams have.

I think a lot of product teams are starting from this perspective of "what cool things are we going to build?" instead of "what will customers be doing differently because we launched this thing?" Jon you've talked a lot about this and done a few events on this - can you talk a little bit more about these ideas and what other color you can add to that?

Jon Harmer

Yeah, absolutely. It's, we're all smart people jumping to the solution space, how we're going to solve Problem X is natural for us. But it's really important to understand and fall in love with the problem space more. And Scott has a whole bunch about problem statements and how to do that better.

I'm sure he'll talk about in a second. But one of the things we talk about at Google specifically around this outside and thinking is. Talking about landings instead of talking about launches. So yes, you have to launch your product, but if it doesn't land with customers, who cares?

Dave McClure 500 from startups. Dave McClure has says customers don't care about your solution. They care about their problems. And so if you aren't solving a problem in a way that a customer is meaningful to that customer, and it's going to change their behavior, it doesn't matter. It's useless.

Steve Klein

Yeah, for sure.

I don't know what it is. I think this intuition of starting from things we're going to build is somehow it's like hard coded , in the way our brains work. And so I think for a lot of teams, it takes this intentional. Perspective shift

Jon Harmer

Because you want to be a problem solver, right?

Steve Klein

Yeah, and the, we've talked about this a bunch and there's a bunch of stats out there and maybe you can talk about it a few, but I think the impact. We've seen when you're starting from, Hey, what cool things are we going to build? Is it most launches fail.

Most products don't get adopted. Don't drive the, the results that they're hoping to drive Scott any thoughts around that? You've worked with tons of teams just from the strategy and product side, what experience do you have around? Just the impact of starting from that perspective.

Scott Sehlhorst

Yeah. So first of all, thanks for having me in like it's thrilled, thrilled to be here to be with you guys and talking about this stuff. It feels like there's a lot of pressure on teams within organizations to run downhill, right? Where you just going as fast as you can. You feel like you're going to fall forward or fall back or trip and you're just can't do anything.

Running and take the idea that we've got and keep running with it. And the problem with that is most ideas are bad ideas. Most of the ideas in our backlog are unvetted. And when somebody hands you the baton and says, get running downhill as fast as you can. You don't get a chance to think about the purpose or the outcome or the benefit or the need or any of the reasons why somebody handed you that baton.

It comes back to that customer question. We all cite the old Ford, I'd build a faster horse. Anecdote and really, even when customers are asking us to do stuff, I don't care what they're asking for. What I care about is why they're asking for it. What is their unmet need?

What is the opportunity to make the world better for them? That lives behind the request. So many people at the surface level, take the request and run with it or. There's another pathology we can talk about for folks, especially folks who are in it teams or right. Working on internal systems of somebody just told me to build the thing.

And the ideas aren't always good ones. And every time a bad idea comes in and we don't do anything to vet it, it's just, it's a cascade of waste. All the way down the system.

Steve Klein

Yeah. And I feel like we go through these just like in the general product management discourse, we go through these like waves of, really promoting this idea of being outcome driven right now. I feel like we're in this where we're a little bit on the opposite end just around, the whole post about, founder mode came out and I think a lot of CEOs took that to mean, okay, cool. I need to I need to just have the strong vision and tell teams, tell teams what to build.

I know the customer's best. I'm just going to go out and direct the teams what to build. I'm sure you guys have been following that. I'm curious if you guys have any thoughts just around that whole idea and like what the net effect of that has been.

Scott Sehlhorst

I'll throw out something that there's a survivor bias in the way we see this stuff because one out of at best one out of 100 of those people who had that brilliant idea were right.

And so what are the odds that Idea formed by one person all by themselves is a good one, right? There's stuff you do to manage that, right? And manage the risk of telling your CEO. No, that's a bad idea. There's a way to do that. Jon's laughing cause we've worked with clients before who struggled with that.

But most of those ideas are bad ones and those companies are gone and we don't ever hear about them.

Jon Harmer

There's data available on the internet, so I'm not sharing anything proprietary, but like at Google and being something like 10 to 20 percent of experiments end up generating positive results, right?

So that's a lot of ideas you had that were essentially bad,

Steve Klein

right? Yeah. Yeah. The whole founder mode thing to me was so interesting in that I feel like if anything, it highlighted the fact that that sharing. Really sharing context and getting your product teams aligned on what's what is our product strategy?

What problems are we solving? And for who is hard? And so one reaction to that could be, okay I'm just going to, I'm going to tell people what to build. But I think, and I'm, I would venture to say you guys are similarly minded that I think that just means you need to just be more intentional and spend more time getting alignment on those things ahead of time.

Jon Harmer

Yeah. I think you hired smart people, Mr. Mrs. Founder mode person. If you just tell them what to do, then you're overpaying them. Let them contribute, tell them what the problems space is and your understanding of that, but let them contribute to all of the. The thinking around that, right?

Steve Klein

A hundred percent.

Scott Sehlhorst

And if I could add, sorry for interrupting, Steve it's more than, and maybe even more important than doing this stuff upfront. As you said, like the discovery before you get going, it's changing the conversation in real time, like all along the way, having your decision cycles be changed from, did I finish to did it work and orient the teams.

throughout the entire process towards an alignment towards purpose. This is the outcome we're pursuing. This is the change we're trying to create. This is our hypothesis of the product of why we're building these things, why we're improving these systems, why we're introducing these features having that stuff.

Be captured in your artifacts and central to your discussions and informing your process, I think is probably even more important than having it improve your upfront thinking like there's a mindset shift. Instead of trying to take the existing linear process, we do some discovery, we write requirements, we do a design and we execute.

Instead of thinking of that as a serial thread with functional silos, where you throw things over the wall, you instead want to have the purpose piece be front and center throughout that process, whether it's serialized or concurrent engineering, like there's a lot of wonky stuff into how you go about actually doing it.

But I think the most important thing is that in every step of the way, you're saying, this is what we're trying to accomplish and here's how we'll know if we accomplish it. And having that infused into all of the progressive elaboration decisions of in different domains of design and execution.

Steve Klein

That totally makes sense.

Gonna add there was a question that got deleted. Here we go. It got reworded. Yeah, question from Ashley just around what words would you use to educate founders on how evolving a new product is. Is different. I'm wondering. Yeah.

So in terms of just this problem of PM's feeling like they're in founder mode being very different. Any thoughts around that? Just the difference of how teams operate From a very early stage compared to maybe down, down the line.

Scott Sehlhorst

I can take a crack at a different interpretation and Ashley can let us know if I'm misreading her question, but I often got asked, when should a startup hire their first product manager? And my answer was six months before you launch your second product. And so if she's trying to get into the brand new product like how thinking about that is different from all of the.

Thinking that the founder did in order to launch the company with the first product. And if that's the case, then I start thinking about the initial framing of, are we taking a product into new markets where there are customers that we don't understand and competitors that we aren't familiar with.

And so maybe the needs to be addressed need to be changed. And so we need a new product, or are we saying, these are the people we understand. This is the competitive environment in the market that we operate in. And. And because we're in that space, we know that they have other problems. That we're not solving with the current product.

And so we're going to try and expand our footprint and evolve to solve additional new problems for them, which is, that's what triggered for me to the brand new product is we've already got a connection with the people we're trying to help and we're trying to do something else also. Those are two pieces out of the Ansoff matrix.

If anybody wants to dig into it more than my fly by hand wavy stuff.

Jon Harmer

Looking at Ashley's previous version of that question. It looks like, yeah she, he or she owns a brand new zero to one product inside of company where there's existing products and the comparisons to the other PMs is difficult.

So I do think you have to measure PMs differently for zero to one than more mature products. Hopefully you're Executives have been part of zero to one part, but if not, then you definitely have education to do about the incredible level of uncertainty that exists in that stage, right? You don't have as much known problem space stuff at that point, right?

Steve Klein

Yes. Yes, totally. Perfect segue, maybe to get into a little bit of. Uncovering some of those unknowns, Jon, you talk a lot about using user journey maps. Customer journey maps is like a precursor to uncovering your solution and outcome hypotheses. Can you I don't know if you have, One of the slides you've used recently to illustrate these, but I was wondering if you could talk a little bit more about how you think about using those tools and maybe like an example of what that actually looks like in practice.

Jon Harmer

Yeah, I might be able to share a thing. Two maps that we're talking about here, user journey map versus customer journey map. The way I differentiate those is the user journey map is what is the, what are the touch points with your product? Whereas the customer journey map is everything that customer needs to do to accomplish the goals that the broader goal that they have, which could have nothing to do with your product or product to be a small part of it could be a big part of it.

But understanding the context surrounding the usage of where they end up touching your product or service is important. Let's see if I can share a thing.

Here we go. This is a slide from the class about this customer journey map. So you have the different sections. Who's the person and what is the goal they're trying to achieve? What are the various stages that they go through in that full journey? And then what are the steps? And then the problems.

This is all pretty traditional journey map stuff. And then the cool thing that Scott introduced, to me at least, is the criticality of that step and the user satisfaction with that. Process. Like, how important is it for this to be awesome? Or how important is it to my, ability to get through?

Is it a blocker, et cetera, et cetera. And then current target and current satisfaction of any given step. So you don't have to gold plate everything and understanding what's solved enough. means is super duper useful.

Steve Klein

Yeah. Do you find customer journey maps to be more more like relevant when you're thinking about new products or maybe like whole net new sets of Functionality in your product as compared to I could see a user journey map.

Being better at identifying, Hey, what are the ways that our existing product could be slightly better? Whereas this may be more, Hey, let's take a step back and look at like this whole. Process of this thing.

Jon Harmer

Yeah, Intel has a thing called follow me home that their product people do where they go and sit in a business owner's office for a day and watch them work.

And if they grab a post it note and write down a number or whatever that's a step that's happening outside of the tooling that into it could provide potentially a solution for and so that's not necessarily going to show up in a user journey map because that's just the buzz. You're clicking inside of your app.

So doing that customer journey map can be both useful for both an existing product and for a new space. It's just the way you look at it ends up being different. If that makes sense, Scott, do you have any thoughts about that?

Scott Sehlhorst

Yeah, I would, I'm glad that you cited ethnographers cause they are awesome sauce.

But fundamentally it's the difference between an outside in and an inside out view of what's going on. If we take this inside out view of this is how somebody's interacting with our product. Or even the pathology of this is what I want someone to do with my product. That inside out view is limited to being transactional.

You're going to get an understanding of the operations that people go through and maybe the sequence that they go through, given the constraints you've put in front of them, because of how your product works or how a competitor's product works. If you're doing competitive analysis, the outside end view, you need to jump back and say that those operations.

Are being done in the context for the user of the activity that they're trying to pursue, which may be in the context of a broader activity. There is some sort of hierarchy of purpose that the person is trying to achieve their goals, understand something about what the person is trying to do.

And use that as a lens for evaluating the tool sets that they use when you do a follow me home or ride along or any of the different ethnographic. Survey type sessions to you need to be able to discern this is the way they have to do things and the context in which they're doing them to understand why they're doing those things.

So that's I think the ethnographers come in with an intuitive. Outside in a mindset. And that's something as product folks that we have to make that shift, especially for folks like Jon and I, who came from an engineering background where we're thinking about building a solution to the problem instead of trying to, it's, that's a set of intuitions that competes for cognitive space with the empathic or empathetic intuitions of understanding why somebody needs a solution.

Steve Klein

Yes. One question here. That's right in line with this. Any perspective on how to change the thinking thought process for product teams that have been working on solving problems in fairly mature markets via journey maps, for example, product teams that have been in market for 15 to 20 years.

And I think this is relevant because I think I think the longer you've had some success people tend to their thinking tends to be oriented around just how can we improved the thing we have now and I think there's always some room for how do we improve the thing we have now?

How do you balance? Hey, how do we improve just the thing we have now? While also at times taking a step back And looking across the whole journey, any thoughts around percent of time you should be spending doing each or really just how to balance doing each.

Jon Harmer

For the percent of time, it totally depends on the maturity of your product. And the appetite for that kind of thing at your org. If you need to change it, increase it or decrease it. That's a different conversation, but like looking as broadly as possible, as often as you can is going to be useful for finding either.

Major pivots or new product offerings or things like that, that are, related offerings. As opposed to just the iterating on features to, minimize and maximize in the, in the small, if that makes sense.

Scott Sehlhorst

Yeah. If I can add a sort of a complimentary idea that coming at this situation from a strategic frame.

There's a concept Clayton Christensen introduced of progressively improving the value of the product in terms of solving somebody's problems. And if you're a Kano analysis person, you can think of that more is better kind of thing.

If we keep making these incremental improvements in a product. The reality is that there are diminishing returns while you might be able to keep making incremental improvements in some specification or some measurement of performance, you're going to run into diminishing returns in terms of the value it creates for the customer, right?

So flipping back to that outside in view, and you're going to reach a point where the incremental investments don't make sense. So you have to ask the question of, are we nearing that point? Do we need to come back and rethink how we're framing the problem? Jon alluded to the problem statement stuff to say, instead of staying fixated on the current industry's definition of what it means to incrementally improve and that competitive jockeying and horse race to say.

We're at an inflection point, whether technology has infused it or an innovative idea has shown up, where what we should do is pursue this as something disruptive.

Steve Klein

Yes. Yeah, an example that comes to mind is something like Stack Overflow, if Stack Overflow were to continue thinking about, Hey, how could we make it easier for people to submit programming questions and for people to answer them?

We're missing this whole, the bigger picture that, Oh, there's, there's this new technology or there's this whole new way for those kinds of people to get answers to their problems. So yeah, they need to take a step back and think about it from that perspective.

Jon Harmer

Even the Netflix pivot, right? They could have optimized on DVD delivery to the end, forever and become a complete failure.

Steve Klein

Yeah, that's a great one.

Scott Sehlhorst

Or had modest success, which is even worse. I've had clients who I had a CMO I supported, she said, I don't have I don't have a charter because everything's on fire.

We're just not getting better faster enough. Like I don't have a, I don't have a platform for blowing everything up because we're doing okay, but we ought to be doing better than okay. And so sometimes you have to think about that, to Jon's point.

Steve Klein

Yep, totally. Jon, to take the journey mapping one step further, you talked a little bit about defining steps in the journey that are particularly painful and particularly critical.

Can you keep going on that idea and how that relates to ultimately getting to being able to define your solution and outcome hypothesis for what you're going to build and the impact you ultimately hope to drive.

Jon Harmer

Yeah, the journey map is 1 method of doing some version of customer discovery, ultimately to identify those underlying needs user problems.

I'm happy if people do it a different way or multiple ways or whatever, but you, once you identify those needs, you have to figure out the things you're going to do to address them, which ones to address what you're going to build to address those needs.

And what do you think is going to happen as a result of that which is where impact mapping.

Steve Klein

Yes. Know you have this kind of big graphic that, that connects connects all those pieces together.

Maybe just walking through that.

Jon Harmer

Yeah. Let me do that really quick.

All right. So you understand your users. You develop some problem statements for them that are potentially high criticality and low satisfaction.

You take those and those become these needs in the impact map. This is a thing you're going to do or build to address a need a pair of things in this case. And then build a thing to address the need.

How will I know if I'm successful? I will see some measurable change in user behavior. And then if I see that measurable change in user behavior, how do I know that was useful for my business? So there's, those are the two hypotheses and we're going to get into a lot more detail about all of this going up.

And then you have confidence scores as to those bits. And then the idea is to test the things you're least confident about, or they're going to have the most impact on that outcome if you're wrong.

Steve Klein

Scott, you want to jump in

how do you really explicitly define a solution hypothesis, outcome hypothesis? What does a good one look like? How do you socialize them within the company?

Scott Sehlhorst

I think broadly the solution hypothesis is that problem solving headspace. I'm not building something because somebody told me to build it.

I'm building it because it's going to address a need because I need to solve a problem with it and getting into that outside in applying almost scientific method mindset is you can't declare success. When you finish building the thing, just because you've created a widget doesn't mean it actually solved the problem.

So you have to ask the question if I solve this problem, what will be different? And so then that inversion of this is a change that I could observe, it's an observable change. That will result as a consequence of solving the problem. And that gives me a feedback loop into two things that I don't know how to tease apart otherwise, which are, did I pick the right need to solve?

And did the thing I built solve the problem? Those are intercoupled. I don't know how you can say, Oh, I picked the right problem, but I failed to solve it. Or this is a great solution, but it's to the wrong problem. I don't know how to tease those apart. So there is a pairing, but the mindset is that solution hypothesis is capturing your problem solving.

It's the bet you're placing of I've got the right problem and the right solution. And I executed it on it effectively. And so the intellectual integrity of your model, right? So Jon showed, and I can actually, I'll share my screen and move my mouse as well.

I've got you. Okay, so you can see the idea that there is some causation in place, building things addresses needs causes an observable change. This is the thing you orient towards to give you a feedback loop that if you solve that problem. And that's where that hypothesis piece comes in, but specifically the solution hypothesis, the second hypothesis is if I can create that change, that's gonna get me to value. And this is where I'm when I'm talking about the business design. Of placing bets, right? This is where your business acumen stuff really hits in.

A great example for me, if you guys remember Groupon back in the day they offered services to small businesses where they would offer these insane discounts. You would get, you would offer a 50 percent discount. On whatever your product or service was. And then of that 50 percent Groupon would keep anywhere from 30 to 60 percent of it.

So really what you're doing is giving like an 80 percent discount in order to get somebody in and the problem that they're solving, the need that they're trying to address is conversion or customer awareness. And so Groupon built a solution that said, Hey. We do these crazy discounts. We've got a platform that gets you visibility that you can't get on your own as a small business.

And it's going to get you exposure and get you lead gen and introduce these new customers and get a crazy high conversion rate of people coming in and buying your service. The presumption is doing that is going to lead to value somehow. And they really did some hand wavy stuff around that because pretty consistently, all of the companies.

who followed that model, it didn't work for them. Some of them even went out of business. And the problem is, there is a separate outcome hypothesis, and Jon showed it as well. But the the idea is that if I increase conversion rate, It should lead to value for the business, but that's only a leading indicator.

It's not a guarantee...and maybe it didn't.

Steve Klein

Talk to me about...I feel like it in a lot of ways that's like The crux of a startup is, can I solve painful enough problems that it leads to value for customers? And it can be hard to measure, especially if you're building something that makes a company's employees more, efficient,, it's a little, in some ways it's a little hand wavy or it's a little bit like.

Measuring it is a little bit sentiment driven, right? It's like, how much does the buyer really feel like we're actually helping their team move faster and do things better? Any tips on good ways quantify that?

Jon Harmer

Definitely use the impact map, as you're talking about the thing you're going to build to address the need, you definitely have the next blue box thing is.

How the behavior of the user is going to change as a result of that need being addressed some kind of to use the Josh Seiden measurable measurable change in human behavior. You've created user value part of that. And so you have to come up with something to measure.

The success of that feature for users adoption they can make it through your funnel better. They can make it through a process workflow better. They stop getting blocked, bugs stop, blocking kicking them out. Most product people have those kinds of metrics already, right?

I launched my feature and these met, I'm going to drive these metrics product metrics, user behavior metrics. The key, I think the bigger gap for most product people I've talked to is connecting that then to revenue or cost savings or whatever your top line business, CEO level metrics, board level metrics that they care about.

Steve Klein

Yes. Yeah. So I'm hearing you use these basically proxy metrics for, Hey, if people are actually using it. They wouldn't be, people don't log into B2B SaaS tools and use them for fun. They use them because they're getting some value out of it. Measure these proxy metrics.

And at the end of the day, there is some amount of leap of faith or, hypothesis that, Hey, the higher percentage of people that we get using this feature, or the faster they can use it, or if we can get them to do it more times. They're doing it because they're getting value.

Does that sound about right?

Jon Harmer

That is the goal. Yeah.

Scott Sehlhorst

Yeah. And the piece I'll add, so during the lightning round for anybody who joined after I mentioned how to measure anything by Douglas Hubbard. And one of the things that he. Does. So if you want to get like Matthew about it, he looks at Bayesian inversion, but here's the soundbite for those of us who aren't crazy math heads of instead of saying, what can I measure to tell me that it happened?

You asked the question that said, if it happened, what would be different? It's you just, you invert that mindset and say, if people are getting more value out of using this, what would change? The, in the early days of mobile apps, when people were scrambling to figure out what to measure there's a very good example of applying that, which is to say, I'm not looking at first use or download metrics, right?

Those are vanity metrics. What I'm looking at is sustained use, because if somebody continues to use this product, especially In a field where there are a lot of competitor products, then what you're saying is if we did something valuable, people would continue to use it. And so what we look for is continued usage, not first use, right?

That's that mindset shift. What would be different if the hypothesis we're testing were true?

Steve Klein

I can see it being hard for all this stuff we're talking about is really around changing the mindset around how people think, right?

It's switching from that, what cool things are we going to build to how is user behavior going to be different as a result of these things we launch for product leaders, any tips around how they can get there? Individual product teams thinking more this way.

Jon Harmer

If you can I think metric trees are an interesting thing to do to tie those product level leading metrics to business level metrics.

So if you can make more explicit as a leader, if we increase. Engagement on this feature that, leads to whatever better retention leads to increased revenue, work your way up. And start talking to your teams about that whole path when they do products ideas, and then start trying to measure the success of products based on both those user behavior changes and those business changes.

I think that's a great way to get their PMs. To think in that way. And maybe you use an impact map to make explicit those multiple hypotheses so that everybody's on the same page about this is a guess and we're going to test it when, if we see this change, that's what we know. We're right.

Steve Klein

Yes. And quick obligatory shout out to Vistaly. Vistaly is is designed around doing exactly that, right? Let's you define that metric tree, talk about, Hey, what are the product metrics that actually feed into that? And managing the whole kind of discovery process around that.

For PMs for PMs I think, ICPMs often feel like they have this problem, maybe a bit more, their product leaders don't really operate from that perspective. What can they do to get their product leaders thinking about things in this way. Or maybe their product leaders, I would venture to say most product leaders have the have some of these ideas in their head, what can I CPMs do to get their product leaders? Maybe more explicit about these assumptions they have.

Scott Sehlhorst

So I'll jump in with one of the things I start with teams. Jon, Jon said it earlier, we hire smart people, right? These are smart people who are well intentioned and ultimately what we're trying to do is change the culture of a dynamic from dictation to collaboration.

Dictation tends to be in the form of go build me this thing because I want it because I'm your boss or your boss's boss and what I introduce is a coherence check. Conversation instead of pushing back and saying, we shouldn't build that thing. You asked for all of our customers are asking for something else.

That's not a helpful conversation. Instead, what you want to be able to do is use the exact techniques that we've talked about, but run them backwards and say, if we build this thing, what is the need we imagine it would address and who has that need and to what scale, or if I'm doing a process improvement, what efficiency, what frequency what magnitude of.

Impact and identify an observable change and then use your metric tree value driver tree. Like I I waved my hands in my at a Markov model of how your business makes money and say, what is the effect? What is the mechanism of value realization that comes from the change that I expect to see?

Because I think this is the problem that widget would solve and then make sure talking to your product leader, whoever's asking for the things that you can align the level of effort with the scope of ambition that building this thing. We now have like CEO, you have very broad perspective and maybe I'm missing something, but I have very deep perspective.

And so I have the opportunity to look at things that you don't have the opportunity to look at anymore. And so I can help you say, based on my narrow view, I'm going to combine it with your broad view to say this thing. Will lead to in our best understanding the follow a rank following estimated range of outcome.

I want to make sure that we're on the same page. I'm not missing something that you didn't share and that your breadth of perspective didn't cause you to overlook some reason why it's the idea might not be as good as you imagined. And that coherence check shifts culturally. That conversation and then the following conversations.

And as a bonus it'll get you invited into more strategic conversations because far too few product managers think to have much less can have that conversation.

Steve Klein

Yes. Yeah. A hundred percent. I think it relates back to just this idea that, every, everyone from product leadership down to ICPMs has this vague mental model of how these things connect, right?

It's if I build this, we're solving a customer problem. Sure. The ultimately it's going to lead to growth. But this coherence check sounds like an opportunity for everyone to get their assumptions out on the table in, in a more like explicit, You use a driver tree, a visual way and I think it, it can lead to much better conversations where everyone has the chance to, yeah, be explicit about their assumptions, why I think this is the way to go, or this is the way to go, or we should focus on.

Improving this product metric or that product metric.

Scott Sehlhorst

Yeah you really have to make the connection. Everybody we mentioned McClure and pirate metric and stuff like people think about churn and are like, Oh, I want to retain my customers. That's an objective in and of itself that it's obviously good.

So let's go build things that help with customer retention or increased customer lifetime value. Wait a second. Those are different things. How much am I increasing my customer lifetime value by retaining them? And you need to have some beliefs, which are actually in an established company, a lot easier to inform around how much repeat sales, what percentage of sales go to existing customers?

Like you can build a model that says, this is our belief. Based on the information we have right now. And so we can only make today's decisions based on today's beliefs. This is our belief of the magnitude, the quantified benefit that comes from reducing a churn rate instead of just running as fast as you can downhill with the target of, we're going to reduce our churn from 90 percent retention to 95 percent retention.

Okay, great. What's that worth? How much should you be willing to spend? You have to make that other connection. You just have to do it. That's part of what product management is.

Steve Klein

Yes. Any any tips just around tool, maybe not tools, but yeah, I'll say tools or processes or, a lot of folks are working at somewhat established.

SAS companies and they have a data team, but they still don't have a good grasp on these ideas or this data, and this is probably honestly like a whole other webinar around, how to work with your data team to Get a grip on these, but any kind of quick tips or thoughts around that?

Scott Sehlhorst

There's a methodology answer, that I would use is GQM .. So that's the goal question metric methodology for saying, this is what we're trying to accomplish. What are the questions we would ask to tell us whether or not we accomplished it?

And then what are the measures we would use to answer those questions? GQM is the methodology for being just a little bit more rigorous about saying, this is what I need to measure to know if I'm on track. To achieve what I'm achieving and that's applicable at the OKR level of the KPI level, individual performance, team flow metrics.

Like it's a useful methodology for how you think about what you're going to measure. And then the other half of it aim is adverse impact mapping of what are the unintended consequences of choosing those measures. You need to do some second order thinking so that you don't do harm. Right alongside with the good that you do by picking something to measure.

Steve Klein

Got it. Okay. Yeah. That last one sounds like setting good, like countermeasures or Hey, just cause we improve, one thing in this part of the product, we need to see how it affects the whole customer journey.

Jon Harmer

Yeah. Need a basket of metrics correlate against each other. If you, just cause you increase time on site, doesn't mean you made it better.

You could have made it harder to accomplish the goal.

Scott Sehlhorst

Yeah. Yeah. Christina Watkie talked about that. I. I've, I always assumed it was in her book. I went back and looked and it wasn't, it was a talk of hers that I saw probably 20 years ago that you should have three metrics because people bias their behaviors towards.

I had a Ceo 25 years ago who said, whatever ruler you use, people will try and look as tall as possible, standing next to that ruler. And Watkins point was, you need to have, Complimentary from our point of view, but basically contrasting measures make it harder to go off the rails with what I call malicious compliance like the toddler who follows the rules without following the spirit of the rules.

Steve Klein

Yes. Yeah. I love that. That's great. I think it's a little related. We had a couple of pre submitted questions. I want to throw one of these on stage. Okay. So for this person, yeah, their product team reports to the CTO, they feel like they're always just being pushed to move faster and they feel discovery is like slowing things down.

And this is related to the measuring and, setting these good countermeasures. How can folks in this position get. Their leadership team to understand the value of doing this,

Jon Harmer

Faster, in this case, almost always means deliver more things or deliver things in a shorter time frame.

And if you do something like impact mapping or whatever, where you trace it through to business outcomes, then you can just show that. Yeah, I hit on budget on time or ahead of schedule. That was great. And the world didn't change at all because we didn't do enough discovery to understand the problems we needed to be solving.

So by measuring by being explicit about those hypotheses and measuring that business outcome you can start to push back and say, okay, we're going to do things that move the business forward faster, not necessarily shipping. More features.

Steve Klein

Yeah. Do you, would you suggest going as far as coming with examples saying, Hey, you had us build.

Jon Harmer

Yeah.

Steve Klein

A, B, C, and D. And here were the results. We need a shift in like how we think about this. Like how do you, it is super awesome

Jon Harmer

if you have. The counterexample is like we shipped 20 features because we were told to, but they were all, customers complained. There's this much rework.

Revenue didn't go up or usage didn't go up or it actually went down. The, there's a Microsoft stat. We almost hinted at the beginning where only a third of features actually improve the target metric, a third or neutral and a third make it worse. If you have those, the third that make it worse stories, those are awesome to push back into the CTO and talk more about that business value.

Yes. Thanks. Yeah, I

Scott Sehlhorst

was going to say, I agree trying to shift the mindset from do more to do better is important. It's, it can be particularly tricky for CTO, CIO because they're not held accountable for the impact in the market of what they build. They're held accountable for building more stuff.

And so it starts to become an incentives question. So how far you push that it's a little bit different in every organization and how you push it. It's the personalities involved. The other angle is introducing the notion of course correction. So not delay the start of obvious activity with big upfront discovery instead.

imbue the ability to course correct into the organization. Yeah, we're still running downhill, but maybe we've got skis or something. We're still going fast, but we've introduced something that gives us the opportunity. To learn while we're doing to improve the plan in parallel with executing the plan and those course correction things slow you down a little bit in terms of outputs, but accelerate you massively in terms of outcomes because you cut waste out of the system by identifying that it's waste.

And you, and the earlier you can kill it, the fewer, the consequences, right? If you can kill an idea that's solving the wrong problem before you invest your critical design resources and engineering resources to come up with a solution, right? That's a win. I had a company it wasn't my company, it was a company I worked at

One of my colleagues pointed out, we don't get rewarded for preventing forest fires. And so that's an interesting conversation to have with a COO. Who's now responsible for operations around taking a more elegant approach to building things that matter and not just building more things.

And how you get there and how you infuse course correction into changes in your decisions and your artifacts and skills for your people. And it's actually pretty easy in a webinar like this to talk about it. And it's a really pretty complicated change for an entrenched team or environment or leadership created culture.

Steve Klein

Yes. Yes, that makes sense. It's related to this other question. I'm gonna throw up here real quick. Just around how can we align incentives across products and design and engineering to reward this thinking? We just one thought real quick. I want to share. Our last webinar I did was with Jeff got health about okay.

Ours. And one of the things he talked about that I thought was interesting was around Yeah. Planning seasons should be short and the things that we plan to build should be relatively near term. Because I think teams get into this situation where they set these like year long roadmaps of, Hey, here's all the stuff we're going to build.

And you mentally commit to those things. And even if, like you're talking about, even if data or feedback comes up that makes you want to course correct, you've already pre committed to this stuff to build. I'm curious if you guys have any thoughts around that. Yeah.

Scott Sehlhorst

I do.

It's something that either makes me really good friends or really disinterested clients. And that is to say the plan is not the goal. The goal is the goal this is the way I got kicked out of a client. I said, how long are we willing to wait after we know it's wrong before we stopped doing it?

But that's really what it comes down to. And then it's bedside manner about how you effectively get that shift to happen. Yes. And

Jon Harmer

there's some sense of shifting your roadmaps from the exact list of things you're going to build to the list of problems you're looking to solve for customers or business outcomes you're looking to drive, right?

That's a huge cultural shift to do. But it is awesome. If you can pull it off.

Scott Sehlhorst

And yes, please God don't conflate a feature release plan with a product roadmap. They're fundamentally different things used by different people for different jobs.

Steve Klein

Yeah, I think people I think it's like almost natural for people to gravitate towards this year long roadmap planning and that it feels safe.

It feels like it's something I can control. I don't have to introduce these hypotheses that if we build this, it's going to solve this problem and it's going to drive this outcome. And it's, and that's going to drive this business impact. So yeah, I think there's some amount of getting people comfortable with the uncertainty that's inherent in putting a customer problem on your roadmap or putting an outcome you want to drive on your roadmap or your plans.

Okay, cool. I could talk about this stuff all day. Before you go, would love to hear just what are the best ways to connect with you guys, follow your work and where can people learn more about you?

Jon Harmer

You can find me on LinkedIn, Jon Harmer, spelled like you see it on the screen without the H in Jon or on Maven where I teach a course about all of this stuff.

You saw some of the slides here. Those are the best places to find me.

Scott Sehlhorst

Yeah. And for me, LinkedIn Slash sell horse spelled just like me, except I've got both H's. And so you'll have to don't skip one. I also do I'm a co host of a podcast called the problem with AI, where we look at what are the challenges, consequences of using AI, not talking about how to use AI, but rather why we do it.

Work on a couple of books. I've got the Tyner Blain blog, so LinkedIn is the right place to start. Anything that I'm doing like this wonderful session today I'll promote it there. And that's an easy place to reach out and have a conversation.

Steve Klein

Thanks so much. Okay. Cool. With that, we'll say, see ya.

And yeah, thanks so much for coming. Thanks so much guys for your time. Appreciate it.

Jon Harmer

Thanks for having us. This was awesome.

Scott Sehlhorst

Thank you. Thanks everyone

Try Vistaly: the operating system for Outcomes-focused product teams

Identify and assign Outcomes, discover and prioritize opportunities (with OSTs), track assumption testing, now/next/later roadmap designed around Outcomes

Request Free Trial

No CC required

Steve Klein
Steve Klein