top of page
Schedule a Consultation

Have you read our blog and still have questions? We offer no-cost consultations to portfolio firms and organizations seeking further advice or assistance in strengthening and growing their Product and Tech teams. 

Sign up now to schedule your session with one of our expert principals. 

Recent Posts
What's next?

Interna principals present at events worldwide. We send out a monthly newsletter with information on where to find us next and how to stream our talks to wherever you are. Our newsletter is filled with additional tips and tricks for executive leadership and the latest stories in global tech news.

 

Stay up-to-date by subscribing to our newsletter using the button below.  

DavidSubarHS1 (2).jpg
I'm David Subar,
Managing Partner of Interna.

 

We enable technology companies to ship better products faster, to achieve product-market fit more quickly, and to deploy capital more efficiently.

 

You might recognize some of our clients. They range in size from small, six-member startups to the Walt Disney Company. We've helped companies such as Pluto on their way to a $340MM sale to Viacom, and Lynda.com on their path to a $1.5B sale to Linkedin.

Metrics for CTO Success



David's notes:

As Engineering groups get bigger, the metrics used to report up and across and manage down become more important to ensure that teams are empowered, independent, and able to scale. Often, CTOs mistake the tools they know best for the ones that they should communicate, which can lead to danger.


This discussion tackles:

  • How engineering metrics can be a danger to CTOs;

  • Why business metrics matter for engineering delivery;

  • How presenting the wrong metrics can cause a rift with Product; and

  • Why engineering metrics cause CEOs to doubt the worth of their CTOs.

While still necessary, engineering-driven metrics (like code coverage and churn) have little value outside of the Engineering team. Velocity, for example, is important to Engineering and partly to Product, but is completely the wrong thing to discuss with CEOs. Communicating results using the wrong metrics can lead CEOs to believe that their CTO is poorly focused, feckless, or worse, an impediment to the organization.


In this presentation, CTOs will learn:

  • Customer-driven metrics that CTOs should present to their CEOs: what they are, and why they are valuable;

  • How customer-driven metrics, in addition to engineering-driven metrics, can align rather than separate Product Management and Engineering; and

  • Why it is critical to supplement engineering-driven metrics with customer-driven metrics, even when communicating with the Engineering team.

In short, CTOs will learn how to use metrics to create value for the company and to create alignment with the CEO.

 

Transcript:

David Subar:

Well, thanks for having me and I'm excited to talk with you all today. I'm going to start with giving you a little bit of background about me and what I do, and I'm telling you this so you have context about the talk and my opinion and how I came to this. I am the Founder and one of the four folks in a firm called Interna and we do, in fact, work with Product CPOs (Chief Product Officers) and CTOs (Chief Technology Officers) about Product Management and Engineering. And the main thing we think about is how to make Product Management and Engineering more effective, how to help CTOs and Chief Product Officers lead those groups and just how you get product-market fit faster. But it's all very much around the lean start-up model and how do you think about that.


So I've been doing this for about eight years. Before this, as mentioned, I was CTO and Chief Product Officer with a bunch of different companies. I started my career doing research and development in AI and ML at a military think tank in DC. And I went from there to being a developer at proper technology companies, leading larger and larger groups being a TL, eventually being CTO and leading product. And the only reason I tell you that story is because doing R&D in AI and ML sounds very exciting-- I thought it was going to be, and it's terrible. And the reason it's terrible is if you do a good job, you write a paper. And if it's an interesting paper, you present it at a conference. And if it's a good conference, maybe 200 people listen to you. And then, nothing happens.


And so, that was a pivot point for me in my career-- deciding I wanted to use technology to build things that actually affected people. And that's what got me on this path to first building things, and then building teams that built things, and then thinking about how do you do that in scale. And that's how Interna got started.


The last thing I'll mention about that is... Well, there're actually two things. Sorry. It turns out the patterns you see from small to large companies are oftentimes very similar. So I've done a bunch of work with startups, but then I've worked with companies as large as The Walt Disney Company. And you see a lot of patterns that are the same and that comes to help shape our thinking about metrics and communication patterns between CTOs and CEOs and just what works and doesn't work and how to be effective about it.


Just one more piece of background and I'll get off this. The kind of work that we do and the way we see this is, I tell people we do three and a half things. One is we'll go and evaluate Product Management and Engineering teams. And this is particularly germane to our subject today. We'll go in and we'll take a look and say, "Here's some stuff that's going well. Here's some stuff you might want to do differently. Here's how." The second thing, as Nandu mentioned, is coaching and mentoring the CTOs and Chief Product Officers. So, now you've got the picture of the world. How do you help someone be more effective doing that? We do interim work. If there's not someone in place, we fix the teams ourselves.


And the half thing is diligence, but the reason I'm telling you all this and I'm sorry I'm belaboring it so much is, we longitudinally see lots of different companies and those metrics see what works and what doesn't work. So the ironic thing about CTOs and metrics are engineers-- the pattern that CTOs grew up in-- are really, really good at numbers and are really bad at metrics. And I hold the reason for this is, is that they don't understand what the metrics are for and the kind of things to use those metrics for, right? Metrics are a tool of communication.


There's a bunch of metrics that you can read about. You can go on the web and find a million of them, but what you're communicating and how you determine to communicate it, is the difference between a relationship between a CEO and a CTO that is good and one that is bad. In my experience, the average life expectancy of a CTO at a job: about two and a half or three years. And I hold one of the large reasons for this is because what's being communicated, the way it's being communicated is problematic. It's a skew for what the CEO wants to understand or can understand and what the CTO is communicating.


In this presentation, at the end of this presentation, I'm hoping to leave you with three tools that you can use to talk with your CEO. And as Nandu said, I very much encourage you to ask questions as I'm talking, interrupt... all that is game, but particularly as we're talking about these tools, I will be highlighting them and you should feel free to ask questions.


Before I dig in, I want to know a little bit more about you guys so I can form the presentation. Can you guys tell me there's how many of us, there's 21 of us here. So tell me about your company's size. What's a big size for some of your Engineering teams and what are smaller ones? A couple of guys can just put it in chat.


Nandu:

Just to clarify, you want to know Engineering team size or company size?


David Subar:

Actually, I'd like to know both, engineering and company. So people can put in chat: 20 engineers, 60 people at the company. or whatever. Can I get a few more. There you go. Thank you. Great. Okay. So we're basically all over the map. Some small, some larger. Great. Thank you. Here's why I asked that question is that, the maturity of the communication patterns changes as companies change in size and as Engineering changes in size. When you're starting out, typically people are thinking about how do I build something? Teams are small. Communication is easy. Oftentimes the CEO is she can gel, not literally anymore, but she can gel with the engineers and it's very easy to communicate.


And the things that people are concerned about are very tactical. Today's kind of pushing forward. How do we land this particular client? How do we get this revenue going? How do we decrease churn in this small way? And frankly, you don't need a lot of process and you don't need a lot of metrics because you're talking all the time and everything is very tactical. Everything's very upfront. As the group start getting bigger, let's call it 20 ish-- Engineering groups, getting 20 ish. People tend to get more focused on, how do I now manage its scale. And so the metrics that they're looking at are things that are more typical things, looking at velocity and just how to get the teams organized.


And at some point you outgrow that, you outgrow the ability of that to be meaningful. And people tend to think, how do the engineering organizations create customer value? And I very much talk about Engineering and Product Management as one combined organization. And the reason is because I hold that they have the same mission, right? So you're delivering a product that creates value. I also hold, and I tell you these philosophical things because they could set context-- is that, the goal of a product company is not to generate revenue. At least, that's not the primary goal. The primary goal is to create value for the customers; and revenue and profit are a side effect of that. And this maturity model, right? From small to medium to large, is the way people generally go up, but they generally get to the next step too late. It's not until they're in pain that they find they need to get to the next step. And it's often at these transition points that CTOs and CEOs get a skew.


So back to what metrics people use to communicate. And I see in the chat someone talked about it. James talked about some of these metrics, and these are typical metrics. Daily typical metrics that CTOs talk about? We talk about velocity and the increase in velocity or the standard deviation of velocity. The commits, the test coverage of over the code, things of that nature. And I hold that these are interesting, but I hold that these metrics, if you talk to the CEO about them will kill you. And they'll kill you because the CEO A) generally won't understand what these mean unless the CEO was CTO his or herself, and B) it's not what the CEO actually really cares about.


The CEO is concerned about moving the business forward by some metrics that are important to them. Now, if you're in the beginning or you're a B-series company or A-series company, maybe even a C-series company, typically they're looking at how do they get to some milestone by which the next round can be funded at a better evaluation? Showing some tangible thing to move for the business, either penetration into some market with a bigger team, some risk reduction, something like that. And these metrics talk about how fast you're running, but not what goals you're accomplishing as a business. In fact, not only do they talk about how fast you're running, they're talking about how fast you're running in a way that the CEO cannot evaluate; they are, if you will, trust-based metrics for the CEO. And the CEO, knowing how fast you're running, if you're running in the wrong direction, doesn't care. And so my argument is, you do not want to have these conversations with your CEO.


The only conversations that make sense are the ones about pushing the customer forward. And even in this case, CTOs often have the wrong conversation about who the customers are. A bunch of patterns I've seen where technology groups talk about someone internal as being their customer: marketing or sales. We're building this because marketing wants it. We're building this because sales wants it. And the question there is, if that is your customer, are you a CTO or CIO? And I don't at all mean to denigrate CIOs. They're super important. A friend of mine is CEO of Trader Joe's and he was looking for a CIO. And this was years ago. This was before I started Interna. And he asked me if I'd be interested. And I said to him, "Brian, no, because if I were your CIO, you'd only see me if the Castro District stop working and then you'd yell at me." The job of a CIO is to do something well and the next year, because you're a cost center, do it for cheaper and more.


The CTO-- and the advantage that you guys have is that you build a product that builds a profit. And so your customer is not the sales group, is not the marketing group. And you shouldn't think about your metrics that way. Your customer is the customer of the company. You share the customer with marketing and sales, right? And so this is the basis by which you should be having the conversation with your CEO. So I'll typically get called. It depends on, I talked about the three and a half kind of engagements we do, but in places where they don't have a CTO or Chief Product Officer and we're being called to do interim work. I'll meet a CEO in about 15 minutes in the conversation I'll look at the CEOs as earnestly as I can. And I'll say, "We need to talk about Jesus." And then the conversation usually stops.


Well the person across the, now across zoom for me, but across the coffee table will look at me awkwardly and I'll look as honestly as I can and I'll say, "We need to talk about Jesus right now." And then there's usually a look of fear in the person's eyes and I'll go, "Oh no, no, no, you got me wrong. I'm not an evangelical Christian. As a matter of fact, I'm not a Christian at all. I'm Jewish, but let me tell you why it's important We talk about Jesus." And then I'll tell them this true story of this guy that I worked with at my second job. And he was an evangelical Christian. We became friends and he was looking to get married. His name was Ken Johnson.


And I'll tell the CEO the story I said to Ken, "Well, Ken, describe the kind of woman you want to get married to and if I meet someone, I'll introduce you and I'll hook you up." And Ken looked at me and he said, "You don't understand the problem." And I looked at Ken and I said, "No, no, no. Tell me you want someone tall or shorter or you don't care?" He was a black guy. I said, "You only want black women or you don't care? Describe this woman and if I meet someone I'll introduce you." And he looked at me and said, "You don't get the problem." And-- I was in my late '20s-- and I looked at him and I said, "Ken, I've had some experiences in life. I think I got this. So describe this woman and I'll introduce you." I was getting really frustrated. And he looked at me and he put a glass in front of me and he said, "David, this is Jesus." And he points the glass with his right hand. He said, "I'm trying to move towards Jesus."


And he pointed at the glass with his left hand and he said, "If I find a woman who's also trying to move towards Jesus, we will tend to converge. You're asking me if I want a six foot tall black woman."


So then, I'll pause the story. I'll look at the CEO and I'll say, "Do your product managers, do your engineers know who your Jesus is? Do they know what makes your Jesus happy? Do you know who your company is here to serve?" Because if so, they'll tend to converge. Otherwise, if your engineers think their job is to write code, if your product manager think their job is to write user stories, you've got a problem.


And so what I hold is that, this whole communication between the CEO and the CTO and all of the Engineers and all the Product Managers has to be focused on who the Jesus is and what makes them happy and how would you know? And it's not enough for the CTO to know-- that's important, it's necessary-- but engineers can write anything five ways and without the context of the "why are you doing this?" they won't consistently write it the right way. Different agile techniques like scrum or XP, actually have good methodologies to think about how to do some of this stuff. People tend to not execute them well.


So I told you, I was going to give you three tools on how to make this communication effective. Here's the first one: the epic statement. Nothing should go on your roadmap, unless it has a specific statement of value. You don't have to use this one. This is one that I've used quite a bit. That I find to be effective and I find to be easy to understand, but it's pretty simple. We believe by doing this feature, this specifically named feature for this specific type of person, we're going to achieve this outcome for them and we'll know it when we see this metric move. Every epic, every major thing that goes on the roadmap, should have some kind of statement for it. That is like this. That is clear. That it's well understood and that's measurable. If you can't state it like this, you might want to ask yourself, "Why is it that you're doing it? How will if you're doing the right thing?" That's the first tool.


Here's the second tool: retro your epics. Let mel talk about this just for a moment. All Product Managers are bad at their job. Every one of them. I tell CEOs, there's a lot of CEOs who think they're Steve Jobs. And I have a habit of regularly telling CEOs, "I know how many Steve Jobs currently exist in the universe. There was one, there is now none." The thing about this is, the reason Product Managers are bad at their job is because it's impossible to do it right, to guess what the market wants. And so the question is, is not what you got right, but how quickly are you learning?


It's not enough to have an epic statement, but just like people retro sprints when they're done, you have to retro whether your bet on what the market wanted was right or not, right? Now, you can't do that the day of release. You need to wait some time, call it two or three weeks. It might be different for you in particular, to understand whether or not you created the value you expected to create. But if you don't retro your epics, once again, you're running quickly, but maybe not in the right direction.


So you've got to go back and say when we said, "We believe by doing this feature for this type of user, we will have this effect and we expect to see this metric move." You have to ask yourself, did we get it right or not? You should not expect to get them all right. As a matter of fact, if you get them all right, you're not betting hard enough.


Years ago, I did a project for American Express with a bunch of other people. It was back when I was still doing AI work. And we were doing an AI system called Authorizer's Assistant. Every time you did a transaction with an American Express card, it determined whether it was really you-- was it fraud or not, and whether you could afford to make the purchase. "American Express has no preset credit limit," is what their advertisement say. It doesn't mean they don't have credit limits but they make a decision every time. So we built the system that did the credit limit on the fly and made a fraud determination.


So I said to the executive at American Express we're doing the work for, "So our goal here is to have no bad credit, right?" And he said, "No, that's a terrible idea. I want some bad transactions." He said, "I can get no bad transactions by having zero transactions." And it's the same here. You got to get some of them wrong and that's okay. But the question is, how quickly do you do them? How quickly do you learn from them? And then how quickly, can you iterate on your roadmap? So that's the second tool.


Here's the third tool: How do you communicate timing? First, we communicated what we're going to build and why; second, we evaluated what we're going to do about it. Sorry, I forgot to say, after you do the evaluation on how good the bets were, you have to go back to the CEO and tell the CEO, "We did really well on this one. Here's what we learned. Here's what we're going to do. Here's what we're going to try to repeat for the things that we did well. Didn't do so well on this one. Here's what we learned. This is just a retro, here's what we're going to do differently. Let's iterate the roadmap." So that was the second tool.


The third is when, how do you communicate when? And I will tell you that this was a hard lesson for me. There's four ways of communicating the timing. I did every one of them. My suggestion is please don't do the same. So the first way, and the first way that I did timing when I was earlier in my career, I would ask the CEO, "Tell me what you want and I'll tell you when you're going to get it and by the way, please don't bother the engineers until then, that is distracting for them and it's just going to take longer." Well, that was a fail because that assumes A) you're going to get the timing right-- which you might be close, but you won't get it right every time. Second of all, it's a faith-based exercise and the second faith-based exercise in this presentation, they just got to trust you.


So the second way to do timing, this one also doesn't work, but it's the one that people fall back to is, "Tell me what you want, I'll tell you whether you'd get it and so you don't have to just trust me. I'll publish a schedule and put it on our intranet or put it somewhere online and you can look at it anytime you want and you can know where we are." That doesn't work either, because nobody actually ever looks at that. So you're really back to version one.


The third way, "Tell me what you want. I'll tell you when you're going to get it. I'll do demos for you and so you'll see progress all the way." This one's actually not bad. You might survive this one. The problem with this one is once again, you're a cost center and not a profit center. You're the CIO not the CTO. Your team isn't empowered. You might survive it, but you're not helping drive the company and you can become useless or perceived as useless.


The only way that I've found that works is, "Let's talk about what we're trying to get to as a company for our market. I'll tell you how we're going to go solve that. I'll approximate the time, then I will show the progress along the way." We will do demos after, if you're running scrum after every sprint, if you're doing Kanban or Scrumban, after every progress point. This is the only way that I find consistently works. It demands that they have respect of you and your position. And I don't do it for that reason, but it engages you in the conversation. So you know what the business metrics are. So you can write the epic statements. So you can then talk to your team about why you're doing these things. So you can go back with whoever's running Product if it's not you, and think about how you redirect to create more value.


One of the conversations that you certainly have had with your boards, if you're in the board meeting is about budgets. Budgets for them are a capital expense for which they're expecting a return. Only by doing these things can you talk about whether they're getting a return on them or not? If you're talking about velocity, it doesn't mean anything to them. These are the conversations you want to talk about. What value were you expecting, whether you achieved it or not, and when's it going to happen?


Now I denigrated in the beginning of the talk, talking about velocity and code coverage and I hinted at spikes and I talked broadly about almost everything you should do to be creating value. All that stuff I said is true, but those internal metrics are still important. They're just important internally. It's not the conversation to have with your CEO. You have to-- with your engineering teams, you have to have the conversation about value creation, the direction you're going, but you also have to have the conversation about how fast you're going.


So don't give up on the typical agile metrics. Just know where to have the conversation. Nobody else cares besides you and your team and maybe Product Management. Your team needs to be well-trained in some agile methodology, by the way, not to be a total agile bigot-- there are some situations where waterfall makes sense. I'll give you an example. I was doing interim CTO/ Chief Product Officer work in a company that was making augmented reality headsets. We were building a hardware and operating system and API to STK. Turns out hardware really wants to be waterfall. Really, really wants to be waterfall. Some places, waterfall is right, but understand the metrics that you want to talk about internally and how those are going to work and feel free to keep having those metrics.


Ennis:

Oh, David, I have a question.


David Subar:

Sure.


Ennis:

Hey, this is Ennis. You just mentioned Product Management might care about those metrics. So if Product Management doesn't-- those internal metrics, right? The more technical, agile-based ones. So if Product Management doesn't care about those metrics, do you find that to be a problem?


David Subar:

Generally.


Ennis:

Okay.


David Subar:

Yeah, generally and more specifically here's why they should care and here's why we try to convince them to care. It goes back to something I said earlier on. I believe Product Management and Engineering are one team. They might be led by two different people, but they have the same goal, right? And if you get to a point where Product Management points to Engineering-- when looking at the CEO, points to Engineering and says, "We would get this done if Engineering weren't so slow." Or if Engineering says, "We're doing it as fast as we can, but Product Management has bad specs." You got a problem, right? So Product Management should care.


And there's decisions that are made, right? There's three ways of doing this. There's five ways of doing that. It matters. And the tech debt matters, right? And by the way, my definition of tech debt is: tech debt is that which decreases velocity. And so Product Management should care. And if they don't care you should find out why. You should ask why. That relationship is critical. Before I go on, are there other questions?


Nandu:

Yeah. I'll repeat a question from chat. Jim is asking, by your definition of a CTO versus CIO, does a CTO even makes sense in a non-Tech company?


David Subar:

A lot of times, no. Generally, no. I'm sure there are circumstances where they do, but if technology is not your product, if technology is utility like electricity, a CTO probably doesn't make sense. Now, there's no hard and fast Webster's Dictionary definition of CTO. My definition is the one that builds product for a Technology company. So my answer to Jim's question is, does a CTO make sense for non-Tech company? My answer will generally be no. Now, I'm going to conflict with myself. Is Amazon a Tech company? Well, we would all, I would assume, argue that it is. When it started out it was an online retailer. So part of it is how does the company think about itself? Bezos always taught them as a Tech company. Once again, Trader Joe's-- I talked about my friend, Brian, at Trader Joe's. They're also a retailer, but they are not, and they have no desire to be a Tech company. There's no reason for them to ever have a CTO. Is that helpful?


Ennis:

It's helpful to me.


David Subar:

Okay.


Jim:

It clarifies your position. That's what I was looking for. Thank you guys.


David Subar:

Perfect. Perfect. Other questions before I move on?


Eric:

This is Eric. I got a question for you. You haven't talked a lot about cost or efficiency, which tends to be top of mind for me and my bosses over the years. That's both cost to implement in terms of time and people, but cost to also operate. I've never worked in a company that's had a CIO, so I can't make that someone else's job. So where do you see... There's lots of ways you can get to an outcome. And if you have more money and you have more people and you can spend more on cloud resources, you can get there from different ways. Where do you see cost and efficiency from a metrics perspective in terms of these conversations?


David Subar:

So, Eric, you're right. I probably didn't talk about that. I think that's actually very important. I've got a company that I'm working with now in LA and they do think about cost and efficiency a lot, and particularly around cloud resources and their operational costs of running their stuff in the cloud. And they're thinking particularly about EC2 versus Lambda and the operational costs of doing that. And when do you pick the other and the cold start and how do you address that with cold start and a bunch of other things-- that is a very real thing. And that is a place that as CTO, you can actually get some wins.


The problem is that there is-- and if you can, you should-- and a lot of times those are wins that are unexpected. And so from, and I hate saying it this way, from a political point of view, you gain points that way. You should do it because it's good for the company, but it doesn't hurt that you get pats on the back for it. Those are the stuff you should definitely look at and address and see.


There's other kinds of costs as well, ones that are harder to measure that Eric, you were hinting at or you were saying directly, what it's harder to know is, what are the investments I will make that will decrease the cost of building a particular feature? And those are often hard to determine. What's easier to determine is, what is my general cost of release. And what I mean by that is from a Dev Ops SRE standpoint, if it takes you a day to release something, that's not so good, right? If you can cut that in half to, half a business day or four business hours is better, right? If you get it to an hour or better, if you can release on demand even better.


But the way that I found that is the most impactful way to describe that, particularly if you're still a private company, is: we have twelve months of burn. That means we need to raise in six because we need to close around nine so we'd still have negotiating leverage. If we want to raise in six, that means that if we can do a release-- and this is all stuff you guys know-- if we do a release every once a month, we get six at-bats for showing progress to doing fundraising. If we can release every week, we have 24. If we can release every day, we have 120. You can do the math after that yourself because my arithmetic in my head gets worse.


And so I want to invest, as CTO I'm ongoing to invest, to automate releases because our cost of release goes down. I'm going to automate test coverage because our cost of release comes down; and I'm increasing as a business our ability to show significant changes for the next round and our evaluation is going to go up. And that's the way I talk about that stuff, which is subjective, but you know it helps.


Eric:

Excellent. Thank you.


David Subar:

No problem.

Nandu:

Hey. We got a question from Max who asks, what actionable metrics do you suggest measuring around value-to-end consumers?


David Subar:

So Max, this is a B2C business?


Nandu:

Yeah, so let me help Max. Max is a head of Product with me. I'm CTO. He's head of Product. So a B2C business.


David Subar:

I got it. Okay, thanks. There is a variety of different metrics you can use. There's things like penetration, decrease, churn. There's a variety of them. There's a model, and I'm glad to send you both. If there are things that I mentioned here that you guys want, my email address is on the next slide or the one after that, I'm glad to send--


But Max, there's a particular model about, and I'm going to screw this up. Let me just look it up so I don't screw up the model. I think this is it. Sorry. I'm referencing one of my notes. Oh, here it is.


There's a model: acquisition, activation, retention, and monetization. This is not the only good model, but it tends to be one of the good models of measuring how far you get a user down their curve. When did they come to your site? When did they do or add to your product? When do they do something with it? That's activation. How long do they stay? And are there funds that come from them as a side effect? So you can use it. And there's other models-- Max, email me later and we'll find a time or set up a time, and we'll talk. But you can get metrics down that funnel and prove whether you are generating more usage or not and there's other ones too. So Max, hit me up later and we'll talk more. Other questions before I move on?


Nandu:

I think we're all caught up.


David Subar:

Great. And by the way, I answered Max's question, but I started Max's questions by asking whether it's B2C or B2B. The metrics for B2B can be quite different. B2C is actually the easier case because B2C the usage of the product tells you what's working and what's not. In B2B it's more obscured because sometimes in B2B, you have to put a feature in that won't necessarily increase direct usage, but it might increase lead generation just knowing that you have it and being able to advertise it. And so the B2B cases are a little more difficult. and I'm glad to talk about those too, and they tend to be more industryspecific. So I'll give you an example. I'm working right now with a company that is doing a SaaS service for pharmaceuticals to help drug testing, make drug testing, faster. I'm working with big pharmaceutical companies like Eli Lilly and Amgen and clinicians that are actually doing testing.


There is a larger latency between putting a feature in so that they know where a patient is in a clinical trial and overall penetration into the pharmaceutical market, right? So you have to have a more abstract measure there, but there's ways you can do that. I'll move in the last couple of slides. So I gave you those three metrics. They're really, really good. I recommend them, but they only work if you're in an environment where you can be successful.


And I hold that there's four things that you need maybe more, but at least four things you need. You're more likely to be successful in a company that is growing, right? My father used to say, "You want good luck to happen to you. And in a company that's growing, good luck can happen to you." That's A. B is, you have to be empowered to actually lead the team. C is, you have to have good people that can execute well. And the fourth is you need to have a seat at the table. You need to be able to engage in that conversation. So I talked about timing, what do we need to do?


If you don't have those four ingredients, the conversation you're having with the CEO, the metrics I'm suggesting won't work, won't matter. Now you can fix these things or you can be part of fixing these things, but if you don't fix them, you should think about how you improve the company or maybe you're not in a place that you can be successful. But the environment has got to be right for the metrics and for the conversation between you and the CEO to work.


So that's what I'm here to tell you about today. If you have questions let's talk about them now, obviously, but also if you have other questions, I talk to people all the time, feel free to email me. Alexander is on this call. He'll set up a time for us. I'm glad just to be helpful. I'm also on the board of the LA CTO Forum so I do a bunch of this stuff through the LA CTO Forum as well. And on the Interna website, very occasionally we repost articles that we find that are interesting for CTOs and Chief Product Officers to think about, or how to think about building Product Management and Engineering teams. And if you're interested, go on and they're very occasional. You will not be getting something from us every week. Other questions?


----


I got a question here privately asking about metrics geared towards customer growth versus revenue growth. That's an interesting question. You would like to think that those are necessarily tied to each other, that customer growth drives revenue growth. That's not always true. As a matter of fact, you might make a decision to drive customer growth and not have revenue growth. You might want to increase the penetration into the market because you know that there's some network effect, some natural monopoly, some data monopoly you have, and you don't care about the revenue impact on it. And the writer of this doesn't say whether he's B2C or B2B, but I'm going to assume, well, I'll do both cases.


In the B2C case, certainly, I would go back to looking at acquisition, activation and retention. And where do you attract and where's the funnel come from? Are you doing SEM or SEO? Looking at when they hit your app, when they hit your site, what are they actually doing? How do you get them using it? And then, where do they drop off? And this is, in the first two cases, kind of classical, AB testing helps looking at those things, drip, that kind of thing. In retention, you very much have to look at user patterns and what they do and actually, I'm finding... There's a bunch of really interesting ML stuff you can do there to draw patterns and look at what one thing leads to the other.


Now, this is very different than revenue growth. They can be tied, but the question is, revenue growth can also be tied to upsell and to other products that you create, right? Looking at what are you putting on the roadmap. But not outing the writer of this, part of it also depends on the business case and ping me offline and I'm glad to spend time with you on it. We're distressed in. I'm sorry. I'm reading the chat. Oh, I see. Okay.


So what I'm understanding, right? It's a B2B2C-- working through a channel that helps the companies that channels and sells to end consumers. Actually, to the writer of this, let's talk offline because I think this is more complicated than I can easily answer based on this chat.


There was another one that came through. "What metrics do you measure for individuals?" I think there's a typo on this. What metrics do you measure for individual IC or ICs. Ashish says I can out him. Okay. Thank you. Ashish asks, "What metrics do you measure for individual employees and Product and Engineering teams? Is it the ones you mentioned earlier: velocity, cycle time, PR time, commits per day, code coverage?"


And those are helpful and I wish I did... There is a... I'm embarrassed, I can't remember the name of this product. There are definitely products and, Ashish, if you hit me up, I'll send you the name of it, to look at individual engineering efficiency-- the efficiency of individual engineers. I think those are necessary but not sufficient. It's also just engagement with the problem. Some of it is objective like the measures that I just mentioned, but also there's just subjective stuff that you want to look at. And to me, it goes back to the engagement with Product, engagement to who the customer is, things of that nature; but I'll look up the product and Ashish if you hit me up on email, I'll send you one of them.


Nandu:

Thank you so much, David. This has been wonderful and, as I expected, it particularly resonated with me. I'll just say this out loud. I'll say the quiet part out loud. I'm in the middle of trying to explain to my CEO, why a massive database refactoring, "I changed the engine on a moving car"-type project, is actually good for our user experience and scalability. And I'm going to revise the last thing I sent to him with your exact phrasing, your epics phrasing.


David Subar:

Oh, great. Great. Look, I'm glad it's help on and hit me up if you want me to look at something, I'm even glad to look at something.


Nandu:

I may take you up on that.


David Subar:

Great.


Nandu:

Yeah. Thank you. I really wish at this point, maybe a feature request for Zoom is to have an applause button because one of the great things about a virtual setting as opposed to in-person is, I've been having side conversations with folks. Fairly vibrant discussion out-of-band and I think this has been enjoyable for everybody.


David Subar:

Great. Well, thank you everyone. Good to meet you all virtually and, if you guys get to LA and we can actually see people face-to-face, hit me up; and if I get to Atlanta, I'll let you guys know.


[END OF AUDIO]



Comments


bottom of page