top of page
Schedule a Consultation

Have you read our blog and still have questions? We offer no-cost consultations to portfolio firms and organizations seeking further advice or assistance in strengthening and growing their Product and Tech teams. 

Sign up now to schedule your session with one of our expert principals. 

Recent Posts
What's next?

Interna principals present at events worldwide. We send out a monthly newsletter with information on where to find us next and how to stream our talks to wherever you are. Our newsletter is filled with additional tips and tricks for executive leadership and the latest stories in global tech news.

 

Stay up-to-date by subscribing to our newsletter using the button below.  

DavidSubarHS1 (2).jpg
I'm David Subar,
Managing Partner of Interna.

 

We enable technology companies to ship better products faster, to achieve product-market fit more quickly, and to deploy capital more efficiently.

 

You might recognize some of our clients. They range in size from small, six-member startups to the Walt Disney Company. We've helped companies such as Pluto on their way to a $340MM sale to Viacom, and Lynda.com on their path to a $1.5B sale to Linkedin.

A Fireside Chat with Ryan Barker, CTO of Zefr




In this fireside chat, I talk with Ryan Barker, the CTO of Zefr. Ryan faces interesting challenges. His company is on the leading edge of GenAI. They are simultaneously learning how to use it, compete with it, and how to do it all effectively and cheaply. If any company has a data monopoly it is Zefr, but how to use it is not entirely clear.


Ryan discusses:

  • How to make C-suites understand this (or any) new technology and how to avoid over-hype

  • How to manage the development of GenAI and why it is different than other technologies

  • How GenAI makes releases faster, and why that is a blessing and a curse

  • Tooling purchasing decisions - what to build versus what to buy and why that is different in the GenAI world

  • And of course one of my favorites: Conway’s Law and how to deal with that now



Transcript:


[00:00:00] David Subar: Welcome to another Interna Fireside Chat. Today I talked with Ryan Barker. Ryan's CTO at Zefr, a company that helps advertisers understand what videos might exist on different social media platforms like YouTube or TikTok, so they can pair up what they want to advertise and what they want. Ryan will talk more about that, but there's a few interesting things he talks about that I think that you'll enjoy. We talk a lot about how GenAI changes the world of an engineering organization and the world of a CTO, both how to manage people in the org and how to talk with C-Level execs. We talk about velocity and how GenAI requires increased velocity. How engineering organizations need to move faster, and we talk about some of the tooling that you might need to buy or develop in order to do that. Turns out that in the GenAI world, that data becomes a monopoly. Leaders in data become a natural monopoly, and we talk about what that effect is on engineering organizations. So stay tuned. I think you'll enjoy it.


[00:01:16] David Subar: Hello, Ryan, and thanks for being here.


[00:01:19] Ryan Barker: It's great to be here, David, always a pleasure to talk with you.


[00:01:24] David Subar: Yeah. So as I mentioned before, I've known Ryan for a long time through the CTO world, for the LA CTO forum that we're both involved in. Ryan, before we jump in, talk a little about your role, talk about Zefr, talk about what you guys do. That's going to set context for everything.


[00:01:43] Ryan Barker: Great. I've been working at Zefr for seven years. I started a software architect, helped really transform the system and been CTO for two years. What we do is we analyze content on TikTok, Meta and YouTube. So this is the hardest content in the world to analyze, is use jurid content. There are hundreds of millions of pieces of content analyzed, and we have hundreds of different labels we try and apply that are relevant to advertisers.


[00:02:16] Ryan Barker: So the main reason people use us is to make sure that when they run an ad, it is next to content that is acceptable for their brand or suitable for the brand, as we say. So if you're an alcohol manufacturer, you're like, there is no way I can be around kids content. I'm going to get in trouble with the law. I could lose my liquor license. It's like a major deal. Okay? While if you're a toy manufacturer, you're like, great, I want to be around kids content. They're the people who buy my toys. I want them to be like, going to.


[00:02:51] Ryan Barker: It's Christmas time here. As you can see you want to harass. Hey, daddy, I need XYZ toy that I just saw on YouTube. Please buy it for me.


[00:03:00] Ryan Barker: Right.


[00:03:02] Ryan Barker: Which has literally happened to me this morning. So it's an interesting world.


[00:03:08] David Subar: So you have a giant many-to-many matching problem.


[00:03:10] Ryan Barker: It is many-to-many matching problem. But prior to this, I worked at eHarmony, which is also a many-to-many matching problem. Although in that case, the toy has to like the child as well, which is a little different, which maybe if GenAI goes a little bit farther, that may have to become true at some point. But, yeah, it's a very difficult problem. There's hundreds of millions of things to analyze, and we need to do it very quickly and accurately within very tight cost controls. And GenAI is incredibly powerful, and it's a very innovative technology, and how to integrate that into our system has been a topic for the entire year.


[00:03:55] David Subar: Okay, so you have a bunch of algorithms previously. They work to some degree of usefulness. I'm going to say, of everyone in the industry, years are the best. I don't know if that's true, but I'm going to go with, yes.


[00:04:12] Ryan Barker: We definitely say we're the best, and I am confident that everything in development is definitely better than what anyone else has, including us.


[00:04:21] David Subar: Right. So now there's this new set of algorithms that come along that change even how you develop code. Right. Fundamentally, how does your world as a CTO change when the algorithmic set changes? When GenAI comes in?


[00:04:44] Ryan Barker: Yeah. So it's really a question about change management.


[00:04:48] Ryan Barker: Right.


[00:04:49] Ryan Barker: Change happens at the small scale. You personally have some issue to deal with. It could be your company has some issue, someone quits. It could be your industry. Some new thing comes in. In this case, it is a global, massive new technology innovation, similar to the industrial revolution, or the automobile, or computers in general, or the phone. And it's just going faster and faster. Faster.


[00:05:17] Ryan Barker: GenAI is going incredibly fast. And the answers I have today are not the same as you were literally last night, because there's new articles and the world's just going incredibly crazy and incredibly fast. But as a CTO, it's your job to keep up with this stuff and really figure out how to utilize this new technology and how to convince people to change. No one likes to change. I have a routine. I wake up a certain time, I go to work a certain time, and then COVID happens. I have to change.


[00:05:51] Ryan Barker: Right?


[00:05:52] Ryan Barker: This is GenAI. It's the exact same thing, but people don't feel they need to in the same way. They don't necessarily feel the same sense of urgency.


[00:06:01] Ryan Barker: Right.


[00:06:02] David Subar: Why is that's interesting. Why do you think people don't feel a sense to me? This has flipped my world over. This has flipped over the world of. A lot of our clients.


 [00:06:11] Ryan Barke: It has flipped your world over.


[00:06:13] Ryan Barker: Right.


[00:06:13] Ryan Barker: What about the people reporting to you? Well, a little bit less reporting to them, a little bit less. You get down to the new grad from college, they're like, I'm just working in whatever. It doesn't matter to me if I use GenAI or not. And if the company doesn't do well, I'll go somewhere else. And it's not like a huge thing. I'm used to doing this thing. And that's seen on the engineering side, on the non engineering side, people have their livelihoods and jobs at stake. Do you think your customer service people are like, oh, yeah, just like, take GenAI and put us entirely out of like, I don't need a job.


 [00:06:51] Ryan Barker: People are scared. And understandably, this is an incredibly powerful technology, and people will need to change the type of jobs they do, but they sometimes have to be convinced to do it right. Whether you have to be nice or whether you have to give them examples like, hey, you're just going to increase the amount of work you're able to produce. We're not going to get rid of you. We're just going to have you do more using these new tools.


[00:07:20] Ryan Barker: Right.


[00:07:23] Ryan Barker: But, yeah, it's incredibly difficult. It's going so fast. And I can say, oh, yeah, we can do XYZ today, but we can't do this other thing. You know, there's a demo yesterday for Gemini, and it's really amazing. Of course, we find out is loot fake? That's a whole another shit. You don't necessarily want to give your salespeople and marketing people a prototype, have them have their way with it.


[00:07:59] David Subar: So you've got this looking down problem people in your organization that you have to get on the train, get excited about using some of this stuff. So I'm interested in hearing about how you get them on a train, but I'm also interested in how you talk with your C Level peers, how you talk with the CEO, how you talk with the board about the possibilities of what can do. And also, don't get too excited. It doesn't do everything.


[00:08:29] Ryan Barker: Yeah, well, it's especially difficult in our industry because they actually try and stay on top of all the new data science things. And they hear like, oh, our competitors using XYZ for everything. And I run the math. I'm like, they're not spending $10 million. I promise you there's not. So they're just lying. And it is what it is. So let's cut this into a few parts of your question.


[00:08:54] Ryan Barker: So, first, how to get people excited, how to get people to actually do it right. We have a ton of engineers working for us who are super smart, super motivated. They follow the news for all the GenAI things. They're playing with it. They're posting cool images on Facebook or other places. And how do you get them to contribute and come up with their ideas, right? So we talked about a little bit, and we were just like, let's just do a hackathon. It's not scheduled till March, but screw it, it's September. We're just going to hackathon.


[00:09:31] Ryan Barker: No notice. And eight of the ten projects were about GenAI, and they're all super interesting. None of them will be exactly that way in the actual product, but definitely influence what we're building this quarter and next quarter. And it really influences the roadmap to get your smartest people that you're paying a huge amount of money, get them engaged and excited and contributing ideas, right? So I think that's really important. Now, the C Suite, it's interesting, right? I see the news article and I pay attention to all this stuff. I see it back when Chat GPT 4 was released. I'm like, hey, this is really interesting. I do a little playing.


[00:10:13] Ryan Barker: I'm like, hey, maybe we might be able to use something. I tell the C Suite, and it's pretty much crickets, right? They're not ready to hear it. Like, their friends haven't talked about it. They haven't seen one in the industry talk about it. In this case, it's very obvious. Anyone in the tech industry, this will become a thing. So you have to figure out how to devote some of your time, get an intern, get some amount of development effort trying to do these prototypes and find something valuable to the company. So when they are ready to hear what you're trying to say, they're like, oh, hey, this GenAI thing, my friend said it was really useful.


[00:10:57] Ryan Barker: It's like, oh, well, we've been doing investigation. We're on top of it. Here is a bunch of different options we have. Here's a quality here, here's a quality there. And you try and teach them about some of the limitations, right? Especially for our industry, it's very difficult because GenAI likes to make things up. I'll say it other way. It just literally, like makes things up. People call it hallucinating.


[00:11:31] Ryan Barker: My favorite example, there's a very popular kids comic, I guess it's not kids anymore. I'm sure I'm age, called Dragon Ball Z. It's been out 20 years. Whatever. It's not kids anymore. There's this channel about, it's very interesting, like world. People are flying around and saving the world and throwing fireballs at each other. And you ask the GenAI like, hey, is this animated? They're like, yes, 100% animated.


[00:12:00] Ryan Barker: You're like, is this cooking? They're like, no, 0% cooking. You're like, is this combat sports? They're like, 100% yes. You're like, wait, let's hold back. This is not wrestling or boxing. This is like throwing fireballs at each other. What are you talking about?


[00:12:17] Ryan Barker: Right?


[00:12:19] Ryan Barker: And so mistakes like these will happen. And how do you explain this to the C Suite, right? Because the mistakes, we've been using AI for years and it makes mistakes, but the mistakes this is making are in a totally different way. They're not used to this kind of mistake. And it's very different than what a human might make a mistake, right? A human might call two kids, like wrestling or whatever, like, oh, that's combat sports. We're like, oh, no, we mean professional combat sports. And they like, correct themselves, right? No, humor would be like, oh, yeah, this comic book is the superheroes. Fighting is best for it. So it's really, you have to give them concrete examples.


[00:13:07] Ryan Barker: And if you're lucky enough, I'm not necessarily that. They're like really good statistics and math. You like an actual full blown tech company with a tech founder. You can talk about F1 scores and precision and recall and exact accuracy and how it varies between concepts and the difficulty of what you're trying to judge. But most of us have less technical founders and trying to explain to them, hey, here's things that does really well, right? You wanted to summarize all this text feed in 100 pages and summarize it, right? That's great. And maybe I'll show you an example. Like, I'll take, here's your competitors shareholder call, right? And I'm going to feed it into Chat GPT and ask it to summarize it. And so it's not 30 minutes, it's like three paragraphs.


[00:14:01] Ryan Barker: And you show it to me like, oh, that's really cool. I understand now that this system can summarize content really well for me, right? And then you're like, okay, well, it turns out there's this new thing called RAG, which I can't remember the exact what it means, because everyone just says, RAG, RAG, RAG, RAG, RAG. And it basically means you're passing in your own data or Wikipedia or some other data. In our case, it's very important you pass in your own data. Okay. There's no value to passing Wikipedia. That's just generic.


[00:14:35] Ryan Barker: Right.


[00:14:36] Ryan Barker: So you want to pass in your own data to get custom results out.


[00:14:40] Ryan Barker: Right.


[00:14:40] Ryan Barker: Whether you're passing in your own sales data or any other sort of, like, financial data, it now becomes yours.


[00:14:49] Ryan Barker: Right.


[00:14:50] Ryan Barker: This is your proprietary data.


[00:14:52] David Subar: You're learning the top of the foundation model.


[00:14:54] Ryan Barker: Yes. It's a custom model, which is a little different than distilling, which I'll talk about in a second. But basically, in our case, we pass in all the legacy, all the video level scores that we have. We have all these labels from very cheap models that we've spent years developing. All our IP is in this, right? And we're feeding it into the system. We're asking to make summaries off of that, and it will take into account all the old data, combine it with the text data that's very good at, and come up with a really amazing textual summary of the data that can generate scores. All kinds of really cool new things that we could never have done before.


[00:15:41] Ryan Barker: Right.


[00:15:42] Ryan Barker: It allows us directly compete with competitors that we never focused on, XYZ. But now we can toss this technology at it and really catch up very quickly.


[00:15:56] David Subar: The technical details, I would argue, are interesting if they're excited about the technology, but maybe not as interesting if they're not. But if they're not. But you believe that there's opportunity here. Do you find in that case, I would argue that maybe what you're meant to do is talk about possible solutions that you can now do that you didn't do before. Do you think that's true, and do you think that's differentiated from in a GenAI world, from a non GenAI world? Is this just more of the same about pitching C Level executives possible solutions and investments that you want to make, or is it fundamentally different?


[00:16:57] Ryan Barker: So I've been a CTO for two years. So I could talk two years ago and talk about it now.


[00:17:03] Ryan Barker: Right?


[00:17:03] Ryan Barker: Two years ago, it was much more about how do we make incremental improvements. It's very clear. Like we had a path we're going to go down. We're going to make this improvement, this improvement, this improvement. We know what we're going to do for the next year. Okay, this is entirely the same, except it's what are we doing the next three months?


[00:17:26] Ryan Barker: Right.


[00:17:28] David Subar: So your window is shrunk, your planning window is shrunk.


[00:17:32] Ryan Barker: It's not just planning. Your development window is shrunk. The amount of time you're willing to spend on quality or whether you're willing to cut features. It is much more about time to market right now because everyone is racing to get the new products out and be the first to land a quality product. I literally had a discussion earlier this week and it was about how are we going to bring this new improvement to the market? And we said, okay, well, our current plan is we're going to do this video level improvement and then we're going to do the channel level improvement and it's going to make an amazing set of new product.


[00:18:10] Ryan Barker: Right.


[00:18:11] Ryan Barker: And the other option is we could do a smaller improvement on the channel side and then the video, then the channels. It's like, well, let's just go as quickly as we can. Let's just get the GenAI channel stuff in place and allow us to start selling a new technology and increase the quality over time, which is standard agile development in some ways, except just at a much faster pace.


[00:18:39] Ryan Barker: Right.


[00:18:40] Ryan Barker: The three month turnaround is just not fast enough. They want it like next? Well, they want it yesterday, but they want it very quick.


[00:18:50] David Subar: The question is, in the world, after you have the field of the model the first time, you now have to keep them all up. How do you do that? What's the process? Is that fundamentally different than software development process? I think it is. Your mileage may vary. And if it is different, what does that look like in your engineering work?


[00:19:14] Ryan Barker: Right. We've worked quite for years now on trying to do this.


[00:19:18] Ryan Barker: Right.


[00:19:18] Ryan Barker: How do you apply software developed methodologies to machine learning? And there's no term to this to my knowledge. So CICD or CICM continuous modeling or something like that. So we have constantly changing data. So it's really the data drifts and there's technology drifts and there's new technology and how do you continually update the models, right. And how to keep track of things and how do you prioritize what new technologies to apply where.


[00:19:53] Ryan Barker: Right.


[00:19:54] Ryan Barker: So really what we've done is we've applied standard engineering practices.


[00:20:02] Ryan Barker: Right.


[00:20:02] Ryan Barker: So we automate the data pipeline, we automate the new data set to score, we automate the system that feeds in and prioritizes new data to be labeled.


[00:20:19] Ryan Barker: Right.


[00:20:21] Ryan Barker: The entire process for these AI systems is you have some idea, like, hey, what's a video game, right? And some new video game gets released. Diablo 4 comes out or whatever. And does the machine know about this? Well, yeah, Diablo 4 sounds like Diablo 3. It gets it correct, right? Some other new game comes out, it gets it wrong. And so you're like, okay, how do we teach it this new thing? And so we really want to distill down the model and make it smaller. So basically we have these very expensive giant OpenAI models or Google's models, whatever we're using, and we collect a bunch of training data about one particular question. And we're trying to turn this OpenAI model into like an idiot savant. So it knows very well video games and that's all it really knows, right? But it can do it very cheaply.


[00:21:18] Ryan Barker: So we basically have completely automated system to continually find new sources of data, prioritize it, get it labeled, retrain the model, and then the critical point is to do the data validation, right? So this is standard data science. You have a holdout set. So you have maybe 20% of your data set is just kept over to the side. You never look at it except when you're validating how good your model is. And at the end of the day you're like, okay, is my precision recall F1 at least the same or close enough? And we're just going to deploy it to production. And you want to do this right. You want to deploy the newer model. The default is deploy like you want to deploy, just like regular software development.


[00:22:10] Ryan Barker: And in this case you want to deploy because the world is constantly changing. And if you don't deploy, it's going to be worse at the new things, which are the most important things.


[00:22:22] David Subar: So drift will happen.


[00:22:25] Ryan Barker: Drift happens in the data set. And it doesn't matter if you get like, you want to get Tide Pod Challenge correct, but it doesn't matter nearly as much as getting like the Gaza war or whatever happened last week, right? So keeping your model up to date with the world is really a competitive advantage. When these OpenAI models are six months old, a year old, they're trained very infrequently because it costs $100 million to make one of these models. And we can train a new model for, I don't know, $100, something like that.


[00:23:05] David Subar: You're always training, you're always deploying because topicality happens, which means drift happens. And so you have a tool set to help you run through this, to make it so that you can do continuous CICMCD, I'll call it that, right? Part of that tool set. I assume you developed part of that tool set you've purchased. The tools that are available in the market are changing at least as frequently as some of the models are. How do you as a CTO say, okay, here's a bunch of new tools that came out. I'm going to invest in these tools. I'm going to skip these tools because they're not valuable enough or I think they're going to be obsolete in three days. So you've got tooling purchasing decisions in real time all the time.


[00:24:00] Ryan Barker: This is my life, David, you're just constantly evaluating what and looking at various risks. You're like, okay, do I want to take some off the shelf total system, right? Microsoft has one, AWS has one, Google has one. Get vendor lock in, get charged ten times as much. But my time to market is going to be much faster. Or do I want to take some open source things and wire it together myself or use some small new startup, right? Say those are the options. Small new startup, big company, build it yourself, right? You're like, oh, the big company must be fine, right? Google abandons products, OpenAI fired their CEO, then heard their CEO back, almost lost their entire dev team. It's a crazy world, right? You're like, there's no way that could happen. Oh wait, it actually did, right? And then the alternatives, you build it yourself.


[00:24:56] Ryan Barker: And that has a huge cost, right? We have chosen to build it ourselves. We considered our data science models and the fact we can do things so cheaply is a business differentiator, right? We can choose to undercut, we can choose to spend more money in higher quality. Whatever we need from a business perspective at the time, for whatever product we're doing, we have better data science for the dollar than you can buy off the shelf. Much better, right? We'll say ten times better than off the shelf, like cost quality trade offs, right? But really, I think it's up to each CTO to do the investigation. And if you want to take the risk with a small company, maybe it'll work, maybe they'll get bought out. But you saved a bunch of money in the meantime and then you have to migrate a year from now if that's all you can afford, right? If you have an existing data science team, it might be worth it. In our case it was to try and convert them and just use this as another new tool. It's just another new tool.


[00:26:07] Ryan Barker: Just keep up to date with the latest and greatest and use it to keep our competitive advantage and allow us to continue to use that as a product differentiator.


[00:26:19] David Subar: Sorry, but, because one could make the argument that like a lot of other things, if it's strategic, you want to develop it yourself. If it's not, you want to outsource it, you want to license it.


[00:26:35] David Subar: For you data and the data manipulation is by definition strategic. But is the reason you're doing it for that? Developing your own tools for that reason, or the tools that you need don't exist on the market yet, or some combination or a third tick?


[00:26:53] Ryan Barker: You can't buy what we do. This is part of the problem. Like you literally can't at the cost that we're able to do it. There's just no one on the market can do it at that cost. A little bit, yes. Now, the thing is, if you were starting a new company today, right? Would I go hire up a brand new data science team? Well, you haven't proven product market fit, like you could take off the shelf things. And as I mentioned, the important thing for these new data science models is that you pass in your own data. That is what is worth the money.


[00:27:31] Ryan Barker: The data is worth the money, right. It's similar in many ways to databases.


[00:27:41] Ryan Barker: Right.


[00:27:42] Ryan Barker: Before good databases existed, maybe the fact that you could query your data quick was worth something. But now there's elasticsearch and you have your postgres and you can basically take off the shelf technology and just use it. It's very cost effective. And the gap between what you can do with off the shelf and what you can build yourself is constantly shrinking. But again, it depends on your focus. In our case, our data science models are the differentiator. We invest in it, and that is why we have the highest quality for the cost of any vendor I've looked at.


[00:28:26] David Subar: Got it. So what this interesting, because this reminds me of thing that I read and posted on LinkedIn about the Federal Trade Commission and the monopolies. I think that might happen in the AI world, and one of them was data monopoly. And what that argues is if the benefit of the model is demonstrable to the user, then they get more customers, they get more data. That becomes a natural monopoly over time in a particular vertical.


[00:28:57] Ryan Barker: No, it's definitely true. Our industry, basically, we're the smallest of the three companies that can do what we do, and the two larger companies have a ton of data they haven't had a data science team. They haven't had an engineering team for a long time, and they're building one up, and they have data. And if they know how to use and they've collected it long enough, it's allowing them to catch up much faster than they would be able to.


[00:29:29] Ryan Barker: Right.


[00:29:29] Ryan Barker: These new GenAI systems let you catch up.


[00:29:33] Ryan Barker: Right.


[00:29:34] Ryan Barker: This is one of the things. But we're using it to catch up, too.


[00:29:37] Ryan Barker: Right.


[00:29:38] Ryan Barker: Is this a double edged sword?


[00:29:40] Ryan Barker: Right.


[00:29:41] Ryan Barker: They've invested all this time and energy in doing this and that that we haven't focused on, and we're like, oh, now we can do that, too in a couple of weeks, right. With GenAI. And it doesn't require us doing a bunch of investigation or anything like that. We can just copy their feature set with off the shelf software.


[00:29:59] David Subar: Got it. So that goes back to your previous thing about velocity. Velocity is. Velocity is faster, windows.


[00:30:07] Ryan Barker: Velocity is right now. Right now, velocity is king. Like, you want to build the best product you can as fast as you can, and your engineering team may kill you if things are breaking all the time, but there's not much of another option.


[00:30:28] Ryan Barker: Right.


[00:30:29] Ryan Barker: Money is tight, so you have to go, like, with a fast, cheap, good. Money is tight, so you have to go cheap. And technology is moving so fast, you have to go fast, so something else to give.


[00:30:41] Ryan Barker: Right?


[00:30:41] Ryan Barker: But we do what we can. Right. In addition with to use modeling, you want to make sure you're focusing resources on automated testing and automated monitoring, alerting, so that people are able to, when things break, and they will, they can fix them quickly, and it doesn't impact all the things you're trying to. You know, the standard software development like methodology still holds. It's just about moving a lot faster.


[00:31:09] David Subar: Got it. Okay. So Conway tells us people ship their chart, right? Our friends that wrote Team Topologies talks about the reverse Conway maneuver, building your design so that it ships the kind of architecture to the product you want. Do you have a fundamentally different design in the GenAI world than you did before, or is it the same, or is it some similar design, but numbers in one group are different than they were before? Numbers in a different group are different tools for.


[00:31:50] Ryan Barker: It's definitely changed, particularly recently. So, as I mentioned, a lot of this is like, how to move faster, right? Like, how do you take a team and make them move faster? When in our case, we have giant enterprise clients who expect quality, right? So this is like a dichotomy. How do we do this, right? And so what we've done recently and seems to be working well is we actually have sort of split off. I don't know exactly what to call it. It's a prototyping team. It's not like permanent hackathon, although you could kind of imagine it like that, where their goal is to crank out a ton of prototypes, have sales pitch it to clients, have product people look at it, have the C Suite look at it and just constantly iterate within and making constantly new little demos and see what lands, right. And then the things that do land the product people actually go and build it out in a standard like agile process. And prototyping is in the agile process, but this is a squad, in our case, of the developers who really like to work in this particular environment.


[00:33:23] Ryan Barker: There are some who don't. Do not put them on this team. It's very important. You need to put the, like. How.


[00:33:31] Ryan Barker: To put some of these engineers, when put into more rigid environments, fail, right? This is why a lot of people when they go to Google, they like fail, right? They work at these startups, they like to go fast paced. You're building all this stuff and they go to Google and things take six months to do anything, right, and they just fall apart. So you want sure of your fastest movers, probably the ones who are least likely to get the last 10% of the ticket done. They're the ones you want over here, right? Which is actually nice because the people who like to complete everything actually don't have to manage the people who are trying to go very quickly. So we've basically now have, in addition to our standard squad set up, where we'd have a different squad from where we're doing for the corner. We also have like a prototype squad. And then I have an intern for me who does whatever crazy thing. I read some article, I'm like, I have this idea and here's an idea, go try and make something with it.


[00:34:46] Ryan Barker: The teams sort of feed into each other and sometimes some prototype will come out and we'll interrupt the quarter and just add it in. But often you're just doing prototypes for what you're going to be focusing next quarter.


[00:35:04] David Subar: So you have a scruffy team. People like to start but don't like to finish, putting words in your mouth. You've got a guy or a girl, woman or man who just does arbitrary things that come out of your head, like Athena at Zeus's head, and then you've got the traditional we ship stuff team, but the weight of the scruffy team is bigger and the interruption risk is higher for the we ship stuff team.


[00:35:37] Ryan Barker: Yes, that's basically true.


[00:35:40] Ryan Barker: Right.


[00:35:41] Ryan Barker: And the percent of product ideas that we ship in a quarter that were the same as what they thought they would do when they kicked off the quarter, planning is going down slightly, not a lot, but like 5-10% lower. Instead of like shipping 65, 70% of what they thought they were going to do, it's down to like 55, 60.


[00:36:09] Ryan Barker: Right.


[00:36:09] Ryan Barker: And that's okay.


[00:36:10] Ryan Barker: Right.


[00:36:12] Ryan Barker: It's very important to be able to react quickly in this environment, whether it's your idea or your competitor came up with something and like, oh shit, I need to copy that. What do I do? Oh, they must use GenAI. How can I use GenAI? Okay, I can copy that in four weeks.


[00:36:28] Ryan Barker: Right.


[00:36:28] Ryan Barker: Where before without the GenAI you're like, oh no, they spent nine months on that. Now I'm screwed.


[00:36:33] Ryan Barker: Right?


[00:36:33] Ryan Barker: So it's really interesting.


[00:36:37] David Subar: So your say due ratio is going down. That must have an effect for the people that just like to ship. That must have a morale effect on them because everyone is shifting left.


[00:36:48] Ryan Barker: Yeah, it's tough. Luckily we've always liked Zefr. We've always had teams that like to move pretty quickly. We've always been like a two week agile thing and they're kind of used to my whims. Shall we say a change of a few percent here and there. Like it's very obvious on a graph. It's not so obvious in practice and it happens over time and people just get used to it. And I feel like the work from home set up, we have hyper set up for us and some people are full remote, really helps the people who like to try and finish things, being able to focus and they feel better that they're able to focus and get something done.


[00:37:42] Ryan Barker: So even though they might not know as well what they're doing next month, they're at least able to focus on what they're trying to build right now. And with a sprint process, we don't change mid sprint.


[00:37:56] Ryan Barker: Right.


[00:37:56] Ryan Barker: So it's not as disruptive as it might sound. It's very disruptive for the product managers, though. They may not like that, but that's not the developer's problem at least.


[00:38:11] David Subar: So I'm interested in product development, the product management stuff. Let's hold that off to our next conversation. Thank you very much for, for educating us, for stepping us through these. And we have a whole product management conversation that we can have that we'll do next time.


[00:38:31] Ryan Barker: Thank you so much for having me, David. I really appreciate it. I hope people found parts of this interesting or at least entertaining, and look forward to talking to you again.


[00:38:42] David Subar: Look forward to talking to you. Thank you.





Comments


bottom of page