Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Podcast | How to Thrive, Belong, and Lead in the Tech World

In this podcast, Dr. Cat Hicks shares cutting-edge research on the most innovative software teams.

Apr 28, 2023 • 3 Minute Read

  • Business & Leadership

Do we really understand engineering productivity, and what does it mean for our software teams to “thrive”? In this podcast, Dr. Cat Hicks shares cutting-edge research on the most innovative software teams. She presents a framework based on her team’s research on over 1,200 developers called Developer Thriving, which engineering leaders can use to create high-performing development teams that are happier and more collaborative.

Transcript

Speaker 1:

Today we're talking to Cat Hicks from Pluralsight Flow about her research insights that are helping developers to thrive. You're listening to Joel Beasley, Modern CTO.

Joel Beasley:

I am genuinely curious in what you do. I have been following Pluralsight for probably about six years now, and one of my past guests, I can't remember the name of them, but I remember he lived in Colorado and he had moose on his property. That's what I remember about him.

Cat Hicks:

If it's Colorado, it could be Flow where I work. Yeah. What was formerly GitPrime.

Joel Beasley:

Yep. GitPrime, that's it.

Cat Hicks:

That's right. That's where I do research. Absolutely.

Joel Beasley:

How long have you been with Flow now?

Cat Hicks:

About a year. Yeah. So I direct this research lab there and we're a pretty new team, a really exciting investment from Flow, and so we've been cooking on that for about a year now, which I can't believe actually. Time flies.

Joel Beasley:

Yes, small world.

Cat Hicks:

Yeah.

Joel Beasley:

So how did you get involved with that?

Cat Hicks:

Oh gosh. I have a background in social science, which you may have seen, and I have always been really looking for this kind of team. I have to say, I had the chance to build this lab. Greg reached out to me to talk about it, sorry, a GM at Flow, and the vision for it was just so much what I wanted to do with my life, what I wanted to do every day, which was collect original data, turn it into evidence for software teams and share with the world. So that's what we're doing, bringing some diverse perspectives together in the lab, which is also a huge passion of mine. Getting data scientists to talk with social scientists and getting design researchers and getting people with all kinds of different backgrounds to sort of weigh in on helping software teams.

Joel Beasley:

So because Flow helps them be more efficient and track their work, is that what the basis of the program is?

Cat Hicks:

Yeah, so that's certainly a part of it. We don't actually do product research with Flow, but we are a research team that does foundational research. So we study big topics of what is it that makes software teams really thrive, and that's something I love to do. I was in academia, I was also a consultant, so I ran an evidence science consultancy with organizations looking at how organizations put research into practice. So this has always been my passion is almost like the open science side of doing this kind of work. So we do large foundational research projects, not just with Flow users, but with large scale studies of engineers out there in the tech industry. And then we share those findings with the field.

Joel Beasley:

Has the question been answered? Amazon has two pizza teams. We just say "We don't need research, just two pizza teams. Everything's good to go."

Cat Hicks:

Right. Yes. It's so easy. If it were so easy, why would we be so worried about it all the time Joel? We do have a large study out that we just launched on developer thriving. The pieces that we think are really, really important for technology leaders to understand about what I would call the good problem solving cultures that we create around software engineers. So it's not easy, but we do think it's not solved necessarily, but I do think we have found some pretty exciting and helpful things for leaders to understand.

Joel Beasley:

Well, now you got to tell me, what did you find out?

Cat Hicks:

So in the developer thriving framework, what we did was we took a look at where we could pull from empirical research across our different backgrounds. So I have a background in psychology. One of my team members has a background in clinical research. We have some data scientists and design researchers, and we took a look at all of this empirical research from academia as well as applied research on just the things that really unlock innovation and problem solving for teams working together. So there are four factors that we came up with and we called them agency support and belonging, motivation and self-efficacy and learning culture.

And we developed original measures of each one of these things. So on a developer team, for instance, for agency, do developers feel like they have a voice in how they're being measured, and if they disagree with something or they notice something is broken, they have a way to speak up about it. And so when we went out into the industry and we did research on this with about 1200 different software developers, we actually found that these factors significantly predicted their productivity. So that was super exciting to find as a social scientist.

Joel Beasley:

So I am curious, my background, software engineering, and I know you just said that you surveyed a lot of developers to get some of this information, but I was curious, when you talk about unlocking innovation and problem solving, how do you determine that? How do you determine that the information I'm getting from these people, I know that they've solved problem solving, or I know that they are good at unlocking innovation, or am I thinking about it wrong?

Cat Hicks:

Can you tell me a little bit more about, give me an example maybe of this question? Yeah.

Joel Beasley:

Yeah. So you had said, I made a note. You said you were looking for what unlocks innovation and problem solving when you were doing your research, and how do you do that?

Cat Hicks:

Yeah, okay, sure. So let me tell you a little bit about how we try to measure these big human experiences and one large way is surveys. And so you can build a really good survey out of measures that we've created in the social sciences that people tend to act answer very accurately. Are you familiar with the DORA metrics and the DORA research in software engineering at all?

Joel Beasley:

No.

Cat Hicks:

Okay. So this is a large scale research project that looked at elite and high functioning organizations. And one of the things they did was actually run a lot of surveys asking people what was working inside of their environments. So we took a similar approach here. We developed some survey measures of how developers felt like they were experiencing these things on their teams. So for instance, you go into your organization and you'd ask your software developers, how likely would you be to speak up if you felt like your team was being measured wrong, this kind of question. And people are actually very good at answering this kind of question, especially when you do it at scale and over time and ask them kind of a battery of questions. So that's one of the things we've been working on in our research is actually creating some of these measures for software developer teams. Yeah.

Joel Beasley:

Well, that's pretty cool. So this DORA is, it's been around for a while, it's independent organization. How are they set up,

Cat Hicks:

DORA? Yeah, I don't want to misspeak because this is not my team, but the DORA metrics, the DORA4 is a series of metrics that was put out in this ongoing research project, and it actually I think is with Google now this team. And so they put out reports every year and they look at things like-

Joel Beasley:

[inaudible 00:07:13].

Cat Hicks:

Yeah, they look at things like deployment frequency, meantime to resolve changes and other kinds of ways of measuring software work and how we deliver software work, which as you know, you've kind of alluded to, is really hard to measure, right? With successes in this field. So that's one kind of guiding light that we have for how to measure it.

Joel Beasley:

Yeah. I had read an article Google put out about research, and it might have been DORA related, but the takeaway that I have I think two or three years later is that psychological safety on a team had a huge impact.

Cat Hicks:

Yeah, yeah, absolutely.

Joel Beasley:

So you went around and you designed your own research and did your own surveying and you figured out the four things and you said it was agency motivation, self-efficacy, learning culture, and what was the fourth one?

Cat Hicks:

Support and belonging. So developers feeling like their team not only appreciated and cared about their technical work, but also supported them as a whole person, and that if they were going to try to change, try to develop, that gets right to the psychological safety piece I think. So we had that represented in this research too.

Joel Beasley:

Help me understand that better support as a whole person. What does that mean?

Cat Hicks:

Yeah, so there's a really, really cool concept in social science called sense of belonging. Have you ever heard of this idea?

Joel Beasley:

No.

Cat Hicks:

Yeah. It's actually very tightly related to psychological safety. Psychological safety includes sense of belonging and you can measure it. It's basically the belief that someone has that "A person like me belongs here, a person like me can succeed here. And even if I'm different from people around me or have a different perspective, I still belong here." It's kind of your sense of community and this is a very, very powerful engine for people. It kind of helps you navigate rough patches, it helps you make a mistake. For instance, you might say to yourself, "That's okay, I can make a mistake and still belong here." So it's kind of this very important way that we find comfort when things are hard and it matters a lot in STEM fields actually.

So there's some really cool and interesting research on how good are we at creating sense of belonging when somebody takes their very first engineering class in college, for instance. And unfortunately the answer is we're not very good at creating this. Students come in with a ton of stereotypes about who belongs in a certain field, who can do what kind of work, so sense of belonging when you can create it, especially in these very important early moments for people. The first time you join an engineering team and the first time you take a coding class, something like that, it has this long-term effect on people.

Joel Beasley:

Yeah. You want a positive experience with it. How does a college, let's say that, how do you go about putting out, and I know this probably isn't your key job, but if you were to hypothetically put out to colleges some sort of standard of within your engineering programs, you could do these three things. And if I, you're doing basically nothing now, but if you do these three things that will help you have a better sense of belonging.

Cat Hicks:

You're speaking my language now because I used to do intervention science in education and have done it for tech organizations. This is the kind of question I love. And so there we have seen interventions that have really helped on sense of belonging. One thing that I'm thinking about is when teachers in an introductory class say you have the one computer science class that everybody has to take, that is one of your most high value opportunities to change the minds of the largest number of undergrads who might ever encounter computer science. This is maybe the only time for some of them to make this decision about whether or not they can code.

So you can say what we want to do in this introductory CS class is make sure we really set the tone in the beginning. First week of class, we show examples of people who have been really successful in coding careers, for instance. And we have those people talk about their early learning, their early times they struggled. And we make that a really explicit part of the conversation because one thing that happens with all of this stuff is that students hold a lot of beliefs in their heads that are like, "Okay, maybe I belong in this field if only I never make a mistake ever." And that's very counterproductive. So making that sort of a thing that they see all these very successful people have also had this journey that I'm on, it helps them draw that connection.

Joel Beasley:

So we call those interventions that would be science intervention.

Cat Hicks:

Yeah. You can call it an intervention.

Joel Beasley:

Okay, What could I do on my team? Let's say I'm a leader. Leaders listen to this, engineering leaders all the time, everyone's growing. People are growing, they're bringing on new people to their team. What's something that they can do, something really small and easy. You don't want to give them too much homework, just something small, easy. Maybe it's just a perspective that they can walk around with one way to create this sense of belonging on their team.

Cat Hicks:

Yes. Okay. I have one for you. Have you as a manager ever sat down with your direct reports and asked, "What does success mean to you?" Have you defined success? Do we share a definition of success? Do we maybe different things sometimes are in our definitions of success. So we've actually seen in research, in software engineering research with real teams, Margaret-Anne Storey, who's a researcher who's worked with Microsoft has found this that managers and individual contributors on software teams often have very different definitions of success. And something that's interesting is that they also don't talk about it very much. So I was at a conference last fall with one of my researchers and she sat down at a table and was introducing herself to a manager who happened to be there with his individual contributor. And they said, "What are you doing here at a engineering conference? You're a psychologist."

And she said, "Hey, we do this research on how do people think about success and how do they think about staying motivated?" And the manager in the IC turned and looked at each other at the table and we're like, "How do you think about success? Have we ever talked about this?" So she got to sit there and listen to them have this whole beautiful conversation. And so that's such a tiny but important thing you can do.

Joel Beasley:

100%. So I'm a founder at this company. I was engineer, I was a co-founder, but this is the first time the past six, seven years that I was the CEO and I didn't have a co-founder. So I got to learn that the whole sales side of things, I got to manage people who weren't engineers. And that was a big learning experience for me. And luckily I kind of fell backwards into some of that, what you were talking about, because early on when I first started running engineering teams, we had this conversation was "It's done." "Well, is it? What do you mean? Did you finish writing it and it's in development? Is it as it passed staging? Is it in production? Are there people being able to use it?" And so we ended up having to come up with this definition of done so that whenever we say the word done like, "Yeah, that's done."

And there's six people on the team or seven people on the team are involved in the project. Everyone has a different version of done in their head. And so we had to figure that out. And I did carry that over through because we're a fully remote company, so we lean harder on our KPIs for knowing that stuff is happening. Something that was fun to watch the bigger companies figure out in the pandemic, but we were able to do that. Defining success on your team is the thing that a manager could do with their direct reports, individual contributors that would help them give a slight bit of advantage.

Cat Hicks:

Absolutely. I think that that's a beautiful start to the conversation because one of the things that I think about as a leader who works in data, as someone who's always worked on how we use evidence and organizations, I think about things where do we already have the answers inside of our organization from our people maybe. And we here in our research with software teams, so much stress and tension and responsibility from engineering managers. So that was another piece that came out of this project we did on developer thriving was a lot of managers feel like they're responsible for everything and having all the information and they just don't have it. Your one human brain is not big enough to have it. So a lot of the recommendations that we put out in our research have to do with where can you actually find that information and see yourself if you're a manager, see yourself as someone who translates information and elevates it and amplifies the voice of your developers rather than feeling like that pressure to generate all of the answers yourself.

Joel Beasley:

And do you have direct reports?

Cat Hicks:

I do, yes. I lead this research team, so it's a very interesting crew.

Joel Beasley:

So you get to learn about it and then you get to use it within your team?

Cat Hicks:

Always. It's always very meta when you're a psychologist,

Joel Beasley:

Right? They'll call you out so fast.

Cat Hicks:

Oh, 100%. 100%, yes.

Joel Beasley:

Motivation, self-efficacy, I'm a very independent person. And my personality, I'm a big on extreme ownership in figuring things out. I believe that once you take ownership of whether you're the problem or whatnot, it gives you the power to then change the variables so that you can then improve the outcome. But when you say motivation and connect that with self-efficacy in this context, what is that?

Cat Hicks:

Yeah, so self-efficacy is also a concept that comes out of social science, and we've measured it in classrooms and we've measured it when we study. What is it that keeps people achieving over time? So achieving long term. And there's a thing you have to do when you're trying to achieve a really big goal, which is basically get knocked down and pick yourself back up again. And so when I give people advice about something like keeping your New Year's resolution, which is the thing that people always want to know, you're a psychologist, how do I keep my New Year's resolution? And I say it's not about never failing, but again, it's about that pickup after the failure. Self-efficacy is a big driver of that.

It's kind of like this self-talk that we can do to say, "I'm not sure what the solution is yet, but I know that I have the ability, the capability to solve it." So self-efficacy that you can be high or low in your general self-efficacy. And we can measure that with developers. So we can say, "Hey, you carry around a lot of doubt. You're undermining your own, probably not even aware of it, but you're going to give up sooner and you're going to not always see the creative solution because you're not understanding that you're on this productive journey." So if you've ever heard of growth mindset or those kinds of things, self-efficacy is under that same category. It's kind of a version of growth mindset.

Joel Beasley:

What was the popular one that came up maybe about six, seven years ago with the, they felt like they were... Oh, imposter syndrome.

Cat Hicks:

Yeah. Imposter syndrome syndrome.

Joel Beasley:

So when I heard that, I was like, "Why would you feel like you're an imposter?" Because I googled, I'm a nerd, so I look up the definition in the dictionary and I'm like, "I don't understand this." After enough conversations, I figured from my understanding of it, and you can correct me because you're the expert, but I just was like, "That's self doubt." If the way people are using the word, it means that they're doubting themselves. And that's been around since the beginning of time. So it was just the kids these days, I think coming up with imposter syndrome.

Cat Hicks:

No, I understand, sometimes on our team, we joke about the buzzwords of the day or the hot topic word of the day, but I always try to pay attention to if something really resonates with a lot of people, there must be something going on. People are trying to express somebody about their experience with you. There was a great HBR article I think about "We should stop telling people that they have imposter syndrome when actually their environment is just being terrible for them." Everyone would doubt themselves if you're in a place that's treating you badly. So that to me is the current version of this conversation about imposter syndrome is have we actually just come up with this word because it lets us avoid talking about whether our workplaces are really good for us. And I have a lot of empathy with that because I see a lot of people, when you start to talk to them about their work, they might say, "Well, I'm not sure if I can do this. I'm not sure if I'm good enough. I'm not a 10 x engineer."

All those things. And then the key question to ask again, tip for managers, key question to ask when you hear that stuff might be, "So can you tell me why you think that, where did that cover, can you give me me an example, right, of a time that you decided this is true about me." And you'll often hear, somebody will tell you, "Well, I did my best to write this piece of code and then I had a really toxic code review and it was terrible, and I just decided this whole language isn't for me," you hear about these important turning points for people. And then again, intervention thinking, you can intervene on it. You can say, "Okay, well let's change the belief because I think that maybe that was wrong, that feedback you got."

Joel Beasley:

You're brilliant. Is there a word for when words become really charged with emotion? For example, do you guys refer to specific words and culture that are currently really charged with emotion where if you bring those up, you're like, "You don't bring that word up at dinner with grandma," or something.

Cat Hicks:

I tend to call words that loaded-

Joel Beasley:

Loaded.

Cat Hicks:

Loaded, loaded with emotion, loaded with energy, or it's like I've put something really heavy down on the table. I also, there's a word you might, this is a vocab word I like to call, some things a suitcase word. Like "Let's unpack that because you just sat a huge suitcase down in front of me." The word productivity, what do you mean by productivity? Right? The word done maybe, right? We have to sometimes use these huge words, but they hold a lot of different concepts inside of them.

Joel Beasley:

Yeah. That was something I'm always working on more recently. And I share, I'm very transparent person. So more recently we're working on our marketing automation and we are visualizing it with these, this lucid chart software thing. And we are going through, and I was working with the marketing automation person, and then they went off and did some research and came back. And when I saw it was so far from what I expected it to be in a not good way. And in my immature early leader instinct was to be like, "Nope, sucks."

But then I'm like, "That's just going to make my job harder because first I'm going to burn some credibility and some emotional capital with this guy by just saying it sucks." And secondly, "It's not going to actually get us anywhere." And no matter what, I'm still going to have to figure out the things that are "wrong with it" or that I need changed or that we need to discuss. So that's just something that I'm always working on is not giving such a quick no response. And I promise you, Cat Hicks, I am not perfect and I usually apologize or I usually catch myself. I was like, "Nope." And I'm like, oh wait, hold on a second. I shouldn't just say that. So I'm always working on becoming a better leader.

Cat Hicks:

Well, here's the great thing too though, is I think I face that fear too. We all do of course. And then you make your career on trying to be really good at things and not make mistakes. And then you get into leadership, I think, and you have to be humble in this really new way. And it's difficult. I have people always say, what gets you success now is not what got you here and you have to unlearn a lot of stuff. It's something that I find really beautiful is when you move from the defensive stuff into the real talk. Like, Hey, this really surprised me. I was wrong about this. I thought it was this one way and the data told me I was wronged. That is a moment that you have the opportunity to connect to people and you have the opportunity to build a lot of trust with people and have a very authentic conversation. So I think it can be really surprising, but I try to pass that on from everything to our research to how I lead my team. I think that I find that dynamic very true.

Joel Beasley:

So I like to listen to all the billionaires and read their life stories and understand how they think that's part of my education and my business world. And one of the things that I saw a couple times was I think it was Bezos and maybe Musk or a couple of them, they said something along the lines of this, the most important decisions that I've ever made were not with data, they were with my gut then. So I put a little Astra, I put a little pin in that because what I think they mean is I reviewed multiple sets of data and then I made a gut decision on what I felt after seeing the picture as a whole. So it's easy if you go to one end of the spectrum, extreme scientists like data, data, data it's really easy and hard sciences, it gets a little more murky in business. But how do you handle that? How much data should I be reviewing or how do you answer these big questions?

Cat Hicks:

Wow. So that's a really easy question you dropped in front of me, Joel. Thanks for that. So I have a couple thoughts here. And I love this question though. And here's one thing I would say. There's a story that I love. It was a letter that the president of the American Statistical Association, he was stepping down from this job and he wrote this final letter, and because I'm a huge nerd, I read the letters of the American Statistical Association and he started this letter saying, if you're standing on a beach and you notice the waves are going out really far really fast, you do not need to do a 10,000 person survey to figure out that you should run. All you need is an N of one in that situation. That is the evidence that you need, that you need to get out there. And so I think sometimes it's not really about data, it's about do we have evidence that's fit for purpose?

And sometimes the evidence that's fit for the decision we need to make is a huge survey and sometimes it is the art form and the best decision we can make in that moment because we need to do it fast. Or we know that yes, we have quant data, but it doesn't measure something super important and we're still working towards figuring out how to measure that thing. All of that stuff we bring together, I think in our decision making. And rather than being afraid of that, I try to lean into it and say, "Where can it all inform the other parts of it?" So in our lab, we do qualitative research and we do quantitative research. That was one of the things that I was very excited to do with this team because I think that human experience, we can measure it in a one to five scale at scale with a thousand people, and we can also have a lawn in-depth interview with somebody and that can teach us something super profound. So that's kind of my version of gut instinct, I guess, is the call of research.

Joel Beasley:

I love it. And so you finished this big report, you studied these concepts, but what do you do now? What's next?

Cat Hicks:

Yeah, we have a roadmap we're really excited about. One of the things that we're actually diving into is that big suitcase word of productivity and how everyone in tech right now is saying we need to do more with less. It's a moment of a lot of fear and tension for people. And something that I care very deeply about is trying to shift our technology organizations away from thinking about just production for the sake of production, produce, produce, produce. That's how we're going to get results, move them instead to what we would call performance thinking. What does it mean to build quality products? Just like how I think about data, this actually right quality data over big data. We think about what is a sustainable productivity cycle. So we're building some research on that right now and it's very, very interesting. A huge piece of it is looking at things like how do engineering leaders actually define performance and is that different from how developers themselves see it? So again, those alignment conflicts are really there.

Joel Beasley:

Yeah, it's tough because there's a lot of competing things to consider the value. It's bringing to both the market, your user base, also what the vision of the leadership team has, and they should in a good organization, they're often very well connected, but sometimes they're not. And then you have to figure out, you go through the translation. Is this what Flow does? I know I do want to talk a little bit about Flow.

Cat Hicks:

Oh, for sure.

Joel Beasley:

Is this what it helps with? Obviously you're doing research, you work with Flow, I'm assuming this product, because I remember I did the interview with Get Prime with one of the founders and we talked about the Moose and the Meese and whatnot, and he had told me, but that was three or three to four years ago. Can you just remind me exactly what Flow is and how you interact with them?

Cat Hicks:

Absolutely, yes. So Flow is a tool that engineering teams can use to reflect on and measure their work. So it takes in all kinds of data from your software processes, and it's probably changed a lot in the last three or four years. We have a lot of cool features just last year launched things like the door metrics that I mentioned before actually. And so what we see, and we're going to have some Flow data in our upcoming research, so you can watch for that. But what we see happening in our research on my team is when software teams use measurement together as a reflective kind of process, and they use it in a way that developers agree with and appreciate, even though it's imperfect and no single metric ever captures everything important about software engineering, we see that this drives greater productivity. It also helps teams communicate about their work.

And so one of the things that we found really, really interesting in our research is, I think it was less than one in four of the software developers that we talked about, was on a team that consistently used software metrics. And it was very surprising, especially to those of us maybe who love to be data nerds and are very evidence based. But we also see things like more than 60% of developers in our research work with teams that don't share their SANE manager. And those other teams have very different measurement practices. So imagine being a developer and you're working really closely, you're coding with somebody back and forth, and they're being measured in a completely different way than you are.

That is a huge equity problem, I think, in our organizations. And so Flow can provide with that conversational board and that reflection point for teams, it doesn't tell you everything that you might want to capture about your environment, but it really helps you start the process. And I think that it also replaces a ton of manual work that managers and even tech leads senior developers might be doing themselves because we're all living inside of organizations that are saying Data, data show evidence of your impact. And so they're going maybe into their own work logs, into their own calendars and creating their own measures. And really, we should be building tools for this to help our teams do this.

Joel Beasley:

Well, it's an incredibly hard problem that's been around since the beginning of software. When you said that the first of all, I fully agree, if you're working with other organizations within the team, you have different measurements of success and productivity that can create a huge problem. You used equity as a word, and I don't know the definition of how. Can you help me understand what that you meant by it's an equity problem with the teams not having the same measurement, and then are there other equity problems in the organization?

Cat Hicks:

For sure. So equity is a word that, again, that's kind of a suitcase word, but equity has to do with are people being met with the same resources, the same credit for the amount of work that they're doing? Are they, do they have access to the same opportunities? So it's really, if you think about diversity, equity and inclusion, if you've ever heard the acronym, DEI, equity is a piece of that and it really has to do, I think very basically with fairness inside of an organization. And so again, sense of belonging right back to the start of our conversation. If you look out into your team and you feel like, "Well, okay, I did the same work and it didn't get the same amount of credit," that's fundamentally a huge source of tension for people, decreases psychological safety. So you want to increase equity in your organization.

And I think that there's many kinds of equity. There's equity in how people get treated for their identities of course, which is a huge thing we talk about. But there's actually a lot of equity to what kinds of engineering work get valued and gets seen and get rewarded. So we see measurement, actually thoughtful measurement inside of teams has this effect.

In fact, in our developer's thriving research, this was fascinating. Developers were more positive about the benefits of using metrics. And so we had some beautiful quotes from a few people because we coupled our quant research with qual research, again, the value of doing qual research. And we had some interviews with folks and they mentioned things like there's a great quote where someone said, metrics can be an equalizer. It's a way that someone, maybe I didn't get seen before or I wasn't in the right room at the right time because of all these factors or where I worked or what team I was on, but then the metrics were there. They were a source of testimony or a witness to my work. So I'm very excited to follow up on that line of the research.

Joel Beasley:

Yeah, so you were using it in the context that people being not met with the same resources, they don't have the same measurement system. Is that correct?

Cat Hicks:

I think so, yeah. In this case. Or maybe they're not just not being seen, right in the ultimately most fair way. Mm-hmm.

Joel Beasley:

Yeah. Well, it's super important to figure out how to do that within an organization largely because they do move within teams. Often people will move within the same organization. Is there an argument for having these different productivity metrics across different teams? Why is it that way to begin with?

Cat Hicks:

Yeah, I think that we're, it's kind of a question about the state we're in now, and I find it difficult to answer questions about the state of the tech industry or maybe the state of there are software teams, well outside of the tech industry, so it's not one state. And that's something that we see because we see people who work in technology, but also people who work in financial services or retail or maybe you're the one software team at a hospital. So I think that there's not one experience that people have, but some large trends that I see, kind of prototypical trends here might be we are really, really relying on individual software teams, whatever you say a team is, because that's also complicated. But individual software teams are holding a lot of information. And then I think we haven't had the conversation in this industry about how do we move the right pieces of that information up to the organization, up to the tech function in general?

What do we share between teens without bogging ourselves down? Because we do need flexibility. We do need these teams to be able to say, "This is what we're dealing with. This is our situation. We understand it, we bring this context to it." So Flow does this, for instance, in that we have different ways that you can aggregate metrics up. And we think that that's a really important piece of this. You don't just say everybody is going to be able to bring the right context to an individual developer's metrics, for instance. Instead, you want to think about team level metrics and velocity over time in a tech organization in a way that really helps you understand the organization as a whole.

Joel Beasley:

I think you would like this guy named Adam Barett. I've had him on the show three or four times, but I first met him under the discussion point of reliability engineering, and he talked a little bit about how he took the principles from physical engineering and manufacturing products over to software and how he helped structure teams to have the least resistance on product delivery. And he did a lot of stuff. He's one of the top smartest people. I feel like a monkey when he's in the room. I'm like, all right, I'm just clearly unevolved. This is the version of humans 2.0.

Cat Hicks:

Well, you're doing something right if you're always feeling not the smartest person in the room. That's what I think because I believe intelligence is kind of collective when you talk about manufacturing. My grandfather was the four man in a bag factory in Missouri, and I sometimes think I have gone so far from my family history in the world, it's so different when I work on now, but sometimes I think, you know what, he's just the same. He was there in this big factory trying to keep people safe.

And one of the things that was amazing about my grandfather, he never went to college who actually went and boarded with a farm family so that he could go past eighth grade in school and he had developed this ear for how the machines sounded, and so he could walk out into their factory and he could hear if something was going to break. And these were back in the day, really dangerous machines, big dangerous. And so he always told me it was the rhythm. He could hear the rhythm and he couldn't translate it to anybody else, but there was that beautiful art in his ability to keep people safe from the factory. So I think sometimes people have that sense in manufacturing. We should learn from that.

Joel Beasley:

Yeah, I love that. Hard work he did too for building his family, building that foundation so that you can go the next layer. Yeah, that's the one thing my dad had said when I'm 35 now, so when I started having kids and around 27 I started asking dad questions to my dad and I said, well, how do you think about being a dad and all of these different roles you have to play? And he said that his whole thought process was, "If we can just build a solid foundation for the next generation of the family, then that's a win because eventually the foundation will be so solid and it'll be built up so much that they'll just take off a rocket ship."

And he goes, "So that's what I focus on." He goes, "If I can raise you kids better than my parents raised me, if you guys can have slightly better finances, that's a step forward." And then it's like-

Cat Hicks:

That's huge.

Joel Beasley:

... now what am I going to do with my kids. It's just these small incremental, it's almost like thinking like your DNA a strand over the chorus of millennia. It's like, what am I contributing here?

Cat Hicks:

I love that. That's so beautiful. And a huge value of mine is just to always think you can't see all the impact that you're going to have. You just can't know. People are going to do stuff more than you could have imagined, and you're not on this planet for as long as you might want to be. Right? And so long-term thinking to me is really about exactly that. That's so beautiful that your dad had that insight. It's like something is going to be better than my generation and it's going to be worth it to be one little piece in the chain that gets us there. Yeah. I think that's how our organizations have to think too.

Joel Beasley:

Yeah. Well, I do want to talk about your research and the developer success lab because people are listening. They're hearing all this brilliant insight and from you about how to become better leaders and work with their teams, but they don't want to just be cut off from the cat supply at the end of this interview. How can they hook in? Do you have a newsletter that the Developer Success Lab puts out? How do they get connected?

Cat Hicks:

We have better. We have better, we have website, so definitely check it out and it's full of resources, and we're building it out really daily. So we've launched a website, it's at dev success lab.com. We also have a white paper series of something that really, really matters to us is to do this rigorous research, but also to share it in ways that you can take back with you to your organization and find it accessible. And so we have the full deep research report, but we also have a beautiful white paper series. It has illustrations.

If you think that way, I urge you to people we are listening to take it, use it, put it in a memo, bring it to your leadership team. We want to empower developers to use our research to argue for the things that are important to them. So our website is a great resource. We also run a webinar series freely available called the Developer Success Summit. We'll have another one coming up in a couple months, and our next white paper launch is actually next week on visibility inside of engineering organizations. So you can find that on our website too.

Joel Beasley:

Yeah, I pulled it up as you were talking, and I love that It looks pretty. There's lots of graphics for me that's important. So I know myself, I am heavily a visual person. Wonderful. If you can explain it to me, that's great. You can write it down in text, that's great. But if you show me an image of it or show it to me in person, I get it super fast.

Cat Hicks:

I'll tell you, this is something that my team is really great at too. So my team, Dr. Cara Lee and Morgan Ramsey, two of our research scientists at the Developer Success Lab, they are both very visual thinkers. And I'll tell you, I am a word person. I love to write things. I write very long things, and this was a challenge for me. We're talking about leadership stuff. They came into our research and they were like, "Cat Hicks, we need a diagram. We need a Flow chart. We need to illustrate this." And especially out of the qualitative research, which Morgan Ramsey helps to lead for us, there'll be these paths that people go on and important concepts, and she's a beautiful visual thinker. So we are excited to push in that direction because I think that sells the story for people.

Joel Beasley:

What would be cool for you to play around with running your research papers through like GPT visualization systems?

Cat Hicks:

Of course, yes.

Joel Beasley:

Be like "Make a comic out of this," and you just send them a 200 page research paper and then it comes out some hilarious Dilbert like comment or cartoon, and then it's really easy to understand.

Cat Hicks:

I'll tell you, we have been using it as a learning tool on our lab, so we have been certain ethical restrictions of what we can send it because people agree to give us their research data. We take that very seriously. We don't put it on any other platform, but for our own learning, it's been really fun to go to Chat GPT and say, "Hey, how would you make a plot for this? And how would you make this plot more visually interesting? Or how would you make it more creative?" Some of those questions. And that's been a great learning exercise actually for our team.

Joel Beasley:

So I just figured this out the other day, so I figured I'd let you know too. I just found out that there is a way, so you can get your own self-hosted isolated instance of chat G P T that doesn't phone home or go back anywhere.

Cat Hicks:

Yeah, create your own space.

Joel Beasley:

Yeah, you can also, if you boot it up with this one specific service, I think Microsoft has guardrails on it, but this other one, you can just launch the raw model that's out there. It doesn't have all the guardrails on it, so you can teach it whatever you want. You can give it whatever type of data you want and have them processed. Because I was curious to run all my company financials through it and start asking it questions and say, act like Dave Ramsey and help me with this. But you don't want to do that when it's on Chat GPT. Yeah. You don't want to put your private research information and have OpenAI models even though they're abstracted. I'm sure learning from it.

Cat Hicks:

No, I think that's very smart. I think that's the future of this stuff is going to be where do we make these models specific to our context and how do we train them up in the ways that matter to us, and where do we have the right guardrails of course, for what's getting sent to where. Yeah. So a really interesting time for this stuff.

Joel Beasley:

Yeah. I want guardrails on public services, but I want freedom on private services. So it would cost you roughly $1,000 a month on Amazon Web Services-

Cat Hicks:

Oh my gosh.

Joel Beasley:

...to just boot up the raw model and you could optimize it and so on and so forth. But then there's now services coming out, so I'm watching it every day because that's the problem with this stuff. I'm sure with research, this stuff is just coming out at such a rapid pace. How do you keep up with it when you're doing your research and all of that? How do you keep up with all of these new advancements and other researchers putting stuff out?

Cat Hicks:

Yeah, I know. I think we're all living no matter what you're doing or working on, we're living in this world where there's so much content and there's so much noise and some of it's incredibly exciting and you don't want to miss the boat. I always try to challenge myself to say picking one thing that's really good for your work is going to be much better than trying to do 20 new things a week. And I just sort of try to trust that process of doing that one really good thing and making that one adaptation. I also think there's something very interesting about what is it that we think is real work versus not? And I have a colleague, Philip Guo UCSD is the computer science researcher there, and he did this work on conversational programming. So people who don't necessarily write code for a living all day long, but they actually need to be conversant in it.

They need to maybe do some diagnosis or be able to read code, be able to have tech conversations, and there's a lot of conversational programmers in tech organizations, but we don't make computer science classes for them. We don't really hire for that. We don't really know what to call it. And I think about that with things like AI models, LLMs, these models that are going to make certain types of work suddenly accessible to new groups of people. And it's very beautiful to me as a psychologist because I'm like, "Great, we could have more time to tell stories to do it, whatever we can replace the grunt work." But then of course, there's always the side of it where you're like, "Okay, this is scary. I have these technical skills, and suddenly maybe it doesn't matter that I have them. So what do I do now?"

Joel Beasley:

Yeah, co-pilots coming for your job people.

Cat Hicks:

Yeah. I think co-pilot those things like they're never going to be able to come for code quality, for architecture, for strategy, for the people, skills of code. People would like to spend more time doing that stuff.

Joel Beasley:

Well, I listened to the great theologian, Justin Bieber, and he says, "Never say never."

Cat Hicks:

Yeah.

Joel Beasley:

Oh man, this is great. Is there any calls to action? And by the way, I love that you brought up the conversational programming. I know it instinctively, but you just put a label on it for me, so thank you for that.

Cat Hicks:

Awesome.

Joel Beasley:

Is there any other calls to action? Go to the website, sign up for newsletters. Go buy Flow, right? We should tell people to go buy Flow. Everyone knows Pluralsight.

Cat Hicks:

Yeah, we see Flow helping people. I think one of the big recommendations we have in our research is take a look at whether you're doing measurement inside of your engineering organization at all, and you might be surprised to find how little you're doing it. We find individual developers actually don't always even give themselves credit for how much they're working. It could be a really positive tool for developers who can be very hard on themselves. So that's maybe like a cultural call to action. A really concrete call to action is go download our white paper. The first one's available. You can read about the developer thriving framework. You can get a really beautiful illustration of it. And we also have these recommendations, many concrete recommendations. They come straight not just from us, but straight from the managers and the developers in our research. So we have tables of that, different starting places depending on your context. And you can look out for the next white paper next week.

Joel Beasley:

I just want to say thank you on behalf of all the engineering people to be out there researching it and coming up with insights to help us grow. It's very useful and we appreciate it.

Cat Hicks:

Oh, I love hearing that. That's the best thing we can hear. We're here to help.

Joel Beasley:

Thank you so much for listening, and if you found this episode useful, please share it with a friend or colleague who you think would get value from it. And if you have topics that you would like to hear discussed on the podcast, either add me on LinkedIn or send me an email, Joel@moderncto.io. Every time I get an email or LinkedIn message, it absolutely makes my day and inspires me to keep going.

Pluralsight Content Team

Pluralsight C.

The Pluralsight Content Team delivers the latest industry insights, technical knowledge, and business advice. As tech enthusiasts, we live and breathe the industry and are passionate about sharing our expertise. From programming and cloud computing to cybersecurity and AI, we cover a wide range of topics to keep you up to date and ahead of the curve.

More about this author