Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

3 steps for engineering leaders to evaluate their onboarding strategy

Onboarding strategies are critical to the success of engineering leaders. This guide offers tangible ways to evaluate yours.

Jun 8, 2020 • 3 Minute Read

  • Software Development

This is part two of our engineering leader’s guide to an onboarding strategy. If you want to start at the beginning, check out the first article: Why quality onboarding matters for engineering leaders. 

For engineers to perform well—individually, as well as members of a team and an organization—they require clarity, purpose and opportunity. How well they receive those necessities is largely dependent on how effectively they are onboarded. With so much on the line, it makes only sense to assess and evaluate an organization’s current onboarding procedures. 

Fortunately, most organizations already have data on their onboarding program’s results—they may just need to collect it. After all, if the proof is in the pudding then both the engineers who have completed onboarding and any who left before completing it are the pudding.

The Kirkpatrick Model

One of the standard evaluation models for determining the efficacy of training programs is the Kirkpatrick Model. Most methods of evaluating the effectiveness of training programs, such as onboarding, use some or all of the four levels in this model. 

Odds are good that new hires will go through each of these tiers of integration that offer opportunities for assessing an onboarding program: 

  • Reaction. This level of evaluation gathers new engineers’ responses immediately following their first day or week of training, as well as any other milestone when they can offer insights into how things are going (say, for example, at 30, 60 and 90 days). These responses give a clearer sense of the onboarding’s delivery—how well it’s received by the new hires.

  • Learning. How much of the content of the onboarding program, such as expectations set and the definition of their roles, do new hires retain? Some trainings use tests or assessments to determine how much trainees learned before putting them on the job; engineering managers may well give their newest developers small projects to see if they can create wins early on in their tenure.

  • Behavior. This level of evaluation looks at how engineers apply what they’ve learned. In a sense, this is the culmination of onboarding: how well do engineers integrate with the organizational culture, how productive are they and how quickly do they make meaningful contributions to the team and organization?

  • Results. This is the why behind onboarding. After all, an engineering team’s mission is to support and advance the organization’s goals. So whether it’s developing innovative products, improving customer satisfaction or increasing return on investment, evaluate whether the engineers are contributing meaningfully to that outcome. 

The Kirkpatrick Model is a general overview of sorts, and it’s not necessarily chronological. After all, engineers may produce meaningful results in the very midst of the onboarding process. It’s nonetheless a helpful aid for thinking about how to break down the data available to collect. 

The key, naturally, is determining what data to collect and how to apply it to evaluating an onboarding program.  

Diagnose the onboarding process

Three of the most utilized and consistently accessible metrics for evaluating onboarding processes are: speed to competency, employee engagement and employee retention rate. 

Speed to competency

A high-leverage point in evaluating and improving an onboarding process.  

The time it takes for engineers to become proficient, interactive contributors varies across job settings and onboarding styles. But speed to competency reveals how effective onboarding practices are as they are happening, which then enables leaders to implement changes for immediate impact. 

That said, measuring competency presents certain challenges because engineers do various things in their roles—exploring solutions, writing code, reviewing others’ work, discussing specs and architecture. Yet considering all that, most organizations share two desired outcomes of the onboarding process: they want new team members to create solutions and contribute to the codebase, and they want them to collaborate with their colleagues to produce even greater work together. 

The right insights into a team’s process allow leaders to better understand the trends and set benchmarks for new engineers. These helpful metrics include: 

  • Active days per week. The intuition behind reviewing active days per week is simple: are new hires getting the resources they need to start contributing to projects? And then, with time, are they getting comfortable with taking on more significant tasks? 
  • Involvement. This metric is the percentage of pull requests that a reviewer participated in. We take involvement within the context of a team or organization’s average. If a team has an average of 20% involvement, onboarding may ramp new engineers up to 10% within three months of starting.  
  • Unreviewed PRs. This is the percentage of pull requests that didn’t get any reviews before being merged. Teams generally want to ensure their new hires are getting plenty of review on their PRs. They want to ensure that the code the new hires are writing is safe and meets their standards—but they also want to make sure that new employees are getting feedback from their peers and learning about the code.

Leaders can visualize trends across these metrics for each engineer hired. With time, they can use that information to understand how long it generally takes for a new hire to become a fully contributing member of the team. These metrics also offer a clearer sense of how a team aids in the onboarding process with its newest members.

Employee engagement

Here, the data gets less quantitative and more qualitative—but it’s no less valuable or insightful. First of all, merely asking teammates (both new hires and established engineers alike) for their input in a meaningful way can help them feel invested in the organization and its evolution. 

Second, these insights will help determine how people feel about the onboarding process at different stages of their integration. After all, how people feel in their job is a significant factor in determining their likelihood to stay. 

Onboarding programs can be gauged by how engineers work together

This phase of data collection can assess employee engagement through both the engineers and their managers. Information can remain anonymous by using a third party or a centralized means of collecting answers. Anonymity may help collect more direct and honest feedback. Again, the data is less concerned with individual cases; in evaluating onboarding processes, overall trends in the data are more meaningful. 

The data may prove more insights when categorized by how long both managers and engineers have been with the organization. And the feedback collected can extend beyond how much the engineers seem engaged with the company’s goals, or how productive they are, to questions that reflect how healthy the environment is on engineering teams—such as how engaged developers are with their teammates. 

After all, an onboarding program’s effectiveness is reflected in how engineers engage with one another both personally and professionally. Do they understand how their roles impact one another? Do they make an effort to socialize at work or outside of work? Seventy percent of employees report that having friends at work is the most crucial element of a fulfilling work life. Half of employees with a best friend at work report feeling a strong connection with the organization.  

In other words, paying attention to the personal side of engineer engagement will help determine how likely engineers are to stick around. Onboarding can establish and delineate roles within a team and how they rely on each other—which will aid engineers in interacting more smoothly. 

Clearly, this kind of data is entirely self-reported. It requires reaching out to managers and engineers to solicit their input, whether that’s in 1:1s or through an anonymous survey. We can group these conversations into three categories: feedback from new hires, exit interviews and stay interviews. 

Seeking feedback from new hires is likely the most direct line into how participants receive and experience an onboarding process. The more open leaders and managers make themselves to receiving feedback—and the more they demonstrate their willingness to act upon it, even if only to acknowledge the input they receive—the more likely they are to get frank and constructive responses from the new hires themselves. 

Here are a few ideas for questions to ask new hires: 

  • What have been some highlights of your experience? What are some challenges you’ve faced? 

  • How does your experience so far compare to how the company and job were presented to you during the application and interview process? 

  • What is still unclear about our company and your role? 

  • What would help you do your job better? 

Quantitative surveys—like the “evaluate this statement from 1-10” model—are another approach, which can work effectively as a supplemental source of data. These surveys are both easier to complete and quicker to compile. This data can also give a quick snapshot into how people are feeling at different points in the process. However, these surveys are less likely to give insight into the reasons why respondents answer the way they do. 

Consider offering any of these surveys, or different versions of them, at different benchmark points throughout the onboarding process. Leaders can tailor the timeline to fit their onboarding structure, but a common approach would be to fill out the surveys after one week, and then at 30/60/90 day milestones. 

Exit interviews are perhaps self-explanatory. A team member is leaving an organization, and this survey offers the chance to have a conversation about why they are leaving. What influenced their decision to leave, and how is their new opportunity better than the ones provided at their current job?

But exit interviews are also an excellent opportunity to discuss why they stayed as long as they did. This is a risk-free chance for developers to talk about what went well and what helped them grow as engineers and human beings, in addition to what they lacked or what could have served them better.  

Another opportunity to get this type of information is by conducting stay interviews. Stay interviews are different than all the other meetings engineers and managers have, from 1:1s to all-hands. Stay interviews aren’t about the work they’re doing together. Rather, these conversations are about the engineers’ relationship with the company and why they are still choosing to stay with the organization. 

These conversations desperately need to not disappear in the ether until the next year’s stay interview. Each engineer’s comments warrant response, either through implementing changes, bringing ideas to leadership for discussion or even acknowledging that these ideas are being filed for the future when the organization can’t implement or explore them just yet.

Not only do leaders learn about the org’s processes from the people most involved with them, but leadership also demonstrates that the engineers’ experiences are valid and valuable. That step alone can contribute to their engagement with the organization and ensure continued insightful feedback for the duration of their time on the team.

Employee retention rate

Employee retention rate is a lagging indicator of an onboarding program’s success—and it may also reflect any number of other causes that impact onboarding’s effectiveness yet are not onboarding itself. The handy thing about tracking employee retention rate is that any organization already has 100% of this data, so long as it has kept records of every employee ever hired. 

Orgs can use this data several ways by looking at different benchmarks: 

  • How many engineers ever hired are still with the organization, even if their role has changed?  

  • What is the average length of time that an engineer stays with the organization? (Count the durations of the ones still there, as well as the ones who have left.) 

  • What is the average length of time that an engineer stays with the organization? (Count the durations of the ones still there as well as the ones who have left.)

  • At what rate do engineers leave within one month of starting with the organization? Within three months? Six months? A year? Three years? 

  • How do these numbers look by team, function, manager? Look for trends. 

  • How have these numbers changed with time? Does that coincide with changes in onboarding? 

  • What are any extenuating circumstances? If there are any spikes or dips in the data trends, see if they can be pinpointed to certain factors, like a new CEO or the departure of a beloved manager, and what insights these instances offer. 

Evaluating the tenure of employees will help identify the key points in time when their flight risk is greatest. Think of this as a stress test on a length of chain: it identifies the weakest points by how the team’s engineers behave at those points. 

This affords an easy access point to early implementation strategies. At the very least, managers can now start talking to engineers before they reach that weak point in the onboarding process. If, say, the data shows a regular exodus after three months, managers can have more meaningful conversations with new hires in the first two months about how they’re doing, where they’re struggling, what would help them succeed. They may be able to staunch the bleeding, or else they’ll discover why the bleeding is so bad.  

Remember, too, that these evaluations look primarily at overall trends (as much as possible, depending on the size of the organization and how long it’s been around). An engineer leaving the company isn’t always and automatically a bad thing. Sometimes switching jobs is necessary for an individual engineer’s personal and professional growth. These transitions are to be celebrated. However, leaders can still evaluate the reasons the organization could not keep those engineers. Is the problem rooted far back in onboarding? Or is it more forward-facing—they ran out of advancement opportunities, or needed a new challenge that an established company couldn’t offer? 

There isn’t any magic number for a “successful” retention rate. But an existing history can reveal upward or downward trends, and an organization can also determine what a successful retention rate looks like for itself. 

Evaluate the resources going into an onboarding process

Maximizing the return on investment in the onboarding process also means evaluating the resources that go into onboarding. How much time and labor goes into bringing new hires up to speed? How much of that is integral and meaningful, such as mentorship and 1:1 meetings? And how much of it is superfluous, repetitive or otherwise less than beneficial? 

Evaluating the effort and resources that are being allocated toward onboarding new team members can help identify where work can be streamlined, automated or used more beneficially.

For example, one of the common culprits for misapplied resources is the infamous Day One hiring paperwork. Are new hires spending hours on that first day signing forms instead of getting to know their teammates? Is a staff member having to manually produce and review those forms when an automated program might suffice?  

A couple hours of work per new hire may not seem like much, but those totals add up, especially in a scaling company. And if anything can make the process go more smoothly for the newest engineers on your team, then that efficiency only sets the tone for how quickly the organization moves—and how it values the engineers’ time and expertise—right from the start. 

Next up in our three-part guide to engineering leader’s onboarding strategy, we’ll give you the tools to lay the groundwork on implementing your own onboarding plan.

Pluralsight Content Team

Pluralsight C.

The Pluralsight Content Team delivers the latest industry insights, technical knowledge, and business advice. As tech enthusiasts, we live and breathe the industry and are passionate about sharing our expertise. From programming and cloud computing to cybersecurity and AI, we cover a wide range of topics to keep you up to date and ahead of the curve.

More about this author