Growing Data Teams from Reactive to Influential

Source: Emily Thompson

Measuring progress towards company-wide data maturity
I’ve been on the job hunt recently1, and while interviewing can be an emotionally draining exercise, it’s been really fun to make new connections with data teams of all shapes and sizes and to learn about their current challenges.

When it’s my turn for questions in an interview, I often ask potential direct reports what kind of support they need from their manager to help set them up for success. A common theme I’m hearing (and it’s not a new one2) is that they feel like they are under water; lots of requests are coming in to the data team, they’re not sure what to prioritize week-to-week, and they feel like they’re spinning wheels more than making an impact on the business.

With the exception of a few companies that were founded with data as core to their culture, most places are somewhere on the road to becoming more “data-mature”. This manifests in a structure where the company’s data expertise is heavily centralized on the data team, while decisions about business priorities are not. As the experts on how to best leverage data, it falls on the data team to change the culture, but starting from a position with very little agency to make that change. The data practitioners continue to try to provide their stakeholders with what they ask for in the hopes that company-wide data literacy will someday come.

So the requests pile up, and the team works reactively on data questions that may or may not be best-framed for real business impact. The impulse they feel to push back with prioritization processes is a symptom of being an intrinsic part of the very culture they are trying to change, and all of the negative traits that come with it.


As a manager, I’ve used a framework that defines reactiveproactive and influential stages to describe a data team’s positioning during a broader company-wide march towards data maturity. Using stages to describe data maturity isn’t new,3 but the frameworks I’ve seen are mostly applied at the company-level rather than the data team, and many of them miss giving tactical advice for how to move from one stage to the next. In my experience, it takes more than a new prioritization process to get out of that reactive downward spiral. 

Culture change doesn’t happen overnight, and while you’re in the middle of it, it’s hard to feel like you’re making progress. Similar to setting measurable goals in product development, setting explicit goals for data maturity forces us to quantify how much progress we’re making towards it. I’m going to talk about how to help the team make progress, and timescales on which I’ve seen that evolution happen.

Reactive stage: It starts by measuring a baseline

The first thing to do is take the data team’s feeling of being underwater and turn it into something quantitative. Figuring out where requests are coming from and centralizing them into a project tracking system adds some extra process upfront, but it’s invaluable to get that visibility. Some examples include encouraging the team to route requests into a central chat channel rather than private DMs, and to use a ticketing system to manage a transparent backlog, making sure it stays updated when requests are finished. A backlog tool can be useful to show leaders of stakeholder groups what kinds of requests are coming in from their teams, as they might not necessarily have that level of visibility themselves.

Seeing the landscape of requests is a good start, but in order to drive real change in the way things operate, it’s important to know how those analysis results, experiments, dashboards, and other data artifacts are being used to make decisions. It should never be the data team’s job (or the data team manager’s job) to prioritize all of the incoming requests by themselves. In an ROI equation, the data team only knows the “I” with certainty, and can only speculate what the “R” might be if they don’t know what the higher level goals of their business partners are.

When I started at Mozilla in early 2019, the data team was manually designing and analyzing bespoke A/B tests. I ran a survey on roughly 40 experiments completed in the first half of the year and asked the data scientist assigned to each of them what the outcome was. Not all experiments needed to have a significant result to be deemed useful, so examples of “good experiment outcomes” included “a decision was made to ship or not ship a change”, “we learned what was intended”, or even “we found technical issues with the experiment deployment software”, which would enable us to build a better system. The bar for an experiment having been worth doing was fairly low. 

When I looked at the survey results, almost half of the experiments had net negative outcomes, including some experiments that were shipped after the business decision had already been made (the “I’ve checked the data box” fallacy). The majority of negative outcomes, however, were that the data scientists weren’t sure what the results were used for after the analyses were completed, indicating that they were accepting the work without question when an experiment request came in. Seeing that a bulk of our work was done with no connection to the impact it had helped us understand that we weren’t having the right conversations with our business partners. After a few months of coaching the team, the business motivation behind the ask was known for almost every analysis.

Proactive stage: Focus on near-term wins for cultural change

Once you have a clear view of where requests are coming from and what they are attempting to solve for the business, you might instinctively start prioritizing based solely on ROI for business impact. But this doesn’t immediately gain you a seat at the table with the decision-makers, and in fact, might have the opposite effect. Stakeholders who feel that they are constantly being told “no”, even when their data partner says it’s in the best interest of the company, will start to feel like the data team isn’t on their side. Making cultural change a goal itself means that there…

Read the full article here.

Chelsea McCullough