How Leaders Should Use Data with Benn Stancil

Brian Sierakowski on February 03, 2022

Founder Chats is brought to you by Baremetrics: zero-setup subscription analytics & insights for Stripe, Recurly, Braintree and any other subscription company!

Like this episode? A rating and a review on iTunes would go a long way!

About Benn Stancil: Benn Stancil is co­founder and Chief Analyst at Mode Analytics, a company building collaborative tools for data scientists and analysts. Benn is responsible for overseeing Mode’s internal analytics efforts, and is also an active contributor to the data science community. In addition, Benn provides strategic oversight and guidance to Mode’s product direction as a member of the product leadership team.

To stay up to date with Benn’s writing, check out his substack.


About Mode: Mode helps businesses grow faster by speeding up answers to complex business problems and making the process more collaborative, so that everyone can build on the work of data analysts.  The Mode platform combines the best elements of Business Intelligence (ABI), Data Science (DS) and Machine Learning (ML) to empower data teams to answer impactful questions and collaborate on analysis across a range of business functions. In 2021, Mode has added a number of premiere enterprise companies to its customer base, which now includes Anheuser Busch, Bloomberg, Capital One, Conde Nast, Doordash, Lyft, Meredith Publishing, VMWare, and Zillow, among others.  At this point, 52% of the Forbes 500 have turned to Mode for help with data-driven decision making.


Episode Transcript:

Brian Sierakowski: Benn, welcome to the podcast. How are you today?

Benn Stancil: Good, thanks for having me. 

Brian Sierakowski: Of course, it’s my pleasure. Well, let’s get started where we usually do, tell me about where your entrepreneurial journey got started?

Benn Stancil: I’m one of the founders of Mode, we build a product for analysts and data scientists. If you’re familiar with BI tools, or tools like Tableau is broadly similar to those sorts of things. And so we started it in about eight years ago, that was and is my first and so far only sort of foray into being a startup founder or entrepreneur or anything like that.  

So I got started after meeting folks at a previous job. I worked at a company called Yammer, which was a SaaS company that built sort of a Facebook for work before Facebook for work was the thing. They got acquired by Microsoft in 2012. And so I met some folks there, we ended up having the idea for the product that we wanted to build based on some like internal tools that we had built at Yammer, and then started the company off of that. I’m someone who found myself basically in a position that it was an interesting idea, it was really good people I wanted to work with. And so it was kind of like, let’s take the leap. I’m not someone who sort of went into it thinking, my lifelong dream is to be an entrepreneur or a founder or anything like that, but sort of did it out of out of interest in the idea and the team and then followed that path.

Brian Sierakowski: Cool. What was it like working at Yammer? 

Benn Stancil:  It was my first job in the tech world. So prior to Yammer, I was actually working in DC for a few years doing a very DC job. I worked at a think tank doing policy research. Think tanks operate as this bridge between academia and the policy world where they basically are a bunch of people who are experts on different issues, a lot of times are PhDs in these things economists, political scientists, experts on foreign policy, that sort of thing. 

And so what they do is they write essentially research papers that are designed for policymakers to make policy suggestions about we should do this about US-Chinese relations. I was there in 2009-2012, which was kind of the midst of the financial crisis from 2008 [we were telling them] policymakers should respond to this crisis in these various ways.

Our job was to make these policy recommendations, but we were very much doing it from a distance. We were one of tons of think tanks that do this. 

We write a bunch of papers, they basically get shipped off to congressional aides, those congressional aides may or may not read them, they may or may not pass them up the chain in the congressional offices or various policymaking offices. Eventually, the people who are the actual policy makers, probably never see them, maybe some thoughts enter into the  way that they think about policy. 

But ultimately those folks are making decisions on a lot of other motivations and kind of pursue academic research that we were doing was very rarely like one of them, that didn’t really matter.

So I eventually left in part because of that, because you’re so far removed from ever actually seeing anything get done that the work is interesting, but you’re kind of yelling into the void. It’s perfunctory in some ways where it’s not actually making a difference. 

I left to eventually join Yammer and the role there at Yammer was actually structurally kind of similar, like the job is to find problems. I was an analyst so it was like, solve them with data, make recommendations to people so they can make better decisions. 

The difference was, rather than me sitting in an office in DC, writing things indirectly, for hundreds of policymakers who probably don’t care about it, I was writing it for or doing this analysis for the PM that sat next to me, who was trying to make a decision that needed to make tomorrow. 

It was obviously much faster paced, it was work where you could immediately see what sort of the things you were doing right or wrong. It was academically interesting to me and that was thinking about problems in the same way but in a way where you actually could see the impact of that, you could actually see whether or not people would follow through on those recommendations, and people actually cared about what you were doing.

Brian Sierakowski: That’s really cool. Were you actively looking for a switch into the tech world? Was this something that was like opportunistic or how did it cross your mind that all this hard work that you’re doing for basically, no audience was like maybe I should find somebody who somebody cares about what it is that I’m doing here?

Benn Stancil: Tech was one of several things I think I was looking for at the time. I wasn’t sure what I wanted to do. I wasn’t then and I don’t know that I am now, but at the time I knew that I liked the kind of data work.

I knew that in that general galaxy of types of problems I enjoyed it. I liked the Econ way of thinking, too. So, Econ is kind of math, but not hard math it’s more of like applied math, where you have data, you want to try to figure out what it means you’re not just in the weeds of basically like solving hard technical problems all day. At least, it kind of that level like sort of academic Econ is becoming kind of a technical thing.

And so I was looking for a numbers in things. I explored the possibility and came very close to actually doing an Econ PhD. I was looking at jobs and finance, which have some similarities there, though I think that kind of very much depends on the role and I never actually did, so I don’t actually know what those would look like. 

I had some friends who I had worked with who had left to go to San Francisco. One of them went to Google had good things to say about it. So it ended up being kind of like, yeah, I was just looking across a bunch of different things that all kind of had basically, like data in the title and was kind of open to different opportunities. 

I ended up landing the job at Yammer on sort of good old fashioned nepotism. I had a friend who I worked with in DC, whose sister worked at Yammer, which doesn’t, like get you the job, but it basically gets you the interview.

I think that especially a lot of these startups, especially the Googles and Facebooks of the world, which again Yammer was not, but the very big companies that just get thousands and tens of thousands of applicants a day, it’s difficult to sort of crack into that process without having some way to sort of get your resume to the top of the pile. 

Basically, that’s what happened. I was looking for a bunch of jobs. I had a friend who was like, you should check this one out. I said okay, that again put my resume at the top of the file for the interview slates. I went to the interview process, I liked the job and I was just kind of like, I don’t have any better to do, why not, and so took the leap.

Brian Sierakowski: How was it going through that interview process?

Benn Stancil: It was interesting, the way that interview worked. So this was in 2011, 2012. I don’t even remember if I interviewed in 2011 and 2012. It was at the height of the brain teaser phase of interviews that had gotten popular, especially in data stuff. 

It was like, here’s a math problem, how do you figure it out? How would you solve this? So it was a lot of that. The phone interview was a couple problems like that. The on-site was a series of here’s a circle, if you draw a line here, what happened? Like, kind of how do you think about these things qualitatively? 

I remember that part being a little strange. Part of it, though, was and I suspect this is true for a lot of folks who don’t work in tech and then kind of get their first view of it, you forget how different it is from so much of the rest of the world, and like what those interviews are like, and what those offices are like. The way they sort tech operates is just very different, or was. I think this is like becoming a little bit more normal. 

But the finance interviews were very much kind of stuffy old school interviews where you wear a suit, you go into an office, and people grill you on various things. 

The tech interview was much more friendly and so I think I was somewhat taken aback by how different that was, how different the offices were- all those sorts of things. I think for people who work in tech it becomes a thing you expect, like you take it for granted that’s how the working world is and I think having come from a world that wasn’t that it was a bit more of a culture shock, mostly for in positive ways, at least through that process.

Brian Sierakowski: I don’t want to put you on the spot from a memory perspective but just curious if you remember, any of those, like brain teaser type questions that you were asked those are always so interesting to me.

Benn Stancil: I remember three or four of them, I think. There was one basically along the lines of “one out of every X number of cars, randomly as a pre designated probability questions, like one out of X number of cars is a Prius that drives by suppose that you sit and watch cars drive by for half an hour how many Priuses are you likely to see?

Suppose you sit and watch cars for an hour how many Prius are you likely to see? You are like what are the odds of seeing no Priuses in half an hour, what are the odds of see no Priuses in an hour?” Questions like that. 

There was some sort of geometry question about “draw a circle, say I put two points on the edge of it randomly now I put a third point like, what are the odds that the third point is sort within the angle of the first two say like create two points, and it creates an acute angle, what are the odds that third point is inside of that acute angle versus the outside of it?” I guess not  an acute angle it was like an angle that’s less than 180 degrees so like that part of it. 

There was some sort of card shuffling question of like, “you have to shuffle a deck of cards” or some sort of probability dealer on that. And for some reason, have some vague recollection of a question that involved close, and some sort of probability deal about that one. 

Most of it was probability, most of it was just these questions about how would you try to solve this thing that’s like some probability? I think a lot of things we were looking for, and obviously having sat on the other side of table for a while is that it’s not necessarily how quickly you get through this question, but it was more about how comfortable are you talking about this stuff? How comfortable are you in kind of this quantitative gymnastics to see if when you get a hit, do you respond to it well? Do you see the things that you’re trying to see, or that they’re trying to get you to see. And so a lot of it was, I think, more of like comfort with math and sort of dexterity in it rather than, you know, can you win a Mensa. 

Brian Sierakowski: Well, that’s actually much more reasonable, at least in some of the experiences, or different questions that I’ve heard people go through. And I think those questions would have been excellent at weeding me out of the candidate pool if I were in there, I don’t think I could answer any of those questions really well. 

It’s certainly effective from that standpoint of getting any non-math person out of the way.

Benn Stancil: The finance questions for those interviews, to the extent that they give you sorts of things, tended to be more of straight up puzzles of either something of like “you’ve got a fox and a chicken and a boat, and you have to cross and they can’t do this or you can do that” or the kind of classic “how many ping pong balls you can fit in an airplane?”, “how many manholes are in New York?” all that kind of stuff which is just like, seeing how well you kind of reason through these unreasonable questions.

The question itself, I don’t know there’s a bad thing. I think there’s ways that those things certainly can be valuable for assessing folks. I think it depends on how you judge it essentially, like if you’re judging it as just like, can you kind of be a calculator to crush this thing?

I think that you should pretty bad if you’re judging it sort of how do people respond to this situation? Do they work on it? Well, can I talk through that kind of stuff I think there’s maybe something a little bit more reasonable there?

Brian Sierakowski: I’ve kind of already been guilty of this a little bit of like, there’s a certain inclination to make fun of these sorts of questions, because they are obviously silly on the surface level. 

But it is kind of a difficult challenge as a business owner, especially if you’re in tech, and you are hopefully doing things that have never been done before. 

How else are you going to be able to say like, okay, this person’s going to be put in an environment where there’s no, like prior knowledge, and they need to kind of figure it. How good of a job are they going to do at figuring it out that this future state that not even if we knew what that thing was going to be we would just test you on that, but we don’t even know. 

So I think it’s really a good point that you make of like let’s put somebody in this unreasonable situation and see how they do and I certainly agree that maybe one of the reasons why this is like a little bit, at least for me, I don’t want to speak broadly, but the reason why how many manhole covers are there in New York City can be something that you can make fun of is like, if somebody’s judging you kind of misusing that tool and judging how close if they know the answer, and they’re like, judging you based off of like how close like, we only hire people that get within 5% of that number I think you use the tool a little bit incorrectly there and trying to assess somebody’s ability. 

Benn Stancil: Yeah, and I think that’s true. I mean there are there are things that data folks tend to do and people across the board who ask these questions, I think that they can be useful tools, if used appropriately. 

I think these questions do kind of lend themselves to being misused, partly because they become kind of tests for horsepower and people want to use them that way. 

There’s a lot of times where it seems like the interviewer wants to basically prove how smart they are. And part of the process is to make them, it’s not obviously what the interview should be, but interviewers will ask questions with the intent of seeing someone struggle and be like, I know how to do this and I feel clever about having asked it, which I think is like a pretty problematic way of doing it. 

The way that this has evolved, and this is how we interview, and it’s like how my interviews have evolved and I think it’s something that’s probably a bit of more of an industry-wide practice is trying to instead of asking like, the unreasonable question that is a brain teaser, like how many ping pong balls can fit in a plane? Start to ask questions that are kind of similar in their unreasonableness, but are much more open ended and are much more about the business itself. 

So like, an example of this that I would ask for somebody is “how much should we pay for a billboard?” Where it’s really hard to actually figure that out but it’s more about how would you try to solve this problem that is a problem that we may actually encounter on the job, that it’s difficult to measure that you have to kind of like, balance both quantitative problems and kind of figuring out how do you actually assess the value of a billboard? What does that look like? How do you decide people are buying your product because of it? How do you actually connect that back to the billboard? 

All those sorts of things where it’s a mix of kind of quantitative and qualitative problems in a way that there’s no clear answer in a way that I can’t sit there as the interviewer and like, know the answer is this and actually trying to wait you to see if you can get exactly there. 

It’s more of like, let’s work on this thing together and feel like we get to a point where we actually came to a place that we feel like is good and if we can get there, then great, we should work together. 

And if it’s like a real struggle, and I think that’s a different thing. Then it’s like maybe this isn’t a fair. So I think those are tend to be better questions, where they kind of structurally similar, but in a way that that protects against like how smart are you, fast can you think about stupid problems and more of like, how can you work together on something that’s kind of ambiguous and vague, but actually reflects the way that the job will actually be done.

Brian Sierakowski: Yeah, that’s really great, too because it also means that you are being thoughtful about what the role is, and what their responsibilities are going to be, which I’ve seen a lot of times. I’ve been guilty of this hiring somebody without really having a crystal clear picture of what value do I want this person to bring or even what is their job this. 

How are we evaluating them? What are their goals? So if you know what those goals are, and if you’re hiring a marketing person and you want them to work on distribution strategies these are open questions and you almost get to the point where if you can fool me in the interview, like if you say something, I’m like nodding along being like, yeah, that sounds great, that sounds good enough. 

You got into the ballpark from there and if I were to hire you and you came to me with these proposals, I’d be like that follows as much data as we have and I don’t think you’re crazy and I think you did your research so let’s go for it. I mean what more could you want other than that?

Benn Stancil: Yeah, and data roles, I think, in particular, are an interesting kind of case study in that because as an analyst or data scientist, the thing you’re saying of like what’s the success in the job look like, it’s actually kind of hard to define. 

This is sort of a recent rant that I’ve been on and been written about some but it’s actually hard to figure out what a good analyst or data scientist looks like, not in the sense of like, what are the attributes they have, but say like you have 10 of them in a job how do you know which ones are better? Even seeing their performance, how do you measure that performance?

Because ultimately, their job is to kind of be an advisor to recommend things and stuff like that.  You can kind of say, the quality of the recommendations they make, but sometimes, because it’s in effect, it’s not sort of formally, but it’s in effect kind of probabilistic of we should do this, and it probably worked out and sometimes it doesn’t, we can’t really use those judgments either necessarily as like a direct measure of how good folks are. 

And so a lot of it comes down to I think what you’re saying where it’s basically how convincing is it? It’s basically, like, if you do some analysis, and I see it, and I’m like, sure seems right to me, then it’s probably good. And that’s probably about as good of a judgment as you have and so I think, in some ways that makes the interview process easier, or it’s like, give people a problem and if you leave it feeling convinced about their answer, then they’re probably pretty good at the job, because that’s basically the job. 

That’s not necessarily the way a lot of great people approach it, but to me, that’s kind of the, the ultimate job of the role anyway, and so if people are good at that and they’re probably do it what you ask them to do.

Brian Sierakowski: Yeah, feels like a really hard job to say, like, who’s the best analyst because you’re right, if you just say, like, well, of all the analysts, which of their recommendations made us the most money or whatever made us the most of target metric?

And the answer to that would be like, well, the analyst who did that was the analysts who got the best questions, the questions that were most likely to yield a lot of money. 

So there’s like that whole upstream effect of the person who was asked the question that would lead to the thing that makes us the most money is likely, you know— if everybody is within plus or minus 20%, in skill, the inputs are what’s going to affect what their performance is going to look like.

Benn Stancil: Right, and there’s no counterfactual either, or it’s like, well, they recommended us to this thing. We were weighing twenty five options, we chose option A, we didn’t choose option B, C, D, or E. We’d actually know how well those would have done if A did pretty well, like was B great that we actually missed a lot of opportunity?

Did we dodge a bunch of bullets where B through E are all terrible? Those are things you don’t really know so it’s all kind of a subjective thing to me and a lot of that subjectivity is around how convincing they are and things that they put forward to the decision making.

Brian Sierakowski: Wow, it’s really interesting. One thing I’m thinking of too, thinking of your journey through the think tank to Yammer, I’m curious kind of to rewind back there of like, how did your skill set need to change?

It seems like you’re very thoughtful about being this role and like what’s the requirements are. How did you feel like you needed to change, if you needed to change, to fit in at this new, faster, higher paced organization?

Benn Stancil: So to go from the think tank into the startup, like as an analyst, there’s obviously a lot of change that has to happen to from being an analyst at a startup to founding one.

But the first change in going from the think tank to the analyst, there were probably two big things that were different. It was still trying to use math to solve problems and so in that the skills that wasn’t that different. 

The thing I think that was, the biggest challenge was one: there was some technical lead that I didn’t have, things like SQL. SQL is basically is the technical skill that you really need. You’re not having to like become a rocket scientist by any means, but I didn’t know any SQL before I started the job and after a month of using it, I felt fine with it and so it was like okay, you can you can learn that. 

But you did have to adapt to an environment that wasn’t downloading excel files from a bunch of like World Bank and IMF websites and instead of using something that’s a little bit more sort of technologically scalable. 

So I had to learn those pieces. I think that was somewhat of a change where it wasn’t just kind of everything was done, throw it together and you’re doing it for this particular paper and once you’re done with it, like, who cares, just have a bunch of excel files in your computer, you’re never going to actually use them again. It was all done, we were like, okay, we need to repeat this over and over again, we need to have dashboards and things like that, so there was some sort of change to that mindset.

The other thing that I don’t think was a change for me because it was the thing that I was looking for, but it’s probably would be for some people, or depending on how long you’d come out of the academic world is the pace, and kind of  the things associated with pace. 

It’s not just like, hey, we have to go faster or that that’s there, it’s that in order to go faster, we are more accepting of kind of directional pieces we’re more accepting of, you don’t need to cross all the Ts and dot all the Is, the job here isn’t to write a paper.

The job here isn’t to do any kind of like formal math. The job is to do enough work, so that we can decide what we need to do and that’s good enough. 

I think like there was some change around if, for instance, and again, this wasn’t my case, because I wasn’t in it for that long but if you come out of academia, and you could ask the question, there’s a lot of like the thing I’m ultimately trying to move towards write a paper. So I need to have some ideas, but the ideas that are formulating, like more formal expressions of those ideas. I need to have stuff that’s a little bit more buttoned up.

Whereas in this case, it’s kind of like get to the point where you have the idea, do enough work to feel like your idea is probably right, and then move on. And so there was like a change in approach I think with that different people from different backgrounds kind of had to adapt at different degrees. 

In my case, it was relatively mild, in some people’s case, who, you know, were academics or PhDs, or postdocs, and had been doing that for 10 years, I think it’s a harsher change.

Brian Sierakowski: Interesting. That’s actually something I’ve been thinking a lot about and I’m wondering if you have any lessons learned that you can share there.

As we become increasingly more data driven, and data informed on our side, I’ve noticed the same thing and I’m like very much a feeling guy which is awful. I’ll go into a meeting and I’ll present the data and I’ll go yeah, but it doesn’t feel like this. 

I see what the data is saying but as an as a person that’s the operator it doesn’t feel like this,. Sometimes I can like dig deeper and actually find the slice that shows, or perhaps I just didn’t do a good job the first time around presenting the data. 

But I’m trying to find that balance between letting experiments run to statistical significance versus getting that direction. Then my approach would be like we will  just try it, if you have an idea, just try it and see what happens, which I think is too cowboy to operate at scale doing that. 

But I’m kind of feeling like maybe the second step is to swing to the other side of the situation and say we’re going to run experiments, and we’re going to run them until statistical significance, and then we are going to feel pretty good that that’s going to work whereas maybe we need to be somewhere in the middle. So I’m curious you might be uniquely qualified to speak to this.

Benn Stancil: Yeah, I have a bunch of thoughts on that. So one I think both ends in the sort of boring answer, both extremes aren’t really workable, you can’t sort of be sort of fully data driven, and everything that you do and just make decisions because data tells you to do it but if you do everything by gut, you’ll have some misses, too. 

I think that there’s a nuanced version of that. To me is there is a tendency among data people and kind of anybody who’s data inclined so it doesn’t have to be like a data analyst or data scientist, but also like the executives that put some faith in data and some ways to use it as a crutch to use it as say, we want to run this test, well, we’re not going to make a decision until we get the results on this. To me, that, and this is another thing that I sort have kind have been on a rant on. 

But recently to me, the effect of that is that people are in essence trying to offload the responsibility of making that decision saying like, well, I will just do what the data tells me and it’s like, the data is not going to actually get you an answer. Sometimes the best thing it’s going to tell you is we don’t know what to do. 

Or it seems like it’s pointing in this direction but like, I don’t know if it’s better. And we can’t run this experiment longer to figure that out. We can’t just do more analysis to figure it out and we’re just not going to know that we are trying to predict. I mean, in effect, maybe test is doing is trying to predict the future by saying what is the effect now if we think that, that will continue and there are times you’re like, we cannot predict the future, we don’t know.

And I think people will sometimes use data too far in that way, where they’re uncomfortable making a decision, or just like taking a leap and saying this is the thing we’re going to make a bet on without being able to say like, well, the data says this therefore, it’s the thing we have to do. So I think in those cases, like, yes, do the analysis to kind of get a sense of the direction, do the analysis to keep yourself honest. 

But that analysis isn’t sort of ground truth in the sense that there’s a lot of things that go into the way that you run the test. There’s a lot of things that go into the ways that you measure things. At some point, you have to make decisions based on some amount of uncertainty and I think we need to be more comfortable on that.

The other side of it is like, when you’re looking at data, and it doesn’t feel right, I think that’s an important thing to pay attention to. There’s like this Jeff Bezos bit about something where it’s like, when he looks at data, and it’s something that goes against his gut, as often as is his gut being wrong as the data is wrong, or like the data is incomplete where it’s like, I’m looking at this thing, and it seems like it’s not reporting everything, or there are some elements is not captured. 

I think that that is also a reasonable stance to take. I don’t think that you can just say like, well, I’m going to trust my gut over what the data says sort of every time. To me, what like a gut feeling is, particularly from an executive or someone who’s in the business. 

For instance, we make a recommendation to the sales team and they say like, that doesn’t feel right, I can’t quite identify what it is, it doesn’t feel right. What that really is saying is, there is some experience that they’ve had, that they don’t quite know how to quantify, or they don’t know how to describe, but it doesn’t, it doesn’t jive with that, it’s not like some nonsense feeling it’s just that it’s hard to articulate what it is that is off to them and that doesn’t mean it’s not real. That just means it’s hard to articulate. 

We shouldn’t basically only take evidence that we can present in charts. If it’s something that’s not easy to chart, that doesn’t mean it’s any less important. It just means it doesn’t fit into a chart very well. 

And so to me, there has to be a balance here. We have to be careful about erring two forms on the side of like overusing data for everything. but I think, like, obviously it has to be used to check against kind of what’s actually happening in the world and in some ways, it’s like the senses of an organization and what the business is doing. But that doesn’t mean the senses are perfect, it doesn’t mean those senses are like always to be trusted over any other instincts that you have.

Brian Sierakowski: That’s really interesting. I feel Jeff Bezos is a pretty good, pretty safe example of someone that seems like they’re doing a relatively good job of running their business. 

It is interesting and as you went through that descriptions talking through that, it does feel like in those scenarios, when the feel is off, the vibe is off, that you’re missing some sort of data. When you brought up the sales team, I think that’s really interesting because sometimes when we’re going through these experiments, and I’ve had that that weird vibe about the data . Sometimes it’s a week and a half later, and I’m in the shower, and I realized we forgot about this, like, very important piece, we talked about how many more trials are we going to get? 

But we didn’t think about the amount of time it’s going to take those trials to convert or seeing something like that, it’s like a piece of data that’s like, well, if you double your trials, but you quadruple your trial length, then that’s not actually really helping you. It’ll help you eventually and that puts  where you can make a decision on that if that’s actually better or worse. 

But I also think, too, sometimes there’s like an experiential thing as well. It’s just like, the reason why it doesn’t feel good to the sales team is because it’s more to higher conflict, or if this is going to be more difficult, more challenging conversation to have with somebody.

I noticed that myself as well, certainly, it’s like do you know the thing that feels off is like, I’m going to have to be for the especially if the data that we’re looking at here is correct for the best of the team, and for the best of the business, we’re actually going to need to be a little bit more confrontational with whatever group of people you’re talking about. 

So I think it’s certainly a challenge between are you just missing the data or is it actually correct but you actually don’t like the fact that I’m going to have to talk to this customer and tell them they’re not paying enough money or tell this customer that they are actually not a good fit for the product and like we’re breaking up with them or something like that.

Benn Stancil: Yeah, and if the argument for listening to feel is like this thing represents something that is real, and that data is never a perfect measure of what’s happening, I think the argument for is kind of two things that you alluded to. 

One is, people will make decisions based on emotion. Not emotion of like, irrational, it is emotion in the sense that I don’t want to do that thing, or like, this is the path that is just the one that is sort of least resistance to me or whatever.

And so I think people will sort of find ways to back into those arguments of we decided we need to focus on this new market, and there may be people who just like, don’t want to do it and so okay, we’ll find ways to make that argument. 

Data gives a way to make that argument or kind of experiences give away to make that argument to in a way that doesn’t seem like it’s an opinion, it seems like it’s  look  this is my reasonable and rational argument for this thing when obviously, it’s like the end goal came first and everything else is sort of just trying to kind of get to that point. 

The other thing I think, on the gut part and this is particularly prominent, the higher you move up an organization, that people’s guts get worse. So, for instance, say that we’re trying to figure out the problems that our customers have, like what are the biggest sort of pain points our customers have with our product and the things we need to solve most urgently. 

A salesperson sense of that, and their gut on that is probably pretty good, because they talk to tons of people. They are kind of assessing across a whole bunch of different inputs. Yes, it is anecdotal, but it’s like a wide range of antidotes and they kind of pick up on what are the trends there. 

For a sales manager, they hear less, so their gut is probably worse. For a CEO, they basically hear nothing. A CEO is sort of pulled into a couple conversations and those anecdotes become like they people will latch on to that as like, I have this one story, I tell it all the time, it is the sort of proof point I have for this thing and so in like the CEOs case, it may be that they got pulled in to the biggest customer because that’s the customer that the CEOs and get involved in.

Those people were upset about this thing they said like this product is too slow, or doesn’t have this one feature. That becomes a thing that they latch on to as well, like this is the only thing that matters, everybody’s saying this, I’ve heard this, and like that is a gut that is based on a lot less. 

And so I think the data is a better counteract for that, like a better sort of counterbalance for that because the anecdote there is like gut feeling is based on something that’s much more limited. 

It depends on where those things come from but they’re certainly like an executive tendency to when you’re more removed from things you’re trying to get a sense of, to attach yourself to a couple anecdotes to kind of get emotionally invested in those things. 

And to see that as reality whereas like, the folks on the ground that hear this stuff day in and day out, are less inclined to get attached like one anecdote in an emotional way.

Brian Sierakowski: That’s really interesting, and that totally makes sense. The volume of exposure goes down, the higher up you get. Like if you’re a frontline support rep, you’re talking to just the pissed off customers all day, every day, and you’re probably speaking to a large volume of them and your salespeople, you’re talking to all the fresh faces, people who aren’t using the product are very new to the product but you have a ton of touch points there so you certainly can feel. 

It’s like the classic story of like you get the head of sales in the roadmap planning, and you get the head of support, and have sales, like we need these new features, we need to compete with our competitor, they have this thing, it’s so obvious to me that we need to build towards that. 

And then the head of support is like, we have these bugs that customers are frustrated with or like performance, or whatever the case is, it’s so obvious to me that we need to go in that direction. 

So it’s interesting, and it almost feels like certainly as you roll up, now as the CEO, you’re hearing these two separate arguments and so those are basically the two data points that you have. It’s almost like understanding the motivation of the of the data and the anecdotes are, like, if we’re falling back on that gut feel.

Like what’s the motivation of that, which feels like an impossible problem to solve for. But it’s like the CEO might make a recommendation and there might be pushback from the sales team because the CEO is effectively saying like, we need to sell more, like you need to do more, like you were comfortable hitting your quota before and now we need to increase that. We have a bigger team, we have bigger appetite, whatever.

And so the team might be like, no I don’t want to do that. Like that sounds like I just finally got my feet under me for where we were setting the numbers before. Or it might be like we’re already kind of hitting our limit, like there might actually be like a valid… like you were saying like moving into this other market, the team might be like, they don’t want to do that they might just be comfortable.

Or they might be like, they might have some experience that would indicate that that’s not going to, especially like sales is really easy to pick on, because they like are compensated directly for how good they do so it’s like, they might be like no, I don’t want to go sell to this other market. I don’t want to go sell to the education market because that’s going to take forever, and I want to get paid and it’s going to be really hard and the feature sets are not going to line up and, it’s going to be months  before I get any result from that. 

And that’s actually kind of what the company cares about, too. So it’s kind of interesting to try to, like figure out like what is that switch on the motivation and maybe this is where like the data can actually come in and help… maybe that’s the true application of the data that you have to help kind of differentiate the different feelings that people might be having.

Benn Stancil: Yeah, I think that’s true. And this, I think, gets into what leadership ultimately is. You’re not going to have an answer to those questions. There will be salespeople who say, well this is why this won’t work, there will be data that says, this is why it does probably. 

There will be other people on the product team that will say, this is what I want, and there isn’t an out there, there is no solution that will tell you what to do. 

Like it’s not well, great, we’ll return to the data, just tell like, get the data to eventually solve this problem. It won’t. The salespeople will just not agree, they will have reasons that are may not be articulated in the data or there are reasons that will be articulated in the data, but the point is like you will just have a bunch of different things that are telling you different things and you will have to make a decision despite that.

And so I think that’s, that is one of the things that that particular people who come from data backgrounds that are kind of growing into leadership positions have to let go of is this sense that if only we go further with what we look at, the answer will emerge and I think like it won’t, in those cases, you have to just make a call and you have to make a call under uncertainty and your job as the person in charge is to do that. 

Your job is not to be the clever calculator that figures this out finally. Your job is to say, we can’t figure it out. Somebody still has to make a call, somebody still has to own that decision,. somebody still has to tell the salespeople that we’re doing it the way that they don’t want to do it, and I’m the one responsible for that and they still have to be motivated to do it. 

And like there’s not going to be data that gets you out of that. 

Brian Sierakowski: I almost had that exact scenario before where I kind of had this feeling. And like I kind of call it like, caveman mode of like, we were like working really hard to get the get the data that we needed to answer a question appropriately and it just like kind of wasn’t coming together. It’s kind of challenging. 

It was one of those ones where like, the deeper we went, and the more data that we had, it’s like, it wasn’t clearing up the question, it was just sort of making it more complex. 

And eventually, I kind of got to the mindset of like, if it’s this hard to answer, if this thing is working or not, then it’s not working and additionally, like number go down bad, number go up good. 

Like, you just get to a certain point of like, you have so much data and create such a complex argument that you get to the point where it’s like, okay, even repeating it now, like I don’t necessarily know that this is true and works in all scenarios but it’s kind of like, if this was working, we would know and if it was really not working, we would know. 

So that means it’s probably somewhere in the middle and I don’t think we want to commit to a big change that puts us basically back where we are now or worse. 

So it is interesting that maybe I’m reading too far to what you’re saying to say like, yes, that is correct, you must do that eventually but that’s kind of how that’s kind of how I’m hearing it of like eventually, you’re going to just have to like once you get to the point, once you kind of go back to the data, well, two or three or four times, eventually you have to be like there’s no more data water in the data well, like you’re just going to have to make a decision and see what happens and move from there or try something else.

Benn Stancil: Exactly. I think there is the sense of like, well, if we keep going back to it, if we returned to this enough, we’ll find a solution. Like there will be a way where, in some ways it is to me the same as consensus. 

We have to make a decision and I think that some leaders view consensus as their job where there’s a sales leader that doesn’t agree with the decision and the product leader who thinks we should probably do A ,sales leader things you should do B. 

It is the executive job than to say how do we get people on the same page? And I think that’s wrong. 

And I think that is that is similar to like continuing to dig through the data well, or fish through it or whatever, to get to a point where there’s an answer that everybody says finally, this is the thing we agree on. Like, they’re not going to agree. Your job is not to get them to agree your job is to figure out what to do despite them not agreeing. And so you have to give it up at some point and just say, this is what we’re doing. 

I’m sorry, product leader, we’re not making a decision you want to make. Tough. That’s the job. We have to move on and make sure they’re like committed to that thing, not look for a way so that they ultimately look the decision and say I will do the same thing. Sometimes they will look at a decision that you look at, and they’ll say I want to do something different. 

That’s going to happen and you have to get over that. You can’t make it where your job is to get everybody to say like, what would you do and we’re all going to do the same thing.

Brian Sierakowski:  Absolutely. I’ve experienced that as well and I think that I’ve at least found a little bit of value in like, going to that person who’s not on board and being like yes, this is the decision, this is the direction that we’re going and I’m, like, very well aware, you’ve been heard and I understand that you actually want to go a different direction but here’s why we’re going that way. 

Certainly there’s any decision that we make is not made in the realm of like infallibility. Like, this is what we’re doing, this is what we’re expecting to see and this is what we’re expecting to happen. Here’s like our out, like, if we try this, and we try it for three months, and it doesn’t work here’s, what our next steps are going to be. I don’t even need you to agree with me, I just need you to get on board and let’s try it together and hopefully, you actually agree with what we’re hoping to have happen. 

But if it doesn’t happen, I’m not asking you to commit to this for life. I don’t want us to get married to this idea. We’re just going to try it and we’re doing it for the greater good, that’s probably a weird and potentially dangerous phrase to use. 

Let’s just kind of like you’ve been heard, and I understand where you’re coming from and you have, hopefully they have, but maybe they haven’t, but you can say like you’ve materially affected the direction that we’re going, but this is where we’re going. We’re in trial, let’s give it a shot and I’m going to watch it and if it’s not working, I’ll be the first to say it and I’ll take responsibility for that. Usually that’s enough to get people going in same direction.

Hopefully, the team didn’t start off, I guess actually it depends. I was about to say, hopefully, the team didn’t start off that far apart to begin with, but maybe just depends on your team and what sort of personalities you have, or how strongly people have different opinions and that that might actually be a really valuable thing for an organization, too, it would have a bunch of people that view the world in a very strong way and view it in a different way so you’re always getting a kind of a melting pot of different ideas but it’s probably war time when it comes to actually making the decision and get going in a specific direction. 

Benn Stancil: Yeah, there’s no one path through it for sure.

Brian Sierakowski:  Yeah. So I want to be thoughtful of your time but I want to do want to sort of talk about as you were transitioning out of Yammer, I’m sort of curious if you just share a little bit more on what was it like starting your business and kind of rotating out of the company and like, how did that— you also mentioned that you didn’t have this plan your life goal wasn’t to become an entrepreneur, it kind of like happened almost accidentally. 

So just sort of curious, like how did that how did that happen and sort of what was the what was the process you went through there?

Benn Stancil: I worked at Yammer, and I was with the data team, me and other couple of folks on our team after the acquisition from Microsoft a few things started happening, we started like talking to more people around Silicon Valley and stuff about like the data tools that we built, because again, once you get acquired people kind of like what you do, and they want to learn more from you and stuff. So showed them to me the internal tools, and they’re like, these things are really cool, either we want to buy it, or  we have built some internal version of ourselves. 

So basically, we started to see the things that we had built for ourselves that we kind of originally thought were a special tool for a special team that was unlike anything else, was actually just a kind of a growing norm in Silicon Valley. And potentially a growing norm broader than that. And so for us, it was kind of if people are building this, and a bunch of people want to buy it, maybe we should just build a product like this and why not just build it as a product. 

To me the decision to do it was, what’s the downside? I would have be walking away from a job at Microsoft, but I wasn’t super thrilled about that anyway, it was an opportunity to start a company as a analyst, which frankly you don’t have the opportunity to join early startups, usually as a data person, that don’t have data for you to do anything with. 

You wouldn’t hire an analyst as your fifth hire because there’s nothing for them to analyze, you hiring engineers and designers and eventually salespeople and marketing, but like the data people come later. And so for me, it was like this would be an interesting opportunity to do something that I wouldn’t have a chance to do otherwise. 

It’s a product I believe in that could be successful. It’s with people that I enjoy working with.  Why not? One of the benefits of the way in which it happened was we were coming out of his acquisition, it was for a bunch of money at the time and it was like a billion dollars. Like Camera got bought for a billion dollars and so made a bunch of people some money and Silicon Valley, like when people make money, I think they immediately do is turn around, like invest in and so we had sort of an easy access to like seed money, essentially, to start it. 

We didn’t have to go out and sort of bootstrap it for a long period of time we basically like day one could go out and raise money for salaries for us, essentially. There are good and bad things about it. If I were to do it again, I don’t think I might do it that way I think there’s reasons why it’s good to kind of bootstrap it for a while and that kind of stuff but it made this sort of road for it easy. 

It made the decision where it’s like, there wasn’t a lot of a lot of downside and one of the things that there’s a lot of like in the sort of political world and Silicon Valley sort of promotes this too. The boldness of entrepreneurs and like, these are the people who are taking risks and going out there to do stuff that’s going to reinvent various things. 

In some cases, yes, in other cases, like it’s not that bold. It’s a job that you can immediately get a paycheck for, you’re not risking your own money. So that’s kind of position we were in, like what’s the downside here. So I was like why not and so that was kind of my approach to it is I was excited about the product, I was excited about the opportunity but a big part of it was that it was it was something I wouldn’t have a chance to really do otherwise and so I figured it out.

I think there’s obviously things I’ve learned from that after having done it for a while, things that were right, and things are wrong. I think that was sort of making the leap, but didn’t feel like a particularly bold leap, it felt like something where it was just like this will be kind of fun.

Brian Sierakowski: That’s cool. It’s like you did the roadshow for what the product would be already. Did you find it easy to get your first couple of customers or did that wind up being a little bit more challenging than expected?

Benn Stancil: It was easier than expected, probably. We knew people in the space and a lot of people’s first customers are kind of friends that you have. They’re types of people who are like they know you’re starting to thing, they want to try it out, someone likes it, you kind of end up building it a little bit for them.

I think breaking out of that where we started, like getting to the point where we’re getting customers from people you’ve never met.. I think like the first 10 customers or something are all people that they are friends of friends type of thing, where it’s like you get introduced because your friend knows somebody who works is a company that’s interested in the thing you’re trying to solve, and they’re willing to try it out and you could sit next to them for a day and try to do it with them and they’ll like it and all that kind of stuff. 

Getting past that point is always a little bit hard and I think for us in particular, the thing that we struggled with was kind of the marketing side of it, honestly. It was a product that that in some ways was like a little early for its time and so the problem that was solving was not a problem that people sort of immediately understood. 

I think if you’re in this position is very much easier, not easy, it’s much easier to sell products where you can give sort of the elevator pitch to people where they immediately understand what it does and Mode was never one of those products where it was a little bit like this seems like it’s in between a couple things, what exactly is it and we didn’t have great answers for that. 

It was something we knew was valuable and once people started using it, they’re like, oh I get this, but it wasn’t something we had a 30-second elevator pitch for. That was always the struggle for us for finding customers then was it worked when people tried it, it didn’t work when people, you know we couldn’t we pitch it well to somebody who didn’t understand it at all. 

And so over time, you kind of figure that out, you figure out what works and what doesn’t and the market evolves and things like that. But yeah, for the initial phases, I think that was the biggest struggle was and most of the marketing per say, it’s not like our marketing was bad, I think marketing was very good for what it was. 

I think it was more just that we were trying to sell something into a market that wasn’t quite there yet and so it was was more of a mismatch around like kind of the product in the market, and the timing of that market than it was around for like the messaging that we have.

Brian Sierakowski: It kind of sounds like positioning. It’s like once people used it, and they’re like cool, I get it but when they were trying to just try to explain it in words, they’re like how exactly does this plug into like I, I have this thing over here and I have this thing over here, like, how does this thing fit in the middle? And why is it better? Is that kind of the experience that you were you were having? 

Benn Stancil: Kind of, I mean, I think that those are those conversations went like. I think it was less about like positioning and more about the problem that we were solving was not a problem that people had quite experienced yet so we could have positioned it in all sorts of ways and they wouldn’t have quite gotten it. 

This is where kind of super early where we that problem came around and I think that it became.. part of this was because we were solving a problem that basically is a problem that we had at Yammer which was a very sort of data heavy startup that was thinking about these things that kind of the bleeding edge of the way companies thought about data. So we’re right in the bet that the world would kind of eventually follow that path. 

We were just kind of early to it, where we were trying to sell people a product, they’re like, this isn’t our problem. And so it was it was less to me about the positioning and little bit more about, I don’t know what an analogy for this would be but any product that that arrives a little bit early, where it’s like, why would I need to do this, and three years later everybody oh, I understand why you do this.

If you’re selling cost management software for AWS in 2010, you’re not going to be very successful, because not many people are using AWS and people don’t understand why it’s useful. But if you’re selling cost management software to AWS people now they’re like, please, I need this. We were, in some ways, on the early end of that, where it was like we were trying to sell something where there wasn’t that big of a market for it yet but again, I think the bet for where that market was headed was right. The timing was just a bit off.

Brian Sierakowski: Do you think that timing was a benefit from the perspective being a little bit earlier maybe then other competitors would have would have shown up?

Benn Stancil: That’s a hard question to answer. I think you get good things and you get bad things. It is good in the sense that we learned a lot of things early that our mistakes that we now that the market is, is in a place that it’s where we want it to be, we’re not going to make those mistakes again. 

When I talk to like early stage founders in this market, they often say like, here’s the way we’re thinking about it’s like, we tried that six years ago, let me tell you why that was a bad idea then. It may work now, but let me tell why it’s a bad idea then. 

We have at least gone through some of those things where we can, we can hopefully not repeat those mistakes. You also haven’t like built stuff like in terms of the things you built, you have a head start on the things that you’ve built, which matters.

The downside of it, I think, is that so much of Silicon Valley, this is like sort of a Silicon Valley problem. So much of it is momentum and kind of pace, essentially and so to be in the position, like, if we were to start Mode last year, and say hey here’s this new product capture this new market, like the pace of the company would be growing more quickly, it’d be much smaller, but it’d be growing more quickly. 

And I think a lot of like Silicon Valley is just like that hype and so there are benefits of that. Now you can mess that up and that by no means of successes and you know, there’s all sorts of stories, you can point to things that were super hyped and then fell apart but you kind of want to be on that that hype train for a number of reasons, it makes it easy to raise money, it makes easier to hire, it makes customers more willing to try it out.

You want to be the thing that people are talking about, and trajectory is what matters rather than position for that.

Oracle makes more money than all of us, but people… about Oracle because Oracle make more money for all of us forever and it’s not growing that fast. 

The startup that went from $0 to $10 million just by being a rounding error on Oracle’s balance sheet seems very exciting because it’s growing really quickly and so Silicon Valley is enamored by that shiny new thing and so being early basically means that obviously not Oracle, but like, it means that you’re not in that conversation about like the shiny new thing, even though if you were to launch it today, you potentially would be.

Brian Sierakowski: That’s funny, it almost feels like that’s something that might drive you crazy a little bit as somebody who’s super skilled with data. 

For people to look at this insufficient data, and they’re probably looking at like oh well, they’re growing so quickly and they’re just extrapolating that line, it’s like, okay they went from 0 to 10 million in a year so that means they’re going to be at 100 million next year, and then they’re going to be at a billion the year after that and then you know, 10 billion, and you’re like, yeah, no the law doesn’t just extend out into eternity on the same on the same trajectory.

Benn Stancil: Yeah, for sure and it’s certainly I think people aren’t so crude about it, for sure but there definitely is like a lot of just like hype in Silicon Valley. Is it frustrating? Is it not? I don’t know it is what it is. 

Sometimes it’s great for you, sometimes it’s not, it’s just the way that it works, and I think you have to learn how to play it basically, and not get too caught up.

Ultimately, the job is to build a business that you have product for people to buy, and how you can certainly help you on that path but at the end of the day, you just got to deliver on the product, you got to deliver on service, you got to deliver on building the machine that tells a good story about it, that can sell and all those kinds of things and that’s hard work no matter of the hype.

Brian Sierakowski: Yeah, it’s like, hype is a tool and if you can get it use it, but just remember that it is just a tool and it can’t be can’t be everything. 

And so I was just sort of curious to like what’s kind of with our with our timeline, I feel like we’re pretty close to like current day, so like, what’s going on today? And what are you paying attention to? What are you looking at?  What are kind of your plans over the next couple of whatever length of period you plan over?

Benn Stancil: The industry that we are in and I think like I said, when we started it, we were a little bit early for it, there was somewhat of a settling in probably three or four years ago around new standards around the data tools that people use and a settling in the sense where the foundations are now kind of agreed upon, or it’s like, this is what these things should roughly look like, this is how people should think about data and organizations.

The sort of early debates that people had about what’s the role of data? And do you use this type of database or this type of database? Those sort of things feel like they’re kind of the battles of the past.

And so there’s a new kind of explosion of different tools in the space and I was trying to figure out like, where data goes, and all that kind of thing, but it’s around a more settled foundation. And so for us, I think like that settled foundation is a foundation that fits us very well on its own. Mode is well positioned within that new kind of landscape. 

And so our focus is basically like, how do we take advantage of it, I think that it is an opportunity that is a really good one,  is an opportunity, that we are sort of in a good spot to capitalize on what I think is a very big opportunity. 

But you have to execute. You have to make sure you’re not getting too far ahead of yourself, you have to make sure you’re still listening to customers, that even if you see this sort of path forward, that you’re a candidate to validate that it’s actually the right path forward.

 So for us, I think it’s about kind of, all right, now is the time to really make this the thing that it could be and a lot of that is one: continuing to just confirm that the ideas that you have are right, continuing to get validation from the market and if you don’t like being humble about you may be wrong about what this is, and being willing to adjust. 

And a lot of it is just execution like, good companies are not just built on good ideas, they’re built on people building good products and, and sort of the day to day operations of our sales team runs and whether or not you can actually get your message out there and that doesn’t take the it’s not a Mad Men speech from Don Draper that makes that good, it’s hard work. 

It’s just the, the daily process of, you know, executing well, on marketing operations. It’s a daily process of having a good support team that responds to hard questions and does a good job. It’s a daily process of doing the diligence of reaching out to customers and making sure they’re happy and if you’ve got customers that aren’t like engaging with them and talking to them.

There’s just a lot of stuff like that, that has to happen to make it grow and to make it successful. 

And so I think we are in a position where we like, and we feel good about around the market, we want to make sure that that belief is right and so we’re going to kind of continue to follow that, but a lot of it is all right, how do we make sure that the machine needs to power all of this, which is really the much bigger part of the job, then then the idea of what the shape of the machine should be actually performed.

Brian Sierakowski: Awesome. Well, Ben, I think that’s perfect advice to end on. I think we missed the operational excellence point a lot pretty easily because it’s not very exciting, it’s sometimes not very fun either but yeah. 

Totally right and totally agree. Thanks so much for joining us, we’ll obviously have all of your of your links and everything connected to this episode, but is there anywhere particular that you would you would recommend sending people to if they want to learn more about you or about the business?

Benn Stancil: Sure. So if you want to learn more about Mode, it’s just So you can check it out there. You can sign up for free trial product, if you’re a data person and want to try it out or any of the various resources and things like that there.

For me personally,  some things I’ve talked about, and most of the stuff that I do is basically like writing something once a week, I have like a blog, it’s on a Substack because this is 2021 and you’re supposed to have subtext not blogs now, I guess. 

So Substack and then there’s the usual stuff on Twitter and things like that. So that’s probably much less interesting than the subject.

Brian Sierakowski: Awesome, Ben. Well, thanks so much for joining us. I really appreciate it.

Benn Stancil:  Thanks for having me.

That was our conversation with Benn Stancil, co-Founder and Chief Analytics Officer of Mode Analytics. If you need a collaborative way to visualize and user data, you know where to go, that’s If it’s business analytics and growth tools you’re looking for, check us out at We hope you enjoyed this episode invite you to check out our other founder chats and if you’re able to share with a friend or leave a review, it goes a long way. Thanks for listening.

Brian Sierakowski

Brian Sierakowski is the former General Manager of Baremetrics, an analytics and engagement tool for SaaS and subscription businesses. Before leading Baremetrics, Brian built TeamPassword, a password-sharing app that was acquired by Jungle Disk in 2018.