Colleen Ammerman (Gender Initiative director): Today, we’re talking with Mike Luca. Mike is the Lee J. Styslinger III Associate Professor of Business Administration at Harvard Business School, where he teaches a course called “From Data to Decisions: The Role of Experiments,” which looks at the rise of experiments in organizations. His research focuses on online platforms and how data from those platforms can inform managerial and policy decision-making. Thank you so much for joining us today, Mike.
Michael Luca (Lee J. Styslinger III Associate Professor of Business Administration, Harvard Business School): Thanks for having me.
David Homa (Digital Initiative director): Good to see you again, Mike. From your work, what are the biggest points that stand out for you where technology treats people differently?
ML: So, it’s a great question. It’s a broad topic, and it’s a topic I’ve been working on for a number of years now. And, maybe one thing we could do is just take a little bit of a step back, and think about the history of how the internet has evolved, and actually how people think about [how] the internet has evolved. When I started at HBS, it was 2011, [and] the very first case study I had written was on Airbnb. I ran the case study, and actually, I wasn’t teaching in the first semester but was invited to give a guest lecture talking about Airbnb. At the time, people were still thinking about what are platforms like this doing, what’s their scope in the economy going to be? And, people hadn’t thought at all about the potential for inequality on platforms and how that might trickle through the economy in what turned out to be important ways.
So, I taught the class, and I remember at the time asking, “If you were a landlord on Airbnb, should you have the right to reject guests?” And, a lot of people at the time thought, yeah, it seems obvious that you should just have the ability to have discretion, you kind of run whatever — renting out your apartment, renting out a house, renting out a handful of listings. But, then I sort of flip the question and say, “What if you’re Marriott? Should you have the right to reject guests who are coming in and looking for a place to stay?” And, pretty quickly, the difference between the system in which people are used to making bookings looked quite different — in an important but subtle way — than new platforms like Airbnb and other platforms that had similar design choices at the time.
I had gone into that case study really thinking about the design of reputation systems and how you build trust on a platform like Airbnb. But, through the writing of it, it became clear to me that, when you look at a platform like Airbnb, they weren’t giving that much information for a host to decide whether or not to accept a guest. You basically saw a picture of the person, a name of the person, and not that much else. And, I started thinking, what is a host going to use when deciding whether to accept or reject a guest? It struck me that there was a possibility that this was going to give rise to discrimination in an industry that worked hard for decades to try to get rid of discrimination in hotels, in housing, in apartment rentals.
DH: Have you done any work to look at the patterns of discrimination? Does it match what we see in a traditional brick-and-mortar business? Is it similar, or are there unique qualities that are just totally unique to online platforms?
ML: So, one area where I kind of tried to push in my research is thinking about the role of design choices in platforms, and how they could lead to more or less discrimination in an ecosystem. To put that into context a little bit, there had been earlier work about discrimination on the internet that had looked around the early 2000s, late 1990s, and posited that the rise of online platforms might actually lead to less discrimination in society. And, the argument was, instead of making purchases face-to-face, where you’re seeing all these markers of race, gender, and whatnot that give rise to discrimination in real-life transactions, you’re taking away some of that and creating more arm’s-length transactions, where you’re essentially replacing a human-to-human purchase with something where you’re just looking at a reputation score or rating, and then deciding whether or not to buy something.
So, there was some early evidence on that. And, at the time, people were really treating this as just the evolution of how things might happen in the online ecosystem. I think where we stepped in is to actually say two things. One, it’s not necessarily true that that’s how we should expect the internet to evolve. If you look at the first generation of online platforms, it kind of looked like things were going to be more anonymous. But, if you look at later platforms, like the Airbnbs of the world, they started to actually get rid of a lot of that anonymity and pride themselves on reducing social distance by just giving you lots of information, not only about the product, but about the person you’re transacting with. So, the first point we raised was essentially that we can expect the possibility of discrimination, even if early work in more anonymous online transactions suggests that there may be less.
And, the second point we made, which is perhaps the more important one for platforms to think about, is that there are a lot of design choices they are making that are going to either lead to more discrimination or less discrimination. So, take Airbnb. We actually, after this case study, set out and ran a pair of academic studies following that case study to look at the potential for discrimination on the platform. And, to give a sense of the type of work we’re doing, we ran a large-scale audit study. Essentially, it looks like what some people had done in off-line rentals in past decades. But, now we took this online, and essentially sent out requests to stay with different hosts. So, we sent out 6,400 requests, and the only thing we varied in our request was the name. And, for the names, we chose names that were statistically more likely to be African American guests and statistically more likely to be white guests. This is following labor market study methodology people had done. So, taking that, we sent out these requests, and what we found is that African American guests were about 16% less likely to be accepted relative to white guests on Airbnb.
Now, we get to take this result and kind of say, is that more or less? One comparison you can make is just comparing it to going on to Expedia or Orbitz, where there’s not really scope for this type of discrimination. And, essentially, you’re taking a market where, if I go and try to rent a place through Orbitz, there’s not going to be discrimination based on my name, and maybe there will be discrimination when I show up at the hotel. But, you’ve taken out a large chunk through the design choices that Orbitz has made. You don’t have a manager deciding whether or not a guest seems trustworthy. In contrast, Airbnb both gave this information about your name and your identity, and also gave you discretion to accept or reject based on that with little other information. So, I would say that there you’re essentially unraveling some of the changes that have led to less discrimination previously.
DH: I want to talk a little bit about design, because you looked at this issue quite early on. I mean, you mentioned the class you were teaching in 2011. Where were companies thinking about design back when you looked at this, and have you seen that evolve? Like, is there a huge emphasis on design and the impact that can have? Or, is it not evolving in technology, like the advances in technology are still dominating? Do you have a sense of that?
ML: Ray Fisman and I put together an HBR article, and I had a series of conversations with Airbnb and other platforms and some of these policymakers about what kinds of design choices might you make. When we think about Airbnb, an example of the type of thing that you could do is decide how hard or easy it is to reject a guest. Do you penalize a host who rejects a lot of guests, and especially a host who systematically rejects African American guests but not white guests? That’s kind of one type of design choice. And, you might say, “Oh, does Airbnb really dictate that?” Because I think early on, one of the big questions people would have is, is this just reflecting society, or is this reflecting Airbnb design choices? But, there’s a pretty interesting history of Airbnb design choices on that.
So, much earlier, around 2011, when we wrote this first case study, Airbnb at the time was thinking about how much of a penalty do you put on hosts for rejecting guests? And, initially, they had strong penalties, because they wanted to make sure that hosts were committing to let people stay with them, and that you had reduced what the market designer would think about congestion on the platform. Essentially, make it easy for me as a guest to go on and find a booking. But, after some negative incidents that hosts had — in particular, you may remember an article by a blogger named EJ, nicknamed EJ, who had rented out her place and had the place trashed, and had trouble navigating Airbnb’s system. So, following that, Airbnb essentially said, let’s make it a lot easier for people to reject guests, and essentially said, you should feel free to reject guests for any reason, just so that you could feel comfortable. And, over time, that penalty has gone back and forth. And, the point that we made is that you shouldn’t just be thinking about market thickness and congestion there. You should also be thinking about the implications that that has for discrimination. And, without measuring that, and thinking explicitly about it, you’re not going to be able to know what the impact on final levels of discrimination are. So, that’s kind of one type — one example — of design choices.
DH: Yeah, this has been a theme across a number of our conversations, where technology companies are technology companies first, and they think, ”What are we doing to enable the platform, to run the platform? What are the technology choices we make to build up this system and make it run?” And, in the past, and even in the present, less so, “What’s the impact on people, our users, society?” A lot of companies — especially early in startup — they’re dominated by engineers building, not social scientists or product managers that are so focused on the impact to customers beyond just using the product. So, this is actually a theme that comes up over and over again in our conversations. And, I imagine it’s something that companies have evolved to think about a lot more now.
ML: And so, the problem wasn’t that… there was no malicious player at Airbnb who was thinking, let’s facilitate widespread discrimination. It was more that they didn’t think about it at all. This is just a blind spot that they had when designing the product. And, in fact, when addressing our research afterwards, the CEO had made this a priority for the company to address discrimination and said, look, we were three… the way he had said it was that they were three white men who had founded the company, and discrimination wasn’t even something that kind of entered their consciousness on a day-to-day basis.
But, actually that’s a problem. And, it’s a problem that they didn’t have other stakeholders thinking about this. It’s a problem that they made design choices that were allowing discrimination on the platform. And, it’s a problem that they weren’t tracking the potential for it and then proactively trying to address it. So, then in the wake of this second paper, they started acknowledging some of these problems and thinking, all right, now what do we do? How do we move forward?
CA: I think we’ve all heard a lot at this point about when organizations try to mitigate discrimination or prevent it or reduce bias, they sometimes end up either not having an effect and missing the mark and the bias continues to be there — to be embedded — or, they actually exacerbate it. So, as you said, you have a take about what are the ways to think about those choices. I’m just curious if you [could] maybe talk a little bit more about how is it that companies can avoid inadvertently either perpetuating and exacerbating these problems when they have the good intentions of working to mitigate them?
ML: So, we could separate this into a couple of issues. First, do companies have the right incentives to get rid of bias? And, that’s an important policy question, it’s an important managerial question to start to think about. To some extent, when thinking about Airbnb’s position and other platforms. So, after our research, they had to grapple with, “Are we going to reduce discrimination?” And, if so, is that going to change the profitability of the company? And, in cases where they could find something that both increased profitability and reduced discrimination, those decisions were clear. In situations where they would say, “Okay, we could cut half of discrimination, but this is also going to start to nibble into our profits at the margin,” then they need to make decisions about, “Are we going to take those hard steps and hope that in the long run, that that’s going to pay off in a more inclusive product that appeals to a broader base of users?” So then, when we see companies that are making decisions that aren’t that effective, we should at least ask the question, are they doing that because they didn’t know what would work? Or, are they doing that because the things that were working were viewed to be too costly for them in some sense, at least in the short run?
I’ll give a concrete example of that in the case of Airbnb. They did adopt a lot of the things that we had suggested. And, one of the more interesting changes, I think, is that they created a team of people to look at discrimination on an ongoing basis — understanding that this isn’t going to solve the whole problem in one area and one set of changes — that it’s going to be something that they need to be thinking about in an ongoing basis. And, once you have that infrastructure, there are a lot of things that you could think about doing. Now, when you look at those changes, was that enough? If you look over time, they’ve increasingly taken more steps. And, for people I’ve spoken with at Airbnb, there are very well-intentioned people who are working hard to try to figure out how to get rid of discrimination on the platform.
So, now when you think about how do you know if something worked, you make this change, you think it worked, you read a paper that suggests this type of thing might be effective. And this is where I think tracking and evaluating via experiments or other empirical methods is super important for companies to be thinking about.
CA: I would imagine that there’s probably always some portion of users that are quite committed to their biases, right? And, will pay the costs to say, like, not rent to an African American guest, even if they’re penalized for it. So, I’m just really curious what you think, do companies have a responsibility to keep those people actually out of the platform? Or, is it okay for them to think about them as just a cost, a discrimination cost, of doing business? To say, “Well, we know the design choices can’t fully eliminate those type of actions, and we’re going to let those people in and just accept a minimum amount of discrimination.” Or, do companies have a responsibility to say, “How can we keep those users fully out of the platform?” I’m just curious what you think about that?
ML: So, I have my own personal beliefs about discrimination and platforms. But, when I step back from that and think about the frameworks that managers can adopt, it’s thinking about what are the legal rules around whether or not you could allow discrimination on the platform. Now, what are the ethical guidelines you’ll follow? How do you personally feel about running a platform that’s facilitating discrimination? And the third, which sometimes gets overlooked, is how customers and employees are going to feel about visiting a platform that’s knowingly facilitating discrimination and allowing people to discriminate who they could just take off the platform. And, they’re never going to get rid of… it’s going to be hard to get rid of all the discrimination on the platform.
But, when you get to a point where you know that there are discriminators on there, and you just say, “Look, we know that they discriminate, but it’s profitable for us.” That’s pretty tough medicine to ask all of your other customers to take and for all of your employees to take. So, after we had this paper, and once everybody kind of agreed that there was discrimination on Airbnb, it got to a point where employees, or a lot of employees, wanted the company to do something about it. And, a lot of customers wanted people to do something about it. And, that is part of the pressure that could help to induce change on these platforms.
CA: We did want to get into this a little bit — and you’ve talked about this already with some specific examples — but you actually wrote a book with a colleague at Harvard Business School about experiments and the role of experiments in organizations today. And so, [I] just wanted to hear you talk a little bit about what exactly is it that experiments can do in helping us to identify and address discrimination.
ML: So, it turns out experiments play an important role in helping to address discrimination. I can talk a little bit about the motivation for this book, and how I started thinking about the issue more broadly. So, I’ve been working for years with my colleague, Max Bazerman. Now, he’s a psychologist; I’m an economist. He kind of came at this from the lab and thinking about how to take those things out into policy. I came about this thinking about the design of platforms and the role experiments could play there. And, we had co-taught a course called “Behavioral Insights,” where we had taken students to work on behavioral projects in government — thinking about how can you use insights to help improve the social good.
We started thinking about, like, how important experiments were there, in terms of knowing whether or not something would actually work. We started thinking about it in both the government sector and in the tech sector. And, we came together and thought, look, there’s a lot of value that experiments could create in organizations. And, we wanted to help managers think about the fact that managers also have to be thinking about experiments.
I think in the tech sector, experiments have been common for a long time now. But, one of the things that we had noticed over the years is that, a lot of times, similar to design choices, they would be left as a decision that gets made in the course of a project. Of course, we have an A/B testing system. But, they sort of outsource the thinking on it. And, one of the things we realized is that this leads to a lot of challenges that managers face in the tech sector.
One of the things we were thinking about is [that] managers should think about the ways to create longer-term metrics for their companies. In fact, I could think about a couple of very large companies that have now created efforts to try to say, can we think better about concepts more analogous to the real net present value of a change that they’re making, and bake that into an experiment.
The second thing we talk about a lot in the book, and that we had been thinking about that motivated us to write the book, is that, kind of similar to having short-run outcomes, companies sometimes have [a] very narrow set of outcomes. Where you’re thinking about clicks, but you’re not thinking about discrimination — you’re not thinking about customer sentiment, you’re not thinking about impact on other products enough. So, having a broader set of metrics that you’re using. When we started thinking about this, we came to the conclusion that there are a lot of managerial questions that we wanted MBA students and we wanted companies to be thinking about when they’re trying to run experiments.
So, in some sense, the goal of the book was… there were multiple goals. The first was trying to help people get an appreciation of the value of experiments. That’s also one of the core goals that I had in creating this course “From Data to Decisions”: to help people understand how and where experiments could be used. The second part is: what are some of the key design choices that should require managerial input? Like, what are the outcomes you care about? Are you trying to understand the mechanism or specific effect size? Are you thinking about generalizing this insight to somewhere else? If so, what are some of the challenges involved there? So, kind of the design choices, and then how do you interpret this and bake it into a managerial decision at the end? And, while there have been lots of technical guides on experimentation, we thought that there was a real hole in trying to understand how managers should think about going from an experiment to a managerial decision. That’s what motivated the book. In the book, we walk through essentially one-third behavioral experiments in government, one-third tech experiments, and one-third sort of putting it all together to get to bigger experiments throughout society.
DH: Was there something for you personally that drove this, or was it just curiosity, like you said, once you started looking at platforms? What was that thing that made it click for you that like, wait, there could be discrimination here?
ML: I went to BU (Boston University) for grad school. I grew up in New York, so I’m like, a Yankees fan. So, there’s a little bit of a tension there. But outside of that, I loved it and had a great experience. At the time, I had actually started off thinking about economic theory and also health care — [they] were two areas that I had been interested in. When I started thinking about economic theory, the thing that got me interested in tech more broadly was information. I started thinking about rating systems. And actually, the first paper I had done was with my grad school friend, Jon Smith. We had written a paper on the impact of U.S. News college ratings, and actually there, we were thinking about this behavioral component of it.
So, people respond to ratings. But, what we found is that a lot of the response was just based on how salient you make the information to people. To give a concrete example of the type of thing we looked at — over time, U.S. News [and World Report] has expanded the number of schools they ranked. So, we look at this change in the mid-90s, where they used to rank 1 to 25, then 26 to 50 were listed alphabetically. But, you had the methodology next to the rankings. So, if anybody really wanted to do the math, and it’s a pretty big decision, you could just do the math. But, we found that the ranking started to matter a lot more after they switched 26 to 50 from being alphabetical to being rank order. Once we saw that, we thought, well, it doesn’t seem that it’s just information people are responding to. It’s information, but also the way that you’re designing this, and the way that you’re providing that information back to people. Then, that paper kind of got me thinking about the fact that there [are] probably a lot of design choices that platforms make and media outlets make about how to aggregate information and get it back to people.
It was at HBS, my first year here, that I started thinking about Airbnb and the big role that these platforms have the potential to play in society, even though they were only starting to have a big role at the time. And, there I started thinking about some of the questions that people really seem to be missing in practice, and also in research, right? So, while there had been documentation of research, the question of how these design choices were going to affect the level of discrimination hadn’t been a big topic at the time. And, I thought it was something that economics had the potential to really help to shed light into, both documenting the issue and helping to provide frameworks for managers and for researchers.
DH: I like how throughout your work, obviously, you have all of the skills and expertise to understand the math, but you’re thinking beyond to the implications. Like, what are the implications for the people that are feeding this data and math, and what are the implications for the people interpreting it? It’s what we talk about a lot in the Digital Initiative of having this multidisciplinary approach to the problems we face.
ML: Yeah, the thing that really gets me excited about these questions is the possibility of helping to affect change at the end of everything.
DH: Yeah, and people often ask this question about “what does academia do?”
ML: It’s an interesting question about what the role of research is in organizations and helping to affect practice. I think there are, to some extent, different views on this and different emphasis on this across disciplines and across areas. When I was a grad student, I was in an economics department, and at the time, I had been thinking about economics departments, [and] hadn’t really been thinking about business schools as much. It was when I was looking for a job that I really started exploring business schools. And, one thing that excites me about business schools is the opportunity to both think about how is research affecting future research, and kind of what the academic insights are, but also thinking about what the possibility is to help shape practice through research, through translational materials, and through developing pedagogy and teaching in the MBA classroom.
CA: So, before we started this interview, we were talking about your family. You recently had a third child. And, I am curious — we were talking about this internally — is there a way in which being a parent has influenced the way you think about your work, especially with regard to this question around its impact on practice and impact on society?
ML: I do think about wanting to address questions. And, when I see a problem in the world where I think economics and data could have an impact, I want to be able to look back and say when I thought that I could do something, that I tried. And, we may not have had the big change, there may be a lot left to do. But, I sort of reflect back and say, all right, we saw this problem, we tried to study it, we tried to help develop frameworks for others who want to solve it, and that is something that excites me, and is something I do sort of think about sometimes.
CA: So, I would love to hear you talk a little bit about what you’ve observed over your time at HBS. And, as you have been talking with these companies over the past decade or so, I think we’ve seen the conversation evolve. That’s one of the reasons we’re doing this project. We’ve seen the conversation evolve in the tech sector about equality and equity and discrimination. So, [I] would love your reflections on where we are today.
ML: I’ve been at HBS since 2011, and one thing that struck me is that the way that the classroom talks about race, and the way that businesses are thinking about race, has evolved. You can think about it even in the context of the Airbnb work, but I think that this is a broader phenomenon.
When I first started teaching days on Airbnb and on discrimination on platforms, you would spend a lot of time thinking about “is this something that companies should be thinking about?” And, I would say that there’s been a little bit of a shift toward people being more proactive in thinking about “what should we be doing about this?” rather than “should we be doing something about this?” And, I think that’s true in a lot of areas of the business sector. I don’t want to say it’s universally true either across companies or across problems, because I don’t know, and it’s just hard to say. There are a lot of problems that companies are grappling with. But, I will say that both in the context of platform design, companies have become a lot more receptive to saying, you know, we should get out in front of this. We should try to make changes. We should try to create an inclusive ecosystem.
So, when I think about the challenges that society is facing in a bunch of domains, and especially around race, around discrimination, it does seem, at least within the narrow corner of the world that I’ve been involved in — kind of like the tech sector and platform design — that there has been a shift. And, one bright spot is that there’s more attention that companies are paying to these issues, and more efforts to try to create platforms and create workplaces that are going to be productive, positive, and inclusive.
CA: I think you’re absolutely right that in business generally, there’s been a real evolution, and I’ve seen it definitely outside the tech sector as well. I think there’s variation, of course, but that’s one reason I actually think work like yours is so important, because I think it is helping shift the conversation away from business leaders saying, “Okay, well, what do I do? How do I just not discriminate?” Or, “How do I be more gender-inclusive or racially-inclusive or whatever?” But, to say,” Oh, okay, how do I develop a way of thinking about these issues? How do I build into the processes that we engage in every day — experimentation and looking at our data — and actually just incorporating this as part of how we think about our work?” Which is quite different from “What are the things that I need to do to comply with regulation?” Or, “What are the things that I need to not do? What’s the list of no-nos?” Which I think, for so long, has been the way that businesses have approached these issues. And, I think we’ve seen the limitations of that kind of compliance-oriented approach.
And so, I think this work that helps people understand (a) some of the mechanisms, right? Experiments can help you understand why is that outcome happening, which gives you a lot more insight. And, then, (b) the shift to, okay, how do we bake into the way that we think about things, the way that we set up hiring, or the way that we actually design products, the right questions to ask, and the right kind of analyses to do. Which I think… I am optimistic about that shift in how companies think about it as well.
DH: Mike, do you have any suggested resources for the people watching this that they might dig into learn more about this topic?
ML: There’s a lot of great work in the area. So, one book that kind of predated the platform design literature, is a book by Cass Sunstein and Richard Thaler called Nudge. And, in that book, they start to think about the role of behavioral economics and framing of decisions plays in the outcomes in a lot of different areas. You could think about 401(k) decisions. And, that book influences the way that I’ve thought about both platform design and also about the role experiments could play in organizations. You know, those weren’t central topics in the book. I think that’s a useful resource for anybody looking to get into the area.
For people that are thinking about discrimination and hiring processes. I think What Works by Iris Bohnet is a great resource for people who want to learn where that literature is. And, for people interested in algorithms and the potential for algorithmic bias, a recent book called The Ethical Algorithm provides a great overview of where the literature is, and what we know and don’t know about the area so far.
DH: Mike, thanks for joining us today.
CA: Yes, thank you for joining us, Mike. It’s been a really fascinating conversation.
ML: Thanks, Colleen. Thanks, Dave.
DH: That’s a wrap on the interview, but the conversation continues.
CA: And, we want to hear from you. Send your questions, comments, and ideas to justdigital@hbs.edu.