February 27, 2017 by quiqcleanpro
The sharing economy has a discrimination problem. Studies have shown that the sharing economy isn’t as open as we think: People of color are discriminated against on platforms such as Airbnb, Uber, and Lyft. A study of ride-hailing platforms found that black passengers were subjected to longer wait times and higher cancelation rates than white passengers. A study of Airbnb found that guests with African-American-sounding names were 16% less likely to be accepted by hosts than guests with white-sounding names.
Documenting and proving discrimination in the sharing economy has been a crucial first step, but the more difficult question is how to prevent it from happening in the first place. We believe we have found an answer.
On Airbnb’s website, guests and hosts can rate each other, which turns out to be pivotal in combating discrimination. In a recent study based on a field experiment that involved more than 1,000 Airbnb hosts, we found that when guests have even one positive review on their profiles, it statistically eliminates racial discrimination against them. In this study we created guest accounts with either white or African-American-sounding names and generic scenery profile pictures, and then sent reservation requests to randomly selected prospective hosts. When a guest did not have any review information on their profile page, white guests had a much higher acceptance rate (48%) than black guests (29%). However, once each guest had at least one positive review, the acceptance rates became almost identical: 56% and 58%, respectively. When we manipulated the content of the review to include more negative information, we also saw reduced discrimination: African-Americans and white guests both faced a comparable struggle to find a host.
To understand how a single review could have such a powerful effect, we need to go back to the literature on what causes racial discrimination in the first place. There are two major theories. One is that differential treatment of people of color is driven by imperfect information. This theory, which dates back to the 1970s and is often referred to as statistical discrimination theory, posits that rational decision makers may use group averages to make a statistical inference of individual characteristics when relevant information related to a specific individual is incomplete. For instance, if a host believes that African-Americans as a demographic group have higher incarceration and crime rates, the host will associate a prospective African-American guest with lower quality. In the case of Airbnb, when prospective hosts receive a request from a guest, they make a statistical inference of the guest quality using all available information. If the available information about guest quality is insufficient, hosts may use race to infer quality. However, when enough information is shared, hosts rely less on race to make a decision, reducing discrimination.
The other major theory suggests that discrimination is taste-driven. This theory assumes that a decision maker, regardless of what information they have, simply has a preference for agents with certain demographics. In this case, hosts may dislike a guest of a certain group even when the guest is equally trustworthy and lots of information is provided that attests to it.
Depending on what drives discrimination, the approaches to combating discrimination can be drastically different. On one hand, if discrimination is largely a result of incomplete information, providing more relevant information will reduce people’s reliance on race as a signal. On the other hand, if discrimination is driven by intrinsic aversion to people of color, simply providing information is not sufficient, and more systematic interventions will be needed.
The fact that we no longer observe discrimination on Airbnb with guests who have as little as one review suggests that discrimination in the sharing economy is of the former variety — a heartening sign. Designing a platform that encourages sharing of truthful and relevant information can instill trust and reduce discrimination.
But withholding information from platform users, the approach taken by some companies, will not help reduce discrimination. For example, “reducing the prominence of profile pictures in the booking process” has been proposed by Airbnb as a means to address disparate treatment of black guests on their site. Our research suggests this will not solve the issue. Even using a profile image that doesn’t show the user, as we did in our study, won’t overcome the problem. It will likely just put more emphasis on other information that might be a racial cue, such as a guest’s name.
Our recommendation is for the platform companies to build a credible, easy-to-use online reputation and communication system. Bringing information to light, rather than trying to hide it from users, is more likely to be a successful approach to tackling discrimination in the sharing economy.