Filter Results:
(158)
Show Results For
- All HBS Web
(650)
- Faculty Publications (158)
Show Results For
- All HBS Web
(650)
- Faculty Publications (158)
Page 1 of 158
Results →
- October 2024
- Article
Sampling Bias in Entrepreneurial Experiments
By: Ruiqing Cao, Rembrand Koning and Ramana Nanda
Using data from a prominent online platform for launching new digital products, we document that ‘sampling bias’—defined as the difference between a startup’s target customer base and the actual sample on which early ‘beta tests’ are conducted—has a systematic and... View Details
Cao, Ruiqing, Rembrand Koning, and Ramana Nanda. "Sampling Bias in Entrepreneurial Experiments." Management Science 70, no. 10 (October 2024): 7283–7307.
- July 2024
- Article
Acceptance of Automated Vehicles Is Lower for Self than Others
By: Stuti Agarwal, Julian De Freitas, Anya Ragnhildstveit and Carey K. Morewedge
Road traffic accidents are the leading cause of death worldwide for people aged 2–59. Nearly all deaths are due to human error. Automated vehicles could reduce mortality risks, traffic congestion, and air pollution of human-driven vehicles. However, their adoption... View Details
Agarwal, Stuti, Julian De Freitas, Anya Ragnhildstveit, and Carey K. Morewedge. "Acceptance of Automated Vehicles Is Lower for Self than Others." Journal of the Association for Consumer Research 9, no. 3 (July 2024): 269–281.
- 2024
- Working Paper
Demographically Biased Technological Change
By: Victor Manuel Bennett, John-Paul Ferguson, Masoomeh Kalantari and Rembrand Koning
Who gets the jobs that automation creates? A consensus has begun to emerge that said technologies complement rather than substitute for labor. However, they also shift the demand for specific types of skills and other worker competencies. Such shifts imply unequal... View Details
Bennett, Victor Manuel, John-Paul Ferguson, Masoomeh Kalantari, and Rembrand Koning. "Demographically Biased Technological Change." Working Paper, June 2024.
- May–June 2024
- Article
Setting Gendered Expectations? Recruiter Outreach Bias in Online Tech Training Programs
By: Jacqueline N. Lane, Karim R. Lakhani and Roberto Fernandez
Competence development in digital technologies, analytics, and artificial intelligence is increasingly important to all types of organizations and their workforce. Universities and corporations are investing heavily in developing training programs, at all tenure... View Details
Lane, Jacqueline N., Karim R. Lakhani, and Roberto Fernandez. "Setting Gendered Expectations? Recruiter Outreach Bias in Online Tech Training Programs." Organization Science 35, no. 3 (May–June 2024): 911–927.
- April 3, 2024
- Article
How Automakers Can Address Resistance to Self-Driving Cars
By: Stuti Agarwal, Julian De Freitas and Carey K. Morewedge
Research involving multiple experiments found that consumers have biased views of their driving abilities relative to those of other drivers and automated vehicles. These findings have implications for the adoption of partly or fully automated vehicles, which one day... View Details
Keywords: Technology Adoption; Consumer Behavior; Government Legislation; Prejudice and Bias; Auto Industry; Technology Industry
Agarwal, Stuti, Julian De Freitas, and Carey K. Morewedge. "How Automakers Can Address Resistance to Self-Driving Cars." Harvard Business Review (website) (April 3, 2024).
- March 2024
- Case
Unintended Consequences of Algorithmic Personalization
By: Eva Ascarza and Ayelet Israeli
“Unintended Consequences of Algorithmic Personalization” (HBS No. 524-052) investigates algorithmic bias in marketing through four case studies featuring Apple, Uber, Facebook, and Amazon. Each study presents scenarios where these companies faced public criticism for... View Details
Keywords: Race; Gender; Marketing; Diversity; Customer Relationship Management; Prejudice and Bias; Customization and Personalization; Technology Industry; Retail Industry; United States
Ascarza, Eva, and Ayelet Israeli. "Unintended Consequences of Algorithmic Personalization." Harvard Business School Case 524-052, March 2024.
- 2024
- Working Paper
Warnings and Endorsements: Improving Human-AI Collaboration Under Covariate Shift
By: Matthew DosSantos DiSorbo and Kris Ferreira
Problem definition: While artificial intelligence (AI) algorithms may perform well on data that are representative of the training set (inliers), they may err when extrapolating on non-representative data (outliers). These outliers often originate from covariate shift,... View Details
DosSantos DiSorbo, Matthew, and Kris Ferreira. "Warnings and Endorsements: Improving Human-AI Collaboration Under Covariate Shift." Working Paper, February 2024.
- December 2023
- Article
When Should the Off-Grid Sun Shine at Night? Optimum Renewable Generation and Energy Storage Investments
By: Christian Kaps, Simone Marinesi and Serguei Netessine
Globally, 1.5 billion people live off the grid, their only access to electricity often limited to operationally-expensive fossil fuel generators. Solar power has risen as a sustainable and less costly option, but its generation is variable during the day and... View Details
Kaps, Christian, Simone Marinesi, and Serguei Netessine. "When Should the Off-Grid Sun Shine at Night? Optimum Renewable Generation and Energy Storage Investments." Management Science 69, no. 12 (December 2023): 7633–7650.
- 2023
- Working Paper
Words Can Hurt: How Political Communication Can Change the Pace of an Epidemic
By: Jessica Gagete-Miranda, Lucas Argentieri Mariani and Paula Rettl
While elite-cue effects on public opinion are well-documented, questions remain as
to when and why voters use elite cues to inform their opinions and behaviors. Using
experimental and observational data from Brazil during the COVID-19 pandemic, we
study how leader... View Details
Keywords: Elites; Public Engagement; Politics; Political Affiliation; Political Campaigns; Political Influence; Political Leadership; Political Economy; Survey Research; COVID-19; COVID-19 Pandemic; COVID; Cognitive Psychology; Cognitive Biases; Political Elections; Voting; Power and Influence; Identity; Behavior; Latin America; Brazil
Gagete-Miranda, Jessica, Lucas Argentieri Mariani, and Paula Rettl. "Words Can Hurt: How Political Communication Can Change the Pace of an Epidemic." Harvard Business School Working Paper, No. 24-022, October 2023.
- September 2023
- Exercise
Irrationality in Action: Decision-Making Exercise
By: Alison Wood Brooks, Michael I. Norton and Oliver Hauser
This teaching exercise highlights the obstacle of biases in decision-making, allowing students to generate examples of potentially poor decision-making rooted in abundant and unwanted bias. This exercise has two parts: a pre-class, online survey in which students... View Details
Brooks, Alison Wood, Michael I. Norton, and Oliver Hauser. "Irrationality in Action: Decision-Making Exercise." Harvard Business School Exercise 924-007, September 2023.
- August 2023
- Article
Can Security Design Foster Household Risk-Taking?
By: Laurent Calvet, Claire Célérier, Paolo Sodini and Boris Vallée
This paper shows that securities with a non-linear payoff design can foster household risk-taking. We demonstrate this effect empirically by exploiting the introduction of capital guarantee products in Sweden from 2002 to 2007. The fast and broad adoption of these... View Details
Keywords: Financial Innovation; Household Finance; Structured Products; Stock Market Participation; Finance; Innovation and Invention; Household; Personal Finance; Risk and Uncertainty; Behavior; Market Participation
Calvet, Laurent, Claire Célérier, Paolo Sodini, and Boris Vallée. "Can Security Design Foster Household Risk-Taking?" Journal of Finance 78, no. 4 (August 2023): 1917–1966.
- 2023
- Working Paper
How People Use Statistics
By: Pedro Bordalo, John J. Conlon, Nicola Gennaioli, Spencer Yongwook Kwon and Andrei Shleifer
We document two new facts about the distributions of answers in famous statistical problems: they are i) multi-modal and ii) unstable with respect to irrelevant changes in the problem. We offer a model in which, when solving a problem, people represent each hypothesis... View Details
Bordalo, Pedro, John J. Conlon, Nicola Gennaioli, Spencer Yongwook Kwon, and Andrei Shleifer. "How People Use Statistics." NBER Working Paper Series, No. 31631, August 2023.
- June 2023
- Article
Amplification of Emotion on Social Media
By: Amit Goldenberg and Robb Willer
Why do expressions of emotion seem so heightened on social media? Brady et al. argue that extreme moral outrage on social media is not only driven by the producers and sharers of emotional expressions, but also by systematic biases in the way people that perceive moral... View Details
Goldenberg, Amit, and Robb Willer. "Amplification of Emotion on Social Media." Nature Human Behaviour 7, no. 6 (June 2023): 845–846.
- 2023
- Working Paper
Auditing Predictive Models for Intersectional Biases
By: Kate S. Boxer, Edward McFowland III and Daniel B. Neill
Predictive models that satisfy group fairness criteria in aggregate for members of a protected class, but do not guarantee subgroup fairness, could produce biased predictions for individuals at the intersection of two or more protected classes. To address this risk, we... View Details
Boxer, Kate S., Edward McFowland III, and Daniel B. Neill. "Auditing Predictive Models for Intersectional Biases." Working Paper, June 2023.
- 2023
- Article
Provable Detection of Propagating Sampling Bias in Prediction Models
By: Pavan Ravishankar, Qingyu Mo, Edward McFowland III and Daniel B. Neill
With an increased focus on incorporating fairness in machine learning models, it becomes imperative not only to assess and mitigate bias at each stage of the machine learning pipeline but also to understand the downstream impacts of bias across stages. Here we consider... View Details
Ravishankar, Pavan, Qingyu Mo, Edward McFowland III, and Daniel B. Neill. "Provable Detection of Propagating Sampling Bias in Prediction Models." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 8 (2023): 9562–9569. (Presented at the 37th AAAI Conference on Artificial Intelligence (2/7/23-2/14/23) in Washington, DC.)
- 2023
- Working Paper
Setting Gendered Expectations? Recruiter Outreach Bias in Online Tech Training Programs
By: Jacqueline N. Lane, Karim R. Lakhani and Roberto Fernandez
Competence development in digital technologies, analytics, and artificial intelligence is increasingly important to all types of organizations and their workforce. Universities and corporations are investing heavily in developing training programs, at all tenure... View Details
Keywords: STEM; Selection and Staffing; Gender; Prejudice and Bias; Training; Equality and Inequality; Competency and Skills
Lane, Jacqueline N., Karim R. Lakhani, and Roberto Fernandez. "Setting Gendered Expectations? Recruiter Outreach Bias in Online Tech Training Programs." Harvard Business School Working Paper, No. 23-066, April 2023. (Accepted by Organization Science.)
- 2023
- Working Paper
Applications or Approvals: What Drives Racial Disparities in the Paycheck Protection Program?
By: Sergey Chernenko, Nathan Kaplan, Asani Sarkar and David S. Scharfstein
We use the 2020 Small Business Credit Survey to study the sources of racial disparities in use of the Paycheck Protection Program (PPP). Black-owned firms are 8.9 percentage points less likely than observably similar white-owned firms to receive PPP loans. About 55% of... View Details
Chernenko, Sergey, Nathan Kaplan, Asani Sarkar, and David S. Scharfstein. "Applications or Approvals: What Drives Racial Disparities in the Paycheck Protection Program?" NBER Working Paper Series, No. 31172, April 2023.
- 2023
- Working Paper
Feature Importance Disparities for Data Bias Investigations
By: Peter W. Chang, Leor Fishman and Seth Neel
It is widely held that one cause of downstream bias in classifiers is bias present in the training data. Rectifying such biases may involve context-dependent interventions such as training separate models on subgroups, removing features with bias in the collection... View Details
Chang, Peter W., Leor Fishman, and Seth Neel. "Feature Importance Disparities for Data Bias Investigations." Working Paper, March 2023.
- 2023
- Working Paper
The Limits of Algorithmic Measures of Race in Studies of Outcome Disparities
By: David S. Scharfstein and Sergey Chernenko
We show that the use of algorithms to predict race has significant limitations in measuring and understanding the sources of racial disparities in finance, economics, and other contexts. First, we derive theoretically the direction and magnitude of measurement bias in... View Details
Keywords: Racial Disparity; Paycheck Protection Program; Measurement Error; AI and Machine Learning; Race; Measurement and Metrics; Equality and Inequality; Prejudice and Bias; Forecasting and Prediction; Outcome or Result
Scharfstein, David S., and Sergey Chernenko. "The Limits of Algorithmic Measures of Race in Studies of Outcome Disparities." Working Paper, April 2023.
- 2024
- Working Paper
Everyone Steps Back?: The Widespread Retraction of Crowd-Funding Support for Minority Creators When Migration Fear Is High
By: John (Jianqui) Bai, William R. Kerr, Chi Wan and Alptug Yorulmaz
We study racial biases on Kickstarter across multiple ethnic groups from 2009-2021. Scaling the concept of racially salient events, we quantify the close co-movement of minority funding gaps to inflamed political rhetoric surrounding migration. The racial funding gap... View Details
Bai, John (Jianqui), William R. Kerr, Chi Wan, and Alptug Yorulmaz. "Everyone Steps Back? The Widespread Retraction of Crowd-Funding Support for Minority Creators When Migration Fear Is High." Harvard Business School Working Paper, No. 23-046, January 2023. (Revised February 2024.)