Filter Results:
(26)
Show Results For
- All HBS Web
(102)
- Faculty Publications (26)
Show Results For
- All HBS Web
(102)
- Faculty Publications (26)
Page 1 of 26
Results →
- March 2024
- Case
Unintended Consequences of Algorithmic Personalization
By: Eva Ascarza and Ayelet Israeli
“Unintended Consequences of Algorithmic Personalization” (HBS No. 524-052) investigates algorithmic bias in marketing through four case studies featuring Apple, Uber, Facebook, and Amazon. Each study presents scenarios where these companies faced public criticism for... View Details
Keywords: Race; Gender; Marketing; Diversity; Customer Relationship Management; Prejudice and Bias; Customization and Personalization; Technology Industry; Retail Industry; United States
Ascarza, Eva, and Ayelet Israeli. "Unintended Consequences of Algorithmic Personalization." Harvard Business School Case 524-052, March 2024.
- February 2024
- Module Note
Data-Driven Marketing in Retail Markets
By: Ayelet Israeli
This note describes an eight-class sessions module on data-driven marketing in retail markets. The module aims to familiarize students with core concepts of data-driven marketing in retail, including exploring the opportunities and challenges, adopting best practices,... View Details
Keywords: Data; Data Analytics; Retail; Retail Analytics; Data Science; Business Analytics; "Marketing Analytics"; Omnichannel; Omnichannel Retailing; Omnichannel Retail; DTC; Direct To Consumer Marketing; Ethical Decision Making; Algorithmic Bias; Privacy; A/B Testing; Descriptive Analytics; Prescriptive Analytics; Predictive Analytics; Analytics and Data Science; E-commerce; Marketing Channels; Demand and Consumers; Marketing Strategy; Retail Industry
Israeli, Ayelet. "Data-Driven Marketing in Retail Markets." Harvard Business School Module Note 524-062, February 2024.
- 2024
- Working Paper
Warnings and Endorsements: Improving Human-AI Collaboration Under Covariate Shift
By: Matthew DosSantos DiSorbo and Kris Ferreira
Problem definition: While artificial intelligence (AI) algorithms may perform well on data that are representative of the training set (inliers), they may err when extrapolating on non-representative data (outliers). These outliers often originate from covariate shift,... View Details
DosSantos DiSorbo, Matthew, and Kris Ferreira. "Warnings and Endorsements: Improving Human-AI Collaboration Under Covariate Shift." Working Paper, February 2024.
- September 29, 2023
- Article
Eliminating Algorithmic Bias Is Just the Beginning of Equitable AI
By: Simon Friis and James Riley
When it comes to artificial intelligence and inequality, algorithmic bias rightly receives a lot of attention. But it’s just one way that AI can lead to inequitable outcomes. To truly create equitable AI, we need to consider three forces through which it might make... View Details
Friis, Simon, and James Riley. "Eliminating Algorithmic Bias Is Just the Beginning of Equitable AI." Harvard Business Review (website) (September 29, 2023).
- June 2023
- Simulation
Artea Dashboard and Targeting Policy Evaluation
By: Ayelet Israeli and Eva Ascarza
Companies deploy A/B experiments to gain valuable insights about their customers in order to answer strategic business problems. In marketing, A/B tests are often used to evaluate marketing interventions intended to generate incremental outcomes for the firm. The Artea... View Details
Keywords: Algorithm Bias; Algorithmic Data; Race And Ethnicity; Experimentation; Promotion; Marketing And Society; Big Data; Privacy; Data-driven Management; Data Analysis; Data Analytics; E-Commerce Strategy; Discrimination; Targeted Advertising; Targeted Policies; Pricing Algorithms; A/B Testing; Ethical Decision Making; Customer Base Analysis; Customer Heterogeneity; Coupons; Marketing; Race; Gender; Diversity; Customer Relationship Management; Marketing Communications; Advertising; Decision Making; Ethics; E-commerce; Analytics and Data Science; Retail Industry; Apparel and Accessories Industry; United States
- 2023
- Working Paper
Insufficiently Justified Disparate Impact: A New Criterion for Subgroup Fairness
By: Neil Menghani, Edward McFowland III and Daniel B. Neill
In this paper, we develop a new criterion, "insufficiently justified disparate impact" (IJDI), for assessing whether recommendations (binarized predictions) made by an algorithmic decision support tool are fair. Our novel, utility-based IJDI criterion evaluates false... View Details
Menghani, Neil, Edward McFowland III, and Daniel B. Neill. "Insufficiently Justified Disparate Impact: A New Criterion for Subgroup Fairness." Working Paper, June 2023.
- 2023
- Working Paper
The Limits of Algorithmic Measures of Race in Studies of Outcome Disparities
By: David S. Scharfstein and Sergey Chernenko
We show that the use of algorithms to predict race has significant limitations in measuring and understanding the sources of racial disparities in finance, economics, and other contexts. First, we derive theoretically the direction and magnitude of measurement bias in... View Details
Keywords: Racial Disparity; Paycheck Protection Program; Measurement Error; AI and Machine Learning; Race; Measurement and Metrics; Equality and Inequality; Prejudice and Bias; Forecasting and Prediction; Outcome or Result
Scharfstein, David S., and Sergey Chernenko. "The Limits of Algorithmic Measures of Race in Studies of Outcome Disparities." Working Paper, April 2023.
- October–December 2022
- Article
Achieving Reliable Causal Inference with Data-Mined Variables: A Random Forest Approach to the Measurement Error Problem
By: Mochen Yang, Edward McFowland III, Gordon Burtch and Gediminas Adomavicius
Combining machine learning with econometric analysis is becoming increasingly prevalent in both research and practice. A common empirical strategy involves the application of predictive modeling techniques to "mine" variables of interest from available data, followed... View Details
Keywords: Machine Learning; Econometric Analysis; Instrumental Variable; Random Forest; Causal Inference; AI and Machine Learning; Forecasting and Prediction
Yang, Mochen, Edward McFowland III, Gordon Burtch, and Gediminas Adomavicius. "Achieving Reliable Causal Inference with Data-Mined Variables: A Random Forest Approach to the Measurement Error Problem." INFORMS Journal on Data Science 1, no. 2 (October–December 2022): 138–155.
- May 2022 (Revised June 2024)
- Case
LOOP: Driving Change in Auto Insurance Pricing
By: Elie Ofek and Alicia Dadlani
John Henry and Carey Anne Nadeau, co-founders and co-CEOs of LOOP, an insurtech startup based in Austin, Texas, were on a mission to modernize the archaic $250 billion automobile insurance market. They sought to create equitably priced insurance by eliminating pricing... View Details
Keywords: AI and Machine Learning; Technological Innovation; Equality and Inequality; Prejudice and Bias; Growth and Development Strategy; Customer Relationship Management; Price; Insurance Industry; Financial Services Industry
Ofek, Elie, and Alicia Dadlani. "LOOP: Driving Change in Auto Insurance Pricing." Harvard Business School Case 522-073, May 2022. (Revised June 2024.)
- Article
Eliminating Unintended Bias in Personalized Policies Using Bias-Eliminating Adapted Trees (BEAT)
By: Eva Ascarza and Ayelet Israeli
An inherent risk of algorithmic personalization is disproportionate targeting of individuals from certain groups (or demographic characteristics such as gender or race), even when the decision maker does not intend to discriminate based on those “protected”... View Details
Keywords: Algorithm Bias; Personalization; Targeting; Generalized Random Forests (GRF); Discrimination; Customization and Personalization; Decision Making; Fairness; Mathematical Methods
Ascarza, Eva, and Ayelet Israeli. "Eliminating Unintended Bias in Personalized Policies Using Bias-Eliminating Adapted Trees (BEAT)." e2115126119. Proceedings of the National Academy of Sciences 119, no. 11 (March 8, 2022).
- September–October 2021
- Article
Frontiers: Can an AI Algorithm Mitigate Racial Economic Inequality? An Analysis in the Context of Airbnb
By: Shunyuan Zhang, Nitin Mehta, Param Singh and Kannan Srinivasan
We study the effect of Airbnb’s smart-pricing algorithm on the racial disparity in the daily revenue earned by Airbnb hosts. Our empirical strategy exploits Airbnb’s introduction of the algorithm and its voluntary adoption by hosts as a quasi-natural experiment. Among... View Details
Keywords: Smart Pricing; Pricing Algorithm; Machine Bias; Discrimination; Racial Disparity; Social Inequality; Airbnb Revenue; Revenue; Race; Equality and Inequality; Prejudice and Bias; Price; Mathematical Methods; Accommodations Industry
Zhang, Shunyuan, Nitin Mehta, Param Singh, and Kannan Srinivasan. "Frontiers: Can an AI Algorithm Mitigate Racial Economic Inequality? An Analysis in the Context of Airbnb." Marketing Science 40, no. 5 (September–October 2021): 813–820.
- September 17, 2021
- Article
AI Can Help Address Inequity—If Companies Earn Users' Trust
By: Shunyuan Zhang, Kannan Srinivasan, Param Singh and Nitin Mehta
While companies may spend a lot of time testing models before launch, many spend too little time considering how they will work in the wild. In particular, they fail to fully consider how rates of adoption can warp developers’ intent. For instance, Airbnb launched a... View Details
Keywords: Artificial Intelligence; Algorithmic Bias; Technological Innovation; Perception; Diversity; Equality and Inequality; Trust; AI and Machine Learning
Zhang, Shunyuan, Kannan Srinivasan, Param Singh, and Nitin Mehta. "AI Can Help Address Inequity—If Companies Earn Users' Trust." Harvard Business Review Digital Articles (September 17, 2021).
- March 2021
- Supplement
Artea (A), (B), (C), and (D): Designing Targeting Strategies
By: Eva Ascarza and Ayelet Israeli
Power Point Supplement to Teaching Note for HBS No. 521-021,521-022,521-037,521-043. This collection of exercises aims to teach students about 1)Targeting Policies; and 2)Algorithmic bias in marketing—implications, causes, and possible solutions. Part (A) focuses on... View Details
Keywords: Targeted Advertising; Targeting; Algorithmic Data; Bias; A/B Testing; Experiment; Advertising; Gender; Race; Diversity; Marketing; Customer Relationship Management; Prejudice and Bias; Analytics and Data Science; Retail Industry; Apparel and Accessories Industry; Technology Industry; United States
- September 2020 (Revised July 2022)
- Teaching Note
Algorithmic Bias in Marketing
By: Ayelet Israeli and Eva Ascarza
Teaching Note for HBS No. 521-020. This note focuses on algorithmic bias in marketing. First, it presents a variety of marketing examples in which algorithmic bias may occur. The examples are organized around the 4 P’s of marketing – promotion, price, place and... View Details
- September 2020 (Revised July 2022)
- Technical Note
Algorithmic Bias in Marketing
By: Ayelet Israeli and Eva Ascarza
This note focuses on algorithmic bias in marketing. First, it presents a variety of marketing examples in which algorithmic bias may occur. The examples are organized around the 4 P’s of marketing – promotion, price, place and product—characterizing the marketing... View Details
Keywords: Algorithmic Data; Race And Ethnicity; Promotion; "Marketing Analytics"; Marketing And Society; Big Data; Privacy; Data-driven Management; Data Analysis; Data Analytics; E-Commerce Strategy; Discrimination; Targeting; Targeted Advertising; Pricing Algorithms; Ethical Decision Making; Customer Heterogeneity; Marketing; Race; Ethnicity; Gender; Diversity; Prejudice and Bias; Marketing Communications; Analytics and Data Science; Analysis; Decision Making; Ethics; Customer Relationship Management; E-commerce; Retail Industry; Apparel and Accessories Industry; United States
Israeli, Ayelet, and Eva Ascarza. "Algorithmic Bias in Marketing." Harvard Business School Technical Note 521-020, September 2020. (Revised July 2022.)
- September 2020 (Revised February 2024)
- Teaching Note
Artea (A), (B), (C), and (D): Designing Targeting Strategies
By: Eva Ascarza and Ayelet Israeli
Teaching Note for HBS No. 521-021,521-022,521-037,521-043. This collection of exercises aims to teach students about 1)Targeting Policies; and 2)Algorithmic bias in marketing—implications, causes, and possible solutions. Part (A) focuses on A/B testing analysis and... View Details
- September 2020 (Revised July 2022)
- Exercise
Artea (B): Including Customer-Level Demographic Data
By: Eva Ascarza and Ayelet Israeli
This collection of exercises aims to teach students about 1)Targeting Policies; and 2)Algorithmic bias in marketing—implications, causes, and possible solutions. Part (A) focuses on A/B testing analysis and targeting. Parts (B),(C),(D) Introduce algorithmic bias. The... View Details
Keywords: Targeting; Algorithmic Bias; Race; Gender; Marketing; Diversity; Customer Relationship Management; Demographics; Prejudice and Bias; Retail Industry; Apparel and Accessories Industry; Technology Industry; United States
Ascarza, Eva, and Ayelet Israeli. "Artea (B): Including Customer-Level Demographic Data." Harvard Business School Exercise 521-022, September 2020. (Revised July 2022.)
- September 2020 (Revised July 2022)
- Exercise
Artea (C): Potential Discrimination through Algorithmic Targeting
By: Eva Ascarza and Ayelet Israeli
This collection of exercises aims to teach students about 1)Targeting Policies; and 2)Algorithmic bias in marketing—implications, causes, and possible solutions. Part (A) focuses on A/B testing analysis and targeting. Parts (B),(C),(D) Introduce algorithmic bias. The... View Details
Keywords: Targeting; Algorithmic Bias; Race; Gender; Marketing; Diversity; Customer Relationship Management; Prejudice and Bias; Retail Industry; Apparel and Accessories Industry; Technology Industry; United States
Ascarza, Eva, and Ayelet Israeli. "Artea (C): Potential Discrimination through Algorithmic Targeting." Harvard Business School Exercise 521-037, September 2020. (Revised July 2022.)
- September 2020 (Revised July 2022)
- Exercise
Artea (D): Discrimination through Algorithmic Bias in Targeting
By: Eva Ascarza and Ayelet Israeli
This collection of exercises aims to teach students about 1)Targeting Policies; and 2)Algorithmic bias in marketing—implications, causes, and possible solutions. Part (A) focuses on A/B testing analysis and targeting. Parts (B),(C),(D) Introduce algorithmic bias. The... View Details
Keywords: Targeted Advertising; Discrimination; Algorithmic Data; Bias; Advertising; Race; Gender; Marketing; Diversity; Customer Relationship Management; Prejudice and Bias; Analytics and Data Science; Retail Industry; Apparel and Accessories Industry; Technology Industry; United States
Ascarza, Eva, and Ayelet Israeli. "Artea (D): Discrimination through Algorithmic Bias in Targeting." Harvard Business School Exercise 521-043, September 2020. (Revised July 2022.)
- September 2020 (Revised June 2023)
- Exercise
Artea: Designing Targeting Strategies
By: Eva Ascarza and Ayelet Israeli
This collection of exercises aims to teach students about 1)Targeting Policies; and 2)Algorithmic bias in marketing—implications, causes, and possible solutions. Part (A) focuses on A/B testing analysis and targeting. Parts (B),(C),(D) Introduce algorithmic bias. The... View Details
Keywords: Algorithmic Data; Race And Ethnicity; Experimentation; Promotion; "Marketing Analytics"; Marketing And Society; Big Data; Privacy; Data-driven Management; Data Analytics; Data Analysis; E-Commerce Strategy; Discrimination; Targeted Advertising; Targeted Policies; Targeting; Pricing Algorithms; A/B Testing; Ethical Decision Making; Customer Base Analysis; Customer Heterogeneity; Coupons; Algorithmic Bias; Marketing; Race; Gender; Diversity; Customer Relationship Management; Marketing Communications; Advertising; Decision Making; Ethics; E-commerce; Analytics and Data Science; Retail Industry; Apparel and Accessories Industry; United States
Ascarza, Eva, and Ayelet Israeli. "Artea: Designing Targeting Strategies." Harvard Business School Exercise 521-021, September 2020. (Revised June 2023.)