Spilling the Recommended Beans: Why Companies Should Have to Disclose the Ingredients of Their Recommendation Systems

[This is part one out of three in a blog post series about transparency and recommendation systems. Here is Part 2 and Part 3]

In the burgeoning digital economy, our scarce time and attention are immensely valuable. To capitalize, online actors compete with increasingly sophisticated attention-grabbing and directing technologies. Anyone using online services will be familiar with the phrases  “you might also like…”, “because you watched…”, and “other people also bought…” Such suggestions are produced by so-called recommender systems: machine learning algorithms that, based on your previous online behavior, tailor personalized suggestions that are supposed to be maximally relevant to you. Studies on recommender systems have revealed that they work remarkably well, and, notably, it has been reported that more than 70% of the time spent on YouTube  and 35% of sales on Amazon are driven by recommendations

Through services like Facebook, Spotify, Netflix, and Amazon, consumers are exposed to recommender systems on a daily basis. But it is very difficult for consumers to obtain any information about why they receive the recommendations they do. Over the course of three blog posts, we will argue for regulation requiring more transparency in how these recommender systems are being used. In this first installment, we will make the case for the need of regulation by laying out the potential impact of recommender systems, and the cost of the underlying machine learning algorithms remaining hidden. In the second and third blog posts we will make a concrete policy recommendation, discuss potential drawbacks, and elaborate on how it may be implemented. 

Recommender systems may undoubtedly incur benefits to users of digital platforms. They can help users navigate the vastness of the digital space, guiding them towards things they enjoy and desire, or filter down a near infinite number of options to a more manageable set of alternatives. In such ways, recommendations can encourage discovery and elevate users’ experiences. And the large amount of data collected on the behaviors of each individual user gives recommender systems an unprecedented ability to identify content that users will find engaging.  

“…the interests of businesses using recommender systems and the consumers exposed to them do not always align. In pursuit of their interests, businesses can program recommender systems to optimize metrics such as revenue, profit margins, and time spent watching videos … So, while users may think a song or a movie is recommended to them because it is something they are likely to be interested in, the cost of streaming may, actually, be a deciding factor.” 

However, crucially, the interests of businesses using recommender systems and the consumers exposed to them do not always align. In pursuit of their interests, businesses can program recommender systems to optimize metrics such as revenue, profit margins, and time spent watching videos. For instance, a streaming platform might allow recommendations to deviate from user preferences by weighting recommendations towards content with lower royalty fees [1]. So, while users may think a song or a movie is recommended to them because it is something they are likely to be interested in, the cost of streaming may, actually, be a deciding factor.  

As companies have no interest in making such strategies public, we have no way of knowing the extent to which business metrics are currently being incorporated in recommendations. But there are good reasons to believe that it is common practice. For one thing, the technology is quite accessible and has been shown to have the potential to increase profits [2]. In short, there is a clear financial incentive. Further, in rare instances when openness actually has been in their interest, companies have made revelations that suggest their recommendations include objectives beyond simply improving the experiences of consumers. Amazon Personalize is a service provided by Amazon in which developers of applications get access to Amazon’s own machine learning technology for real-time personalized recommendations. It is explicitly advertised as allowing optimization of business metrics of choice. Also, Spotify introduced a promotion service where tracks are favored by their recommender system in exchange for lower royalty fees (see here and here).

Aside from direct conflicts of interests, recommender systems can also have unintended negative effects. Leaked internal research showed that Instagram had a substantial and harmful impact on teenagers’ body image. There is no reason to believe this was by design, but the fact that Instagram’s knowledge about the harm did not immediately spur attempts to mitigate it, exemplifies the conflict of interest between consumers and providers. Instagram could have adjusted how their algorithms were optimized, changing what teenagers view and promoting healthier content. In addition to effects on individuals, recommender systems may also have various larger societal impacts. Even recommendations that cater to the preferences of its users may have negative side-effects. For instance, the users’ enjoyment of social media might be best served by recommendation-bolstered virtual safe spaces, insulating and affirming the worldviews of users. However, in the aggregate, highly personalized information consumption has likely been one of the main causes of the fragmented political discourse and increased polarization we have witnessed in recent years [3]. The spread of misinformation may also be exacerbated by the way recommender systems tend to propagate engaging and emotional content in an effort to maximize user engagement.  

In view of the prevalence of recommender systems online and their large behavioral impact, the lack of transparency regarding what they are up to is particularly troubling. It is also difficult to opt out and avoid receiving recommendations: Machine-learning-based recommender systems are typically infused into already existing platforms [4], such that users may not even be aware of them, and they are often hard or even impossible to disable. Without access to relevant information, there is no informed consumer choice with regards to how they interact with these technologies. The resulting lack of consumer pressure provides companies with no incentive to align recommendations with consumer interests. Further, assessing the societal consequences of different recommender systems, and investigating the need for regulation, is impossible without more openness about the underlying algorithms.

Professor of Law Frank Pasquale stated [5], with regards to recommendations resulting from the use of recommender systems: 

“Defenders of the status quo say that results like these reflect a company’s good-faith judgment about the quality of a website, an investment, or a customer. Detractors contend that they cloak self-serving appraisals and conflicts of interest in a veil of technological wizardry. Who Is right? It’s anyone’s guess, as long as the algorithms involved are kept secret.”

This quote highlights how increased insight is a desperately needed first step towards a proper understanding of the impact of recommender systems. We, therefore, propose that companies should be required to divulge what their recommender systems are programmed to optimize. In the second and third blog posts of this series, we will develop this proposal into a more detailed policy recommendation.   

[1] Bourreau, M., & Gaudin, G. (2018). Streaming platform and strategic recommendation bias.

[2] Chen, L. S., Hsu, F. H., Chen, M. C., & Hsu, Y. C. (2008). Developing recommender systems with the consideration of product profitability for sellers. Information Sciences, 178(4), 1032-1048.

[3] See, e.g, Hong, S., & Kim, S. H. (2016). Political polarization on twitter: Implications for the use of social media in digital governments. Government Information Quarterly, 33(4), 777-782.

[4] Engström, E., & Strimling, P. (2020). Deep learning diffusion by infusion into preexisting technologies–Implications for users and society at large. Technology in Society, 63, 101396.

[5] Pasquale, F. (2015:9). The black box society. Harvard University Press.

Authors appear in alphabetical order.

2021-12-12