algorithmic systems in their decision making. For instance, Netflix’s recommendation system is estimated to influence choice for ‘about 80% of hours streamed’, with ‘the remaining 20%’ coming ‘from search, which requires its own set of algorithms’ (Gomez-Uribe and Hunt 2015: 5). Similar figures can be found for other platforms, including Amazon, and they explain the spectacular marketing success of recommender algorithms (Celma 2010; Konstan and Riedl 2012; Ansari, Essegaier and Kohli 2000). Myriad feedback loops like the one sketched above constellate our digitally mediated existence, eventually producing self-reinforcement effects that translate into a reduced or increased exposure to specific types of content, selected based on past user behaviour (Bucher 2012a). By modulating the visibility of social media posts, micro-targeted ads or search results, autonomous systems not only mediate digital experiences, but ‘constitute’ them (Beer 2009), often by ‘nudging’ individual behaviours and opinions (Christin 2020; Darmody and Zwick 2020). What happens is that ‘the models analyze the world and the world responds to the models’ (Kitchin and Dodge 2011: 30). As a result, human cultures end up becoming algorithmic cultures (Striphas 2015).
Critical research has mainly dealt with this ‘social power’ of algorithms as a one-way effect (Beer 2017), with the risk of putting forward forms of technological determinism – implicit, for instance, in most of the literature about filter bubbles (Bruns 2019: 24). Yet, recent studies show that the outputs of autonomous machines are actively negotiated and problematized by individuals (Velkova and Kaun 2019). Automated music recommendations or micro-targeted ads are not always effective at orienting the taste and consumption of platform users (Siles et al. 2020; Ruckenstein and Granroth 2020; Bucher 2017). Algorithms do not unidirectionally shape our datafied society. Rather, they intervene within it, taking part in situated socio-material interactions involving both human and non-human agents (Law 1990; D. Mackenzie 2019; Burr, Cristianini and Ladyman 2018; Orlikowski 2007; Rose and Jones 2005). Hence, the content of ‘algorithmic culture’ (Striphas 2015) is the emergent outcome of techno-social interactional dynamics. From this point of view, my deciding whether or not to click on a recommended book on Amazon represents an instance of human–machine interaction – which is, of course, heavily engineered to serve the commercial goals of platforms. Nonetheless, in this digitally mediated exchange, both the machine learning algorithm and I maintain relative margins of freedom. My reaction to recommendations will be immediately measured by the system, which will behave differently in our next encounter, also based on that feedback. On my end, I will perhaps discover new authors and titles thanks to this particular algorithm, or – as often happens – ignore its automated suggestions.
Paradoxically, since machine learning systems adapt their behaviour probabilistically based on input data, the social outcomes of their multiple interactions with users are difficult to predict a priori (Rahwan et al. 2019; Burr, Cristianini and Ladyman 2018; Mackenzie 2015). They will depend on individual actions and reactions, on the specific code of the algorithm, and on the particular data at the root of its ‘intelligence’. In order to study how algorithms shape culture and society, Neyland (2019) suggests we leave aside the abstract notion of algorithmic power and try instead to get to know autonomous machines more closely, by looking at their ‘everyday life’. Like ‘regular’ social agents, the machine learning systems embedded in digital platforms and devices take part in the social world (Esposito 2017) and, as with the usual subjects of sociological investigation, the social world inhabits them in turn. A second open question for a sociology of algorithms is therefore: how do socialized machines participate in society – and, by doing so, reproduce it?
These open questions about the culture in the code and the code in the culture are closely related. A second-order feedback loop is implicit here, one that overlooks all the countless interactions between algorithms and their users. It consists in the recursive mechanism through which ‘the social’ – with its varying cultural norms, institutions and social structures – is reproduced by the actions of its members, who collectively make society while simultaneously being made by it. If you forget about algorithms for a second, you will probably recognize here one of the foundational dilemmas of the social sciences, traditionally torn by the complexities of micro–macro dynamics and cultural change (Coleman 1994; Giddens 1984; Bourdieu 1989a; Strand and Lizardo 2017). In fact, while it can be argued that social structures like class, gender or ethnicity ‘exercise a frequently “despotic” effect on the behaviour of social actors’ – producing statistically observable regularities in all social domains, from political preferences to musical taste – these very same structures ‘are the product of human action’ (Boudon and Bourricaud 2003: 10). Since the times of Weber and Durkheim, sociologists have attempted to explain this paradox, largely by prioritizing one out of two main opposing views, which can be summarized as follows: on the one side, the idea that social structures powerfully condition and determine individual lives; on the other, the individualistic view of a free and agentic subject that makes society from below.
In an attempt to overcome the dualism between the ‘objective’ structuring of individuals and the ‘subjective’ character of social action, a French sociologist with a background in philosophy developed an original theoretical framework, whose cornerstone is the notion of ‘habitus’. He was Pierre Bourdieu (1930–2002), who is unanimously considered one of the most influential social thinkers of the twentieth century. Aiming to deal with a different (but related) dualism – that is, between ‘the technical’ and ‘the social’ in sociological research – this book seeks to treat machine learning algorithms ‘with the same analytical machinery as people’ (Law 1990: 8). I will build on the analytical machinery originally developed by Bourdieu, and argue that the particular ways in which these artificial social agents act in society should be traced back to the cultural dispositions inscribed in their code.
Seeing algorithms with the eyes of Pierre Bourdieu
Why do individuals born and raised under similar social conditions happen to have almost identical lifestyles, ways of walking and speaking, modes of thinking about and acting within the world? Why do unskilled workers and highly educated bourgeois have such different ideas about what makes a song ‘bad’, a piece of furniture ‘nice’, a TV show ‘disgusting’, a behaviour ‘inappropriate’, a person ‘valuable’? How come that the everyday practices of women and men, Algerian farmers and French colonialists, dominated and dominators, end up jointly reproducing material and symbolic inequalities? These are some of the crucial questions Bourdieu asked in his research. All point to a general sociological dilemma, and have a common theoretical solution: ‘So why is social life so regular and so predictable? If external structures do not mechanically constrain action, what then gives it its pattern? The concept of habitus provides part of the answer’ (Bourdieu and Wacquant 1992: 18).
The habitus is defined as a system of ‘durable, transposable dispositions’ which derives from the ‘conditions of existence’ characteristic of a particular social environment (Bourdieu 1977: 72). Such embodied dispositions are formed at a young age and tend to orient one’s entire life, social exchanges, practices and even perceptions (Bourdieu 1981). Bourdieu’s key intuition was to resort to a classic Aristotelian concept6 in order to overcome the aforementioned dualism between autonomous subjects and conditioning structures, and thus ‘account for the social or external bases of thought’ (Lizardo 2013). According to the French sociologist, the ‘conductorless orchestration’ (Bourdieu 1981: 308) of individual practices derives from embodied cultural scripts which simultaneously enable and constrain action, without any need for fixed rules or rational deliberations. If we think with the idea of habitus, socialization precedes consciousness and works in a ‘practical way’: ‘social structure is internalized by each of us because we have learned from the experience of previous actions a practical mastery of how to do things that takes objective constraints into account’ (Calhoun et al. 2002: 260). Class, gender or race inequalities are not merely external to the individual; rather, they exist inside individuals and their bodies, incorporated as a ‘practical reason’ made of spontaneous inclinations and tacit cultural understandings (Mukerji 2014). For Bourdieu, the habitus is the site of the interplay between social structure and individual practice, culture and cognition. With their instinctive gestures, sedimented classification