A single parent is working from home as usual in Melbourne, Australia, connecting to the internet using one of the Google Wi-Fi points that are distributed strategically throughout her terrace house. Shes prepared to trade off the privacy risks associated with the data these devices gather for the comfort she derives from their aesthetically pleasing sleek design, and for the convenience they provide compared to the complication and hassle of other home Wi-Fi set-ups, especially when renting. Meanwhile, two doors down, a man thumbs through search results and targeted ads on his phone as he researches the planned purchase of a new smart TV. He is struck by the low price of certain models, but feels uneasy, vaguely recalling reading somewhere that some models are only priced so competitively because the TV makers collect user data and sell it on to third parties (Gilbert, 2019).
In Madrid, a teenager opens Snapchat and scrolls through newly received snaps. Because his virtual presence on the Snap Map (his Actionmoji) is moving in a constant direction at speed, the Snapchat app correctly assumes that he is seated on a train. Beside him, his friend, who has been scrolling TikTok continuously, sees a video pop up in her For You Page. It seems out of place compared to the videos the algorithm usually selects for her, and features a popular influencer humorously suggesting she take a break, have a drink of water and perhaps go for a walk. Barely pausing to roll her eyes at the clumsy intervention, she scrolls on to the next video in her feed (Burke, 2020). In a Brazilian megacity, a young woman is about to head out on her bicycle, and needs to work out the safest and most terrain-friendly route to take. Because she doesnt entirely trust app-generated directional recommendations, she selects her preferred route based on a combination of saved Strava and pre-downloaded Google Maps data and her own experiential knowledge of the city (Pink et al., 2019, p. 179).
Around the same time, a Japanese TV show reveals that the person behind social media star @azusagakuyuki, an attractive young woman who poses beside her motorcycle, is Soya, a fifty-year-old man. Soya created his highly successful female alter ego using AI-enabled face-editing apps marketed as fun apps, but licensed to users on terms that enable companies to collect large amounts of personally identifying data. When asked about it, Soya explains that, while he began by just playing around with the app, it happened to turn out to be fairly pretty and so @azusagakuyuki was born. Encouraged by the likes he got after posting the results, he said, I got carried away gradually as I tried to make it cuter (BBC News, 2021). Nearby in South Korea, Seo-Hyun, a young woman interested in fashion and beauty, gives careful consideration to the forms of zero party data personal data a consumer intentionally and proactively shares with brands and marketers (Mitchell, 2019) that she is prepared to share with clothing and cosmetics brands, tailoring this information in such a way as to maximise her chances of being rewarded with promotional give-aways of her favourite jeans and make-up.
Outside a courthouse in Oakland, California, an activist taking part in a demonstration against police violence is confronted by a local officer, and so the protestor begins recording video of their interaction on his phone. The police officer retaliates by pulling his own phone out of his pocket. To the bemusement of the watching crowd, he opens Spotify and starts playing a track by mainstream pop artist Taylor Swift, assuming that YouTubes copyright enforcement algorithms (YouTube, 2021) will use audio data-matching to automatically detect and remove the protestors video, preventing it from reaching a large audience (Cabanatuan, 2021; Schiffer and Robertson, 2021). Nevertheless, the video goes viral on Reddit and Twitter, provoking discussions and knowledge-sharing about audio editing techniques that can be used to work around the data logics of the major platforms automated content-moderation techniques.
These composite vignettes are a mix of factual, semi-fictionalised and fictionalised accounts, designed to give the reader (you) a way into the books themes and ideas. They have also fulfilled an important function for us as authors. In preparing this book, the creation of vignettes aided us in compiling and distilling what we have observed over the last decade or so of our own and colleagues qualitative research in this area, and in isolating what we see as important about everyday data cultures. In this way, the above vignettes serve as presentation devices (Ely et al., 1997, p. 74): they introduce and synthesise themes that are central to the book and to which we return, in different ways and to varying degrees, in later chapters. Not only do these vignettes reinforce how data is often both collected from and targeted to us as we go about our day-to-day lives (the datafication of everyday life), they also provide glimpses of how we form intimate relationships with and through data (everyday data intimacies), how we develop skills and capacities to do things with data (everyday data literacies), and how everyday data practices play out in communities and in public (everyday data publics).