Cognitive control in times of digital distraction

Dr Mira Vogel, Senior Lecturer in Education, King’s Academy

Back in 2013 when media studies professor Clay Shirky decided to stop students from using digital devices during his teaching, it was in response to research which had sent ripples through the academic community.

Faria Sana and colleagues had set up two experiments. One investigated whether multitasking on a laptop impeded learning while the other investigated the influence of being in direct view of another multitasking student.  As well as finding a strong negative effect on multitaskers’ learning (operationalised as the ability to recall knowledge and apply it to novel problems), the researchers discovered even stronger negative effects on students taking paper notes within view of a peer who was multitasking – those students scored a full 17% less in a post-lecture test. In 2020 Amanda Hall and colleagues extended this work and found even stronger interference with learning when the multitasking was unrelated to the topic of the lecture. Students did not necessarily recognise disruption to their learning, but in the post-lecture knowledge test they scored better on the material covered when students within their view were using their laptops for on-task note-taking.

Although lab conditions are rarely authentic learning environments, many educators recognised this phenomenon from their own experiences, and found these findings compelling. Likening the effects of multitasking to second-hand smoking, Shirky enacted a default ban on laptop use in his class and weighed into the already vigorous global conversation about how to respond to the phenomenon of digital distraction. At the time, he worried that his students were powerless to resist the designs of technology companies to exploit users’ most primal instincts in order to keep their attention. Social media companies have since openly admitted to this business model, which has been referred to by former industry insiders as ‘the race to the bottom of the brainstem‘.

Weighed against the impetus to ban digital networked devices are several reasons, the most compelling of which is accessibility. Educators cannot know, predict nor assume who will need a digital device to interact with and learn from a lecture. Restricting use to only students with documented disabilities is likely to make them stand out even more. There are more reasons not to ban. For heavy social media users, out of sight is not out of mind and they may experience more distraction when deprived of their device.  For those users, removing the device deprives them of vital interactions they rely on for peace of mind, and the consequent worry is liable to destroy any prospect of concentrative work. More prosaically, there are expectations around being reachable in an emergency, and in any case it’s doubtful that a ban could be enforced, especially in large groups. Then there are students’ expectations around participatory culture and the questions a ban might raise about a lecturer’s tech cred. Some educators seek to engage students through their devices, in opinion polls, voting and knowledge tests, for example. Finally, banning does not help students to develop metacognitive awareness of the consequences of distractions from learning, nor ways to overcome them in their lives beyond that timetabled class.

Between the extremes of banning and ignoring digital devices, there is the possibility of supporting students to control their attention. In their 2016 book ‘The Distracted Mind‘, psychologists Adam Gazzaley and Larry Rosen set out four causes of distraction along with some strategies for developing cognitive control. These are not particular to higher education, but they look amenable to being adapted as follows.

Supporting students to understand the costs of multitasking and task-switching. We should expect a knowledge-action gap here; in other words, being told about research evidence is unlikely to lead to behaviour change on its own. Instead, we need to deeply grasp the principles through experience and reflection. In a conscious multitasking exercise, David Levy sets his students an activity to note any temptations to switch attention away from task at hand, along with their decision in response – maintain or switch. Participants in my workshops about responding to distraction tell me they have found this approach helpful in becoming aware of triggers and exercising control.

Limiting access. If students decide to do put their devices out of reach, there are apps which allow them to impose blocks on selected software for a set amount of time. This may be particularly powerful in independent study but can also eliminate the device notifications which demand attention during class.

Decreasing boredom-related self-interruption. Students can introduce variety by interleaving what they study, using a range of study strategies, changing physical position, and rewarding themselves with breaks when they catch up with social media or do something enjoyable. Educators can also improve engagement by introducing variety to their teaching. Real-world examples, posing questions, use of audience response systems and follow-up discussion have been associated with improved engagement.

Supporting students to reduce their fear of missing out. FoMO is a social anxiety about being absent from valuable experiences happening elsewhere. It tends to affect younger adults, who are consequently likely to want to check social media more often. To the extent that FoMO is related to adjustment to university life, creating a sense of belonging and mattering within a module or degree will help students to perceive their university studies as among those valuable experiences, rather than a dislocation. This is likely to lessen students’ dependence on being involved minute by minute in what is happening outside the learning context.

This post has primarily been concerned with building defences against distraction on an individual basis. Ultimately, though, in a world of social media with advertising revenue to make, distraction is our problem but not our fault. So, while the strategies above are beneficial and likely to empower us, the problem and the harm will persist until the technology companies competing for attention adopt more responsible business models.

Sources

Gazzaley, A., & Rosen, L. D. (2016). The distracted mind: Ancient brains in a high-tech world. MIT Press.

Hall, A. C. G., Lineweaver, T. T., Hogan, E. E., & O’Brien, S. W. (2020). On or off task: The negative influence of laptops on neighboring students’ learning depends on how they are used. Computers & Education, 153, 103901. https://doi.org/10.1016/j.compedu.2020.103901

Levy, D. M. (2016). Mindful tech: How to bring balance to our digital lives. Yale University Press. p115-117.

Sana, F., Weston, T., & Cepeda, N. J. (2013). Laptop multitasking hinders classroom learning for both users and nearby peers. Computers & Education, 62, 24–31. https://doi.org/10.1016/j.compedu.2012.10.003

Shirky, C. (2014). Why a leading professor of new media just banned technology use in his class. Washington Post. 5th September 2014. Accessed at:  https://www.washingtonpost.com/news/answer-sheet/wp/2014/09/25/why-a-leading-professor-of-new-media-just-banned-technology-use-in-class/

Cultures of vigilance. A new approach at the University of Munich (LMU)

Arndt Brendecke, Cultures of Vigilance, Ludwig-Maximilians-Universität München.

There are more than 80 CCTV cameras in Times Square. Police officers are permanently stationed there. And yet, the only verifiable terror attack that has taken place in New York after 9/11 was not prevented by cameras or police officers. Instead, it was two “normal citizens”, street vendors Lance Orton and Duane Jackson, who noticed an oddly parked Nissan Pathfinder on 1st May 2010. They immediately alerted a mounted police officer who checked the car, realized that there were bombs in the boot, and evacuated and cordoned off the entire square. Barack Obama personally phoned Orton and Jackson three days later. According to a speech Obama gave on 4th May 2010, the attack failed “because ordinary citizens were vigilant and reported suspicious activity to the autho­rities”.[1]

New York’s security measures had successfully relied upon something that already exists in all societies, human vigilance. But how does that work exactly? How is the attention of many focussed on a concrete goal, in this case the threat of terrorism? An advertising agency asked to achieve exactly this created the now famous slogan “see something, say something”. However, to see the right “something” and then to alert others relies upon prior knowledge, behavioural expectations, roles and ideals, all of which have undergone centuries of complex cultural formation. We know little about this history and the cultural means it has passed on to us. It is the history of a willingness, and the existence of opportunities, to amass different parts of individual acts of vigilance to serve a supra-individual goal.

A group of researchers in Munich have joined forces (as the German Research Foundation’s Collaborative Research Centre 1369) to study the history and diversity of processes used to activate and bundle the attention of individuals and use this attention to serve particular goals – to avert danger, for legal or religious purposes, but also to assist the implementation of, and adherence to, specific societal or individual objectives. Our interest here goes far beyond simple societal functions, such as the early detection of fires so that they can be quickly extinguished. It encompasses deeper, more profound mechanisms that shape societies and individuals beyond a given here and now. For if attention is successfully and permanently paired with specific goals, say fending off the Devil or the prevention of sin, if individuals train their perception accordingly and identify with such obligations, then vigilance becomes something profoundly “political”. A part of the task therefore lies in finding out what effect the direction and employment of attention can have on societies and individuals.

We use the term vigilance to differentiate these types of phenomena from more general forms of attention, and define vigilance as the linking of individual cognition with supra-individually defined goals. We turn to cultures of vigilance not only to complement the extensive body of research on the biological and psychological basis of attention, but also because culture is essential to understanding how attention is usually oriented, attuned, and linked to behaviour as well as identities. Culture is by no means just an instrument for, say, communicating social expectations. It is creative in its own right. The full creative potential of a culture of vigilance can be gauged by looking at a problem that psychology has been dealing with since Norman Mackworth’s experiments on sustained attention. Triggered by the observation of the rapid decrease in radar operators’ ability to focus on critical signals during World War II, the Royal Air Force commissioned Mackworth to begin research on how to achieve sustained attention. In the cultures of vigilance in which we are interested, one hits upon a simple answer to this question of how to sustain attention for as long as possible: variety. Cultures of vigilance must be creative to be effective in the long-term: danger must be something tangible – a pulsating force; events must be created, stories told and interests served. However, the cultural side of vigilance reaches far beyond the issue of attentional focus or sustained attention. It also creates opportunities for critical reflection by enabling second-order observation: literature, theatre and art reflect how attention varies over time, show how people fail to notice what later turns out to be crucial, or do notice but fail in their response. They thereby display what is otherwise so difficult to observe: the inner drama of decision-making processes, of doubt, of wavering judgements. They let us speculate on whether the protagonists will or will not take up responsibility and thereby raise questions of morality, and identity, and about what is right and what is owed. And they sometimes teach us to doubt the true motives of those involved.

The latter highlights an important aspect: Vigilance provides a variety of opportunities for those involved including the opportunity to participate, to identify with something, to make decisions, and not least to make distinctions. Vigilance is therefore always political and structurally ambivalent. Orton and Jackson’s act was undoubtedly a good one, but their behaviour was structurally identical to certain forms of denunciation: here, too, those reporting on others usually maintain that they serve a common goal such as that of preserving the law. Right-wing militia groups for instance, such as the United Constitutional Patriots, claim that they help to secure the US border to Mexico and to work hand-in-hand with the official Border Patrol. They claim legitimacy due to their alleged vigilance, whilst breaking the law itself, for example by persecuting and kidnapping migrants. Their activity is also motivated by campaigns and here, too, we cannot rule out the possibility that a president (or former president) may praise their vigilance.

Perhaps the biggest methodical challenge of our approach is therefore dealing with this ambivalence and being able to examine it using a conceptual framework that is, itself, not already biased. For however necessary it may be to evaluate each case in political terms, and however smoothly the available terms seem to facilitate this by ascribing virtue to a certain behaviour on the one hand, and evil intentions on the other, for instance through a distinction between care and intrusion in a neighbour’s gaze and attitude, it is analytically far more interesting, and more appropriate to this phenomenon, to base any study on the assumption that a fundamental and enduring ambivalence exists, and that this ambivalence lies deeper than one would first presume, and that parties involved in vigilance practices often have diverse and fluctuating motives. This ambiguity and inherent contradictoriness are the substance of a second type of drama, one that again features heavily in theatre, literature and cinema. An archetypical example of this is Hitchcock’s Rear Window (1954), which is why we have used the film’s iconography on our website and in publications.

In Rear Window, Hitchcock demonstrates how the protagonists’ motives remain unclear, even to themselves. A photographer named Jeff, played by James Stewart, is temporarily confined to a wheelchair and passes the time spying on his neighbours. It does not take long however, until his idle voyeurism takes on a darker twist. As a potential witness to a murder, he believes his duty is to help convict the murderer. Jeff’s wandering gaze becomes a searching one. His private pastime becomes coupled with a supra-individual, societal goal. Suspicion triggers a change: it suddenly seems to be appropriate to systematically observe one’s neighbours. And yet doubt remains, highlighted when Grace Kelly, playing Lisa, exclaims, “Sitting around, looking out a window to kill time is one thing, but doing it the way you are – with, with binoculars, and with wild opinions about every little movement you see is – is, is diseased!”, to which Jeff evasively answers, “What do you think I consider it – recreation?”.

Later Jeff himself states, “[…] that was pretty private stuff going on out there. I wonder if it is ethical to watch a man with an ocular and a long focus lens”. Even the protagonists cannot fully comprehend their own motives or the boundaries of what is permissible. This example once again demonstrates that there are no underlying biologically fixed mechanisms that can be measured medically or psychologically. Vigilance is adjusted and implemented according to the society and culture within which it is practiced. It can, according to the situation, at one point in time be assessed as legitimate and necessary, whilst at another as excessive and threatening. Literature, theatre, the media, and, in this case, film, are heavily involved, which is why, to reiterate, competencies won, developed and polished in the humanities, cultural studies and the social sciences are indispensable when exploring vigilant behaviours.

It has already become evident that our collaborative research centre does not take cases of institutional surveillance as a starting point. We work on the assumption that mutual observation (and, indeed, of oneself) happens more frequently and is of equal importance, for the simple reason that seeing, hearing, smelling and sometimes touching each other is ingrained in our basic biological equipment. It is free of charge, requires no or little technology, and can never be effectively turned off. It involves all the senses and is deeply ambivalent. This is because the boundaries between caring and controlling are fluid and reversible, motivations are often contradictory, and functions manifold. It is precisely for this reason, and because corresponding attitudes and processes– such as notions of duty, the swearing of oaths and ideals of civilian behaviour – have a long history, that our collaborative research centre encompasses so many different disciplines. Along with the fields of History and Literature Studies, these include Drama Studies, Ethnology and Legal Studies. In our individual subprojects and working groups, we deal with, among other things: the evaluation of whistleblowing; stories of fighting off the devil; the denunciation of luxury in cities in the late Middle Ages; the use of the senses in times of plague; and the investigation of prostitution in countries in which this type of bourgeois capitalist behaviour should not exist, such as in socialist Czechoslovakia.


[1] Obama, Barack: Remarks to the Business Council, Attempted Terrorist Attack in New York City, 4. Mai 2010, https://www.presidency.ucsb.edu/documents/remarks-the-business-council-2 [letzter Aufruf: 13. Mai 2020].

I need to go to the pub

How science can help refocus our attention to control impulsive consumption in times of a pandemic

Stefan Bernritter, Ilias Danatzis, Elisa Schweiger and Ko de Ruyter

The COVID-19 pandemic has significantly changed how people consume in the UK. One of the most concerning developments has been a dramatic surge in different types of impulsive consumption tendencies. For example, during the first wave of the pandemic in March 2020, pictures of empty supermarket shelves went viral as people engaged in stockpiling. As soon as lockdown rules eased in summer 2020, people started to flout restrictions in their desire to socialise. Extant social distancing measures were ignored, leading to overcrowded restaurants and pubs and packed beaches. Despite the government’s clear message that the second wave was in full motion and that stronger measures were needed to protect the NHS and save lives, pubs were flooded the night before the second lockdown by people having “one last pint”. Indeed, some investigators argued that this “last pint” lead to a measurable spike in COVID-19 infections in the following weeks.

These reoccurring patterns of impulsive consumption in times of crisis demonstrate how conventional interventions (e.g., information campaigns, fines) fail to achieve their goals. Given the dire consequences for society it is thus of utmost importance to understand why consumers engage in impulsive consumption behaviours and how to mitigate them.

Relevant scientific literature might provide some ideas. Research suggests that impulsive consumption simply is a powerful urge to consume. This urge stems from a lack of self-control to contain our impulse and delay gratification. Experiencing such an urge is more common than we realize. Try putting a freshly baked cake in front of you and not eat it for a couple of hours. You will notice that the cake will draw your attention more often than you’d like it to. Self-control is not a stable individual trait, but rather the result of two opposing psychological forces: our desire that drives impulsive consumption tendencies and our willpower to deflect this urge by using various strategies to divert attention from such tempting objects in our environment. In essence, if we lack sufficient willpower, we will effectively fail to resist the urge to consume, lack sufficient self-control and subsequently fall prey to impulsive consumption behaviours.

So, how do these insights help us design more effective interventions to curb excessive consumption? Importantly, our self-control can be augmented by situational factors. More precisely, whether we are able to control ourselves, is essentially driven by several cognitive biases that make us believe that a) it is necessary to stockpile or go out or socialise and b) that everyone else is stockpiling or going to pubs too. In effect, those biases are self-imposed. Successful interventions might consequently tackle exactly those biases by drawing our attention away from them. Indeed, clinical psychologists use a similar technique to treat anxiety and depression, called attentional bias modification (ABM).

ABM aims to modify automatic attentional processes. It is an effective treatment tool in reducing negative emotions and harmful behaviours such as anxiety or addictions, including compulsive buying – an extreme and pathological form of impulsive consumption. Different cognitive tasks are used to draw our attention to neutral or positive stimuli to avoid threatening or negative information. In a consumer context, ABM has been used to make salespeople more empathetic towards customer complaints.  Finding empathetic words in a puzzle led salespeople to take customers’ perspectives and respond better to their complaints. Similarly, ABM allows us to refocus our attention before or during a shopping trip or a pub visit so that we do not fall prey to these cognitive biases. This would boost our self-control and help us overcome our urge to consume excessively.

Potential ABM-based interventions could include a wide range of online technologies and on-site interventions such as short training sessions, messages, displays, staff communications, and simple design changes to facilities. For example, augmented reality (AR) and short app-based training sessions, such as app-based puzzles, could draw attention to the fact that plenty of food is available and stockpiling is unnecessary.  Such intervention could be shown to people while waiting in line to shop, book a holiday, or reserve a restaurant table to help further dissipate our self-imposed cognitive biases, thus reducing our urge to consume excessively.

While existing (failing) approaches to tackle impulsive consumption focus on information (campaigns) or punishment (fines), ABM-based behavioural interventions would tackle the root causes of impulsive consumption: the attentional, self-imposed cognitive biases people have. While ABM has shown very promising results in effectively changing negative behaviours in clinical settings[1][2], exploring its potential in campaigns to mitigate harmful impulsive consumption behaviours during the COVID-19 appears very promising.


[1] Hakamata, Y., Lissek, S., Bar-Haim, Y., Britton, J. C., Fox, N. A., Leibenluft, E., … & Pine, D. S. (2010). Attention bias modification treatment: a meta-analysis toward the establishment of novel treatment for anxiety. Biological Psychiatry, 68(11), 982-990.

[2] Kim, H. S. and Hodgins, D. C. (2018). Component model of addiction treatment: A pragmatic transdiagnostic treatment model of behavioral and substance addictions. Frontiers in Psychiatry, 9 (AUG), 1–17.

Attention Studies: a New Interdisciplinary ‘Discipline’?

Marion Thain

This website is a hub for our project that aims to build the interdisciplinary field of ‘Attention Studies’. Based at King’s College London, the project is bringing together a network of partners from across the globe. In this first blog post, I am asking what it would mean to think of Attention Studies not just as a multi- and interdisciplinary field, but as a new interdisciplinary discipline. This is in part a thought-experiment, but does also signal something of the ambition of the project.

Many of our academic disciplines have long and eminent histories. Some have shorter but equally esteemed lineages. My own Faculty of Arts and Humanities, for example, contains both the centuries-old discipline of philosophy, and newcomers from the past couple of decades such as Digital Humanities. In both cases, these disciplines have distinctive content-focus and also methodologies that are recognisable and more or less established.

What does it mean then to think about what we are calling Attention Studies as, potentially, a discipline? The knowledge project we are undertaking here has four main axes: concept, content, method, and infrastructure. It is by mapping our endeavours along all four of these axes that we are attempting to effect a small but perceptible shift in the tectonic plates of knowledge-formation.

The first axis refers to the work needed to carve out a common conceptual model and language for attention. How can we articulate a model of attention that will provide the  basis for a common conversation across the disciplines? How can we ensure we are using terminology in ways that translate across our fields, with shared understanding? This foundational work requires the project to draw from across the disciplines, to synthesise, and also to create.

The second axis refers to the various questions and issues that fall within the parameters of the field. This might commonly be a thematic taxonomy, but we are trying a different approach by drawing our taxonomy from the real-world problems we need the field to address. This approach is designed to reverse-engineer the creation of a discipline: rather than identifying a field, studying it and then thinking about what use that research can be in the world, we are taking the problems that need to be solved and allowing them to help determine the shape of the field.

The third axis is one of methodology: the multi-disciplinary methods we are bringing together around the focal content.  This is what we are looking to represent through the key methods and concepts page of this website, where we will be collecting indicative extracts from key texts from the various different disciplines that we are bringing together around this field. In addition to that mapping of the methodological field, we are running various projects that actively attempt to bring those methods into dialogue and to test out our ability to work across them in collaboration.

The fourth axis is one that considers institutional and disciplinary infrastructures. The ways in which we organise knowledge are not unchanging, but can sometimes feel slow to change, and this must be in key part because they are enshrined in the organisation of our institutional knowledge-spaces. Universities extol the virtues of interdisciplinarity to our graduate students, but when they come to apply for an academic position they usually need not just to fit squarely within a discipline, but centrally within an established subfield of a discipline. How can we encourage graduate students to undertake truly innovative work that crosses boundaries when we know this risks marginalising them on the job market? Our monograph publishers and journal editors also tend to structure their lists along disciplinary lines, and even when welcoming interdisciplinary work that interdisciplinarity is often between neighbouring disciplines rather than more broadly construed. Our funding councils, also, while encouraging interdisciplinary work still sit broadly within disciplinary areas.

How might building the field of Attention Studies act as an experiment in the creation of a space of intellectual inquiry that, while building on disciplinary expertise, genuinely sits outside of the usual disciplinary boundaries? Working out how to configure such knowledge-spaces institutionally is crucial both to ensuring that all disciplinary partners can meet as equals, and that we can supplement ‘centre and periphery’ models of interdisciplinarity, and ‘two (or three) partners’ models of interdisciplinarity, with modes of enquiry that can draw a wide variety of partners together on fresh ground. Such a vision aims not just to develop but to transform, and requires a change in our articulation of the relationship between the frames within which we operate and our intellectual ambitions.

Crucially, by opening up properly interdisciplinary spaces that become structurally rooted, might we have the potential to rethink the power dynamics and historical legacies baked into our current knowledge infrastructures? How might the experiment of building a new ‘discipline’ offer opportunities for shifting bias that is structurally encoded within our institutional frames? Embedding collaboration across greater disciplinary breadth (a foundational principle of this project) is already demanding fresh structures to enable that conversation, and is already opening up a receptivity to how things could be otherwise. The project’s radical interdisciplinary is fostering a self-awareness of disciplinary histories and paradigms that I hope will send out broader ripples. The ambitions of this project are centrally, of course, in relation to the study of attention, but I hope it will also help establish new paradigms for cross-disciplinary working and for understanding and reframing the dynamics of history, power, and culture within and across institutional knowledge structures.