In this interview, Dr David Young, a King’s College lecturer, speaks to Dr Iain Overton in an interview that explores the complexities of OSINT in conflict situations. He traces its military origins, emphasising its transformation with digital technology and social media. Young highlights OSINT’s role in shaping public understanding of conflicts, like in Ukraine and Gaza, and discusses the risks of misinformation and manipulation. He points out the increasing reliance of traditional media on OSINT, cautioning against the overshadowing of moral and human aspects of war. Addressing AI’s impact, Young stresses the need for critical analysis and verification in the face of sophisticated AI-generated misinformation.
——————————–
Iain Overton: Thank you for joining us, David. Could you start by introducing yourself.
David Young: I’m David Young, a Lecturer in Digital Media and Culture at King’s College London. My research and teaching focus on a range of topics including information design, critical data visualization, and the military history of computing. My current research interests focus on the politics and aesthetics of information, specifically relating to how truth and evidence are represented digitally in reports on war.
Iain Overton: Today’s conversation centres on OSINT and its potential weaponization. To start, could you provide some background on your military research and how data has been institutionalized by militaries?
David Young: My journey to this field has been quite circuitous, originating in visual arts and design, where I became interested in the history of cybernetics and computing. In the US after World War II, there was a marked interest in the concept of information, and we can see a peculiar relationship between the military and avant garde art at this time, with cybernetics and the digital computer serving as something of a common ground between two very different cultural spheres. So I then began to focus more on the military context, examining how war came to be increasingly and explicitly understood as the management and representation of information. There’s an intensification of this idea following the US invasion of Vietnam, where data takes on an unprecedented role in American military strategy, ultimately intensifying the violence, both in aerial bombing campaigns and the counterinsurgency on the ground.
Iain Overton: In the context of current conflicts like in Ukraine and Gaza, there’s a noticeable parallel informational war. Groups like Bellingcat and Air Wars are at the forefront of what’s known as OSINT today. Could you shed more light on OSINT and its evolution?
David Young: OSINT, or open-source intelligence, involves compiling information from diverse decentralised sources to identify correlations or coherencies. These sources are fragmentary, they might include eyewitness accounts, photos, or video posted on various social media platforms, reports on local news websites, satellite imagery from Google Earth, and so on. This task of compiling is obviously quite challenging due to the distributed nature of these fragments. Especially regarding conflicts like those currently ongoing in Ukraine or Gaza, the intensity of the informational war means that the task of sifting through sources and validating claims is particularly challenging.
Iain Overton: With the rise of OSINT, do you have concerns about how it might influence the understanding of conflict or its potential for manipulation?
David Young: OSINT is essentially a methodology that demands a high level of expertise and technical literacy. Recent conflicts have illustrated how OSINT’s visual rhetoric can be utilized to misinform and disinform: the aesthetic of ‘forensics’ is a vernacular that’s become increasingly recognizable and expected by the public, gaining almost an aesthetic ‘truth value’. However, there’s a significant risk that this visual rhetoric begins to almost stand for itself as representing certainty. For the viewer, the meticulous work that goes into compiling and analyzing these fragments of information can fall into the background, and the seductive design aesthetic itself becomes the message, in a sense.
Iain Overton: How has OSINT influenced traditional media outlets like the BBC or the New York Times?
David Young: Well, we’ve seen major media outlets adopting OSINT methodologies and aesthetics in recent years, often explicitly framed as efforts to ‘debunk’ or ‘verify’ suspect claims or narratives that have gone viral online. We see them using advanced tools like satellite imaging and 3D modelling software to, quite literally, recreate the perspectives of multiple eyewitnesses of a particular event. These perspectives are important and valuable—it allows for an event to be seen from the multiple angles gathered from whatever open sources and media fragments that have been used in the investigation.
Iain Overton: However, is there a concern that such a focus on forensic analysis might overshadow the moral and human aspects of conflict, potentially sanitising the horrors of war?
David Young: The real challenge in OSINT is not to let the power of its visual rhetoric substitute for the question of what counts as evidence and how this is compiled. Meticulous and slow-paced work is essential in assembling media fragments to create a coherent narrative, or indeed, point out where incoherencies exist. This approach is exemplified by organizations like Forensic Architecture, Bellingcat and Air Wars, which have gained authority in the field due to their thorough and methodical analysis. But I think taking this pace has only become increasingly important in the current context–as we’ve seen in recent years, there has been a change, I’d say an intensification and acceleration as well as an individualisation, in the role social media plays in framing and targeting the narratives in these informational wars.
Iain Overton: With the widespread use of OSINT reports on social media platforms like Twitter, how is OSINT’s visual rhetoric being co-opted, and what are the wider implications?
David Young: The familiar visual rhetoric of OSINT, established by recognised organisations and activists, is now being more widely used by various actors, sometimes incorrectly or spuriously, and then contributing to the ‘fog’ of misinformation. This phenomenon is particularly prevalent on social media platforms, where the aesthetic and discourse of OSINT has become a trend in itself—a quick Twitter search for “OSINT” demonstrates this quite quickly.
Iain Overton: Do you think the spread of misinformation in OSINT is orchestrated, or is it more of a reactionary phenomenon?
David Young: It’s a crucial question, I’d suggest that the proliferation of misinformation we can see on Twitter, TikTok, Telegram etc stems from a mix of both orchestrated campaigns and distributed phenomena. It’s important to recognise that, in some respects, there’s actually no need to actively orchestrate it. The design of these platforms fosters the emergent conditions in which misinformation can be produced and circulated by users themselves, without any need for a central governing strategy. It’s an affordance, or even a feature, of the platforms. However, orchestrated campaigns which have the hallmarks of being led by state actors or other organized groups are also a significant concern, as has been reported recently in relation to Gaza and Ukraine. There’s nothing new about state-led psychological operations of course, but there is a new intensification and individualisation of these operations made possible by the design of the platforms.
Iain Overton: For consumers of social media, what advice would you give to verify the credibility of OSINT reports?
David Young: It’s crucial for users of social media to critically ask what counts as ‘evidence’ and ‘truth’ when examining OSINT reports, and go beyond the often very compelling and seductive design aesthetic. There are simple questions to ask–who has conducted the investigation, and are they who they say they are? Also, the evidence used in the investigation—evaluate cited usernames and double-check the authenticity of sources. Also, the range of tools commonly used in OSINT, like reverse image searches, can be employed to check if imagery is attributed correctly. Awareness of reused images and narratives from different conflicts is also important in recognizing misinformation—again, this is an ongoing issue we can see in reports on Gaza and Ukraine.
Iain Overton: Considering the advancements in AI, how do you foresee its impact on OSINT and the spread of disinformation?
David Young: The rapid evolution of AI, particularly in image generation, certainly will pose new challenges. The paradigms of generative AI are changing very quickly, so it’s hard to know where we’ll be in even a year’s time. Previously, AI-generated images had identifiable glitches, but newer models are becoming more sophisticated and harder to discern from real images. This progression of course complicates our ability to verify open sources, but for investigators at least it should make the chain of trust from eyewitness to reporter much more important.
Iain Overton: In the face of these AI challenges, how can we maintain a positive outlook on the future of OSINT?
David Young: There’s a natural inclination toward accelerating the speed of reporting, but for me, optimism in the realm of OSINT lies in resisting this, instead adhering to slower and more methodical approaches that are transparent about ambiguity and explain their processes. This makes possible a more deliberative approach to information consumption and analysis—it’s a vital temporal counterpoint, especially in the current context of dominant rapid, real-time social media streams.
Iain Overton: Thank you, David, for your valuable insights and time.