Interview by Emma Briant
Emma Briant: Cambridge Analytica often pitched their methods as a contemporary digital marketing firm – if you were sat next to Alexander Nix at a dinner party, how would you convince him your book explains what’s really going on?
Robert W. Gehl: First of all, should we even trust that the dinner party is actually not a cover for some nefarious Nix scheme? Wasn’t Nix famous for such tricks? I’d be worried about what Nix is trying to convince me of! And given his duplicity – such as his claim to a German audience in 2017 that Cambridge Analytica got their data through legitimate means – is it even possible to convince him of anything?
Sean T. Lawson: I think I’d start by agreeing with the general premise: Cambridge Analytica was a contemporary digital marketing firm, not unlike many others. And therein lies the issue. The unique combination of big data, machine learning, and microtargeted messages that CA and so many others are adopting is an innovation, but one with potentially negative impacts.
Robert W. Gehl: Agreed. I think I would point to your work, Emma, among others – whistleblowers such as Kaiser and Wylie, journalists such as Cadwalldr. Based on that work, we can say we already know what’s going on. What we need to do is further theorize it and look for points of pressure to fix the problems of manipulative communication. I think that’s the approach that all of us – you, us, people like Joan Donovan – are taking now.
Sean T. Lawson: I also don’t know that we’d claim that Social Engineeering accounts for the entirety of what’s “really going on.” What we’re doing is offering another way of thinking about what’s going on, one that we hope helps to connect the disparate contemporary pieces – like CA, but also Russian interference operations – with one another and with a broader, historical context.
Emma Briant: You post the question of how “crowdmasters, phreaks, hackers and trolls created” a new form of manipulative communication. But how new is it really? You trace a long history, so what is new and distinct in what you see today?
Sean T. Lawson: Trying to sort out what is new and unique in the contemporary moment was one of our main goals for the book. As you note, many of the practices we see today have been part of public relations, propaganda, marketing, or hacking for a long time. Part of our frustration was that so much of the current discourse treats what’s happening with Cambridge Analytica or the Russians as unprecedented. It’s not. But it’s also not just “the same old thing,” either. So, we think what’s new here is, first, the unique combination of new methods of data gathering, analysis, and targeting that have the goal of allowing social engineers to have societal-level impacts by engaging people at the individual or small-group level. Second, we think the ability to shift fluidly and quickly between the interpersonal and mass forms of social engineering in a given campaign is also an innovation.
Robert W. Gehl: Right – that fusion of interpersonal con artistry techniques with mass societal engineering desires is what’s new. That’s what we mean by “masspersonal social engineering.” This concept is a subset of recent “masspersonal communication” theory that suggests that the old mass/interpersonal divide means far less in the digital age. When a tweet @ someone can also be a public performance in front of thousands, and when an interpersonal interaction could be recorded and posted online, it’s harder and harder to draw the line between interpersonal and mass communication. Likewise, it’s hard to draw the line between interpersonal con artistry and mass propaganda.
Emma Briant: Scholars use a lot of different terms to describe the practices your book aims to help us understand. You avoid using the term contemporary propaganda or influence and instead discuss manipulative communication. Can you explain your choices?
Robert W. Gehl: We’re reacting somewhat to conceptual confusion happening right now. For example, Benkler, Faris, and Roberts’s excellent book Network Propagandadoes a fine job tracing how disinformation flows through right-wing media, but one thing they do is specifically bracket off interpersonal con artistry as not-propaganda. This makes sense due to the history of propaganda, but not as much sense when we start to think how manipulation might be both highly targeted at individuals at one moment, and then scaled up to a population level the next moment.
Sean T. Lawson: We chose “social engineering” over those other terms for a couple of reasons. First, we wanted one term to cover this combination of both the interpersonal and mass forms of manipulation that we were seeing come together. A few other scholars and security researchers had floated the term “social engineering,” which we thought was clever. But the more we talked about it, the more we became convinced that it wasn’t just a clever one-off but that there was really something to the use of that term that not only described both aspects of what we were seeing but did so in a new and interesting way. Second, we also felt like propaganda or influence did not fully account for the active and malicious manipulation of what we were trying to describe. Propaganda, to us, implies spreading biased information promoting one’s own side. Influence, though it can be malicious, is ubiquitous and mostly innocuous. We think that social engineering better captures the active attempt at using communication to manipulate a target in a malicious way.
Robert W. Gehl: As for other terms, etymologically, “influence” comes from astrology, meaning a fluid coming from the stars that shapes our lives. I personally prefer “manipulation” – coming from the late Latin manipulare, leading by the hand. I think of it as a more human-centric and less metaphysical capacity to shape environments. This links up well with “social engineering” – engineering comes from gin, a trap, net, or snare (including verbal traps and snares) and reflects the desire to practically apply knowledge to shape a situation.
Emma Briant: Your approach bringing together histories of hacking and deception is truly original. Was there a Eureka moment when you felt this idea coming together? Can you explain how the idea emerged from each of your work and unique perspectives coming together?
Sean T. Lawson: At the time, both of us were working together at the University of Utah and talked about our respective projects regularly. Rob had just published a book about the dark web and was well into a new project originally just focused on the history of hacker social engineering. I was just finishing up a book on cybersecurity discourse in the United States. At the end of that book, I argued that political warfare, propaganda, and disinformation were more the reality of cyber conflict than the as-yet hypothetical “cyber Pearl Harbor” infrastructure attacks that get so much attention. But the attempt at precise targeting and en masse seemed new and my initial efforts to explain it using analogies to the so-called “precision revolution” and airpower in the U.S. military felt unsatisfactory.
Robert W. Gehl: For me, I have long been intrigued by the concept of engineering – as I mentioned above, it contains the term gin (snare, trap). I was initially thinking of looking at the genealogy of software engineering – I had been collecting material on that since my first book. But in the course of writing Weaving the Dark Web, I came across people talking in hacker forums about “social engineering” and was hooked. I thought of social engineering to be a pejorative term for government programs, not as a fancy way of saying con artistry. So I dug into hacker social engineering. My initial plan was to write about hacker social engineering only, but in conversations with Sean, he convinced me that the more important move would be to trace both the older social engineering of the early 20th century with the hacker conception. In turn, I convinced him to join the project and bring his cyberwar discourse expertise to bear on it.
Sean T. Lawson: So, really through ongoing discussion of our respective projects at the time, we came to realize that there was overlap between hacking and propaganda techniques. As we looked around, we didn’t see anyone else taking that approach, so we decided to see what would happen if we did. We think that the result is a valuable new way of thinking about the relationships between these practices.
Emma Briant: How optimistic are you about our future in this age of masspersonal social engineering? Can we escape its grasp long enough to hack the system and build something better?
Robert W. Gehl: I love this question, because my current research addresses it head-on. I’ve been a longstanding advocate of quitting corporate social media, since its whole purpose is to have us produce ourselves as consumers through profiling and then deliver our profiled selves to advertisers. Facebook/Meta makes for a fine vehicle for masspersonal social engineering! But instead of advocating quitting social media – since it does give people a great deal of pleasure – I’ve been studying ethical alternatives, like Mastodon. If we’re looking for people who are hacking the system, look to the people coding Mastodon and the rest of the fediverse. You’ll note rather quickly that targeted advertising is not part of that system! And that helps make is less attractive to the sort of manipulations we’re talking about.
Sean T. Lawson: We can hack the system to make something better. We talk about options for that at the end of the book. However, I’m not optimistic that we will. That is what is so frustrating about this situation. Sensible privacy and data collection regulations would go a long way to thwarting the use of big data in masspersonal social engineering. Addressing the problem of dark money and front groups in politics is also essential. Doing more to shore up our cybersecurity to make the penetration and theft of information that can subsequently be weaponized is also possible and essential. Unfortunately, at the moment, we’re not seeing nearly enough progress in each of these areas. Too many actors, from corporations to social media platforms to marketing firms to politicians who rely on them have an interest in continuing to allow masspersonal social engineering to take place.
Robert W. Gehl: We joke that Sean is Eeyore. But I am afraid he’s right. I’m looking at home-grown, ethical FOSS solutions but they are not going to do the job on their own. Global regulations of what you, Emma, aptly call the “influence industry” have to happen.