clock menu more-arrow no yes mobile

Filed under:

How bots amplify hoaxes and propaganda on social media

Zignal Labs CEO Josh Ginsberg says consumers and companies need to know what’s bot-generated and what’s not so they can make informed decisions about things like elections.

A box-shaped robot in front of a castle
“We are robots. We come in peace. Or not.”
Matt Cardy / Getty Images

On this episode of Recode Decode, hosted by Kara Swisher, Zignal Labs CEO Josh Ginsberg says his media intelligence company has started detecting “massive amounts of bot activity” on social media. In everything from political elections to the debates over Roseanne Barr’s and Samantha Bee’s controversial statements, bots are insinuating themselves into the discourse and provoking humans into being more outraged. He talks about what businesses and regular people can do to better gird themselves against these bot attacks.

You can listen to Recode Decode on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts. Below is a lightly edited transcript of the full conversation.

Kara Swisher: Hi, I’m Kara Swisher, editor at large of Recode. You may know me as the lifelike robot that replaced the real Kara Swisher three years ago — oops, I shouldn’t have said that. But in my spare time, I talk tech and you’re listening to Recode Decode from the Vox Media podcast network.

Today in the red chair is Josh Ginsberg, the CEO and co-founder of Zignal Labs. It’s a media intelligence company that helps its consumers see what people are saying, and what stories are emerging from across the Internet. He previously worked in politics and public affairs directing campaigns for Arnold Schwarzenegger and Mitt Romney and working on the Strategy Department of the Republican National Committee. Josh, welcome to Recode Decode. There’s so much to talk about.

Josh Ginsberg: Thank you so much for having me.

Okay. We have also some information about some new products you’re doing. Now, let’s go about your background a little bit. I would like people to know how people got ... You were just a political guy, right? You’re just a ...

Yeah. I have a little bit of ...

In the old Republican party, I guess.

Exactly. The old Republican Party. I have a little bit of probably a nontraditional background to becoming a CEO of a technology company. So yeah, my background is in politics and public affairs. I’ve served on three presidential campaigns. My first one was in 2000, on the 2000 Bush campaign. I actually started as an intern, and it was a summer job, which was great, down in Austin, Texas. It was in the strategy department, which was fascinating for me because one of the biggest things that we were doing was seeing how can we get then-Governor Bush’s message out there — which we’ll talk about how this is relevant to what we’re doing now. It was such a great experience. I actually managed to convince my high school to let me take my first semester of senior year off to stay down in Austin, Texas to work on the campaign. Then I guess you could say I was bitten by the political bug from there.

And, so you did those campaigns, and they were just sort of traditional. What did you do? The things you did, just your basic ...

Yeah, I mean at that point it was really ... I mean, by the way, we had and we actually put it on our signs. The fact that we put the website out there was like a really big deal.

At the time, yeah.

So, if you think about it from just how media has evolved in just less than 20 years, that’s really significant. I’d say that’s a pretty big theme that I saw throughout my careers in each step. After working on the 2000 Bush campaign, I worked on the 2004 Bush campaign, and eventually made my way up here to California to be Governor Schwarzenegger’s political director, which was sort of another great lesson in terms of how to use the media. He’s probably one of the best communicators I’ve ever seen.


So, consistently throughout my career, that’s always been a really important part of what we’re trying to do. How do you get your message out? How does it move throughout that media spectrum? And eventually reaching the public.

Right, and how it gets there. Like one message gets iterated, and how it gets iterated. And there used to be traditional methods of doing that. Then you worked for Mitt Romney?

I worked for Mitt Romney. I was a national field director in 2008. So, the one that he didn’t quite win.

No, didn’t quite.


Quite at all.

You can blame me.

He was the first place where you did see memes happening. Remember “binders full of women,” and all kinds of things.

Totally. It was actually really interesting. I was talking to someone about this the other day. We were at a point in that campaign where we were still having arguments of, “Hey, what’s a better platform for us to get our message out? Is it Myspace or is it Facebook?” So, just to give you a sense, I mean, that was a decade ago and now ...

Sure. It wasn’t Myspace, as it turns out.

Yeah, turns out, we bet on Facebook. So that was smart.

Yeah. Good. But also, I remember there were a couple of different points when he had that recording [in 2012], when he talked about people — like there was a couple of times where he really got bitten by internet memes and viral, virality. And Obama benefited from it on the flip side.


I don’t think there was error by the Obama people at all in that, from a digital view.

No. Look, they did a great job using that media landscape to their advantage. And, if you think about how things have evolved, I mean, the difference between ’08 and 2012 was huge, too. I mean, thankfully, I was out of politics by that point, but that’s something that really consistently, consistently occurred.

Right. So, and that [2008] was the Palin-McCain.

Yeah. That was Palin-McCain.

Right, who was an internet phenomenon, then was not.


Same thing, like that “SNL” clip that went around had a lot of impact, from what I can tell on the campaign.

A huge amount of impact.


Then if you think about it, there was new ways of getting out your message at that point. I mean, how do you use the internet to your advantage? How do you make sure that that coalesces in the right way with more traditional types of media, like newspapers? I mean, that’s when you really started, that landscape [was] starting to shift even more.

Even more. And we’re going to talk about that more. So, you went off and did this company. You got sick of politics? Sick of talking to politicians.

That’s the nice way of saying it, yeah. I mean, after politics, I went into public affairs. So, I worked at a large public affairs firm. It was owned by one of the massive holding companies. I started two public affairs firms of my own where our clients range from the boutique nonprofit to the Fortune 10 companies, and really everything in between.

The bottom line, and I think where our company, Zignal Labs, really comes into play is every single time in every single step along the way, we faced the exact same issue. Now the landscape looks different, but the issue is the same, which is: How can I hear what is being said about myself out there, but most importantly take action on those things?

So, when we were doing everything “the right way,” in order to hear what was being said about us, we’d have 10 20-year-olds in a back room constantly doing Google News searches with a wall of TVs in front of them, and later on with Tweetdeck. Every eight minutes or so, I get a new email saying, “Hey, here’s a new story that mentions your CEO, or your company, or your candidate.” My question was always the same, which is, I mean, “Hey, that’s great, but is it moving? Is it trending? Is it positive? Is it negative? Most importantly, what type of action should I take?”

Right. Just not, “Just because it’s there.” First it’d be, it was just, “It’s on there.”

Right, exactly. So, when we started Zignal Labs, I had two great founders.

Did you have to use the word “Labs?” That seems science-like.

Yeah, I was very a Silicon Valley story where you ...

Yeah, you need a lab.

It was either that or taking out a vowel, and we liked our vowels.

All right. Okay. ZGNL.


That would have driven me crazy.



Us, too.

Yeah, but you would use the Z, but go ahead.

Although we might own the URL, I don’t know.

Okay. All right. So, you created this to make it more — able for people to use in a dashboard format.

Exactly. And we really targeted three things. We said we have to get really good at, in this new type of media environment, in order to help people make better decisions and execute on those actions. The first one was, we said, “Okay, look, we have to look across the entire media spectrum. So, we have to bring everything from social media, traditional media and television.” One of the things that always fascinated me and was really important for us as a company is to be able to see how stories traveled.

So, for example, a couple of years ago, we did a study where we found one of the most powerful news cycles were stories that started in social media, bounced up into local television, bounced up into national newspapers, and bounced up into national television. And, by the time it gets to national television, I mean, that story is baked, there’s not a lot you can do about it. So, that’s a really important element. Now, that’s changed just in the past couple of years and we’ll talk about that.

The second thing is we said, okay, look, we’ve got to see all this happening in real time because everything’s moving that fast. We have to build an infrastructure for our platform where we can feed all of those data streams down a real-time processing pipeline to analyze it in all of the relevant ways.

The third thing we said, all right, then we’re talking about millions of data points coming in really fast. We got to really nail how this is visualized. So, in a split second, even if there are millions of data points, the end user can say, “Okay, this is what’s important and this is what’s not.”

So we went and we hired a guy from the [NASA] Jet Propulsion Laboratory who is doing data visualization for the Mars rover lander. His job was taking data off of Mars and visualizing it for NASA. We said, “All right, let’s take that same concept and now do that for media data points.” So, that was our approach.

So you got the rocket guy.

So we had to get a rocket scientist.

What you have is like all kinds of points of information and data to make a cogent argument about whatever’s happening.


Or a cogent snapshot of what’s happening. But, there’s lots of dashboards. We have a social media dashboard, we have this, this is working, this is not working, these headlines are working, this is not. But, this is more sentiment and everything else, correct?

Yeah, it’s everything. I mean, there are dozens of ways of slicing and dicing. There’s probably hundreds now. You can really see what does the environment look like, and how do you take action on that? So, we work a lot with large corporations, large enterprises, to be able to see that. We actually started off in politics, and then sort of transitioned.

Which has a twitchy effect, which ... but everything’s twitchy now, right?

Yeah. I would say that is the world that we live in now.

That is the world we live in.

For sure.

So, you do this for corporations, and it’s just media intel. Like, this is what’s trending. But it’s not more than that, is like this is what’s trending. It’s this is the impact of this, this is who’s doing it, this is where it’s coming from, this is the sentiment about whether it’s negative, positive or neutral.

Exactly. So, it’s what’s accelerating, but then it’s also, “Okay, well, are my key messages getting across? What’s my brand health, how’s my CEO being perceived? Or are those messages resonating? Who are the best people to go out there and be influencers? Who are the other influencers I need to be aware of?” So it’s actually very broad in terms of all the different types of use cases you can have, and those get customized for those corporations.

For those corporations, what they’re supposed to do with it because they’re aware of these things but haven’t been paying attention as much as they should have.

I would say one thing that I learned in my political days is if you can’t measure it, it’s almost impossible to manage it. So, if you take that same concept for media news cycles, and corporate communications and marketing teams, it’s the same thing. What’s really interesting is actually expanded beyond communications teams and marketing teams. Now it’s much more part of the C-suite, the executive teams as well.

Right, but they have to, because it affects employees, it affects the stock market, it affects investors, it affects everything.

Yeah. I mean, I was at a large company back in New York a few weeks ago, and the communications person brought to this meeting, the chief risk officer, the general counsel, the person from marketing. If you think about it, a few years ago, if the head of comms went to someone from risk, “Hey, come to this meeting.” They’d say, “What are you doing? Go write me a press release.” But now it’s so vital to a company to have those on.

Right, and so that’s why we’re going to talk about this new product that you have that is about, because it’s not just managing your group, because there’s lots of different negative news articles or something happens that’s bad for a company, or a bad quarter, and they manage that. They try to get the best messages out and stuff like that, but you started to look into the area where of misinformation — not misinformation, disinformation.

Yeah, it’s a dis.

A dis, not a mis, they’re not just saying something stupid. Talk about this new product that you’re doing because I think that’s one thing. People have to have those capabilities going forward, no matter what. The first ones you talked about.

For sure. I mean, that’s sort of tables stakes nowadays. One thing that we saw ... So in the beginning of this year, we started seeing some really interesting anomalies in our data. By anomalies, I mean spikes in volume that didn’t necessarily make sense. Or, “influencers” — we said, “Okay, why is that person considered an influencer?” We gave this data to our data science team, and they came back and they said, “Look, these are actually synthetic mentions, these are artificial, these are not real people.”

So, we took a step back and we said, “Okay. Well, what’s what’s going on here?” What we saw was there was massive amounts of bot activity that were impacting corporate brands, corporate reputations, it had an impact on market cap, all these other things. When we dug in even deeper, we saw these were all coordinated. To give you a sense of the magnitude, we literally have not found a company in the past six months that have not gotten hit with a major bot attack. So, those same forces ...

This is separate. Let’s make this clear. People have been concerned with hack attacks, data attacks, all kinds of things like that. This is different. Explain what that is.

This is for sure different, although in some ways related because they’re coming from similar types of nefarious actors out there. But what this is, people are spinning up fake accounts or fake sites, and they’re using that to amplify false news stories or negative news stories about a corporation. They’re using that in order to really impact the bottom line of the company, the company’s corporate reputation.

So, we found those same forces that were impacting our politics. It’s been pretty well publicized. It’s now happening in the corporate world in a really major way. Time and time again, as we go to companies, they say, “Well, I guess this actually makes a lot of sense. We had no idea it’s happening.” It’s something that everyone really needs to be made aware of right now.

All right, and so explain what happens. Give me the scenario, and then in the next section we’ll talk about what that means, and we’ll talk more deeply about it, but very briefly, what does that mean? That something happens ...

Yeah, so let me give you an example from one of our customers here. We had a customer who, a false news story, this was a couple months ago, was put out about them that a bunch of their customers were leaving and going to a competitor. What we saw was a bot network spin up, so a bunch of fake accounts. They spun it up and they started amplifying it until that story started trending. So, it was a false news story from a random blog. Got it to start trending. The mainstream media then saw it. They wrote traditional news articles on it, blogs, things like that. Then Wall Street saw it. Wall Street said, “Okay, something’s going on here.” And they actually saw their market cap drop several billion dollars.

So, there was an opportunity to make money writing that.

There’s an opportunity. We’re seeing [things] like that happen time and time again now.

So this was a fake news story that was started and then amplified. A bot network works in that they started attacking it, and then others start to attack it.

And then others start to attack it as well.



But the original bots are attacking and then they have support bots, right? Is that correct?

Yeah. A lot of times — and it depends on the type of bot attack. If you look at the motives, that’s always a good place to start. Sometimes it’s about corporate reputation. How can you hit that corporation’s reputation, that can impact all the things we’ve been talking about.

The other one is more around cultural weaponization. How can you take very specific points in our culture? So, for example, when Roseanne made that terrible tweet ...

Right, about Valerie Jarrett.

Yeah. More than 60 percent of that conversation was being amplified by bots.

So it seemed noisier than it was.

It seemed noisier than it was. It spun people up. The goal of a lot of these bot networks is not for it to just be bots. It is to bring humans into the conversation too, so they can start to see that, it’s to sow discord within the media landscape.

Then you also see examples like the stock price issues that we talked about, and then you see it in our politics, too. So, once you start looking at those four motives, it starts to make sense what these malicious actors are doing.

We’re here with Josh Ginsberg. He is the CEO and co-founder of Zignal Labs. It’s here in San Francisco. He does media intelligence, which helps companies see how they’re being perceived on social media. A lot of people do that. A lot of companies do that. You’re focused now more on nefarious players. It’s obviously a growing problem in politics. The Russians obviously used the tools of social networks really well.

In fact, people always think they hacked them. They didn’t hack them. They used them properly. Now these bot networks are doing that.

We talk a lot about bots in politics and the ability to swarm and to create a lot of faux-outrage and to create news that doesn’t exist and things like that, but it’s moved into the corporate space. Do you think companies have a sense of this, that they are being attacked?

For the most part, no. Because it’s really impossible to see if you don’t have technology to see it. It just looks like natural conversations. Time and time again, when we go to companies and we show them this data and these analytics, they’re pretty surprised. Once they have that awareness, it really starts to make sense. If not, you’re responding to things and you’re using resources on elements in the media that just really are not that important. It can have a real impact on a company.

That you get reactive.

Very reactive.

Or you do something or you make decisions and stuff like that. Let’s talk first about the two culture ones, Roseanne Barr and Samantha Bee. Those are two opposite sides. Same thing.

Same week.

Same week. Same thing.


Explain what happened in each of those cases.

In the first instance, you had Roseanne Barr. Roseanne put out a terrible tweet about Valerie Jarrett. What you saw almost instantly were bot networks wake up and they started amplifying what Roseanne said. What’s interesting was, it was kind of on both sides of the issue. There was one bot network or one side of the bot networks that were saying, “This is terrible of Roseanne,” etc., etc. There was another side that was saying, “Hey, you’re being too hard on Roseanne. The liberals are getting too upset.” You have both these things. What ended up happening was, real people then join the conversation because they were outraged.

On either side.

On either side. Really the goal there was to sow discord within Western culture.

Who’s that? Who are those bots?

Well, sometimes we’re able to track those back. It depends on how sophisticated the bot network is. In some cases, you can’t. It’s more difficult.

So, they wake up. Who wakes them up?

Usually there’s some sort of nefarious actor who’s controlling these networks behind the scenes. Sometimes it’s a group. In politics, you hear about this a bunch up on the Hill, that they’re looking into various agencies overseas that do this. It’s coming from a lot of different angles. Again, some of this is economically motivated. Some of this is culturally motivated. In this case, it was cultural.

The attacks on Roseanne and then Samantha Bee when she said the terrible thing about Ivanka Trump, which is also tasteless — not nearly, not the same level of ...

Both not advisable.

Not advisable, but not as bad as what Roseanne did. The reason for that, then, is that humans were justifiably upset on both sides of that, but it creates a ginning-up effect.

It’s a ginning-up. In fact, it forms this poisonous environment within our public sphere. If you think about it, these people who are doing this are not necessarily looking at it in that moment. They’re playing the long game. “Okay. How can we poison this? How can we make it cloudy, these issues murky?” When they do go out there and do something that’s more economically motivated to impact our democracy, our politics, it sort of has the waters all muddied. That discord that they’re sowing, to use that term, it really has a major impact on how we digest the news.

How we trust. And what we trust.

How we trust. Yeah. There’s some really interesting statistics in terms of trust.

Your business is getting companies to pay attention and to use your technology to do that. Give me an example of ... You were talking when we met yesterday about Harley-Davidson, for example.

Yeah. There was a really interesting situation that happened last week.

Harley-Davidson, background, said they’re not going to make as many motorcycles in the United States. Donald Trump was their best friend and now he’s angry at them and has been tweeting, not a bot tweet, about them, attacking them quite heavily.

Exactly. What we saw was a tweet came out that “quoted” — and I’m putting the quote in quotes — that Harley-Davidson’s CEO basically called the president a moron. All of a sudden what you saw was that just go crazy. It went viral. It started trending. We saw at one point more than 70 percent of that conversation were bots. When you traced it back — and actually, to Twitter’s credit, they’ve removed the initial tweeter from the platform — but it wasn’t a real person. Someone created a fake account. They made up this fake quote and then you could see that they created these bot networks that all of a sudden started amplifying this.

If you think about it, I thought that was a real quote when I first saw it. I think almost everybody thought it was real quote when they first saw it. You’re seeing it over and over again and it starts trending. Then the president starts to respond to those things and all of a sudden this spins out of control. That impacts Harley-Davidson. That impacts international trade policy. It impacts everything around President Trump. There’s a lot of reasons why these actors really want to start playing in the space. If you’re a corporation and you have no idea that that’s what’s happening, you’re in a major disadvantage.

Right. Right. You have Harley-Davidson. Give me another example of another ...

Here’s another example. AMD back in March. The chipmaker. A small security outfit out of the Middle East came out with a blog post basically saying, “Hey, there’s flaws in AMD’s chips,” which, by the way, is not how you expose flaws. There’s a whole process. You go to the company. None of that happened. They did that. What we saw was a bot network start to amplify that to a point where then ...

What does that look like? What does an amplification look like? “Look at here. Look at here.”

Picture it this way. This blog post is basically sitting on a corner of the internet. All of a sudden, a bot network gets spun up and it starts retweeting it and it starts pushing it around to all these different platforms until more people start to see it. It gets more exposed within mainstream news reporters. More stories get put out.

Now, to AMD’s credit, they actually — and folks around them — called it stock manipulation. They say, “Hey, this is manipulating the company, and that’s false,” which is the right way to handle that, but that does have a hit. People did hear, “Hey, there is a flaw with AMD.”

Now the really interesting thing is, we started to create a database of all these bots. Every time we see a bot spin up ...

Where they spin into action.

We actually already had this bot network in our database and we got an alert the split second that it started to occur.

This group of networks, bot networks.

Correct. This one has been going from technology company to technology company taking negative stories and amplifying them to really start weakening the industry, weakening those companies, really hitting the reputations of those brands.

Then you track them and then you can tell the companies this is bullshit.

We tell the companies, there’s situational awareness. There’s offensive measures you can take. There’s defensive measures. Calling it out is always a good thing. You can make decisions in terms of how much you want to expose it.

Sometimes it makes more sense to not say anything. There’s also situations where companies start seeing [something] over and over again, and they freak out and they make this big announcement, “No, this is not true.” Well, sometimes the only reason people heard about that is because of the company’s announcement.

Company made the announcement. Right. Right. So talk about this concept, because here you are, you want to give people this intelligence about what they’re doing, but at the same time you’re saying, it’s pernicious, essentially. It’s like getting a virus. You just can’t get rid of it.

In the old days, there were whisper networks. There’d be whispers about a person, a politician, a company, an issue. Then sometimes it would make it into the mainstream by legitimate news organizations doing investigations or whatever of any of these things.

What’s the difference now, when you’re doing media intelligence, between now and then? Because now you don’t need ... It moves much faster and has a similar impact.

Well, you kind of hit the biggest thing, which is things move so fast now that you don’t necessarily have time for that fact-checking. It’s happening across every platform that exists, basically instantaneously. A coordinated attack can move very quickly. If you’re a company, your job is to react. Your job is to clamp things down. If you don’t know where it’s coming from, you’re at a major disadvantage.

I think when we’re talking about this, sort of in the public now, we’ve been talking about a lot in relation to politics and elections and things like that. We really haven’t been talking about it in the context of, how does this impact corporate America? And that’s what’s really important to do.

Talk a little bit about that.

I think this is one of the biggest things that corporate America needs to be aware of now. Those same forces are really impacting the markets. They are impacting corporations. What’s going to happen when they start making major company decisions based off that? They need to know where it’s coming from. They need to know what the motives are behind it. They need to make sure that they have a strategy to deal with those issues as they start to come in. It’s completely new landscape.

Let me give you an example like Roseanne Barr. Bob Iger fired her, essentially, after the comment, which most people would assume he would do just for the comment. The question is, was he more affected by the reaction or the right thing to do? Those are the kind of decisions you can make quickly. I’m assuming he would have fired her.

That was a comment that ...

You pretty much get fired for.

You’re probably getting fired.

But it has impact. It has economic impact on the company. It has economic impact on people there. Lesser issues that aren’t so clear cut, like I noticed TNT didn’t fire Samantha Bee. It felt like there was a huge hubbub and maybe there wasn’t. Maybe there was a little one and it was a stupid comment, but they didn’t ...

It gets bigger and bigger.

Right. Right. How do you then explain to corporations that they’re going to have to calm themselves down just like people, right?

Well, one of the things that we really try and do — and, again, you need to technology to do this, it’s impossible to do without the naked eye — you really remove the bot networks from the analytics. You say, “Okay, this is what’s bot-generated. This is what isn’t.” Then you also take a step further. “This is the part of the conversation that was influenced by those bots, too.”

Right, because then humans come in, too.

Right. Once you remove that, then you can start really seeing, “Okay, what is the real problem?” A lot of times they are based off of real problems, but a lot of times they’re not. You need to be able to see all of those things.

Is there less conversation actually going on than we think? Or not? It feels like there isn’t. Like whenever Donald Trump tweets something, then it breaks into like an enormous roundelay every issue like that.

In some ways, it’d be great if it was that cut and dried. Is there less conversation or not? Because often what happens is the bots come in, the bot networks, those nefarious actors, and then they cause more conversation. They’re kind of lighting these fires throughout the internet to create more and more conversation. It really just feeds off of each other. If you think about how they’re impacting, how I talk about things, you talk about them, we all talk about things, how we’re all digesting news, information, it’s really widespread. The fact that so much of this is being manipulated by individuals or groups or foreign states, whatever it is, it’s kind of out of a spy novel type thing.

Right. Right. Right. It was actually a plot on “Homeland.”

Right. It was a plot on “Homeland.”

It was. He took a picture. There was a couple.

Getting amplified. Yes.

They amplified a lot of stuff, which was really interesting, which created decisions that were then bad and then created more mess.

That’s now happening every single day. The first thing that I always urge people of is just be aware this is happening. You need to be aware of that.

That amplification. Then from a corporate point of view, to be aware that you can’t react, right? That you shouldn’t react until you know where it’s coming from.

Or you should react differently depending on where it’s coming from, how it’s getting to you. If it’s impacting something like your stock price, then you need to bring in other players. Bring in the investor relations team, the risk team, the cyber security team. It really changes the communications landscape for a corporation.

We’re here with Josh Ginsberg. He has a company in San Francisco called Zignal Labs. He’s got a longtime political history when things were easier. Were things easier then?

It didn’t feel like it, but looking back, man, we traded a lot.

I know, I know. It was easy, right? It was just some annoying local news reporter that got some good tip. Let’s talk about what companies can do because I do want to ... Then I want to get back to politics. This midterm elections coming up. It’s disinformation. It’s manipulation of these networks. It’s just using the networks well for disinformation. Let’s talk about what people, what companies can do. I mean, obviously every company can’t just now have a social media presence. They’ve got to understand the impact of social media on them in order not to be as twitchy, presumably.

There’s a whole strategy. The other thing that we’re seeing, too, is there’s a converging of ... Before, the social media worked differently than the communications team, which worked differently than the marketing team. We’re seeing all of those things really start to converge. All of those, whereas they used to be really siloed, they’re all turning into one.

That’s step one, is making sure that you have the internal communication within those different groups and teams. Then number two, when you start to see this start to happen, you need to be able to see, okay, where it’s coming from and what are the motives? Once that gets established, then you’re able to start figuring out what your strategy is. Are you going to say, “Hey, this is manipulation. Hey, this is a bot network that’s coming out.”

There are certain offensive strategies that you can do, too, in terms of how are you pushing out your message? You can map sort of that new landscape of where are the bots, where are the influencers, how are you going to reach them?

You don’t stop them. They just go away. They just fade back. They make a mess and then fade back.

Yeah, but the other thing is, if you tell your message in the right way, or you give your message to the right influencers, you need to know, how’s that going to be amplified by real people, by synthetic ...

You want to amplify it yourself.

How do you use that to your ability, too? There’s a lot of different things that you need to look at from a landscape perspective, as well as just sort of a general strategy perspective as you’re reacting to these things.

When you bring it to a corporation, “Look, this new thing just was bot-related.” What’s the first thing they do? It depends on the situation.

Depends on the situation. Although if it’s an economically motivated situation, that’s pretty relevant. That means that you need to be aware of how this is going to impact stuff like your stock price. If it’s more of a corporate reputation situation ...

Right, which it’s often.

... which often it is or if it’s about your CEO, that’s really important too. So, to be able to go out there and correct the record, to be able to go out there and really make sure your message is being amplified.

Here’s another interesting thing that we see time and time again. And I’ll give you a real-world use case to illustrate this. We found bot networks that get, I’ll use the term “woken up,” based off of certain keywords. So, for example, we found a bot network where every time a major technology company’s name is used in the same headline as the word “vulnerability” —

Ah. Which one?

It’ll wake up. It’ll start to amplify it. And so, if you’re a technology company ...

So, whenever “Facebook” and “fuck-up” happens, it ... go ahead. Whatever. Okay.

Yeah, I mean, hey, we can check that.

“Privacy fuck up.” Sorry, Mark, but it’s true.

I’m just glad to know I can curse on this podcast.

Yes, you may. Please feel free.

So if you know that, if you know that’s part of the landscape, how do you then, No. 1, make sure from a defensive perspective that those headlines aren’t coming out?

Right. Which you can’t.

Which you can’t, but you might be able to sort of figure out the best way to frame a story based off that.


Or from an offensive perspective, how do you use that in a positive way? You know, that’s a less sophisticated bot network. Now, what’s scary are the more sophisticated bot networks. So, let me give you an example of one of those.


We found bot networks what we call are using a three-wave strategy.

Okay. Three waves? Geez. It’s like the ocean. I like it ... you’re getting paid the big bucks, you should have theories.

We came up with that. It’s a new world. We can come up with new frames.

I am bringing you the third wave, Harley Davidson CEO.

We’ll call one a Kara Swisher strategy. We’ll figure one out.

Okay, all right. Okay, yeah.

So, in a three-wave strategy, there’s three basically bot attacks that happen over usually a two- to four-week period. In the first two waves, the bot networks are testing messages. What’s resonating with different groups that they wanna get that out to? And so they’re just doing little tests, and you can kinda see how that happens through our technology.

The third wave is where they’re using what they learned to really make an impact, to pack a really big punch. That’s sort of the scary time. So, if we can show a company, “Hey ...”

“They’re preparing.”

Now we can prepare for that. We can see what’s starting to happen. Also, how can you participate in those waves, too? There’s all sorts of offensive and defensive ...

What does participate mean? Ride the wave, essentially?

Well, you can start to see ... okay, well, people are reacting when they push out this type of message to you. How are you gonna respond to that in the moment? How can you blunt that at that time so you’re saving yourself when it gets to that third wave? So there’s a number of patterns like that that we start to see.

That you can see because the third wave is to sink you, essentially.

Yeah, and you have an alerting function ... we have alerting functionality that can tell you, “Hey, this is ... you’ve got a three-wave coming at you. Be prepared.”

Yeah, and it’s gonna be bad. So, how do corporations react now when you show them? Like a lot of technology companies get hit here. You’ve got a lot of others. And you can see it. You can see ... it was interesting. I think you were looking at, the Microsoft one was the ICE issue. The employees were concerned. It was a real thing.

They were concerned. It was a legitimate issue that they had, but what we saw was — there was a concern, it came from sort of a real person that tweeted out, “Hey look. This is what’s happening. Microsoft is working with ICE.” But then, during that conversation, 52 percent of all conversations within that were bots. They were being amplified by bots. So if you’re Microsoft, does that change the way you respond to things, knowing that the level of intensity probably felt a little bit more because the bots were amplifying that?

Right, right.

And so, you change your strategy in terms of ...

Right. So you don’t get quite as intense about the thing as the bots.


But then real people see them. I think that’s the fascinating part.

All right, just finishing up, I wanna talk about what happens in the next elections and everything. You know, you’re a political guy, so do you still do political clients or not?

We have some political customers. It’s not too many.

Right. Corporations are just waking up to what, politics has been under siege, but it seems like, just from reading recent reports about — the government’s not ready for the next thing, the companies just met. It is in a few months. What do you imagine the next ... what has to happen next? You’re not gonna stop these bot networks, presumably. Is there a way to stop them?

It is a challenge.

You’re chronicling them, but they could move away and create new ones all the time.

Well, that’s the thing. This is constantly evolving. I mean, it’s on us to make sure we’re staying one step ahead of all those so that we can say, “Okay, what’s the next iteration of the three-wave strategy?” You know, that’s really important ...

Well, four waves, but go ahead.

Yeah, four. Get ready for five.

A tsunami. A tsunami.

A tsunami strategy. So, but, I think people do need to be made aware, in this election cycle, this is happening. We’re seeing it right now. Every single primary, this is happening. All of these different messages are getting pushed around, and again, it’s happening on both sides of the aisle. It’s not like, “Hey, the Republicans are doing this. The Democrats are doing that.” Oftentimes, it’s probably outsiders who are also jumping in there to sow discord. When we see major issues spike up ... you know, immigration was a really good example of this.

A good example, yeah.

We saw that huge percentages of that conversation was malicious actors just trying to spin people off. And again, across all platforms. And the way that the media writes about that is also impacted. Of course it is.

Right, because they see it and they think it’s a bigger deal. I just was dealing with someone who was ... we were tweeting ... we were doing a Tim Cook interview and they wanted to tweet something. And I said, “Don’t tweet it. You’re gonna get a lot of bullshit tweets back at you, and then you think that’s the issue.” And there was dozens and dozens on one issue, and I was like, “This is bot-related. This has to be. It was just too many people responding to the same thing.”

And it was really ... but they were like, “Well, we gotta ask about that.” I’m like, “No we don’t. Like, it’s not an issue. You’re now being pushed around by I’m sure a Russian somewhere in St. Petersburg or something like that.”

Yeah. No, let me give you an example of that. So, Bloomberg did a story about Nestlé back last September, and it was about how they were fined ... it was a water rights issue, effectively. Maybe not necessarily the sexiest issue but ... and they were going to places where they could get water for very inexpensive or free and bottling it, and they were selling it for billions. That’s what the story was about. And I think the folks at Bloomberg probably thought they had a hell of a story on their hands because it kept getting talked about.

And when we looked into this, time and time again, and this has been ... this is an example, where this story comes up at least monthly, and it just spikes huge amounts. It just starts trending. It’s all bot networks. It’s all these nefarious actors that are trying to push across an environmental message. So that can have those types of impact since it’s ...

Right, but they keep going on it. That it keeps living. Then companies think it’s still a thing and so does the media, so we’ll just react to it.

So the same thing is happening in our politics, too. For sure.

Right, right. So are you worried about these elections? These next elections?

I think everybody should be worried about this. I mean, this is ... and I think the first thing is just people need to be aware that this is occurring. Now, it’s not too late. There’s things that we can do about it. I think it’s important to have these conversations out in the open. It’s important for media outlets to be aware this is happening. It’s important for politicians, for elected officials. And that’s what’s going to get us to a point where we can start blunting some of these issues.

What about the responsibilities of platforms in this? Which are the most manipulable? You use Twitter a lot. Obviously, Twitter is the cesspool that has become a media organization, whatever it is. I call it ... you don’t have to call it a cesspool.

I would say one of the major issues is that this is not a platform-specific issue because if it was, then you could just say, “Well, this platform needs to fix it.”

“Clean it up.”

And a lot of these platforms are genuinely taking legitimate steps to try and clean it up.

Such as?

Well, you’ve seen both Facebook and Twitter ...

Knock people off.

Yeah, they’re removing accounts that are fake and things like that. But when it’s happening across networks ... you know, the Senate Intelligence Committee just went into Tumblr and said, “Hey, you were manipulating the 2016 election. What are you doing to the 2018 election?” And so, by the way, once people saw it on Tumblr, then it went to Reddit, then it went to Facebook, Twitter, then it went into mainstream news. So, it’s across the board.

Across the board on all the platforms.

So I think it’s important that all the platforms are working to clean it up. And I think they are taking ...

And share information.

Yeah, and I think they are taking genuine steps in order to do so, but those activities of course need to continue. And I think they are and I think us, as consumers, us as readers of this media just need to have that level of awareness.

It’s interesting that people don’t get quite as bothered by, say, a Samantha Bee or Roseanne Barr, these bots, or politics. But when it comes to corporations, wait a second, this could affect us! So, what would the head of Harley Davidson do in that instance? What did he do?

Harley Davidson, actually, I think did a good job. No. 1, I think they contacted the platforms. They had those users removed that were talking about this. And then they issued a statement saying, “Hey, this is not real.” Now, that’s maybe a little bit of an easier example because so many people were paying attention to it. But what happens if it’s a little bit smaller? What happens if a little bit happens every day and it’s kinda this low-grade fever, not necessarily enough for it really to get on the company’s radar.

“Did you hear someone got sick at a Disney park?” Or the stuff around ... there was a story ... the conspiracy theories around phones taping you and then sending you ads, which they proved today not so. But people persist. I think that’s actually just crazy people. But that could be just conspiracy theorists.

But then they get amplified, and then real people see it.

But then it would affect these businesses. And it’s like, look, I’d like to ding Facebook and Google all day long but not for things they don’t do.

I agree.

Do you know what I mean? Like, let’s focus on the shitty things they do versus the shitty things they don’t do.

But I think that is sort of the situation that we’re in nowadays. What’s true? What’s not true? It’s really difficult to tell. And you made a point earlier that I think was really good. One of the largest things that we’ve seen over the past couple years is trust in institutions has plummeted. Study after study shows that. The two places where trust in institutions has sunk the most is, No. 1, media; No. 2, corporations. So, what’s true and what’s not? That’s really difficult to discern in this media environment.

And what does the impact of Trump behaving this way have on it?

I mean, it’s probably not helping.

No, it’s not helping.

And the other thing is, people are using these political issues as vehicles to hit corporations, because what they look at is, all right, let’s look at the culture. Let’s look at corporations. What can we tie together that can get people spun up the most? And, in this political environment, it’s political issues.

Yeah. That’s a really good one. Then using these tools, these technology tools that ... do you see any upcoming ones that are gonna be dangerous? VR, suddenly?

Yeah, I’ll tell you what I’m probably the most nervous about is this issue with deep fakes. So what happens with ...

Oh, what’s a deep fake?

So, that’s when you — and I’m sure your listeners are gonna hear my explanation and be like, “Man, you got like 15 percent of that.” But I’ll do my best.


So that’s an example where you can take any individual, and you get enough video, you get enough audio, you get enough visual ...

Ah, you deep fake.

So then all of the sudden, I can show a CEO — and I’ll do a PG version of a deep fake — but I can make it look like a CEO went and robbed a bank. And that video can look extremely genuine to the naked eye. You start amplifying that all of a sudden, you get stories about a CEO robbing a bank.

Now, let’s take that a step further. What if it’s a foreign leader that you have, showing video saying that they’re gonna do something to the United States? What happens if it’s ... you know, you can sort of go down the line from the politicians and everything else. So you combine ...

This is just like an episode of “Homeland,” but go ahead. Yeah.

There you go. Yeah. Maybe I should just see how fake ...

Watch “Homeland.” I have a lot of things. There’s actually a movie you should watch called “Minority Report” that has a lot of ideas like a lot of this stuff in. I try to avoid Tom Cruise movies but, nonetheless, this one is full of ideas like that.

That’s a great movie.

Conceptual ideas. All right, so, if you’re a corporation, you need to be paying attention to this. You need to understand. And if you’re a user, what? Just turn off Twitter or what?

No, I mean, look, I think Twitter, Facebook, all these platforms serve a really great purpose in getting news and information. But as you’re digesting that news and information, you just need to be aware, you need to be extra vigilant — what’s real, what’s not, do research. Make sure you’re reading from trusted publications. You know, everyone’s got that crazy uncle that likes to throw out things and memes.

I know, it’s always the crazy over there. Like, the whole world is crazy. Let me just end by saying we looked at my name, and apparently I don’t have many bots but lots of people are pissed at me.


So, Kara Swisher is just being attacked by regular people.

Kara is very clean on the bots. Regular people might be a little bit pissed off.

Yeah, so I create all kinds of problems, but just ... it’s all real people.

It’s a badge of honor.

Thank you so much. But bots, please don’t come after me because I would ignore you, too.

Yeah, forget this. This is not a trigger.

This is a not a trigger. But thank you so much, Josh. This has been really fascinating. Again, it’s Josh Ginsberg from Zignal Labs. Their product is called ... what is it? Bot intelligence or?

Yeah, our Bot Intelligence Platform.

Right, which I think a lot of companies will be either availing themselves or there’s lots of people that ... there’s gonna be lots of business in this area.

I think so.

As we move forward, because people have to know what’s real and what’s not. Or maybe... Maybe not. We’ll never know. Anyway, by the way, it’s all a simulation anyway, Josh, according to most internet executives, so it doesn’t matter.

Yeah, we’re in an episode of “Westworld” right now.

Yeah, exactly. All right, it was great talking to you. Thanks for coming on the show.

This article originally appeared on

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.