Hi.

Welcome to my blog. I document my adventures in travel, style, and food. Hope you have a nice stay!

We Are At War: Disinformation, CBRN, and Emergency Management

We Are At War: Disinformation, CBRN, and Emergency Management

(Photo Source: Morton devonshire, Wikimedia Commons, 2007).

Yes, yes, I know I have failed to post anything new in a while – from podcasts to the well-received series on detection instruments, I am far behind in my work. I have lots of excuses – from moving halfway around the world in the middle of a pandemic to trying to get three (yes, three) different books completed – two for my “day job” and one for y’all – the second of Rexmond Cochrane’s studies. But I’m going to go with the best excuse of all – 2020, y’all. Which is the topic of this post.

I promise this next year will see more activity here and in other endeavors. I just wanted to put up a brief post today to let everyone know, first, yes, I am still alive, and second, to talk about something in my “day job” that is an essential topic for the CBRN and HazMat communities alike. It came up in writing my latest books dealing with emergency management of hazardous materials transportation accidents, but it has a broader application. I’m talking about weaponized disinformation, and before you jump to conclusions about what I’m going to say on that fraught topic – bear with me for a minute. I promise you three things – yes, politics will be discussed, though as impartially as I can, and two, I’ll tie the topic of disinformation to CBRN/HazMat. Finally, I promise potential solutions.

2020 sucks. We’ve seen it across numerous recent events – back to back and simultaneous hurricanes, epic wildfires, Covid-19, protests, counter-protests, unimaginable (a few years ago) political dysfunction, freaking murder hornets, trade wars, real wars, you name it, this year has offered a smorgasbord of horror for emergency management and first responders across the United States and around the world. We’ve also seen, through all of these events and dysfunctions, the influence social media plays in disinforming the public and aiding in the spread of “disaster rumors.” That is what I am going to talk about.

Disaster rumors (a form of misinformation) are an ancient problem. After Pompeii and Herculaneum were destroyed by Mt. Vesuvius, I am sure all sorts of rumors floated around the Roman empire as to the reasons and causes from the gods’ capriciousness to “Do you know what the homosexuals are doing to the soil?” (sorry, couldn’t resist a Dead Milkmen reference).  Hell, go back a little further and read the Illiad. Homer was coming up with are sorts of wild ideas about the gods to explain the outcome of a military campaign. Conspiracy theories have a very, very long history, and many of the ones we see today are just updated versions of the old ones.

We’ve seen misinformation/disinformation more recently, pre-social media. In 2003 and 2005, disaster rumors pervaded both the Space Shuttle Columbia disasters and Hurricane Katrina. In East Texas, there were disaster rumors spreading word of mouth about space monkeys running loose in the Sam Houston National Forest, divers finding skeletons with cement overshoes in the Toledo Bend reservoir, and debris searchers finding vast fields of pot in the woods and some meth lab cook sites (okay, the last one was partially true). In Katrina, the infamous “dead bodies in the Superdome” rumor continues to live on, in part because of Mayor Ray Nagin’s habit of repeating disaster rumors in front of cameras without any verification. That disaster saw numerous events ascribed to conspiracy better explained by incompetence, which was on public display for anyone paying attention.

I cannot imagine either event today with social media. Still, we can see what is happening with the dueling narratives about who “set the wildfires” in the western US and the mysterious arrival of heavily armed white nationalists and racists expecting the ever nebulous “Antifa” to arrive at any minute and begin tearing down a statue only to find…nothing but a few shocked and outraged tourists. We see worse in the interplay of armed protestors and counter-protestors, resulting in gun violence and death in Portland, Austin, and Kenosha. The expansion of the vaccine denial movement into COVID-19 denial and anti-mask protests and all of the Covidiot nonsense anyone can stomach is the most evident symptom of the national disease in the US, except it isn’t just national – this is an international issue, being fought on a local level. Much of the current problem isn’t misinformation or rumor as in the past, either. Instead, it is targeted, deliberate disinformation churned out to significant effect for a purpose. It is not accidental that many of the anti-mask protestors include far-right groups.

This problem takes different forms and uses different means, but it is happening worldwide. Disinformation is causing violence – from murders in India caused by the spread of rumors on WhatsApp to the shootings in Kenosha and Austin. We see a similar trend, a morphing of the self-radicalized “lone wolf” terrorist into mob-violence fueled by disinformation. Worse, it is equal opportunity disinformation. There is as much disinformation among the far left as the far right. In some places, disinformation is racially motivated. In other places, it is political. In some countries, it is about long-simmering conflicts – Pakistan and India, for example, regularly weaponize social media against one another.[i] Pakistan, of course, is also home to the most violent anti-vaccination campaign in the world, in part because of unneeded revelations about how the CIA identified Osama Bin Laden, revelations that cost one man his freedom and countless others their lives. That particular revelation now forms the basis for an entire online conspiracy industry.[ii]

From anti-vaccsers to space monkeys and QAnon, emergency management faces a growing, tremendous challenge in dealing with the flood of disinformation on social media that seems to crop up with each new emergency. As warnings issued during the recent wildfires attest, disinformation is severely complicating disaster and emergency response.

There are several reasons for the sharp increase in misinformation and disinformation (Fear not, however, I will suggest solutions to the problem, so read on). These are:

First, increased social media use includes a broader segment of the world population, including portions of populations that were less likely to consume “hard news” before coming to social media and among whom media literacy is minimal at best. Hell, in many places and parts of this population, literacy is minimal at best. For many of these newest arrivals to social media, the differences between the DuffelBlog, Onion, and the MidEast Beast compared to the New York Times are not readily apparent. A quick browse through the comments expressing outrage to articles posted beneath any of the satirical news sources and those criticizing the NYTimes reveals surprising overlap.

But a lack of media literacy is only the low hanging fruit in this equation. This population segment, upon which disinformers prey, is the same population segment scam artists and Nigerian Princes looking to transfer millions to your bank account have long targeted. These are vulnerable people. They represent a small segment of the problem.

The second factor driving the uptick in misinformation/disinformation is increased social media use due to the COVID-19 pandemic that coincides with the death of local media. Lock-downs and work from home schemes have dramatically increased the entire world’s consumption of social media. People are looking at their phones more now than ever (hard to believe it is possible, but, well…2020 is like that). Exposure alone accounts for an uptick in disinformation – more eyeballs, more consumers, the more likely people are to see disinformation and share it online, especially if it confirms preconceived biases or judgments and targets an emotional response. That worked just as well in 2019 as it did in 2020. It is just reaching more people likely to fall for it than in previous years.

Further, the death of local media, especially newspapers, means that the primary source of news locally in most communities is now via social media, which makes rumor control more difficult and rumors more likely to spread. There are few trusted or official “local sources” of information left with the majority of news consumed online being national news. It is likely some fringe Facebook group in your community has more followers than your Office of Emergency Management Twitter account.

The third factor is a manifestation of a problem I’ve long written about here on CBRNPro.net – worried well. From the Tokyo Subway Sarin Attack to Amerithrax to COVID-19, there are large numbers of people likely to seek reassurance and comfort in an emergency that they have trouble understanding and which generates significant fear (CBRN, pandemic, etc.). The way they go about it morphs and changes, but it manifests the same underlying fears. Not panic, incidentally (that seldom happens in disaster). To the sufferer, this fear is entirely rational, so they seek expert advice. In Tokyo, people flooded hospitals. In Goiânia, Brazil, they filled a soccer stadium waiting to be scanned by radiation detectors. During Amerithrax, they called their local fire department to “screen their mail.” These are not panicked responses. They are a rational exercise in response to irrational fears brought on by a lack of knowledge/information or distrust of official sources of information.

It is that last part (distrust of official sources of information) that poses one of the most significant challenges to emergency management in any emergency worried well or not. I’ve written about that before regarding the Amerithrax event. Still, the basic premise is always the same – speak with one voice – the greatest counter to disinformation is unity of message. Given the current disconnects at the federal, state, and local levels regarding the COVID-19 response, worries about the safety of an “October surprise” vaccine, and increased political polarization, the perfect storm exists, right now. Everyone in the world is worried well because of COVID-19, and fewer people have access to local media and more distrust official sources unless those official sources confirm previously held biases.

We can see this in the way different pandemic responses played out around the world. The most effective responses to COVID 19 were those with few, if any, land borders (Korea, Japan, Australia, New Zealand, Taiwan), and second, smaller countries where the populace generally complied with government guidance regarding lockdowns, mask-wearing, and social distances, and where that guidance was trusted and delivered consistently from the beginning. The worst responses generally occurred in larger, more populous countries (China, India, US). They had the worst and most chaotic messaging (Brazil, US, Mexico) or where leaders echoed disinformation (haunting us like Ray Nagin’s ghost, except he’s still alive, though under house arrest).

The fourth and final factor driving increased disinformation in disaster and emergency response is weaponization. This factor, while the most nebulous to quantify, accounts for the most worrying part of disinformation in disaster and emergency management. Way back in 2015 (seems a century ago), there were already signs of this driven by Russian troll farms. December 2016 also brought out the first known Q-related attack: A lone-wolf self-radicalized on conspiracy theories attacked Comet Ping Pong Pizza in Washington DC,  believing it to house a vast child abuse network run by Hillary Clinton. The QAnon movement continues to grow worldwide, even as its claims become increasingly absurd, leading the FBI to label it a threat.

But the problem goes back further – in the early 2010s, disinformation first emerged around fake weather posts related to Hurricane Sandy. It is unclear who was behind that, though it is likely lone actors engaged in most such activity – similar to the rise in the practice of “SWATing” that arose around the same time frame that proved briefly popular among rival gamers and ultimately led to a murder. It is only more recently we’ve seen such lone-wolf “troll” tactics weaponized by state actors and non-state extremists like neo-nazi organizations in Europe and the United States. These take the form of sophisticated online and real-world efforts to cause violence, create division, and increase distrust in government.  

It was in 2015 that the Russians first weaponized social media to attack the United States, not by targeting elections (though they did that in the US and elsewhere too, through both social media and traditional espionage and bribery). They did it with a fake Ebola outbreak in Atlanta and a Chemical Plant in Lousiana. These two events can now be seen as trial runs for everything we now see in 2020.  Both those previous events were traced back to the same St. Petersburg, Russia front organization, made infamous during the impeachment hearings of Donald Trump – the Internet Research Agency.

Analyzing those first two attacks on the United States demonstrates two critical factors for emergency management going forward – they both leveraged fears about CBRN related events playing off the worried well problem. Second, they were highly sophisticated to the point of setting up fake news sites and creating fake videos. Those trials went mostly unnoticed (they were both relatively localized) and were quickly stamped out by local emergency management, though not without achieving some results. Fast forward to 2020, and it isn’t hard to see how similar methods are part and parcel of nearly every disinformation campaign currently afflicting the United States.

I want to be clear - the Russians are not behind every piece of disinformation on the internet, but they are more than willing to create it and amplify what may already be out there if it creates division, erodes trust in government, and causes trouble in any western nation or wherever else it perceives its “enemies” to be – which is pretty much everywhere. The Chinese aren’t averse to playing this game either, though evidence suggests it isn’t on the same level. Traditionally, the Chinese valued stability – internal and international - and preferred to steal information rather than spread it. That changed in the last year. But China and Russia aren’t the only ones doing battle online. More than a few countries, like India and Pakistan, have their own campaigns of disinformation, both state-sponsored and independent. Non-state actors have played this game for a long time. Al-Qaeda was doing disinformation in the Middle East in chat rooms and internet cafes when everyone’s first friend online was Tom at MySpace.  

The newest and most worrisome manifestation of this problem involves using social media to direct armed groups of radical individuals to locations likely to be the scene of conflict or protest. Armed people in large gatherings of angry persons are always a recipe for disaster – something, no doubt, the Russians are banking on. Of course, that sort of thing feeds on efforts by lessor trolls of more radical variety. Neo-nazi groups and far-right anti-government extremists have long sought to spark a racial or political civil war going back to at least the 1950s (forerunners to anti-government and neo-nazi groups like the KKK and the America First crowd of the 1930s trace their origins to the 19th century). Timothy McVeigh’s deadly cos-play of the Turner Diaries in Oklahoma City drew on a deep reservoir of such ideas. There are, no doubt, many a tattoed skin-head tweeting anything that fits with their notion of achieving racial division – from “Antifa” memes to fake videos of BLM protestors. Some of these are self-generated, others flow in from St. Petersburg – either way – they spread.

Further, there is ample evidence of Russian support and funding for far-right movements throughout Europe and the United States dating back some way. This includes the FPO in Austria, AfD in Germany, UKIP in the UK, LePen’s National Rally, and…well…whatever you want to believe about what happened in 2016.[iii] The common thread between all of them – they benefit and amplify the same anti-immigrant/racist messages, they love Russia and Putin (and never say a bad word about him). There is mounting evidence of Russian infiltration and clandestine action within such groups and deep financial dependence on Russian backers, as well.

That all said, it isn’t just racists gaining Russian support. There is strong evidence that those on the far left get fed an equivalent dose of Russian disinfo, Bernie Sanders, and his supporters being particular targets in the US. Other efforts target the radical left in Europe. The Russians are equal opportunists at promoting division wherever they can find it. For every “stop the radical Antifa” meme online, there is a corresponding “stop the fascists” meme. Many come from the same source. Still, attempts at the right-wing agitation appear more successful in achieving electoral success and street violence (so far, anyway).[iv] In part, this may be due to a unity of message on the far right that combines a sense of economic and social frustration with racial appeals and nationalism that seems to work in multiple countries and has a long and ugly history. Even in places with a long history of leftist agitation, left-wing radicals tend to fragment more and lack a unifying message or struggle to find a wider audience. Perhaps in part due to the left’s traditional emphasis on ideological purity – a central problem even in the Cold War when communists often spent more time denouncing each other than capitalists.

Whatever the reason, disinformation seems to target all radical movements but work better on the right than the left at achieving violent action and electoral success, at least currently. A few studies suggest it could be a psychological effect, with a couple of studies suggesting that those with greater authoritarian leanings are more likely to support lying by leaders. While authoritarianism isn’t bound by ideology, left-wing populist authoritarianism like that practiced by Evo Morales or Hugo Chavez has generally declined over the last decade in popularity even while the right-wing variety grew and took its place. Perhaps it goes in waves?

The common thread from the beginning, though, is how disinformation plays on emotion and fear – and therefore, it is no surprise that the initial test runs in Atlanta and Louisiana involved some aspect of CBRN/HazMat. Since then, those efforts have developed a quicker reaction time-shifted from creating events to capitalizing on those already occurring. Disinformation is being pumped into any disaster, any emergency, any political event, in real-time: COVID-19, BLM, wildfires, hurricanes, and the 2020 election cycle. Hence the sudden “meet-ups” of armed counter-protesters to “protect” statues or other businesses/buildings during “Antifa” protests. The goal is to incite violence between protestors, which has achieved limited success. It will, if it continues, likely result in a mass shooting sooner rather than later, which is the likely goal.

What does it all mean for emergency management and responders? To be blunt – you are at war, and you are on your own. It is an unacknowledged war where the problem is known, and the national responses around the free world hopelessly incompetent. While pressure builds nationally and internationally for social media giants Facebook and Twitter to police their services better, they are outmanned and outgunned. Plus, disinformation peddlers find other ways of reaching the same audiences through direct messaging apps like WhatsApp and Viber or via messaging boards like radical offshoots of 4Chan or Reddit. Others create mini media empires, like Alex Jones and InfoWars, and his many would-be imitators.

Part of the complication facing the US response is that there are competing bureaucracies in the US federal government with opposing viewpoints regarding the response to these attacks. Politics plays a role here too – both parties loathe to admit any benefit they may receive from disinformation while gleefully highlighting that of their opponents. Like all else, this is a partisan issue. That is one goal of the disinformation campaigns being waged – make everything an argument. Free speech and the First Amendment, while essential to Western and US democracy, are the very means by which we are under attack. This irony doubtlessly brings joy to our quasi-totalitarian attackers whose own governments struggled against western information campaigns in the Cold War.

What can you do? For starters, you have to emphasize your public information more than anything you’ve ever done in the past. If a significant event happens in your community, be it a train derailment or a protest, there are state and non-state actors actively scanning media reports around the world for a mere mention of it, ready to strike. They will act before you do. That makes your messaging and rumor control operation essential. You have to implement an information operation early and get media to help. Traditional media were often seen as the enemy by many response and emergency management organizations. Now, you genuinely need them on your side. Unfortunately, in many cases, there are a lot fewer of them then there used to be. Still, those that do exist have greater reach than your organization can achieve.

You also need to go on the offensive. In the past, DHS and FEMA generally pushed guidance regarding reactive rumor control – like the creation of “rumor control” pages where officials could quash rumors. As the COVID-19 response shows across the United States, that’s a losing proposition. The disinformation peddlers are going to win that fight every day. Nor can you adopt a criminal investigative approach to the problem alone – deterrence, in this case, is only marginally useful. You might get to prosecute a teenager tweeting from his mom’s basement or a neo-nazi in his bunker years after the event, but not the warehouse full of FIS trolls in St. Petersburg or the People’s Liberation Army Unit 61398 in Shanghai. Unless and until the United States federal government decides to take more direct action, foreign actors remain largely untouchable. That means that when foreign intelligence and military units target your town for disinformation, you will have to take them on alone. That seems like a daunting task, but you do have a few weapons.

First, unity of message – first, last, always this is your best defense in all responses and how to practice information management – speak with one voice, communicate the same message, and keep doing it. You can drown out more noise than you think just by practicing that simple dictate.

Second, get traditional media on your side – when a rumor pops up, communicate it to them, give them the facts, and then get them to quash it for you – they may not be as trusted as they were in the past. However, they still have reach and can generate doubt about misinformation and disinformation in all but the most conspiratorially minded.

Whatever you do, never, ever, ever, answer any media question or respond to social media or post anything to social media unless you are absolutely sure about the facts. If you don’t know the answer, say so and promise to find out, then find out and communicate your findings. Any hint of obfuscation, denial, or sign of inaccuracy will be seized on by disinformation operators. It is okay not to have an answer. It is always wrong to make one up or cover up something – you will fail and lose trust – the most valuable commodity you have.

Third, local responders and emergency management have more trust than they recognize. People know you, often personally, especially in smaller communities. Like with incumbent politicians, emergency management gets the benefit of the doubt: “Vote the bastards out of Congress, except for my congressman, I like him.” Similar sentiments apply to first responders, generally speaking. Most emergency management and first responders have a lot of public goodwill, especially fire, EMS, and medical personnel. Times are changing, though. Police departments and public health officials have a growing problem in this regard, especially in certain communities, and may not make the best face for your operation depending on circumstances unless they’ve managed to regain public trust or address longstanding concerns about inequitable treatment or address mixed messaging or concerns about political interference – once lost – trust takes a long time to regain.  

Fourth, leverage the signs and symbols of your authority. Uniforms are symbols of authority. Wear the best one you have. There is a reason politicians like to have police, fire, and military stand in the background when they speak.  It projects authority. People are generally inclined to respond to such symbols. Try this experiment if you don’t believe me. Go to where a gathering of people are milling around waiting for something to happen – say a conference at a hotel by the sign-in desk, or a spot where tourists gather. Bring a clipboard and a walkie talkie. Wear a generic name badge and dress nicely (no uniform). Walk up, introduce your self, “Hi, I’m John Smith, if you’d please follow me.” Walk them somewhere nearby. They will follow you. Stop, turn, and tell them, “Someone will be along in a minute to take it from here. Please wait while I go get them.” Then leave. Only three symbols – a clipboard, a walkie talkie, and a name badge can get most any group to follow you like that without much question. Only one of those symbols of authority (name badge, clipboard, or walkie talkie) will often suffice. We are conditioned for such behavior from a very young age at school.

Uniforms, symbols, responders in the background, and buildings project authority. Use them. Don’t give sidewalk press conferences if you have an imposing background in the lobby—image matters in this game. Disinformation operators utilize similar methods online by creating fake websites and news pages or co-opt your symbols with official-looking seals, fake documents, or even pictures or videos of people in uniforms. You have the advantage over them as your symbols are the real deal. Fight them on that ground, protect your symbols – anytime an imposter crops up, be quick to debunk them, loudly, and, if possible, use humor to point out the things they get wrong. Hide verification marks in your seals and symbols that make it easier to spot and point out fakes. Research and use authenticated photo and video tools or software that embeds code into official photos and videos that make verification and identification of altered video and photos easier.

Fifth, use humor and emotional appeals. The ridiculous nature of much disinformation invites ridicule - from chemtrails to reptilian overlords. This is a fine line, however – using humor to degrade the authority of an imposter or a troll is powerful, but never ridicule or degrade anyone who might have fallen for them. You want them to come to your side while they disown the discredited source. Have people on your staff who can manage this. One of the most powerful uses of social media is a savvy user who can leverage humor. Find one. Hire a comic. Whatever it takes, make it work for you. Humor is one of your most powerful weapons.

Use emotional appeals – disinformation operators use them to garner a response. You can do the same through empathy and genuine emotion. Never put up a video of someone in a command post that reads statements like an animatronic robot, even if they are the ones in charge. Emotional intelligence is essential in communication. This is why politicians sometimes make better spokesmen in emergencies than professional emergency responders or managers (at least, if they can stick to the unity of message). Most politicians are better at reaching people. They do it for a living.

Sixth, build social media messaging and disinformation control into every exercise, drill, and day to day operation you perform. You may need to get funding and personnel to make it happen but find it. You are at war, and you will lose without the training, practice, and tools to fight it. Get that mayor or county judge to stand up there and give a press conference in an exercise with actual local media prepped to challenge them using disinformation and rumors. Get local media to participate in your exercises (not just cover them), give them some inside access to entice them if needed. Get your social media team to practice against a live target – create a social media red team to create false narratives in an exercise and work to have your team counter it. Get your incident commanders and EOCs to play in the same social media exercise – let them see the effects of disinformation through injects into the exercise that directly complicates their response.

Seventh, prepare for politicization. The goal of both state-level disinformers and non-state extremists is to create and exacerbate divisions. Don’t fall for that game. As an emergency manager, you may face pressure from political leaders to fall into line if they perceive political benefit from some disinformation. Some extremist groups may have supporters in your ranks. That is a profound challenge, one of the most complex related to COVID-19 and recent protests.  The best way to deal with it is to create conditions to mitigate it, rather than respond to it. Responding to it will likely create disunity of message, as happened in COVID-19, or create images like police officers offering support to armed rightwing counter-protestors. The more aware everyone is of the challenge and the problem, the less likely they are to give the perception of partiality or fall for the bait offered by some guy in a Russian office trolling you while eating borscht on his lunch break.

The political dimension is the seam between response and politics that sophisticated disinformation operations carefully target in an attempt not only to divide the public about your response but divide your response and get you off your unity of message. Traditional media sometimes plays the same game, especially the national networks. The way to deal with it is to build trust and knowledge about such efforts and their targets. Hence the push to get politicians and the media to play in your exercises and drills. The more aware both groups are of the game being played, the less likely they are to fall for it, and the more likely they are to assist you in your efforts.

Finally, name names and get social media companies to help. Citizens reporting fake news and accounts to social media companies is one thing. State and local government officials another. The response is likely to still be slow and less than optimal. Still, those companies are improving in this area. Public appeals can help – get the populace to report fake news and rumors – both to you and the social media companies. The louder the squeak, the more likely you get oil. The social media companies can help by blocking accounts and removing posts. It would be good if they got roped into the national exercise program. Alas, as stated, the US federal government lags far behind in this area. If you can, get specialized law enforcement in these matters involved early, talk to state and federal agents who work in such areas, make those contacts now. Lastly, if you identify a fake account – out it publicly, especially if it becomes apparent it is linked to a state/foreign actor or an extremist group.  Disinformation campaigns depend a lot on their anonymity. The more you deny them of that, the less effective they become.

That’s all folks. Until next time, keep on fighting the good fight and remember, when in doubt, keep calm and decon.

 

 

 

 

 

 

 


[i] The two countries even play ding-dong-ditch against each other.

[ii] Various covert and clandestine campaigns form the basis of many conspiracy theories invoking the CIA, with kernals of truth at their core based on historical events expanded beyond credulity.

[iii] The FPO case is particularly interesting. In addition to a fake Russian leading to their downfall from government last year, the party publicly acknowledges its association via an official agreement with United Russia, Putin’s political party that almost certain limits their appeal in Austria where anti-Russian feelings are strong in a sizeable portion of the population going back to the post war occupation. In fact, Putin attended an FPO minister’s wedding in Austria while the FPO was still the junior partner in government. The common link between these groups isn’t just Russia either – they transit in the same conspiracies and anti-immigrant/racialist messaging, though oddly enough it was the FPO that rejected Steve Bannon’s attempt to forge a cross-border cooperative agreement.  Extreme far right groups have, in some cases, also co-opted symbols. In some places the Confederate flag serves as a stand in for banned Nazi symbols and Alex Jone’s InfoWars site draws a sizeable amount of traffic from Europe, so much he now has a dedicated European page.

[iv] UKIP accomplished Brexit, the AfD managed to gain support throughout the former GDR and obtain seats in the Bundestag, the FPO was the junior partner in the Austrian government until a seres of scandals leading up to a very public exposure of the Vice Chancellor’s shady dealings with a fake Russian heiress led to his ouster. Then, there is, of course, whatever you want to think about the US 2016 election… Contrarily, there are few examples of leftist groups achieving anything near the success of the far right, at least so far. The exception may come in France, where LePen has struggled and the left and its “yellow vests” are increasingly empowered against economic reforms instituted by President Macron. Also, unlike the German speaking countries and the UK, France has always leaned further left, but time will tell.

The White Powder Rodeo: Lessons from the Amerithrax Response [for a Global Pandemic]

The White Powder Rodeo: Lessons from the Amerithrax Response [for a Global Pandemic]

0