Does the Military Really Have an Extremism Problem?
The empirical data raises far more questions than it answers
On April 9, 2021, U.S. Secretary of Defense Lloyd Austin directed “immediate actions to counter extremism in the military.” According to the New York Times, not all extremism was equally concerning to the Secretary of Defense. “Right-wing extremism” would be the focus of the Department of Defense’s counterextremism efforts.
A key part of the counterextremism campaign was to establish the “Countering Extremist Activity Within the Department of Defense Working Group” (CEAWG). In December 2021, the working group issued its findings. According to the report,
Thanks for reading The Missing Data Depot! Subscribe for free to receive new posts and support my work.
“The CEAWG incorporated quantitative and qualitative data from internal and external experts to help inform this report, including: A briefing from the University of Maryland National Consortium for the Study of Terrorism and Responses to Terrorism (START).”
“START data indicated recent spikes in the instances of domestic violent extremism, and an uptick in veteran participation in these cases.”
In conjunction with the release of the CEAWG report, START released a research brief based on an extension of their Profiles of Individual Radicalization in the United States (PIRUS) dataset. In the brief ( “Extremism in the Ranks and After”), the research team explicitly identified their desire to assist with Secretary Austin’s goal of pursuing a “department-wide stand-down to address the problem of extremism in the ranks.” As the authors write, the brief was
“intended to help in this effort by providing information on the military service backgrounds of individuals who committed extremist crimes in the U.S. from 1990 through the first eleven months of 2021.”
Upon its publication, the START brief was covered in a number of media outlets (e.g. Defense One’s article entitled “At Least 458 U.S. Crimes Tied to Extremism Involved Veterans, Active Duty Troops”. ):
On July 20 of this year, the Senate Armed Services Committee called on the Defense Department to halt its programs to prevent and root out extremism in the ranks. In a report accompanying the Senate’s National Defense Authorization Act, the committee states that “spending additional time and resources to combat exceptionally rare instances of extremism in the military is an inappropriate use of taxpayer funds, and should be discontinued by the Department of Defense immediately.” Shortly thereafter, the START team released a new research brief (still titled “Extremism in the Ranks and After”). This new brief increased the estimated number of criminal military extremists from 458 to 545 between 1990 and July 2022.
The Plan for this Post
A central goal of this newsletter is to encourage skepticism about the growing use of social science research in the policymaking process. My motivating concern is that we might be implementing costly, inefficient and, possibly, counterproductive policies on the basis of bad or misappropriated social science research. When I see policymakers like the Secretary of Defense citing research from places like START, I am not confident that good decisions will follow.
I’m also concerned about second-order effects of using imprecise social science research in our policymaking process. Specifically, I’m worried that headlines like the ones above might needlessly erode confidence in America’s military. The military is one of the only institutions in the country that a majority of the public still trusts. This trust will not last long, however, if Americans come to see the military as a haven for criminal extremists (right-wing or otherwise). Before leveling charges of widespread extremism, we need to be sure to have good evidence to back these accusations up. Sadly, we rarely do.
I will start by saying that I’m actually agnostic about whether military extremism is increasing or decreasing. I’m also agnostic about whether military extremism is, at the moment, a sufficiently large problem to warrant aggressive actions to root it out. What I am not agnostic about, however, is whether the data used to justify the Secretary of Defense’s efforts (and media reports about them) provide a sufficiently solid empirical foundation for deciding any of these issues one way or the other. To state the matter simply, there are too many unanswered questions about the data at the center of these internally-focused counter-extremism debates. We should be wary of letting it guide us.
In this post, I am going to walk through five questions raised by the START’s PIRUS project data on extremism and point out how we might be drawing bad conclusions about both the overall level of and over time changes to military extremism. My hope is that this discussion will help us improve our data and improve the way we incorporate that data in military decisionmaking.
Question #1: How much extremism is too much extremism?
The START PIRUS team claims, “This project includes all known cases of individuals who served in the U.S. military and committed extremist crimes in the United States from 1990 through 2021.” The July 2022 brief includes the following chart summarizing the yearly number of “US extremists with military backgrounds” between 1990 and July 2022. As Graph #1 shows, there were 545 criminal military extremists identified by their data collection efforts over the last 32 years:
Graph #1: US Extremists with Military Backgrounds (1990-2022)
There are nearly 19 million veterans currently living in the United States. In 2021, there were 2.1 million active-duty and reserve U.S. Department of Defense members, including officers and enlisted personnel. In other words, in 2021 (which had nearly three times more criminal military extremists than any other year), the “extremism” rate was .0007% (or 7 in one million). In more typical years (e.g. 2010, 2015, 2016, 2018, 2019), the yearly extremism rate among current and former members of the military was .0001% (or 1 in one million).
While most “one in a million” problems are not worth our serious attention, some are (e.g. nuclear plant meltdowns, nuclear weapons attacks, plane crashes, etc.). Is criminal military extremism one of these issues? There are certainly high profile instances throughout American history were military extremism has proven catastrophic. In 2009, for example, Nidal Malik Hasan killed 13 people and injured more than 30 others in the Fort Hood mass shooting. In 1995, Timothy McVeigh, a former army soldier, killed 168 people (including 19 children) when he and his accomplice Terry Nichols bombed the Alfred P. Murrah Federal Building in Oklahoma City.
These are powerful examples of the dangers of extremism from individuals with a military background. But how typical are these extreme examples of extremism? The PIRUS data from 1990 to 2018 can help provide some context (as I will discuss below, comparable data is not available from the project after 2018).
Between 1990 and 2018, the PIRUS data includes information on 148 criminal military extremists. 15 of the 148 individuals with a military background were “charged, arrested, or indicted” for misdemeanor crimes (e.g. trespassing, vandalism, etc.). A majority (75 out of 148) of the criminal military extremists identified in the PIRUS data were not “violent” (i.e. did not “actively participate in ideologically motivated operations/actions that resulted in causalities/injuries or clearly intended to result in causalities/injuries (but failed), or charged with conspiracy to kill or injure but were interdicted in the plotting phase”). In other words, a significant portion of criminal military extremists appear to avoid engaging in the most extreme forms of criminal activity.
Another way of contextualizing the number of criminal military extremists over the last 32 years is to compare extremism by members of the military to extremism from other demographic groups. The most obvious comparision here is with Islamic extremism. According to the PIRUS data, there were 251 criminal Muslim extremists between 1990 and 2018. Given that the overall number of individuals with a military background during this period was at least six times greater than the number of Muslims living in the United States (roughly 20,000,000 compared to roughly 3,000,000), we get much higher rates of criminal extremism in the Muslim community than among those with a military background (see Graph #2):
Graph #2: Extremism Rates among Muslims and Military Members (1990-2018)
Even higher rates of criminal extremism can be found among some first generation immigrants. The PIRS data show 30 first generation Pakistani immigrants arrested, charged, or indicted for criminal extremism between 1990 and 2018. In 2006, when there were only 300,000 Pakistani immigrants living in the country, six were arrested for criminal extremism (a rate of .002% or 20 in one million). The PIRUS data indicate there were 27 first generation Somali immigrants arrested, charged, or indicted for criminal extremism between 2007 and 2018, producing a criminal extremism rate as high as .003% (30 in one million) in 2015.
What about the problem highlighted by the New York Times and the Secertary of Defense: “right-wing” extremism among current and former military members? Of the 148 criminal military extremists included in the PIRUS data between 1990 and 2018, 87 (58.7%) were radicalized by “right-wing” movements. Despite the fact that there were only 5,895 Muslim Americans serving in the military during this period, 35 of the 148 criminal military extremists in the PIRUS data (23.6%) were radicalized by “Islamist or jihadist” movements.
I am NOT arguing here that these differential rates justify more scrutiny of Muslim Americans, first generation Pakistani immigrants or first generation Somali immigrants. I am also NOT pointing this out to suggest the need for “counter extremism” policies targeting any of these groups. On the contrary, the rates discussed here are to suggest how tiny ALL of these problems are and, further, how strange it is to treat them so differently in our thinking about extremism. Any discussion of extremism (and efforts to combat it) must start with a clear eyed look at how rare the problem of military extremism truly is.
Question #2: Who is an Extremist?
Who is an “extremist”? I will start here by saying that defining extremism has proven itself to be a nearly impossible task. Consider the tautological hopelessness of the standard dictionary definition of “extremism”:
What are “extreme political or religious views”? In his recent book Extremism, JM Berger attempts to answer this basic question with the following definition:
“an extremist ideology is a collection of texts that describe who is part of the in-group, who is part of an out-group, and how the in-group should interact with the out-group.”
Using this definition, nearly anything can be an extremist ideology. Consequently, nearly anyone can be an extremist. The danger, then, is that “extremism” becomes nothing more than a rhetorical label for views one doesn’t like.
To the PIRUS project’s credit, they have attempted to avoid this trap by being transparent about how they define extremism. Despite their efforts at transparency, however, they run into the same basic problems that have plagued all other efforts at measuring “extremism.” Consider the criteria for classifying extremism:
Notice that each of these criteria for extremism refers back to a host of other impossible to define and highly subjective concepts (e.g. “ideologically-motivated,” “ideological activities,” “active participation,” “self-identified association,” etc.). It’s hard to imagine producing reliable data when the definitional criteria are so fuzzy.
An additional problem is that the PIRUS project completely changed its definition of extremism as it transitioned away from studying all extremists to studying only military extremists. To make matters worse, it has explicitly obfuscated this difference in its public-facing presentations (e.g. the December 2021 and July 2022 briefs).
The PIRUS project’s FAQ makes it clear that not all “extremists” have been arrested for or convicted of criminal activity. As they write, an extremist is someone, “who radicalized to the point of violent or non-violent ideologically motivated criminal activity, or ideologically motivated association with a foreign or domestic extremist organization.” Their criteria for identifying extremists also make it clear that criminal activity is not a necessary condition for inclusion in the dataset because an individual “must meet at least one of the following five criteria” and all three of the criteria listed under points #6, #7, and #8 (see above). As point #5 indicates, an individual might be classified as an extremist if they have an “official membership,” “a membership claimed by a government,” or a “self-identified association” with a “violent extremist group.” This association would be sufficient for inclusion in the PIRUS data even if the individual never committed a crime.
The PIRUS project stopped collecting data using this classification scheme after 2018. According to the PIRUS website, “Due to resource limitations, the research team was only able to collect and perform quality control on data for individuals through the end of 2018. Plans for continued data collection are dependent upon future research funding.”
Apparently, no additional funding has been made available to track extremism in general. The PIRUS team did receive more funding from “the Office of the Under Secretary of Defense for Intelligence and Security,” however, to “compile an auxiliary dataset to PIRUS that contains all known cases of individuals with military backgrounds who committed extremist criminal acts in the United States over the past 32 years.”
This newly funded effort uses a completely different set of criteria for defining extremism. According to the December 2021 and July 2022 briefs, criminal offenses are now a necessary condition for inclusion in the PIRUS dataset:
Yet, strangely, the briefs insist on ignoring the differences between this new, military-focused data collection effort and the PIRUS project’s initial goal of capturing a “representative sample” of all kinds of “extremism” (including but not limited to criminal behavior). Highly subjective, unclear, and changing definitions do not inspire confidence in the estimates of extremism that are guiding the Department of Defense’s policies.
Question #3: Why have the recorded number of military extremists suddenly doubled for the years 1990 to 2018?
The most recent dataset the PIRUS project has made available is from May 2020 and contains data through 2018. As discussed above, this dataset includes information about all extremists (regardless of military background or criminality). The numbers from the July 2022 brief includes data only on criminal military extremists. Table 1 shows the difference between the number of criminal military extremists contained in the May 2020 data and contained in the July 2022 data (both for the period 1990 to 2018):
Table 1 - Differences between July 2022 Data and May 2020 Data
As Table 1 shows, the July 2022 data include a larger number criminal military extremists than the May 2020 data for nearly every single year between 1990 and 2020. In total, there were 138 additional cases of criminal military extremists discovered by the PIRUS researchers sometime after the publication of the May 2020 dataset. A majority of these cases were added for relatively recent years. Specifically, 70 new criminal military extremists were added to the data for the years 2012-2018 alone. It is also true, however, that a significant number of cases were added from the 1990s. The July 2022 data included 30 more criminal military extremists than the May 2020 data for the period between 1990 and 1999.
The massive growth in cases is puzzling. The PIRUS researchers have been engaged in a comprehensive effort to track all extremists using the same methods since at least 2016. Their previous efforts (discussed in further detail below) should have identified all criminal extremists from the 1990s, 2000s, and 2010s long before July 2022. Why were the methods used to build the May 2020 dataset so deeply flawed that they missed almost half cases (138 of 286) of criminal military extremism?
The huge changes in the number of criminal military extremists are deeply concerning given that the PIRUS team has chosen to highlight (to the public in their briefs and, apparently, to the CEAWG) what their data tells us about changes in criminal military extremism over time. How are we to trust PIRUS’s narrative about increases in criminal military extremism when the historical estimate offered by PIRUS has nearly doubled in the space of only two years? If they missed 138 cases of military extremism prior to May 2020, what are they missing now? What did their previous datasets underestimate about extremism from other, non-military sources? How might these underestimates shape our view of the relative amount of military extremism? Before letting these numbers guide our policies, we need some answers to these basic questions.
Question #4: Is there relatively more military extremism or relatively less military extremism?
The July 2022 brief claims that there has been a large increase in military extremism since 2010. The brief claims:
From 1990-2010, an average of 7 subjects per year with U.S. military backgrounds were included in the PIRUS data. Over the last decade, that number has more than quadrupled to 33.1 subjects per year.
Given that the PIRUS data for all extremists stops in 2018, we can't be sure whether this increase is a function of far more Americans overall becoming radicalized or just a growing number of Americans with military backgrounds becoming radicalized. Indeed, it is possible that current and former military members actually represent a declining percentage of all extremists (i.e. everyone else is radicalizing more quickly than those with a military background). If, for example, the number of non-military Americans labeled “extremist” by the criteria of the PIRUS project has grown by 7x since 2010, we would actually have relatively less military extremism (not more). The implications would be that we should study the military for their effectiveness at limiting extremism in the context of rising societal radicalization (not interrogating it for its failures to prevent extremism).
Although we don’t have PIRUS data on the total number of extremists for 2019, 2020, 2021, or 2022, it seems entirely plausible that extremism from non-military individuals has increased at a greater rate than extremism from military individuals during these years. In 2020, for example, there were approximately 17,000 arrests during the first few weeks of June alone. At least 96 people were arrested in late July 2020 related to protests and rioting at the federal courthouse in Portland.
Nearly all of these individuals would meet the PIRUS July 2022 brief’s first (i.e. to the extent that they were “radicalized,” they were “radicalized in the United States”) and third criteria for being an extremist (i.e. they “committed a criminal offense that was clearly motivated by their ideological views and resulted in their arrest”). While not all would meet the second criteria (“adhered to or espoused views that justify the use of illegal means, including violence, to achieve political, economic, religious, or social goals”), a small number would. Even if only 2% (or 340) of the more than 17,000 individuals arrested in June 2020 “adhered to or espoused views that justify illegal means,” it would mean that extremism among non-military individuals could be framed as increasing at a greater rate than military extremism.
It is important to consider the project’s potential treatment of arrests from 2020 in light of how the project has chosen to handle January 6 defendants with a military background. As Graph #1 suggests, the increase in military extremism identified by the PIRUS project is primarily a consequence of January 6. Without January 6, there would have been fewer than 10 cases of criminal military extremism in 2021 (the lowest number since 2014).
What’s more, most of the reported increase in military extremism would look far less dramatic if not for the fact that every single one of the 151 current and former military members arrested for their participation in January 6 was classified as an “extremist.” Additionally, Ashli Babbit (an “Air Force veteran who was killed while breaching the Capitol”) was classified as an extremist (raising the number to 152). While there were undoubtedly many extremists arrested in conjunction with January 6, there were also undoubtedly some that would not typically be labeled “extremist.” As a low-end estimate, for example, a recent Harvard study found that 7% of arrestees were at the Capitol to “peacefully protest” (i.e. they did not “adhere to or espouse views that justify the use of illegal means, including violence, to achieve political, economic, religious, or social goals”).
As stated above, the PIRUS project claims it is no longer collecting data on extremism in general. We cannot know, therefore, how the increase in criminal military extremism they document compares with the overall increase in criminal extremism throughout American society. In other words, it is impossible with the data PIRUS gives us to answer the one question that would truly help us determine whether the military is doing well or poorly when it comes to extremism: is there more or less relative extremism in the military today compared to the rest of American society?
Question #5: More extremism committed or more extremism found?
Searching the police reports and court records
How did the PIRUS project find its criminal extremists? The January 2022 report on criminal military extremism from the PIRUS project explains that “all data for this project were coded from public sources, including federal and state court records, public police reports, and print and online news media.” Unforutnately, we have no more details about exactly which of these documents were searched. As the Defense One article cited above explains, “It’s likely the data set did not capture all arrests, because some state and county records were either difficult to obtain or did not identify military service.” There are no details on how large a quantity “some” is. Does “some” mean that 5% of state and county records were difficult to obtain or does “some” mean that 50% of these records were not included in the study? Does “some” mean that 5% of included records contained no information about military service or is it more like 75%? Relatedly, which of the thousands upon thousands of “print and online news media” were searched? The materials provided by the PIRUS project do not answer these questions.
A major concern with data collections such as PIRUS is that any observed change over time is merely a function of the fact that the number and population of data sources is changing over time. Indeed, how can we be sure that the documented increase in military extremism since 1990 is not merely an artifact of there being a larger number of more thoroughly completed “court records, public police reports, and print and online news media”? Given how small the numbers of overall extremists are, a small change in the reporting (e.g. a few counties adding a line in police and court documents about previous military service) and accessibility practices (e.g. uploading all police and court documents to a searchable online archive) of government agencies could easily account for all of the observed increases in the PIRUS data since 1990.
I should point out here that this kind of problem is common in studies using over time data cobbled together from disparate government sources. Let’s consider the case of hate crime reporting between 2016 and 2017. According to data released by the FBI, there were 7,175 hate crimes in 2017, which was an increase from the 6,121 reported incidents in 2016. This 17% increase in hate crimes was breathlessly covered in the media (see, for example, the Al Jazeera headline below) and, more often than not, attributed to President Trump’s divisive leadership.
The problem with these claims, however, is that the 2017 data was based on reports from an entirely different and much larger group of law enforcement agencies. To be exact, the FBI collected hate crime data from 1,000 more agencies in 2017 than it did in 2016. As a result, we can’t know whether there were more hate crimes committed or just more hate crimes reported. As Robby Soave wrote, “this means it's not obviously the case that hate crimes are more prevalent in 2017. Maybe the government just did a better job of counting them.”
Does the PIRUS project control for variations in reporting issues? Can they be sure that any observed changes in extremism are not simply an artifact of changes in the police and court reporting practices that they rely on to identify their cases? We can’t know with the PIRUS project has publicly disclosed. These concerns are even more worrisome given that the number of historical cases in the reported data has changed significantly since May 2020. Far more detail is needed here.
Searching news reports
In addition to the problems associated with having only “some” of the police and court documents, there are major problems with “print and online news media” as a source of data on extremism. News reports are not a perfect reflection of reality. Countless decisions must be made in every newsroom about what to cover and what to ignore. Political, ideological, commercial, and professional considerations mean that not every event or issue is equally likely to receive attention in the news media. A secondary set of decisions involves “how” to report on those events and issues that do receive attention. Are we, for example, looking at a “riot,” “an insurrection” or a “mostly peaceful protest”?
Those making these decisions are overwhelmingly on the political left, with only 7% of journalists now identifying as Republican. Party identification may actually underestimate the liberalism of today’s journalistic class. According to a recent paper assessing the ideological composition of reporters based on their Twitter networks, the typical journalist is somewhere to the left of Bernie Sanders.
This ideological homogeneity, however, does not tell the whole story of media bias. After 2016, journalistic norms related to news coverage began to change considerably. Fearful of Trump and his supporters, reporters began to believe that it was their job to expose any instance of possible “right-wing extremism” regardless of its size or significance. Previous norms about proportionality, objectivity, and neutrality were jettisoned in favor of strident activism in “defense of democracy.” No one has more eloquently or succinctly summarized this shift in perspective than Wesley Yang here:
The Trump presidency radicalized America’s governing and chattering classes, who saw in his election the fulfillment of one of the dark possibilities of democracy—that the people would elect a demagogue intent on bending the arc of history backward—and felt themselves summoned to act as guardians of the Republic righting the course of that arc. We were in a state of exception that it was both their warrant and their duty to decide.
The standards and practices that marked our professional classes as elites deserving of our trust in ordinary times (impartiality, procedural correctness) were no longer applicable. In a time of “literal white nationalists in the White House” putting “babies in cages,” these protocols would in practice end up colluding with an existential danger. Departures from those practices become not just excusable but a moral imperative. Thus was undertaken a principled abandonment of scrupulousness in reporting, proportionality in judging, and the neutral application of rules once held to be constitutive of professional authority, all in favor of a politics of emergency. The new politics demanded loyalty and unanimity in an effort to defeat the usurper at any cost. The loss of proportionality in judging and scrupulousness in reporting created an echo chamber in which the bulk of the governing and chattering classes confirmed and exacerbated self-generated fantasies and fears of foreign subversion and fascists on the march.
These impulses gave us a long and embarrassing list of national “stories” focused on the actions and statements of obscure, right-leaning members of the public after 2016. We had CNN threatening to expose the identity of Reddit user who created an anti-CNN wrestling meme. CNN also confronted an old woman on the front lawn for having used her Facebook page to advertise a pro-Trump event they claimed was engineered by Russians (picture below). The Daily Beast celebrated in a headline that they “Found the Guy Behind the Viral ‘Drunk Pelosi’ Video.” Nearly every major news outlet covered Covington Catholic High School student Nicholas Sandmann’s “smirk” towards Native American protester Nathan Philips.
With this context in mind, let’s return to thinking about the PIRUS data and its reliance on “news reports.” At any given point in time, news coverage is almost certainly more likely to cover some kinds of “extremism” over others. People that use intriguing tactics, communicate in novel ways, or represent polarizing demographic groups make for “better” stories from the perspective of journalists and are, therefore, more likely to be covered in the news media. A corollary to this is that the kinds of extremism that receive attention are also likely to change over time. If the media is covering “right-wing” movements more (and using words connected with extremism in that coverage) because they (and their audiences) are more concerned about these movements, any study relying on media coverage will show an increase in right-wing extremism (even if the underlying “reality” of right-wing extremism is unchanged or declining).
Importantly, the PIRUS researchers realized this was a massive problem with their dataset when it started identifying the large numbers of Islamic extremists discussed above (and illustrated in Graph #2). As a result, they were quite clear about their data’s limitations. In the project’s FAQ's, the researchers write:
"Given our reliance on open-sources, the sample likely reflects news reporting trends over time. That is, as reporters shift their primary focus from one ideology or movement to another, it becomes increasingly easier to identify individuals who are associated with the groups that are under intense media scrutiny, and increasingly harder to identify those who are not. For example, the post-9/11 period in the PIRUS data is likely over-representative of Islamist extremists compared to individuals affiliated with other extremist ideologies."
If you replace “Islamist” with “right-wing” and “post-9/11 period” with “Trump era” in the final sentence you see that there is very likely a problem with concluding anything about the changes in right-wing extremism after 2016. The media's increasing focus on conservative individuals during the Trump era almost will necessarily produce an increase in this measure's level of right-wing extremism. If military members are more likely to associate with movements of the right, they will be significantly overrepresented based on the PIRUS approach regardless of whether there is an actual increase in their extremist activity. Unfortunately, however, this likely overestimate does not receive the same attention in the team’s reporting that their potential overestimates of Islamic extremists received.
Searching for the Search Terms
What exactly were the researchers searching for in these “court records, public police reports, and print and online news media” to locate the “radicalized” individuals to include in their dataset (i.e. what specific search terms were used to identify the 4,000 people the study identified as potential extremists)? It’s obvious that you will only ever find what you are searching for. If the researchers searched “public sources” only for the word “Muslim,” the data they collect will inevitably identify Muslims as the main source of extremism. More generally, the researchers could be using search terms that are disproportionately likely to pick up some kinds of extremism (ideologically, tactically, demographically) and not others. Of course, they may not even be using search terms at all. We just don't know because these details are not revealed anywhere in the project’s documentation.
Where to go from here?
The PIRUS data is not the only source of information on extremism. Most of these suffer from the same methodological challenges or reach the same conclusions about the relative infrequency of military extremism. The Center for Strategic and International Studies, for example, compiles data on domestic terrorist attacks perpetrated by active-duty or reserve service members. They rely on reports from third-party monitors (e.g. the Anti-Defamation League, START’s Global Terrorism Database, the Extremism, Anti-Semitism, and Terrorism Map, etc.) whose data coverage and quality have varied considerably over time. Consistent with what I have emphasized here, their data show that active-duty or reserve military personnel accounted for 0 terrorist attacks in 2018 (out of 40), 1 terrorist attack in 2019 (out of 65 total), and 7 terrorist attacks in 2020 (out of 110 total).
There will always be questions in large and complicated data collection efforts. These questions become more important when the data is being used to guide fundamentally important matters of government policy and has the potential to undermine faith in key governmental institutions. It is very likely that the PIRUS researchers and policymakers at the Department of Defense have good answers to all of the questions posed here. Unfortunately, they have not made these answers clear enough.
Thanks for reading The Missing Data Depot! Subscribe for free to receive new posts and support my work.