343 research outputs found
The Myth of the Rational Voter: Why Democracies Choose Bad Policies
In theory, democracy is a bulwark against socially harmful policies. In practice, however, democracies frequently adopt and maintain policies that are damaging. How can this paradox be explained? The influence of special interests and voter ignorance are two leading explanations. I offer an alternative story of how and why democracy fails. The central idea is that voters are worse than ignorant; they are, in a word, irrational -- and they vote accordingly. Despite their lack of knowledge, voters are not humble agnostics; instead, they confidently embrace a long list of misconceptions. Economic policy is the primary activity of the modern state. And if there is one thing that the public deeply misunderstands, it is economics. People do not grasp the "invisible hand" of the market, with its ability to harmonize private greed and the public interest. I call this anti-market bias. They underestimate the benefits of interaction with foreigners. I call this anti-foreign bias. They equate prosperity not with production, but with employment. I call this make-work bias. Finally, they are overly prone to think that economic conditions are bad and getting worse. I call this pessimistic bias. In the minds of many, Winston Churchill's famous aphorism cuts the conversation short: "Democracy is the worst form of government, except all those other forms that have been tried from time to time." But this saying overlooks the fact that governments vary in scope as well as form. In democracies the main alternative to majority rule is not dictatorship, but markets. A better understanding of voter irrationality advises us to rely less on democracy and more on the market
Networks, law, and the paradox of cooperation
There is a tension between libertarians’ optimism about private supply of public goods and skepticism of the viability of voluntary collusion (Cowen 1992, Cowen and Sutter 1999). Playing off this asymmetry, Cowen (1992) advances the novel argument that the “free market in defense services” favored by anarcho-capitalists is a network industry where collusion is especially feasible. The current article dissolves Cowen’s asymmetry, showing that he fails to distinguish between self-enforcing and non-self-enforcing interaction. Case study evidence on network behavior before and after antitrust supports our analysis. Furthermore, libertarians’ joint beliefs on public goods and collusion are, contrary to Cowen and Sutter (1999), theoretically defensible.networks; anarcho-Capitalism; collusion
Roundtable on Epistemic Democracy and its Critics
On September 3, 2015, the Political Epistemology/Ideas, Knowledge, and Politics section of the American Political Science Association sponsored a roundtable on epistemic democracy as part of the APSA’s annual meetings. Chairing the roundtable was Daniel Viehoff, Department of Philosophy, University of Sheffield. The other participants were Jack Knight, Department of Political Science and the Law School, Duke University; Hélène Landemore, Department of Political Science, Yale University; and Nadia Urbinati, Department of Political Science, Columbia University. We thank the participants for permission to republish their remarks, which they edited for clarity after the fact
In an Age of Anti-Intellectualism, What is the Value of Expertise?
Photo ID 25835385 © Travelling-light| Dreamstime.com
Abstract
The nature of the scientific enterprise is sometimes misunderstood by large sections of the public. Failure to understand how progress occurs within scientific disciplines can lead to nonadherence with expert recommendations, with devastating consequences. Why does the public put enough stock in scientific research and medical science to comply with certain research findings but are skeptical of others? Through careful attention to history, current socio-cultural contexts, scientific data, and knowledge development within professions, we argue in favor of greater public deference to expertise. Though we rely on a variety of examples rooted in medical science, our focus is on general conceptions of expertise and what might be learned from reflection on its proper role. We situate our analysis within the context of current discussions in philosophy, bioethics, and public policy around the idea of ‘wicked problems’.
Introduction
Expertise, broadly speaking, has come under serious attack in recent years. The current presidential administration is engaged in large-scale dismantling of the scientific and public health infrastructure at a variety of federal agencies. This is happening concurrently with ongoing attacks on academic institutions. This should concern not just members of the scientific and academic community but also the public at large. Unfortunately, the nature of the scientific enterprise is sometimes misunderstood by large sections of the public.[1] Although trust in science remains high among Americans, the level of trust contrasts sharply depending on one’s political affiliation.[2] There has been growing hostility toward science itself.[3] This can have devastating consequences when a failure to understand how progress occurs within scientific disciplines leads to ignorance of and even nonadherence to expert recommendations. How ought members of the public be persuaded to put enough stock in scientific research and medical science to listen to and comply with sound research findings but remain appropriately questioning of expert advice? For instance, recent polling data suggests that a majority of US adults are concerned that agencies such as the CDC will make decisions that are influenced by politics.[4]
Through careful attention to history, current socio-cultural contexts, the role of expertise, and knowledge development within the scientific professions, we argue in favor of greater, albeit prudent, deference to expertise. Our hope is that lessons will be learned for better responses to future public health crises and other collective action problems that require public support to properly address them. We are careful to focus our efforts on the value of expertise, which is hard-won and takes years to cultivate. We situate our analysis within the context of current discussions of pandemics as ‘wicked problems,’[5] responding to potential critiques rooted in concerns about epistemic justice,[6] and suggest bioethics as a working model for addressing these concerns and bioethicists as potential leaders in future public policy discussions.
The Authority of Experts
The authority of experts has increasingly attracted the attention of ethicists. For instance, some ethicists have argued that we should be wary of the authority of experts because experts are human beings that may have conflicts of interest, speak outside their area of expertise (“epistemic trespassing”), or offer advice that is not useful.[7]
Managing deference to the authority of experts has long been a challenge in the United States. The authority of experts has waxed and waned throughout the history of the United States. For instance, the authority of experts was under assault during the era of Jacksonian democracy in the early 19th century.[8] Licensure of professionals was repealed. Many irregular physicians practiced medicine. The prevailing ideology held that each individual could and should decide how to manage their own health. This was the state of affairs for much of the 19th century. But, with the rise of laboratory science and the demonstrable efficacy of the germ theory of disease, the authority of physicians grew. As chronicled in the classic book The Social Transformation of American Medicine, physicians enjoyed an unparalleled run of cultural and moral authority throughout much of the 20th century.[9] Today, health care professionals such as physicians and nurses typically hold the top rung in Gallup surveys of public trust (although they suffered a downturn as a result of COVID).[10]
Public health experts have always struggled for the same level of cultural authority. Despite the successes of public health in the 20th century, medicine garnered the lion’s share of resources. Respect for those treating individual patients surpassed respect for public health officials, who offered prevention strategies and focused on infectious disease avoidance, vaccination, and food safety. It also surpassed those engaged in the practice of medicine in clinics and publicly funded venues serving individuals who were stigmatized – the poor, minorities, individuals with disabilities, individuals with mental illness or addiction, or immigrants. Public health journalist Laurie Garrett highlights the difference in a rather stark way by comparing the public health school at Harvard with the medical school building: “The medical school is all marble, with these grand columns….The school of public health is this funky building, the ugliest possible architecture, with the ceilings falling in….That’s America.”[11]
During the COVID-19 pandemic, for example, the authority of public health and scientific experts came under withering attacks. President Trump dismissed the authority of scientific experts when the pandemic first started. Trust in expertise was further eroded by right-wing media personalities attacking figures such as Anthony Fauci. Ideally, public health and medicine should acknowledge mistakes and be transparent to the public. Robert F. Kennedy, Jr, the current head of the Department of Health and Human Services (HHS), has continued the administration’s undermining of science and expertise by promoting falsehoods about autism, vaccines, chemtrails and fluoride. Trust as precious social capital is diminished when officials in positions of power promote mistruths.
It is this social capital of trust that is essential in preserving the authority of credible experts. Since no one is an expert in everything, we must have some level of trust in the expertise of certain individuals. But who merits our trust? With the vast amount of information available, it is increasingly difficult to determine who is trustworthy and who is not. We have witnessed a stratification of trust toward individuals and institutions based on income and education. For instance, more highly educated individuals tend to trust academic experts. And this trust goes beyond individual trust to institutional trust. The CDC has enjoyed a good deal of trust throughout its history. Yet even this agency has seen its trustworthiness diminish with its recommendations during COVID of school closings, mandatory vaccination, isolating, and masking. Institutional trustworthiness is historically more precarious than the trustworthiness of individuals, but both have been undermined in recent years.
Historically, professional experts have grossly abused their authority. We have witnessed this with unethical experiments in Nazi Germany and human rights abuses in the US, including Tuskegee, Willowbrook, the Iowa ‘monster’ stuttering studies, Havasupai Tribe gene studies, and the Stanford prison experiments, among many others. Ordinary people rightly question the authority of experts when such a legacy of abuse exists.
The COVID pandemic highlighted a great deal of disagreement among a variety of experts. Dr. Vinay Prasad, now in high office at the FDA, emerged as a staunch critic of mainstream public health measures, such as vaccine and mask mandates.[12] So did those who authored the Great Barrington Declaration.[13] Mark Battersby stated, “Disagreement among experts renders appeals to authority fallacious. But many of the interesting cases one deals with will involve conflict among experts.”[14] The pandemic reflected not only disagreement but the very undermining of the notion of expertise.
Others have highlighted the role of anti-intellectualism in American life. Anti-intellectualism reflects a suspicion of experts and the expert class. The scholar and public intellectual Tom Nichols chronicled this phenomenon in his 2017 book The Death of Expertise.[15] It also harbors a suspicion toward ‘the academy,’ or institutions of higher education. (Nichols provides a stinging critique of the academy and its dereliction of promoting logical discourse and reason.) This kind of suspicion and mistrust has persisted throughout US history and has seen a resurgence in recent years with the election and re-election of Trump. Such attitudes support populist or even conspiratorial explanations for complex phenomena, such as the COVID-19 pandemic, chemtrails, and climate change. As Merkley and Loewen state: “People who are highly distrusting of experts are not simply willing to put aside their distrust of these sources to resolve the crisis and return to normalcy. Relaying information from experts is unlikely to be of use in persuading these individuals, even in times of crisis. Other communication strategies are needed.”[16]
For many individuals who are distrustful of public health advice, their experience with COVID-19 may have only reinforced that distrust. Asymptomatic people or those with very mild symptoms led to the spread of the virus to vulnerable members of society who were more likely to experience severe COVID-19 complications. Thus, the failure to heed public health advice placed a significant burden on the US healthcare system, even if some individual persons had mild or asymptomatic cases. Although awareness of how viruses are transmitted improved during and after the pandemic, misinformation and disinformation often undermined understanding. Viruses that change or mutate, such as COVID-19, make efforts to communicate new information from experts to the public extremely difficult. These challenges are further compounded in the current climate of assault toward experts and the value of expertise. Before turning to the value of expertise itself, it is important to respond to a set of arguments that has received wide attention of late.
Politicization in the Face of Expertise
One argument against trusting experts is that experts’ advice has been compromised by politics. This argument stems from the intertwining of politics and healthcare (broadly understood). Politicians who find themselves offering recommendations that are at odds with experts have trotted out this reasoning, particularly those in leading roles in the Trump administration. Dr. Josh Sharfstein suggests HHS Secretary Robert F. Kennedy, Jr., engages in this technique. In his analysis of the role of expertise relative to political power (referring to Kennedy’s guidance), Sharfstein noted that, “Kennedy has been sowing doubt in public health for years. He's basically saying, here's my view; here's some thoughts, and you can't trust what people who – you know, their whole career, their whole goal is to support health, and they have all this expertise – you can't trust them at all.”[17] Understanding what this claim means is important to understanding why it is both mistaken and wrongheaded, and it might mean at least one of four things.
First, consumers of information can sometimes lack the relevant knowledge and critical skills to properly assess who might be an expert. The possibility of mistaken and fraudulent “experts” is a real and too common occurrence. Second, given the diversity of thought and the lack of critical assessment skills and opportunities, it is possible for people to find “experts” to support their own pre-existing views. Third, the view that everyone is or can be an expert is arguably common. This point is of special interest to bioethicists who generally recognize that all people are (or might be) experts with respect to their own sets of values, but not with respect to other bodies of knowledge, such as medical science. The idea that anyone can be an expert can be coupled with an effort-focused qualification (“they could be if they just put their mind to it”; consider, e.g., the meme ‘I did my own research’). Fourth, some argue that those who are dubbed experts are selected based only upon non-relevant factors, such as their political stances, as opposed to their knowledge of medicine or of epidemiology. Experts can then be seen as political tools, as the perception is that they are chosen for reasons that are not relevant. Some may feel that they have all “sold out” to someone or something—big pharma, big law, billionaire philanthropists, big food, celebrity, tenure, etc.
Concerns about conflicts of interest, personal expertise, the ability to become an expert, post hoc expert shopping, and the lack of critical reasoning and knowledge base familiarity should give pause to those who understand the importance of expertise. These considerations, however, are not without response. For instance, one line of response to the politics charge is to simply drive a wedge between politics and science (whether that be medical science, scientific research, or another body of knowledge with practical importance and application that is distinct from politics). For instance, the former director of the National Institutes of Health, Francis Collins, said, “When you mix politics and science, you just get politics.”[18] Others argue that the only way for science to escape the political allegation is to keep politics completely out of science. However, neither view offers a viable response, as the health needs of the public are – in different ways – the purview of both sets of expertise (politics and science). Further, it is unclear how driving such a wedge rebuts the lack of trust in experts, nor is it clear whether such a wedge could be successfully driven into spaces which are intertwined.
Politics is a part of what research gets done, who gets to do it, what messaging is utilized, and whether that messaging is amplified or squashed. The way to manage the dismissal of expertise as “just politics” is neither to deny the reality of politics nor pretend that it can be safely isolated, but rather to insist on methods to tamp down the direst consequences of one on the other. There are, at least, two methods that can be employed for this purpose.
First, it is important to remember that though people are referred to as experts, expertise belongs to groups as well as individuals. Peer consensus among individual experts (even false or would be “experts”) drives the legitimization of expertise—not who is the loudest, most educated, flashiest, best funded, most quoted, or the closest friend of those in power. Checks of claims and assessments offered by experts from other, relevantly situated experts are essential. This notion is the basis for peer review, among other common substantiations of the claims of experts.
Second, expertise must be vindicated by application. Consider, for example, the importance of replicability in scientific experimentation. Consider also the importance of consensus that follows on an understanding of expertise that that which belongs to a group, not merely to a set of individuals. If, for example, the majority of climate experts successfully predict continuing rising temperatures, more humidity, more algae outbreaks, species extinctions, excessive heatwaves, more wildfires, and nastier storms, then climate change deniers lose credible standing. When the predicted instances occur, the usefulness of proper expertise is highlighted. If a scientist or a political official dismisses the values of vaccines but then witnesses outbreaks of measles, mumps, hepatitis or flu, then that scientist or official ought to lose credibility. This is not to say that experts cannot make mistakes; however, in such cases, explanations of the mistake and revisions of expertise-dependent reviews are needed. An official or scientist who rejects the germ theory of disease, for example, shoulders the burden of explaining the reduced morbidity and mortality that occurred in the early 1900s after that theory gained prominence and physicians (and others) began washing their hands more diligently. In other words, the ability to apply expertise successfully falsifies the ‘it’s just or all politics’ argument.[19]
Expertise
(i) Expertise
In popular discussions, the idea that one must do something for “10,000 hours”[20] has permeated considerations of who counts as an expert. It takes time and resources to develop expertise in a specific area, as well as particular skill sets to be able to master a practice or gain command of the tools of a discipline. That said, our aim in this section is not to articulate a full description of what counts as having expertise, but rather to highlight two features of expertise relative to its boundaries, which should inform the role of expertise within public discussions of public health.
In their classic work on the nature of expertise, Chi, Glaser, and Farr describe seven key characteristics of experts, two of which apply to the limits of expertise. The first is that “Experts excel mainly in their own domains” (Chi, Glaser, and Farr, xvii). When reflecting on and evaluating “authority,” it is important that notions of expertise are taken seriously. It is not mere authority which marks out proper expertise. Experts are such only in their particular domain and we make a significant mistake if we defer to the authority of an expert of the wrong domain. For example, we might look to the famed Red Sox baseball player and author of the Science of Hitting Ted Williams for advice on how to hit a baseball. Williams was, by all accounts, an expert at it. However, it does not follow that society should have relied on him for recommendations of legal interpretation of baseball stadium and game regulations, cooking (even of ballpark franks), or public health measures (even about crowding at baseball games). Williams likely had views about these latter matters; he may even have had informed opinions having spent time in many stadiums around the country, but he did not hold any expertise in these latter areas.
The second boundary-relevant criterion that Chi, Glaser, and Farr note is that “Experts have strong self-monitoring skills” (Chi, Glaser, and Farr, xx) (Chi, 2009). The ability to restrict oneself in one’s capacity as an expert is important for at least two reasons. First, as we often appeal to experts in situations that are complex (in ways relevant to their expertise), it is likely that laypersons making the appeal lack the ability or struggle to draw clear lines as to when the expert has overstepped. The domains of expertise, especially where multidisciplinary and interprofessional teams are needed to properly respond to an issue, are a good example of the sometimes vague or murky lines of expertise and need for experts to carefully self-monitor. Second, self-monitoring is important not only when the bounds of expertise are murky, but in situations where the dissemination, use, and influence of the results of that expertise occur through channels that may not involve the expert. If an ordinary manager of a baseball team aims to share the results of Ted Williams’ hitting expertise, he needs to take care in both ensuring (to the best of his ability) that Williams has not overstepped, but also that he is not conflating his standing as a star with the results of the expertise he is sharing.
Public health is an especially important example of this potential confusion and conflation for several reasons. First public health responses are complex – requiring multi-disciplinary and interprofessional teams of experts. Second, those disseminating and then acting upon the information shared by particular experts within the team often have standing to share this information due to other kinds of expertise or a different kind of authority. Thus, a political leader – a president or governor, for example – may be charged with the care of the public and so it is within their purview to disseminate relevant scientific information that they receive. However, it would be an abuse of authority or – in the very least, a conflation – to fail to self-monitor and, say, to share one’s own opinions under the auspices of expertise.
It is especially important in our current socio-cultural and political moment to pay attention to the bounds of expertise. During the COVID-19 pandemic, many individuals frequently stepped outside of the proper bounds of their expertise. Merely because someone is an expert on a particular subject matter does not mean, as noted above, that they are competent to discuss complicated matters which, even if relevant to the subject area, require different training, experiences, or knowledge bases to master, or those especially challenging problems which require an interdisciplinary team to fully address.
Consider another pandemic-relevant example: an expert in supply chain management might possess sufficient expertise to comment on and evaluate the way in which COVID-19 vaccine materials made their way from production and transportation mechanisms to distributions centers. However, they should show restraint and not comment on the merits of particular vaccines, as that would properly fall to biochemists, virologists, vaccinologists or other experts.
During a public h
Self-interest And Public Interest: The Motivations Of Political Actors
Self-Interest and Public Interest in Western Politics showed that the public, politicians, and bureaucrats are often public spirited. But this does not invalidate public-choice theory. Public-choice theory is an ideal type, not a claim that self-interest explains all political behavior. Instead, public-choice theory is useful in creating rules and institutions that guard against the worst case, which would be universal self-interestedness in politics. In contrast, the public-interest hypothesis is neither a comprehensive explanation of political behavior nor a sound basis for institutional design
Rational Irrationality: A Framework for the Neoclassical-Behavioral Debate
Critics of behavioral economics often argue that apparent irrationality arises mainly because test subjects lack adequate incentives; the defenders of behavioral economics typically reply that their findings are robust to this criticism. The current paper presents a simple theoretical model of "rational irrationality" to clarify this debate, reducing the neoclassical-behavioral dispute to a controversy over the shape of agents' wealth/irrationality indifference curves. Many experimental anomalies are consistent with small deviations from polar "neoclassical" preferences, but even mildly relaxing standard assumptions about preferences has strong implications. Rational irrationality can explain both standard, costly biases, as well as wealth-enhancing irrationality, but it remains inconsistent with evidence that intensifying financial incentives for rationality makes biases more pronounced.Behavioral
Networks, law, and the paradox of cooperation
There is a tension between libertarians’ optimism about private supply of public goods and skepticism of the viability of voluntary collusion (Cowen 1992, Cowen and Sutter 1999). Playing off this asymmetry, Cowen (1992) advances the novel argument that the “free market in defense services” favored by anarcho-capitalists is a network industry where collusion is especially feasible. The current article dissolves Cowen’s asymmetry, showing that he fails to distinguish between self-enforcing and non-self-enforcing interaction. Case study evidence on network behavior before and after antitrust supports our analysis. Furthermore, libertarians’ joint beliefs on public goods and collusion are, contrary to Cowen and Sutter (1999), theoretically defensible
Mises, Bastiat, public opinion, and public choice
The political economy of Ludwig von Mises and Frederic Bastiat has been largely ignored even by their admirers. We argue that Mises' and Bastiat's views in this area were both original and insightful. While traditional public choice generally maintains that democracy fails because voters' views are rational but ignored, the Mises-Bastiat view is that democracy fails because voters' views are irrational but heeded. Mises and Bastiat anticipate many of the most effective criticisms of tra4itional public choice to emerge during the last decade and point to many avenues for future research
Mises, Bastiat, public opinion, and public choice
The political economy of Ludwig von Mises and Frederic Bastiat has been largely ignored even by their admirers. We argue that Mises' and Bastiat's views in this area were both original and insightful. While traditional public choice generally maintains that democracy fails because voters' views are rational but ignored, the Mises-Bastiat view is that democracy fails because voters' views are irrational but heeded. Mises and Bastiat anticipate many of the most effective criticisms of tra4itional public choice to emerge during the last decade and point to many avenues for future research
- …
