1,952 research outputs found

    Floridi on Disinformation

    Get PDF

    What is disinformation?

    Get PDF
    Prototypical instances of disinformation include deceptive advertising (in business and in politics), government propaganda, doctored photographs, forged documents, fake maps, internet frauds, fake websites, and manipulated Wikipedia entries. Disinformation can cause significant harm if people are misled by it. In order to address this critical threat to information quality, we first need to understand exactly what disinformation is. This paper surveys the various analyses of this concept that have been proposed by information scientists and philosophers (most notably, Luciano Floridi). It argues that these analyses are either too broad (that is, that they include things that are not disinformation), or too narrow (they exclude things that are disinformation), or both. Indeed, several of these analyses exclude important forms of disinformation, such as true disinformation, visual disinformation, side-effect disinformation, and adaptive disinformation. After considering the shortcomings of these analyses, the paper argues that disinformation is misleading information that has the function of misleading. Finally, in addition to responding to Floridi’s claim that such a precise analysis of disinformation is not necessary, it briefly discusses how this analysis can help us develop techniques for detecting disinformation and policies for deterring its spread

    Design and commissioning of a timestamp-based data acquisition system for the DRAGON recoil mass separator

    Full text link
    The DRAGON recoil mass separator at TRIUMF exists to study radiative proton and alpha capture reactions, which are important in a variety of astrophysical scenarios. DRAGON experiments require a data acquisition system that can be triggered on either reaction product (γ\gamma ray or heavy ion), with the additional requirement of being able to promptly recognize coincidence events in an online environment. To this end, we have designed and implemented a new data acquisition system for DRAGON which consists of two independently triggered readouts. Events from both systems are recorded with timestamps from a 2020 MHz clock that are used to tag coincidences in the earliest possible stage of the data analysis. Here we report on the design, implementation, and commissioning of the new DRAGON data acquisition system, including the hardware, trigger logic, coincidence reconstruction algorithm, and live time considerations. We also discuss the results of an experiment commissioning the new system, which measured the strength of the Ec.m.=1113E_{\text{c}.\text{m}.} = 1113 keV resonance in the 20^{20}Ne(p,γ)21\left(p, \gamma \right)^{21}Na radiative proton capture reaction.Comment: 11 pages, 7 figures, accepted for publication in EPJ A "tools for experiment and theory

    Toward a Formal Analysis of Deceptive Signaling

    Get PDF
    Deception has long been an important topic in philosophy (see Augustine 1952; Kant 1996; Chisholm & Feehan 1977; Mahon 2007; Carson 2010). However, the traditional analysis of the concept, which requires that a deceiver intentionally cause her victim to have a false belief, rules out the possibility of much deception in the animal kingdom. Cognitively unsophisticated species, such as fireflies and butterflies, have simply evolved to mislead potential predators and/or prey. To capture such cases of “functional deception,” several researchers (e.g., Sober 1994; Hauser 1997; Searcy & Nowicki 2005, Skyrms 2010) have endorsed the broader view that deception only requires that a deceiver benefit from sending a misleading signal. Moreover, in order to facilitate game-theoretic study of deception in the context of Lewisian sender-receiver games, Brian Skyrms has proposed an influential formal analysis of this view. Such formal analyses have the potential to enhance our philosophical understanding of deception in humans as well as animals. However, as we argue in this paper, Skyrms's analysis, as well as two recently proposed alternative analyses (viz., Godfrey-Smith 2011; McWhirter 2016), are seriously flawed and can lead us to draw unwarranted conclusions about deception

    The Impact of regulatory capital regulation on balance sheet structure, intermediation cost and growth

    Get PDF
    URL des Documents de travail : http://ces.univ-paris1.fr/cesdp/cesdp2016.htmlDocuments de travail du Centre d'Economie de la Sorbonne 2016.61 - ISSN : 1955-611XAs Europe is subject to a protracted recession, it should be asked whether the reform of the financial sector is not costly in terms of potential growth. Our analysis shows that the negative effect of the Basel III package excepted by the pre-QE studies are almost annihilated today. The recession must then have other causes: falling corporate lending volumes resulted from falling demand in the aftermath of the financial crisis, but this is longer the case. The EU is trying to incentivize corporate lending, via forward guidance as well as ‘supporting factor’ cutting down the Basel capital requirements. The macroeconomic theorists are trying to account for future success of monetary policy around zero nominal interest rate via the risk-taking channel. All these clever initiatives failed to deliver. As a consequence, we might infer that banks are simply not taking any risks: rather than appealing to risk aversion, we would like to argue that the banks seem especially embarrassed by future regulatory developments, which appear remote and uncertain. The binding constraint for corporate lending and growth in the EU is then plausibly a combination of banks' expectations of future regulation and strong uncertainty aversion. While we offer some mitigation prospects, we hope that the theoretical developments of the recent years will quickly yield both theoretical advances and practical results.Malgré les assouplissements récents de la règlementation bancaire afin de permettre une reprise des prêts aux entreprises, il semble que le paquet Bâle III ait un effet néfaste sur la croissance. Le texte s'interroge sur la contrainte active parmi l'ensemble des nouvelles règles. Il semblerait que les banques soient devenues très averses à l'incertitude, en particulier à l'incertitude sur l'évolution des règles qui leurs sont imposées. En conséquence nous proposons de modifier la nature des provisions en capital et le nombre de régulateurs ayant un pouvoir de décision sur le niveau des capitaux

    Double-beta decay Q values of 130Te, 128Te, and 120Te

    Get PDF
    The double-beta decay Q values of 130Te, 128Te, and 120Te have been determined from parent-daughter mass differences measured with the Canadian Penning Trap mass spectrometer. The 132Xe-129Xe mass difference, which is precisely known, was also determined to confirm the accuracy of these results. The 130Te Q value was found to be 2527.01(32) keV which is 3.3 keV lower than the 2003 Atomic Mass Evaluation recommended value, but in agreement with the most precise previous measurement. The uncertainty has been reduced by a factor of 6 and is now significantly smaller than the resolution achieved or foreseen in experimental searches for neutrinoless double-beta decay. The 128Te and 120Te Q values were found to be 865.87(131) keV and 1714.81(125) keV, respectively. For 120Te, this reduction in uncertainty of nearly a factor of 8 opens up the possibility of using this isotope for sensitive searches for neutrinoless double-electron capture and electron capture with positron emission.Comment: 5 pages, 2 figures, submitted to Physical Review Letter

    Mass measurements near the rr-process path using the Canadian Penning Trap mass spectrometer

    Full text link
    The masses of 40 neutron-rich nuclides from Z = 51 to 64 were measured at an average precision of δm/m=107\delta m/m= 10^{-7} using the Canadian Penning Trap mass spectrometer at Argonne National Laboratory. The measurements, of fission fragments from a 252^{252}Cf spontaneous fission source in a helium gas catcher, approach the predicted path of the astrophysical rr process. Where overlap exists, this data set is largely consistent with previous measurements from Penning traps, storage rings, and reaction energetics, but large systematic deviations are apparent in β\beta-endpoint measurements. Differences in mass excess from the 2003 Atomic Mass Evaluation of up to 400 keV are seen, as well as systematic disagreement with various mass models.Comment: 15 pages, 16 figures. v2 updated, published in Physical Review

    The Uses of Argument in Mathematics

    Get PDF
    Stephen Toulmin once observed that `it has never been customary for philosophers to pay much attention to the rhetoric of mathematical debate'. Might the application of Toulmin's layout of arguments to mathematics remedy this oversight? Toulmin's critics fault the layout as requiring so much abstraction as to permit incompatible reconstructions. Mathematical proofs may indeed be represented by fundamentally distinct layouts. However, cases of genuine conflict characteristically reflect an underlying disagreement about the nature of the proof in question.Comment: 10 pages, 5 figures. To be presented at the Ontario Society for the Study of Argumentation Conference, McMaster University, May 2005 and LOGICA 2005, Hejnice, Czech Republic, June 200
    corecore