2,095 research outputs found

    Antitrust Error

    Full text link
    Fueled by economics, antitrust has evolved into a highly sophisticated body of law. Its malleable doctrine enables courts to tailor optimal standards to a wide variety of economic phenomena. Indeed, economic theory has been so revolutionary that modern U.S. competition law bears little resemblance to that which prevailed fifty years ago. Yet, for all the contributions of economics, its explanatory powers are subject to important limitations. Profound questions remain at the borders of contemporary antitrust enforcement, but answers remain elusive. It is because of the epistemological limitations of economic analysis that antitrust remains unusually vulnerable to error. The fear of mistakenly ascribing anticompetitive labels to innocuous conduct is now pervasive. The Supreme Court has repeatedly framed its rulings in a manner that shows sensitivity to the unavoidability of error. In doing so, it has adopted the principle of decision theory that Type I errors are generally to be preferred over Type II. It has crafted a pro-defendant body of jurisprudence accordingly. In 2008, the Justice Department picked up the gauntlet and published the first definitive attempt at extrapolating optimal error rules. Yet, in 2009, the new administration promptly withdrew the report, opining that it could “separate the wheat from the chaff” and thus marginalizing the issue of error. Notwithstanding this confident proclamation, error remains as visible as ever. Intel’s behavior in offering rebates has been subject to wildly fluctuating analysis by the U.S. and E.U. enforcement agencies. In a marked departure from precedent, the DOJ is again viewing vertical mergers with concern. And the agency has reversed course on the legality of exclusionary payments in the pharmaceutical industry. Antitrust divergence, both within and outside the United States, remains painfully apparent, demonstrable proof that vulnerability to error remains systemic. For this reason, error analysis may be the single most important unresolved issue facing modern competition policy. This Article seeks to challenge the contemporary mode of error analysis in antitrust law. We explain the causes and consequences of antitrust error and articulate a variety of suggested cures. In doing so, we debunk the current presumption that false positives are necessarily to be preferred over false negatives. We highlight a variety of cases in which the contemporary bias in favor of underenforcement should be revisited

    Antitrust Error

    Full text link
    Fueled by economics, antitrust has evolved into a highly sophisticated body of law. Its malleable doctrine enables courts to tailor optimal standards to a wide variety of economic phenomena. Indeed, economic theory has been so revolutionary that modern U.S. competition law bears little resemblance to that which prevailed fifty years ago. Yet, for all the contributions of economics, its explanatory powers are subject to important limitations. Profound questions remain at the borders of contemporary antitrust enforcement, but answers remain elusive. It is because of the epistemological limitations of economic analysis that antitrust remains unusually vulnerable to error. The fear of mistakenly ascribing anticompetitive labels to innocuous conduct is now pervasive. The Supreme Court has repeatedly framed its rulings in a manner that shows sensitivity to the unavoidability of error. In doing so, it has adopted the principle of decision theory that Type I errors are generally to be preferred over Type II. It has crafted a pro-defendant body of jurisprudence accordingly. In 2008, the Justice Department picked up the gauntlet and published the first definitive attempt at extrapolating optimal error rules. Yet, in 2009, the new administration promptly withdrew the report, opining that it could “separate the wheat from the chaff” and thus marginalizing the issue of error. Notwithstanding this confident proclamation, error remains as visible as ever. Intel’s behavior in offering rebates has been subject to wildly fluctuating analysis by the U.S. and E.U. enforcement agencies. In a marked departure from precedent, the DOJ is again viewing vertical mergers with concern. And the agency has reversed course on the legality of exclusionary payments in the pharmaceutical industry. Antitrust divergence, both within and outside the United States, remains painfully apparent, demonstrable proof that vulnerability to error remains systemic. For this reason, error analysis may be the single most important unresolved issue facing modern competition policy. This Article seeks to challenge the contemporary mode of error analysis in antitrust law. We explain the causes and consequences of antitrust error and articulate a variety of suggested cures. In doing so, we debunk the current presumption that false positives are necessarily to be preferred over false negatives. We highlight a variety of cases in which the contemporary bias in favor of underenforcement should be revisited

    Pure phase-encoded MRI and classification of solids

    Get PDF
    Here, the authors combine a pure phase-encoded magnetic resonance imaging (MRI) method with a new tissue-classification technique to make geometric models of a human tooth. They demonstrate the feasibility of three-dimensional imaging of solids using a conventional 11.7-T NMR spectrometer. In solid-state imaging, confounding line-broadening effects are typically eliminated using coherent averaging methods. Instead, the authors circumvent them by detecting the proton signal at a fixed phase-encode time following the radio-frequency excitation. By a judicious choice of the phase-encode time in the MRI protocol, the authors differentiate enamel and dentine sufficiently to successfully apply a new classification algorithm. This tissue-classification algorithm identifies the distribution of different material types, such as enamel and dentine, in volumetric data. In this algorithm, the authors treat a voxel as a volume, not as a single point, and assume that each voxel may contain more than one material. They use the distribution of MR image intensities within each voxel-sized volume to estimate the relative proportion of each material using a probabilistic approach. This combined approach, involving MRI and data classification, is directly applicable to bone imaging and hard-tissue contrast-based modeling of biological solids

    Why citizens don’t like paying for public goods with their taxes– and how institutions can change that

    Get PDF
    Why are Americans so against paying taxes to fund basic government functions such as roads and education? In new research, Alan M. Jacobs and J. Scott Matthews find that many citizens object to paying for public investment because they do not trust politicians to spend new revenues as promised. Using online experiments with voting-age US citizens, they find that support for using taxation to pay for investment was dependent on how much voters trusted the institution charged with carrying out the work. Local governments and the military were trusted to a much greater degree than Congress, especially among conservatives. Citizens were also more willing to pay more for public goods when they were told that the new taxes would be set aside in a dedicated trust fund account

    Method and apparatus for shadow aperture backscatter radiography (SABR) system and protocol

    Get PDF
    A shadow aperture backscatter radiography (SABR) system includes at least one penetrating radiation source for providing a penetrating radiation field, and at least one partially transmissive radiation detector, wherein the partially transmissive radiation detector is interposed between an object region to be interrogated and the radiation source. The partially transmissive radiation detector transmits a portion of the illumination radiation field. A shadow aperture having a plurality of radiation attenuating regions having apertures therebetween is disposed between the radiation source and the detector. The apertures provide illumination regions for the illumination radiation field to reach the object region, wherein backscattered radiation from the object is detected and generates an image by the detector in regions of the detector that are shadowed by the radiation attenuation regions

    Radiography by selective detection of scatter field velocity components

    Get PDF
    A reconfigurable collimated radiation detector, system and related method includes at least one collimated radiation detector. The detector has an adjustable collimator assembly including at least one feature, such as a fin, optically coupled thereto. Adjustments to the adjustable collimator selects particular directions of travel of scattered radiation emitted from an irradiated object which reach the detector. The collimated detector is preferably a collimated detector array, where the collimators are independently adjustable. The independent motion capability provides the capability to focus the image by selection of the desired scatter field components. When an array of reconfigurable collimated detectors is provided, separate image data can be obtained from each of the detectors and the respective images cross-correlated and combined to form an enhanced image

    Dysregulated methylation at imprinted genes in prostate tumor tissue detected by methylation microarray.

    Get PDF
    BACKGROUND: Imprinting is an important epigenetic regulator of gene expression that is often disrupted in cancer. While loss of imprinting (LOI) has been reported for two genes in prostate cancer (IGF2 and TFPI2), disease-related changes in methylation across all imprinted gene regions has not been investigated. METHODS: Using an Illumina Infinium Methylation Assay, we analyzed methylation of 396 CpG sites in the promoter regions of 56 genes in a pooled sample of 12 pairs of prostate tumor and adjacent normal tissue. Selected LOI identified from the array was validated using the Sequenom EpiTYPER assay for individual samples and further confirmed by expression data from publicly available datasets. RESULTS: Methylation significantly increased in 52 sites and significantly decreased in 17 sites across 28 unique genes (P \u3c 0.05), and the strongest evidence for loss of imprinting was demonstrated in tumor suppressor genes DLK1, PLAGL1, SLC22A18, TP73, and WT1. Differential expression of these five genes in prostate tumor versus normal tissue using array data from a publicly available database were consistent with the observed LOI patterns, and WT1 hypermethylation was confirmed using quantitative DNA methylation analysis. CONCLUSIONS: Together, these findings suggest a more widespread dysregulation of genetic imprinting in prostate cancer than previously reported and warrant further investigation

    Method and Apparatus for Computed Imaging Backscatter Radiography

    Get PDF
    Systems and methods of x-ray backscatter radiography are provided. A single-sided, non-destructive imaging technique utilizing x-ray radiation to image subsurface features is disclosed, capable of scanning a region using a fan beam aperture and gathering data using rotational motion

    PCNA Ubiquitination Is Important, But Not Essential for Translesion DNA Synthesis in Mammalian Cells

    Get PDF
    Translesion DNA synthesis (TLS) is a DNA damage tolerance mechanism in which specialized low-fidelity DNA polymerases bypass replication-blocking lesions, and it is usually associated with mutagenesis. In Saccharomyces cerevisiae a key event in TLS is the monoubiquitination of PCNA, which enables recruitment of the specialized polymerases to the damaged site through their ubiquitin-binding domain. In mammals, however, there is a debate on the requirement for ubiquitinated PCNA (PCNA-Ub) in TLS. We show that UV-induced Rpa foci, indicative of single-stranded DNA (ssDNA) regions caused by UV, accumulate faster and disappear more slowly in Pcna(K164R/K164R) cells, which are resistant to PCNA ubiquitination, compared to Pcna(+/+) cells, consistent with a TLS defect. Direct analysis of TLS in these cells, using gapped plasmids with site-specific lesions, showed that TLS is strongly reduced across UV lesions and the cisplatin-induced intrastrand GG crosslink. A similar effect was obtained in cells lacking Rad18, the E3 ubiquitin ligase which monoubiquitinates PCNA. Consistently, cells lacking Usp1, the enzyme that de-ubiquitinates PCNA exhibited increased TLS across a UV lesion and the cisplatin adduct. In contrast, cells lacking the Rad5-homologs Shprh and Hltf, which polyubiquitinate PCNA, exhibited normal TLS. Knocking down the expression of the TLS genes Rev3L, PolH, or Rev1 in Pcna(K164R/K164R) mouse embryo fibroblasts caused each an increased sensitivity to UV radiation, indicating the existence of TLS pathways that are independent of PCNA-Ub. Taken together these results indicate that PCNA-Ub is required for maximal TLS. However, TLS polymerases can be recruited to damaged DNA also in the absence of PCNA-Ub, and perform TLS, albeit at a significantly lower efficiency and altered mutagenic specificity

    Geology for planning in St. Clair County, Illinois.

    Get PDF
    Cover title.At head of title: State of Illinois. Department of Registration and Education.Includes bibliographical references (pages 32-35)
    corecore