991 research outputs found

    “some kind of thing it aint us but yet its in us”: David Mitchell, Russell Hoban, and metafiction after the millennium

    Get PDF
    This article appraises the debt that David Mitchell’s Cloud Atlas owes to the novels of Russell Hoban, including, but not limited to, Riddley Walker. After clearly mapping a history of Hoban’s philosophical perspectives and Mitchell’s inter-textual genre-impersonation practice, the article assesses the degree to which Mitchell’s metatextual methods indicate a nostalgia for by-gone radical aesthetics rather than reaching for new modes of its own. The article not only proposes several new backdrops against which Mitchell’s novel can be read but also conducts the first in-depth appraisal of Mitchell’s formal linguistic replication of Riddley Walker

    Unbounded randomness certification using sequences of measurements

    Get PDF
    Unpredictability, or randomness, of the outcomes of measurements made on an entangled state can be certified provided that the statistics violate a Bell inequality. In the standard Bell scenario where each party performs a single measurement on its share of the system, only a finite amount of randomness, of at most 4log2d4 log_2 d bits, can be certified from a pair of entangled particles of dimension dd. Our work shows that this fundamental limitation can be overcome using sequences of (nonprojective) measurements on the same system. More precisely, we prove that one can certify any amount of random bits from a pair of qubits in a pure state as the resource, even if it is arbitrarily weakly entangled. In addition, this certification is achieved by near-maximal violation of a particular Bell inequality for each measurement in the sequence.Comment: 4 + 5 pages (1 + 3 images), published versio

    Generalized Bell Inequality Experiments and Computation

    Full text link
    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in a local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice of two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich non-local box for many parties and non-binary inputs and outputs at each site. Finally, we comment on the effect of pre-processing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally non-local correlations such as those of these generalised Popescu-Rohrlich non-local boxes.Comment: 16 pages, 2 figures, supplemental material available upon request. Typos corrected and references adde

    Learning to Teach Argumentation: Research and development in the science classroom

    Get PDF
    The research reported in this study focuses on an investigation into the teaching of argumentation in secondary science classrooms. Over a one-year period, a group of 12 teachers from schools in the greater London area attended a series of workshops to develop materials and strategies to support the teaching of argumentation in scientific contexts. Data were collected at the beginning and end of the year by audio and video recording lessons where the teachers attempted to implement argumentation. To assess the quality of argumentation, analytical tools derived from Toulmin's argument pattern (TAP) were developed and applied to classroom transcripts. Analysis shows there was development in teachers' use of argumentation across the year. Results indicate that the pattern of use of argumentation is teacher-specific, as is the nature of change. To inform future professional development programmes, transcripts of five teachers, three showing a significant change and two no change, were analysed in more detail to identify features of teachers' oral contributions that facilitated and supported argumentation. The analysis showed that all teachers attempted to encourage a variety of processes involved in argumentation and that the teachers whose lessons included the highest quality of argumentation (TAP analysis) also encouraged higher order processes in their teaching. The analysis of teachers' facilitation of argumentation has helped to guide the development of in-service materials and to identify the barriers to learning in the professional development of less experienced teachers

    Non-adaptive Measurement-based Quantum Computation and Multi-party Bell Inequalities

    Full text link
    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as Measurement-based Quantum Computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities still remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focussing on deterministic computation of Boolean functions, in which natural generalisations of the Greenberger-Horne-Zeilinger (GHZ) paradox emerge; we then explore probabilistic computation, via which multipartite Bell Inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.Comment: 13 pages, 4 figures, final version accepted for publicatio

    Characterizing the Effects of Chronic 2G Centrifugation on the Rat Skeletal System

    Get PDF
    During weightlessness, the skeletal system of astronauts is negatively affected by decreased calcium absorption and bone mass loss. Therefore, it is necessary to counteract these changes for long-term skeletal health during space flights. Our long-term plan is to assess artificial gravity (AG) as a possible solution to mitigate these changes. In this study, we aim to determine the skeletal acclimation to chronic centrifugation. We hypothesize that a 2G hypergravity environment causes an anabolic response in growing male rats. Specifically, we predict chronic 2G to increase tissue mineral density, bone volume fraction of the cancellous tissue and to increase overall bone strength. Systemically, we predict that bone formation markers (i.e., osteocalcin) are elevated and resorption markers (i.e., tartrate resistant acid phosphatase) are decreased or unchanged from controls. The experiment has three groups, each with an n8: chronic 2g, cage control (housed on the centrifuge, but not spun), and a vivarium control (normal rat caging). Pre-pubescent, male Long-Evans rats were used to assess our hypothesis. This group was subject to 90 days of 2G via centrifugation performed at the Chronic Acceleration Research Unit (CARU) at University of California Davis. After 90 days, animals were euthanized and tissues collected. Blood was drawn via cardiac puncture and the right leg collected for structural (via microcomputed tomography) and strength quantification. Understanding how counteract these skeletal changes will have major impacts for both the space-faring astronauts and the people living on Earth

    Trends in Pennsylvania 8th and 11th Grade Student Test Performance Since the Common Core Implementation

    Get PDF
    Abstract The purpose of this research study was to explore trends in student test performance since the Common Core implementation in 8th and 11th grades in Pennsylvania. After receiving failing grades for the Pennsylvania State Standards when compared with other states, legislators adopted the Pennsylvania Common Core Standards in 2013. Much of this decision was grounded in the belief that with new standards, Pennsylvania student test scores would move from 35-45% proficiency levels in Reading and Math to 100% proficiency (Hamilton, 2007). Research questions focused on the trends in students’ scores over time as reported by the PSSA and Keystone exams, administered each year. A quantitative analysis was performed with repeated measures for 8th grade from 2015-2017 and for 11th grade from 2013-2017 looking for statistical significance in the general population, the “Historically Underperforming” population, and in locales- urban, suburban, rural, and towns. Where significance was found, correlations were run between the covariates of Black/Hispanic and poor student populations. Results showed significant growth in 8th grade math scores over time, with negative correlations from race and poverty which also affected 8th grade ELA scores in the “Historically Underperforming” population. Eleventh grade scores showed no significance except negative correlations associated with race in the “Historically Underperforming” reading students. When drilling down to locales, significance was found in growth made by city and rural schools in 8th grade math and short term gains in 11th grade math

    Trends in Pennsylvania 8th and 11th Grade Student Test Performance Since the Common Core Implementation

    Get PDF
    Abstract The purpose of this research study was to explore trends in student test performance since the Common Core implementation in 8th and 11th grades in Pennsylvania. After receiving failing grades for the Pennsylvania State Standards when compared with other states, legislators adopted the Pennsylvania Common Core Standards in 2013. Much of this decision was grounded in the belief that with new standards, Pennsylvania student test scores would move from 35-45% proficiency levels in Reading and Math to 100% proficiency (Hamilton, 2007). Research questions focused on the trends in students’ scores over time as reported by the PSSA and Keystone exams, administered each year. A quantitative analysis was performed with repeated measures for 8th grade from 2015-2017 and for 11th grade from 2013-2017 looking for statistical significance in the general population, the “Historically Underperforming” population, and in locales- urban, suburban, rural, and towns. Where significance was found, correlations were run between the covariates of Black/Hispanic and poor student populations. Results showed significant growth in 8th grade math scores over time, with negative correlations from race and poverty which also affected 8th grade ELA scores in the “Historically Underperforming” population. Eleventh grade scores showed no significance except negative correlations associated with race in the “Historically Underperforming” reading students. When drilling down to locales, significance was found in growth made by city and rural schools in 8th grade math and short term gains in 11th grade math
    corecore