611 research outputs found

    Understanding Student Computational Thinking with Computational Modeling

    Full text link
    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than observational) terms tended to have more success in the programming exercise.Comment: preprint to submit to PERC proceedings 201

    Hypersonic simulations using open-source CFD and DSMC solvers

    Get PDF
    Hypersonic hybrid hydrodynamic-molecular gas flow solvers are required to satisfy the two essential requirements of any high-speed reacting code, these being physical accuracy and computational efficiency. The James Weir Fluids Laboratory at the University of Strathclyde is currently developing an open-source hybrid code which will eventually reconcile the direct simulation Monte-Carlo method, making use of the OpenFOAM application called dsmcFoam, and the newly coded open-source two-temperature computational fluid dynamics solver named hy2Foam. In conjunction with employing the CVDV chemistry-vibration model in hy2Foam, novel use is made of the QK rates in a CFD solver. In this paper, further testing is performed, in particular with the CFD solver, to ensure its efficacy before considering more advanced test cases. The hy2Foam and dsmcFoam codes have shown to compare reasonably well, thus providing a useful basis for other codes to compare against

    Cost-Effective Use of Silver Dressings for the Treatment of Hard-to-Heal Chronic Venous Leg Ulcers

    Get PDF
    Aim To estimate the cost-effectiveness of silver dressings using a health economic model based on time-to-wound-healing in hard-to-heal chronic venous leg ulcers (VLUs). Background Chronic venous ulceration affects 1–3% of the adult population and typically has a protracted course of healing, resulting in considerable costs to the healthcare system. The pathogenesis of VLUs includes excessive and prolonged inflammation which is often related to critical colonisation and early infection. The use of silver dressings to control this bioburden and improve wound healing rates remains controversial. Methods A decision tree was constructed to evaluate the cost-effectiveness of treatment with silver compared with non-silver dressings for four weeks in a primary care setting. The outcomes: ‘Healed ulcer’, ‘Healing ulcer’ or ‘No improvement’ were developed, reflecting the relative reduction in ulcer area from baseline to four weeks of treatment. A data set from a recent meta-analysis, based on four RCTs, was applied to the model. Results Treatment with silver dressings for an initial four weeks was found to give a total cost saving (£141.57) compared with treatment with non-silver dressings. In addition, patients treated with silver dressings had a faster wound closure compared with those who had been treated with non-silver dressings. Conclusion The use of silver dressings improves healing time and can lead to overall cost savings. These results can be used to guide healthcare decision makers in evaluating the economic aspects of treatment with silver dressings in hard-to-heal chronic VLUs

    Growth and properties of GaSbBi alloys

    Get PDF
    Molecular-beam epitaxy has been used to grow GaSb 1− x Bi x alloys with x up to 0.05. The Bi content, lattice expansion, and film thickness were determined by Rutherford backscattering and x-ray diffraction, which also indicate high crystallinity and that >98% of the Bi atoms are substitutional. The observed Bi-induced lattice dilation is consistent with density functional theory calculations. Optical absorption measurements and valence band anticrossing modeling indicate that the room temperature band gap varies from 720 meV for GaSb to 540 meV for GaSb 0.95Bi0.05, corresponding to a reduction of 36 meV/%Bi or 210 meV per 0.01 Å change in lattice constant

    Search for first generation leptoquark pair production in the electron + missing energy + jets final state

    Get PDF
    We present a search for the pair production of first generation scalar leptoquarks (LQ) in data corresponding to an integrated luminosity of 5.4 fb1^{-1} collected with the D0 detector at the Fermilab Tevatron Collider in ppbar collisions at s=1.96\sqrt{s}=1.96 TeV. In the channel LQLQˉeνeqqLQ \bar{LQ} \rightarrow e\nu_e qq', where q, q' are u or d quarks, no significant excess of data over background is observed, and we set a 95% C.L. lower limit of 326 GeV on the leptoquark mass, assuming equal probabilities of leptoquark decays to eq and νeq\nu_e q'.Comment: 7 pages, 6 figures, submitted to PRD-R

    Phylogeny of snakes (Serpentes): combining morphological and molecular data in likelihood Bayesian and parsimony analyses

    Get PDF
    Copyright © 2007 The Natural history MuseumThe phylogeny of living and fossil snakes is assessed using likelihood and parsimony approaches and a dataset combining 263 morphological characters with mitochondrial (2693 bp) and nuclear (1092 bp) gene sequences. The ‘no common mechanism’ (NCMr) and ‘Markovian’ (Mkv) models were employed for the morphological partition in likelihood analyses; likelihood scores in the NCMr model were more closely correlated with parsimony tree lengths. Both models accorded relatively less weight to the molecular data than did parsimony, with the effect being milder in the NCMr model. Partitioned branch and likelihood support values indicate that the mtDNA and nuclear gene partitions agree more closely with each other than with morphology. Despite differences between data partitions in phylogenetic signal, analytic models, and relative weighting, the parsimony and likelihood analyses all retrieved the following widely accepted groups: scolecophidians, alethinophidians, cylindrophiines, macrostomatans (sensu lato) and caenophidians. Anilius alone emerged as the most basal alethinophidian; the combined analyses resulted in a novel and stable position of uropeltines and cylindrophiines as the second-most basal clade of alethinophidians. The limbed marine pachyophiids, along with Dinilysia and Wonambi, were always basal to all living snakes. Other results stable in all combined analyses include: Xenopeltis and Loxocemus were sister taxa (fide morphology) but clustered with pythonines (fide molecules), and Ungaliophis clustered with a boine-erycine clade (fide molecules). Tropidophis remains enigmatic; it emerges as a basal alethinophidian in the parsimony analyses (fide molecules) but a derived form in the likelihood analyses (fide morphology), largely due to the different relative weighting accorded to data partitions.Michael S. Y. Lee, Andrew F. Hugall, Robin Lawson & John D. Scanlo

    A justification of whistleblowing

    Get PDF
    Penultimate version accepted for publicationWhistleblowing is the act of disclosing information from a public or private organization in order to reveal cases of corruption that are of immediate or potential danger to the public. Blowing the whistle involves personal risk, especially when legal protection is absent, and charges of betrayal, which often come in the form of legal prosecution under treason laws. In this article we argue that whistleblowing is justified when disclosures are made with the proper intent and fulfill specific communicative constraints in addressing issues of public interest. Three communicative constraints of informativeness, truthfulness and evidence are discussed in this regard. We develop a ‘harm test’ to assess the intent for disclosures, concluding that it is not sufficient for justification. Along with the proper intent, a successful act of whistleblowing should provide information that serves the public interest. Taking cognizance of the varied conceptions of public interest, we present an account of public interest that fits the framework of whistleblowing disclosures. In particular, we argue that whistleblowing is justified inter alia when the information it conveys is of a presumptive interest for a public insofar as it reveals an instance of injustice or violation of a civil or political right done against and unbeknown to some members of a polity.Project: ‘Change of Direction. Fostering Whistleblowing in the Fight against Corruption’ co-funded by the Internal Security Fund of the European Union (Grant Agreement Number: HOME/2014/ISFP/AG/EFCE/7233); SFRH/BPD/108669/2015info:eu-repo/semantics/publishedVersio

    Reconstruction of primary vertices at the ATLAS experiment in Run 1 proton–proton collisions at the LHC

    Get PDF
    This paper presents the method and performance of primary vertex reconstruction in proton–proton collision data recorded by the ATLAS experiment during Run 1 of the LHC. The studies presented focus on data taken during 2012 at a centre-of-mass energy of √s=8 TeV. The performance has been measured as a function of the number of interactions per bunch crossing over a wide range, from one to seventy. The measurement of the position and size of the luminous region and its use as a constraint to improve the primary vertex resolution are discussed. A longitudinal vertex position resolution of about 30μm is achieved for events with high multiplicity of reconstructed tracks. The transverse position resolution is better than 20μm and is dominated by the precision on the size of the luminous region. An analytical model is proposed to describe the primary vertex reconstruction efficiency as a function of the number of interactions per bunch crossing and of the longitudinal size of the luminous region. Agreement between the data and the predictions of this model is better than 3% up to seventy interactions per bunch crossing
    corecore