252 research outputs found
AN INVESTIGATION OF POLY(N-ISOPROPYLACRYLAMIDE) FOR APPLICATIONS WITH MICROFLUIDIC PAPER-BASED ANALYTICAL DEVICES
N,N′-methylenebisacrylamide-crosslinked poly(N-isopropylacrylamide), also known as P(NIPAM), was developed as a fluid delivery system for use with microfluidic paper-based analytical devices (microPADs). MicroPADs are postage-stamp-sized devices made out of paper that can be used as platforms for low-cost, simple-to-use point-of-care diagnostic assays. P(NIPAM) is a thermally responsive polymer that absorbs aqueous solutions at room temperature and will expel the solutions to microPADs when heated. The fluid delivery characteristics of P(NIPAM) were assessed, and P(NIPAM) was able to deliver multiple solutions to microPADs in specific sequences or simultaneously in a laminar-flow configuration. P(NIPAM) was then shown to be suitable for delivering four classes of reagents to microPADs: small molecules, enzymes, antibodies and DNA. P(NIPAM) successfully delivered a series of standard concentrations of glucose (0 – 5 mM) to microPADs equipped to perform a colorimetric glucose assay. The results of these tests were used to produce an external calibration curve, which in turn was used to determine accurately the concentrations of glucose in sample solutions. P(NIPAM) successfully delivered fluorescein-labeled IgG and fluorescein-labeled oligonucleotides (20 base pairs) to microPADs in a variety of concentrations. P(NIPAM) also successfully delivered horseradish peroxidase (HRP) to microPADs, and it was determined that HRP could be stored in P(NIPAM) for 35 days with minimal loss in activity. The combination of P(NIPAM) with microPADs will allow for more complex assays to be performed with minimal user input, will facilitate the preparation of external calibration curves in the field, and may be useful in extending the shelf life of microPADs by stabilizing reagents
Cluster Approximation for the Farey Fraction Spin Chain
We consider the Farey fraction spin chain in an external field . Utilising
ideas from dynamical systems, the free energy of the model is derived by means
of an effective cluster energy approximation. This approximation is valid for
divergent cluster sizes, and hence appropriate for the discussion of the
magnetizing transition. We calculate the phase boundaries and the scaling of
the free energy. At we reproduce the rigorously known asymptotic
temperature dependence of the free energy. For , our results are
largely consistent with those found previously using mean field theory and
renormalization group arguments.Comment: 17 pages, 3 figure
Didaktische Modelle von Computerspieltutorials
Computerspiele basieren auf Regeln. Diese zu erlernen stellt die Voraussetzung dar, um in das Spielgeschehen einzusteigen. Um die entsprechenden Basisqualifikationen zu erlernen, benötigt man eine Anleitung oder zumindest eine virtuelle Umgebung, welche eben diese in die Spielwelt integriert.
In der vorliegenden Diplomarbeit werden Tutorials von vier ausgewählten Computerspielen mit Blick auf deren didaktische Vorgehensweise analysiert, um folgende Forschungsfrage zu beantworten: „Nach welchen Methoden sind Tutorials in Computerspielen didaktisch aufbereitet?“
Im theoretischen Teil wird das Computerspiel im Sinne Johan Huizingas als Spiel definiert und eine Übersicht über die zur Thematik gehörende Fachliteratur geboten. Danach werden die für das Forschungsfrage relevanten Begriffe bestimmt. Abschließend stehen das Forschungsdesign und die Entwicklung einer Methode für die didaktische Analyse von Computerspieltutorials im Mittelpunkt.
Im empirischen Teil wird zunächst jedes Spiel einzeln besprochen. Die Basis der Analyse bilden von der pädagogischen Tatsachenforschung gewonnene Auswertungsbögen und die darauf aufbauende Kategorisierung. Auf dieser Grundlage werden zunächst didaktische Modelle jedes Spiels und danach ein Gesamtmodell erstellt.
Nach dem Gesamtmodell folgt die Beantwortung der Forschungsfrage durch Behandlung der daraus abgeleiteten Unterfragen.
Die Hauptergebnisse der Diplomarbeit lauten folgendermaßen: Alle Computerspieltutorials sind progressiv strukturiert, verlangen selten Eigeninitiativen des Spielers, basieren auf Teilprüfungen und sind zumeist narrativ gerahmt.Computer games are based upon rules. Learning these rules is a precondition for playing such games. Players need an instruction book or at least a virtual environment, which integrates these instructions into the game.
The present diploma thesis at hand analyses tutorials of four selected computer games in terms of their didactic approach in order to answer the following question: "What methods are used to present computer game tutorials didactically?"
The theoretical chapter defines computer games as games within the meaning of Johan Huizinga and presents an overview of the literature that relates to the topic of the paper. This is followed by definitions of the most important terms. The research design and the development of a method for didactical analysis are the focus of the final part of this chapter.
The empirical chapter first discusses each individual game on the basis of the evaluation sheets derived from educational action research and the associated categorization. This is then used as the basis for the preparation of a didactic model of each game, followed by the creation of an overall model.
Following the overall didactic model, the research question is answered by examining the sub-questions thus obtained.
The main results of the diploma thesis read as follows: Computer game tutorials are structured progressively, rarely claim the players own initiative, are based upon partial testings and they are framed narratively
Pulse frequency techniques for automatic control
This thesis describes an investigation into the suitability of pulse frequency modulation (PFM) as a standard form of signal for representing quantities in process control. The encoding and decoding of PFM signals into both analogue and digital forms is examined in some detail. PFM is shown to be well suited for high accuracy telemetry at moderate cost, provided ample channel band-width is available. The processing of the information in PFM signals by means of binary logic devices is treated systematically. Functional building blocks are identified, and shown to be capable of performing all the basic algebraic and differential operations needed for control.
The thesis concludes with an examination of applications and a discussion of PFM transducers, actuators and hierarchical control schemes. The performance constraints of two different process controllers are identified. Both controllers show ‘P+I’ action; one works continuously, the other has a cyclic action; both employ PFM techniques. They are shown to offer dynamic responses similar to those of conventional analogue controllers, in conjunction with high accuracy (e. g. errors less than ½%), computer compatibility and the facility for digital display
Policy challenges from the "White" Senate inquiry into workplace-related health impacts of toxic dusts and nanoparticles
On 22 June 2005 the Senate of the Commonwealth of Australia voted to establish an inquiry into workplace harm related to toxic dust and emerging technologies (including nanoparticles). The inquiry became known as the "White" Inquiry after Mr Richard White, a financially uncompensated sufferer of industrial sandblasting-induced lung disease who was instrumental in its establishment. The "White" Inquiry delivered its final report and recommendations on 31 May 2006. This paper examines whether these recommendations and their implementation may provide a unique opportunity not only to modernize relevant monitoring standards and processes, but related compensation systems for disease associated with workplace-related exposure to toxic dusts. It critically analyzes the likely role of the new Australian Safety and Compensation Council (ASCC) in this area. It also considers whether recommendations related to potential workplace related harm from exposure to nanoparticles could commence a major shift in Australian healthcare regulation
High-Speed Tracer Analysis of Metabolism (HS-TrAM)
Tracing the fate of stable isotopically-enriched nutrients is a sophisticated method of describing and quantifying the activity of metabolic pathways. Nuclear Magnetic Resonance (NMR) offers high resolution data, yet is under-utilised due to length of time required to collect the data, quantification requiring multiple samples and complicated analysis. Here we present two techniques, quantitative spectral filters and enhancement of the splitting due to J-coupling in 1H,13C-HSQC NMR spectra, which allow the rapid collection of NMR data in a quantitative manner on a single sample. The reduced duration of HSQC spectra data acquisition opens up the possibility of real-time tracing of metabolism including the study of metabolic pathways in vivo. We show how these novel techniques can be used to trace the fate of labelled nutrients in a whole organ model of kidney preservation prior to transplantation using a porcine kidney as a model organ, and also show how the use of multiple nutrients, differentially labelled with 13C and 15N, can be used to provide additional information with which to profile metabolic pathways
Allele-Specific HLA Loss and Immune Escape in Lung Cancer Evolution
Immune evasion is a hallmark of cancer. Losing the ability to present neoantigens through human leukocyte antigen (HLA) loss may facilitate immune evasion. However, the polymorphic nature of the locus has precluded accurate HLA copy-number analysis. Here, we present loss of heterozygosity in human leukocyte antigen (LOHHLA), a computational tool to determine HLA allele-specific copy number from sequencing data. Using LOHHLA, we find that HLA LOH occurs in 40% of non-small-cell lung cancers (NSCLCs) and is associated with a high subclonal neoantigen burden, APOBEC-mediated mutagenesis, upregulation of cytolytic activity, and PD-L1 positivity. The focal nature of HLA LOH alterations, their subclonal frequencies, enrichment in metastatic sites, and occurrence as parallel events suggests that HLA LOH is an immune escape mechanism that is subject to strong microenvironmental selection pressures later in tumor evolution. Characterizing HLA LOH with LOHHLA refines neoantigen prediction and may have implications for our understanding of resistance mechanisms and immunotherapeutic approaches targeting neoantigens. Video Abstract [Figure presented] Development of the bioinformatics tool LOHHLA allows precise measurement of allele-specific HLA copy number, improves the accuracy in neoantigen prediction, and uncovers insights into how immune escape contributes to tumor evolution in non-small-cell lung cancer
- …
