221 research outputs found

    ‘Multi View Graphing’: Linked Multi Visualization utilising Brushing, Binning and Clustering

    Get PDF
    Visualization can provide a distinctly advantageous overview of data, enabling the rapid identification of anomalies, patterns or correlations that would not otherwise be obvious. Different visualization techniques each offer their own unique insight into the same data; however the similarities that exist between them are not always clear. High data density can also be a very evident issue when exploring data using visualization. The densest datasets can ensure that even well suited visualization methodologies succumb to usability issues. The most powerful data analysis environments are arguably those that provide interactive exploration, however visual feedback in such environments is sometimes undesirably limited. The concept of linking different visualization styles using interactive techniques, such as brushing, is currently evident in multiple publically available software environments. To explore the concept of linked visualization a prototype application was produced, allowing up to four unique visual styles to be generated using the same data, at the same time. Current brushing methodologies were extended and included, in order to provide the ability to affect each visualization from within every other. The issue of data density was tackled through the use of a novel approach to binning based around a uniform grid. Visual cues were used extensively throughout the prototype, ranging from representing a brushing area through to defining the basic starting parameters of a clustering algorithm. Three distinctly different test cases are presented to demonstrate the techniques showcased within the prototype, each in conjunction with external collaborators. Results suggest that using linked multi visualization is a more effective method of data analysis, offering greater insight than using a lone visualization technique. In tackling data density, the grid based binning has the ability to offer an easily disseminated overview of even extremely cluttered visualizations. The extensive use of visual cues in the prototype vindicated the theory that offering clear feedback within interactive environments is of the utmost importance. Also the interactive definition of clustering parameters via visual cues shows promise as a concept but one requiring further research. This study highlights that the current trend towards linking multiple visualization techniques within advanced data analysis environments is correct; it also introduces novel brushing, binning and clustering concepts worthy of further investigation

    Numerical Modelling and Visualization of the Evolution of Extensional Fault Systems

    Get PDF
    The purpose of this work is split into two categories, the first was to analyse the application of real-time Physics Engine software libraries for use in calculating a geological numerical model. Second was the analysis of the applicability of glyph and implicit surface based visualization techniques to explore fault systems produced by the model. The current state of the art in Physics Engines was explored by redeveloping a Discrete Element Model to be calculated using NVIDIA’s PhysX engine. Analyses regarding the suitability of the engine in terms of numerical accuracy and developmental capabilities is given, as well as the definition of a specialised and bespoke parallelisation technique. The use of various glyph based visualizations is explored to define a new standardised taxonomy for geological data and the MetaBall visualization technique was applied to reveal three dimensional fault structures as an implicit surface. Qualitative analysis was undertaken in the form of a user study, comprising of interviews with expert geologists. The processing pipeline used by many Physics Engines was found to be comparable to the design of Discrete Element Model software, however, aspects of their design, such as integration accuracy, limitation to single precision floating point and imposed limits on the scale of n-body problem means their suitability is restricted to specific modelling cases. Glyph and implicit surface based visualization have been shown to be an effective way to present a geological Discrete Element Model, with the majority of experts interviewed able to perceive the fault structures that it contained. Development of a new engine, or modification of one that exists in accordance with the findings of this thesis would result in a library extremely well suited to the problem of rigid-body simulation for the sciences

    Visualizing a spherical geological discrete element model of fault evolution

    Get PDF
    Discrete Element Modelling (DEM) is a numerical technique that uses a system of interacting discrete bodies to simulate the movement of material being exposed to external forces. This technique is often used to simulate granular systems; however by adding further elements that inter-connect the bodies, it can be used to simulate the deformation of a large volume of material. This method has precedent for use in the Earth Sciences and recently, with the increase of available computing power, it has been put to good use simulating the evolution of extensional faults in large scale crustal experiments that involve over half a million individual spherical bodies. An interactive environment that provides high quality rendering is presented, showing that interactivity is key in allowing the intelligent application of visualization methods such as colour-mapping and visibility thresholds in order to extract fault information from a geological DEM. It is also shown that glyph representation alone is not sufficient to provide full insight into the complex three dimensional geometries of the faults found within the model. To overcome this, a novel use of the MetaBall method is described, which results in implicit surface representations of sphere sub-sets. The surfaces produced are shown to provide greater insight into the faults found within the data but also raise questions as to their meaning. © The Eurographics Association 2012

    Analysing the use of Real-time Physics Engines for Scientific Simulation: Exploring the Theoretical and Practical Benefits for Discrete Element Modelling

    Get PDF
    Within computer science, reusability of specific modular software components is generally accepted as best practice. Simulation techniques such as Discrete Element Modelling (DEM) rely on the well defined problems of Newtonian physics, and while differences exist in the methods defined to compute solutions to these problems, each method follows the same basic set of premises. Recently, libraries termed Physics Engines (PE) have been released that are designed to solve physics based problems. This paper considers the features of a range PEs and explores whether the techniques and design methodologies can be applicable to the design and implementation of a working simulation. The NVIDIA PhysX engine has been utilised in a practical DEM implementation to simulate the evolution of extensional fault systems in rock. Through understanding the general processing pipeline implemented by a PE, obvious similarities with a range of DEM implementations has became apparent. Discussed are areas that are compatible and also areas within the PE that have been proved unsuited to large scale DEM simulation. It is shown that current versions of PEs may not provide access to techniques giving high enough numerical accuracy for certain applications, but the basic premise of an easy to use and highly optimised library, designed to allow researchers to construct complex simulation scenarios is compelling

    Discrete element modelling using a parallelised physics engine

    Get PDF
    Discrete Element Modelling (DEM) is a technique used widely throughout science and engineering. It offers a convenient method with which to numerically simulate a system prone to developing discontinuities within its structure. Often the technique gets overlooked as designing and implementing a model on a scale large enough to be worthwhile can be both time consuming and require specialist programming skills. Currently there are a few notable efforts to produce homogenised software to allow researchers to quickly design and run DEMs with in excess of 1 million elements. However, these applications, while open source, are still complex in nature and require significant input from their original publishers in order for them to include new features as a researcher needs them. Recently software libraries notably from the computer gaming and graphics industries, known as physics engines, have emerged. These are designed specifically to calculate the physical movement and interaction of a system of independent rigid bodies. They provide conceptual equivalents of real world constructions with which an approximation of a realistic scenario can be quickly built. This paper presents a method to utilise the most notable of these engines, NVIDIAs PhysX, to produce a parallelised geological DEM capable of supporting in excess of a million elements. © The Eurographics Association 2009

    Microbial Protein for Human Consumption: Towards Sustainable Protein Production

    Get PDF
    \ua9 2025 The Author(s). Nutrition Bulletin published by John Wiley & Sons Ltd on behalf of British Nutrition Foundation. Protein from animal sources significantly contributes to greenhouse gas emissions, driving the need for sustainable alternative protein sources to meet global dietary demands while reducing environmental impact. This project explores microbial protein, derived through cellular agriculture using fermentation technology, as a viable, sustainable and high-quality protein for human consumption. This report describes a multidisciplinary approach to assessing the feasibility of incorporating microbial protein into human food systems, guided by four key objectives. First, a market analysis to identify opportunities and challenges for incorporating microbial protein into existing food products, assessing its potential to improve the protein quality of plant-based foods. Second, the project will evaluate the protein quality and digestibility of reformulated products using advanced models simulating human gastrointestinal processes. Third, consumer perceptions and barriers to adopting bacterial-based proteins will be investigated, addressing safety, health and sustainability concerns. Overall findings will inform the development of a technical document outlining actionable recommendations for commercialising microbial proteins as food ingredients. This multidisciplinary project aims to support the sustainable diversification of dietary protein sources, contributing to global efforts towards achieving sustainable food systems. The project is funded by the Start Healthy, Stay Healthy (STAR) Hub, a Diet and Health Open Innovation Research Club (OIRC) which is funded by the UK Research and Innovation (UKRI) Biotechnology and Biological Sciences Research Council (BBSRC)

    The public health challenge of obesity: is it the new smoking?

    Get PDF
    This article considers the complexity of obesity and its health implications, highlighting its implications for community practitioners

    Coupling Molecular Dynamics and Direct Simulation Monte Carlo using a general and high-performance code coupling library

    Get PDF
    A domain-decomposed method to simultaneously couple the classical Molecular Dynamics (MD) and Direct Simulation Monte Carlo (DSMC) methods is proposed. This approach utilises the MPI-based general coupling library, the Multiscale Universal Interface. The method provides a direct coupling strategy and utilises two OpenFOAM based solvers, mdFoam+ and dsmcFoam+, enabling scenarios where both solvers assume one discrete particle is equal to one molecule or atom. The ultimate goal of this work is to enable complex multi-scale simulations involving micro, meso and macroscopic elements, as found with problems like evaporation.Results are presented to show the fundamental capabilities of the method in terms of mass and kinetic energy conservation between simulation regions handled by the different solvers. We demonstrate the capability of the method by deploying onto a large supercomputing resource, with attention paid to the scalability for a canonical NVT ensemble (a constant number of atoms N, constant volume V and constant temperature T) of Argon atoms. The results show that the method performs as expected in terms of mass conservation and the solution is also shown to scale reasonably on a supercomputing resource, within the known performance limits of the coupled codes. The wider future of this work is also considered, with focus placed on the next steps to expand the capabilities of the methodology to allow for indirect coupling (where the coarse-graining capability of the DSMC method is used), as well as how this will then fit into a larger coupled framework to allow a complete micro-meso-macro approach to be tackled

    Is the GehD lipase from Staphylococcus epidermidis a collagen binding adhesin

    Get PDF
    The opportunistic human pathogen Staphylococcus epidermidis is the major cause of nosocomial biomaterial infections. S. epidermidis has the ability to attach to indwelling materials coated with extracellular matrix proteins such as fibrinogen, fibronectin, vitronectin, and collagen. To identify the proteins necessary for S. epidermidis attachment to collagen, we screened an expression library using digoxigenin-labeled collagen as well as two monoclonal antibodies generated against the Staphylococcus aureus collagen-adhesin, Cna, as probes. These monoclonal antibodies recognize collagen binding epitopes on the surface of S. aureus and S. epidermidis cells. Using this approach, we identified GehD, the extracellular lipase originally found in S. epidermidis 9, as a collagen-binding protein. Despite the monoclonal antibody cross-reactivity, the GehD amino acid sequence and predicted structure are radically different from those of Cna. The mature GehD circular dichroism spectra differs from that of Cna but strongly resembles that of a mammalian cell-surface collagen binding receptor, known as the alpha(1) integrin I domain, suggesting that they have similar secondary structures. The GehD protein is translated as a preproenzyme, secreted, and post-translationally processed into mature lipase. GehD does not have the conserved LPXTG C-terminal motif present in cell wall-anchored proteins, but it can be detected in lysostaphin cell wall extracts. A recombinant version of mature GehD binds to collagens type I, II, and IV adsorbed onto microtiter plates in a dose-dependent saturable manner. Recombinant, mature GehD protein and anti-GehD antibodies can inhibit the attachment of S. epidermidis to immobilized collagen. These results provide evidence that GehD may be a bi-functional molecule, acting not only as a lipase but also as a cell surface-associated collagen adhesin
    corecore