7,689 research outputs found

    Pile-up solutions for some systems of conservation laws modelling dislocation interaction in crystals

    Get PDF
    Some continuum models for dislocation interactions in a simple crystal geometry are studied. The simplest models are mixed systems of conservation laws which are shown to exhibit singularities and instabilities. These are then regularized, leading to parabolic free-boundary problems. In both cases, solutions describing the formation of structures such as dislocation pile-ups are discussed

    The role of ontologies in creating and maintaining corporate knowledge: a case study from the aero industry

    No full text
    The Designers’ Workbench is a system, developed to support designers in large organizations, such as Rolls-Royce, by making sure that the design is consistent with the specification for the particular design as well as with the company’s design rule book(s). The evolving design is described against a jet engine ontology. Currently, to capture the constraint information, a domain expert (design engineer) has to work with a knowledge engineer to identify the constraints, and it is then the task of the knowledge engineer to encode these into the Workbench’s knowledge base (KB). This is an error prone and time consuming task. It is highly desirable to relieve the knowledge engineer of this task, and so we have developed a tool, ConEditor+ that enables domain experts themselves to capture and maintain these constraints. The tool allows the user to combine selected entities from the domain ontology with keywords and operators of a constraint language to form a constraint expression. Further, we hypothesize that to apply constraints appropriately, it is necessary to understand the context in which each constraint is applicable. We refer to this as “application conditions”. We show that an explicit representation of application conditions, in a machine interpretable format, along with the constraints and the domain ontology can be used to support the verification and maintenance of constraints

    ConEditor+: Capture and Maintenance of Constraints in Engineering Design

    No full text
    The Designers' Workbench is a system, developed to support designers in large organizations, such as Rolls-Royce, by making sure that the design is consistent with the specification for the particular design as well as with the company’s design rule book(s). Currently, to capture the constraint information, a domain expert (design engineer) has to work with a knowledge engineer to identify the constraints, and it is then the task of the knowledge engineer to encode these into the Workbench's knowledge base (KB). This is an error prone and time consuming task. It is highly desirable to relieve the knowledge engineer of this task, and so we have developed a tool, ConEditor+ that enables domain experts themselves to capture and maintain these constraints. The tool allows the user to combine selected entities from the domain ontology with keywords and operators of a constraint language to form a constraint expression. Further, we hypothesize that to apply constraints appropriately, it is necessary to understand the context in which each constraint is applicable. We refer to this as "application conditions". We show that an explicit representation of application conditions, in a machine interpretable format, along with the constraints and the domain ontology can be used to support the verification and maintenance of constraints

    Inventive interpretations

    Get PDF

    Strokes for Representing Univariate Vector Field Maps

    Get PDF
    Particle systems make an excellent tool for creating tracks (which we call strokes) in vector fields. The question addressed in this paper is how such tracks should be made to vary in size and colour in order to reveal properties such as local direction and strength of the field. We find that for strokes that vary from large to small, direction is indicated by the large end. We also find that for strokes that vary in colour, the colour of the background is the most important determinant of perceived direction

    Greenhouse gas emissions, inventories and validation

    Get PDF
    The emission of greenhouse gases has become a very high priority research and environmental policy issue due to their effects on global climate. The knowledge of changes in global atmospheric concentrations of greenhouse gases since the industrial revolution is well documented, and the global budgets are reasonably well known. However, even at this scale there are important uncertainties in the budgets, for example, in the case of methane while the main sources and sinks have been identified, temporal changes in the global average concentrations since the early 1990s are not understood. In the absence of a quantitative explanation with appropriate experimental support, it is clear that current knowledge of the causes of changes in the global methane budget is inadequate to predict the effect of changes in specific emission sectors. In developing control strategies to reduce emissions it is necessary to validate national emissions and their spatial disaggregation. The methodology to underpin such a process is at an early stage of development and is not fully implemented in any country, even though target emission reductions have already been announced. Furthermore, the scale of the emission reductions is large (eg of 60% reductions by 2050 relative to 1990 baseline). There is therefore an urgent requirement for measurement based verification processes to support such challenging emission reductions. In this paper we provide the background in greenhouse gas emissions globally and in the UK followed by examples of approaches to validate emissions at the UK scale and within the regions

    Constraint capture and maintenance in engineering design

    Get PDF
    The Designers' Workbench is a system, developed by the Advanced Knowledge Technologies (AKT) consortium to support designers in large organizations, such as Rolls-Royce, to ensure that the design is consistent with the specification for the particular design as well as with the company's design rule book(s). In the principal application discussed here, the evolving design is described against a jet engine ontology. Design rules are expressed as constraints over the domain ontology. Currently, to capture the constraint information, a domain expert (design engineer) has to work with a knowledge engineer to identify the constraints, and it is then the task of the knowledge engineer to encode these into the Workbench's knowledge base (KB). This is an error prone and time consuming task. It is highly desirable to relieve the knowledge engineer of this task, and so we have developed a system, ConEditor+ that enables domain experts themselves to capture and maintain these constraints. Further we hypothesize that in order to appropriately apply, maintain and reuse constraints, it is necessary to understand the underlying assumptions and context in which each constraint is applicable. We refer to them as “application conditions” and these form a part of the rationale associated with the constraint. We propose a methodology to capture the application conditions associated with a constraint and demonstrate that an explicit representation (machine interpretable format) of application conditions (rationales) together with the corresponding constraints and the domain ontology can be used by a machine to support maintenance of constraints. Support for the maintenance of constraints includes detecting inconsistencies, subsumption, redundancy, fusion between constraints and suggesting appropriate refinements. The proposed methodology provides immediate benefits to the designers and hence should encourage them to input the application conditions (rationales)

    Editorial

    Get PDF

    Quantum computing with nearest neighbor interactions and error rates over 1%

    Full text link
    Large-scale quantum computation will only be achieved if experimentally implementable quantum error correction procedures are devised that can tolerate experimentally achievable error rates. We describe a quantum error correction procedure that requires only a 2-D square lattice of qubits that can interact with their nearest neighbors, yet can tolerate quantum gate error rates over 1%. The precise maximum tolerable error rate depends on the error model, and we calculate values in the range 1.1--1.4% for various physically reasonable models. Even the lowest value represents the highest threshold error rate calculated to date in a geometrically constrained setting, and a 50% improvement over the previous record.Comment: 4 pages, 8 figure
    corecore