970 research outputs found

    Influence of combined impact and cyclic loading on the overall fatigue life of forged steel, EA4T

    Get PDF
    The performance of forged steel, EA4T, used in rail industry, under simulated in service conditions, i.e. combined impact - cyclic loading, was investigated through a comprehensive experimental programme. The standard Paris-Erdogan fatigue design curve parameters, m and C, were calibrated to account for the effect of the impact component of loading. A minimum threshold for impact load component, identified in the experiments, was also incorporated in the proposed empirical model. Comparison with experimental findings indicated that this “modified” Fatigue design curve could predict the fatigue life of pre impact loaded specimens with sufficient accuracy. It was therefore suggested that the modified model may be used as a novel design tool for predicting the overall fatigue life of components made of this material under the specified combined impact and fatigue loading conditions.Publisher Statement: The final publication is available at Springer via http://dx.doi.org/10.1007/s12206-016-0923-

    Running Spectral Index and Formation of Primordial Black Hole in Single Field Inflation Models

    Full text link
    A broad range of single field models of inflation are analyzed in light of all relevant recent cosmological data, checking whether they can lead to the formation of long-lived Primordial Black Holes (PBHs). To that end we calculate the spectral index of the power spectrum of primordial perturbations as well as its first and second derivatives. PBH formation is possible only if the spectral index increases significantly at small scales, i.e. large wave number kk. Since current data indicate that the first derivative αS\alpha_S of the spectral index nS(k0)n_S(k_0) is negative at the pivot scale k0k_0, PBH formation is only possible in the presence of a sizable and positive second derivative ("running of the running") βS\beta_S. Among the three small-field and five large-field models we analyze, only one small-field model, the "running mass" model, allows PBH formation, for a narrow range of parameters. We also note that none of the models we analyze can accord for a large and negative value of αS\alpha_S, which is weakly preferred by current data.Comment: 26 pages, 5 figures, Refs. added, Minor textual change; version to appear in JCA

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Graph-based simulated annealing: a hybrid approach to stochastic modeling of complex microstructures

    Get PDF
    A stochastic model is proposed for the efficient simulation of complex three-dimensional microstructures consisting of two different phases. The model is based on a hybrid approach, where in a first step a graph model is developed using ideas from stochastic geometry. Subsequently, the microstructure model is built by applying simulated annealing to the graph model. As an example of application, the model is fitted to a tomographic image describing the microstructure of electrodes in Li-ion batteries. The goodness of model fit is validated by comparing morphological characteristics of experimental and simulated data

    Unified Treatment of Asymptotic van der Waals Forces

    Full text link
    In a framework for long-range density-functional theory we present a unified full-field treatment of the asymptotic van der Waals interaction for atoms, molecules, surfaces, and other objects. The only input needed consists of the electron densities of the interacting fragments and the static polarizability or the static image plane, which can be easily evaluated in a ground-state density-functional calculation for each fragment. Results for separated atoms, molecules, and for atoms/molecules outside surfaces are in agreement with those of other, more elaborate, calculations.Comment: 6 pages, 5 figure

    Thermo-Mechanical Treatment Effects on Stress Relaxation and Hydrogen Embrittlement of Cold-Drawn Eutectoid Steels

    Get PDF
    The effects of the temperature and stretching levels used in the stress-relieving treatment of cold-drawn eutectoid steel wires are evaluated with the aim of improving the stress relaxation behavior and the resistance to hydrogen embrittlement. Five industrial treatments are studied, combining three temperatures (330, 400, and 460 °C) and three stretching levels (38, 50 and 64% of the rupture load). The change of the residual stress produced by the treatments is taken into consideration to account for the results. Surface residual stresses allow us to explain the time to failure in standard hydrogen embrittlement test

    Capacity Analysis of MIMO-WLAN Systems with Single Co-Channel Interference

    Get PDF
    [[abstract]]In this paper, channel capacity of multiple-input multiple-output wireless local area network (MIMO-WLAN) systems with single co-channel interference (CCI) is calculated. A ray-tracing approach is used to calculate the channel frequency response, which is further used to calculate the corresponding channel capacity. The ability to combat CCI for the MIMO-WLAN simple uniform linear array (ULA) and polarization diversity array (PDA) are investigated. Also the effects caused by two antenna arrays for desired system and CCI are quantified. Numerical results show that MIMO-PDA is better than those of MIMO-ULA when interference is present.[[notice]]補正完畢[[incitationindex]]EI[[booktype]]紙本[[booktype]]電子

    Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop

    Get PDF
    A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization

    A clustering method for graphical handwriting components and statistical writership analysis

    Get PDF
    Handwritten documents can be characterized by their content or by the shape of the written characters. We focus on the problem of comparing a person\u27s handwriting to a document of unknown provenance using the shape of the writing, as is done in forensic applications. To do so, we first propose a method for processing scanned handwritten documents to decompose the writing into small graphical structures, often corresponding to letters. We then introduce a measure of distance between two such structures that is inspired by the graph edit distance, and a measure of center for a collection of the graphs. These measurements are the basis for an outlier tolerant K‐means algorithm to cluster the graphs based on structural attributes, thus creating a template for sorting new documents. Finally, we present a Bayesian hierarchical model to capture the propensity of a writer for producing graphs that are assigned to certain clusters. We illustrate the methods using documents from the Computer Vision Lab dataset. We show results of the identification task under the cluster assignments and compare to the same modeling, but with a less flexible grouping method that is not tolerant of incidental strokes or outliers
    corecore