4,737 research outputs found

    The Methods to Improve Quality of Service by Accounting Secure Parameters

    Full text link
    A solution to the problem of ensuring quality of service, providing a greater number of services with higher efficiency taking into account network security is proposed. In this paper, experiments were conducted to analyze the effect of self-similarity and attacks on the quality of service parameters. Method of buffering and control of channel capacity and calculating of routing cost method in the network, which take into account the parameters of traffic multifractality and the probability of detecting attacks in telecommunications networks were proposed. The both proposed methods accounting the given restrictions on the delay time and the number of lost packets for every type quality of service traffic. During simulation the parameters of transmitted traffic (self-similarity, intensity) and the parameters of network (current channel load, node buffer size) were changed and the maximum allowable load of network was determined. The results of analysis show that occurrence of overload when transmitting traffic over a switched channel associated with multifractal traffic characteristics and presence of attack. It was shown that proposed methods can reduce the lost data and improve the efficiency of network resources.Comment: 10 pages, 1 figure, 1 equation, 1 table. arXiv admin note: text overlap with arXiv:1904.0520

    A SEARCHING ALGORITHM FOR TEXT WITH MISTAKES

    Get PDF
    The paper contains a new text searching method representing modification of the Boyer-Moore algorithm and enabling a user to find the places in the text where the given substring occurs maybe with possible errors, that is the string in text and a query may not coincide but nevertheless are identical. The idea consists in division of the searching process in two phases: at the first phase a fuzzy variant of the Boyer–Moore algorithm is performed; at the second phase the Dice metrics is used. The advantage of suggested technique in comparison with the known methods using the fixed value of the mistakes number is that it 1) does not perform precomputation of the auxiliary table of the sizes comparable to the original text sizes and 2) it more flexibly catches the semantics of the erroneous text substrings even for a big number of mistakes. This circumstance extends possibilities of the Boyer–Moore method by addmitting a bigger amount of possible mistakes in text and preserving text semantics. The suggested method provides also more accurate regulation of the upper boundary for the text mistakes which differs it from the known methods with fixed value of the maximum number of mistakes not depending on the text sizes. Moreover, this upper boundary is defined as Levenshtein distance not suitable for evaluating a relevance of the founded text and a query, while the Dice metrics provides such a relevance. In fact, if maximum Levenshtein distanse is 3 then how one can judge if this value is big or small to provide relevance of the search results. Consequently, the suggested method is more flexible, enables one to find relevant answers even in case of a big number of mistakes in text. The efficiency of the suggested method in the worst case is O(nc) with constant c defining the biggest allowable number of mistakes.The paper contains a new text searching method representing modification of the Boyer-Moore algorithm and enabling a user to find the places in the text where the given substring occurs maybe with possible errors, that is the string in text and a query may not coincide but nevertheless are identical. The idea consists in division of the searching process in two phases: at the first phase a fuzzy variant of the Boyer–Moore algorithm is performed; at the second phase the Dice metrics is used. The advantage of suggested technique in comparison with the known methods using the fixed value of the mistakes number is that it 1) does not perform precomputation of the auxiliary table of the sizes comparable to the original text sizes and 2) it more flexibly catches the semantics of the erroneous text substrings even for a big number of mistakes. This circumstance extends possibilities of the Boyer–Moore method by addmitting a bigger amount of possible mistakes in text and preserving text semantics. The suggested method provides also more accurate regulation of the upper boundary for the text mistakes which differs it from the known methods with fixed value of the maximum number of mistakes not depending on the text sizes. Moreover, this upper boundary is defined as Levenshtein distance not suitable for evaluating a relevance of the founded text and a query, while the Dice metrics provides such a relevance. In fact, if maximum Levenshtein distanse is 3 then how one can judge if this value is big or small to provide relevance of the search results. Consequently, the suggested method is more flexible, enables one to find relevant answers even in case of a big number of mistakes in text. The efficiency of the suggested method in the worst case is O(nc) with constant c defining the biggest allowable number of mistakes

    Quantum interferometry with three-dimensional geometry

    Get PDF
    Quantum interferometry uses quantum resources to improve phase estimation with respect to classical methods. Here we propose and theoretically investigate a new quantum interferometric scheme based on three-dimensional waveguide devices. These can be implemented by femtosecond laser waveguide writing, recently adopted for quantum applications. In particular, multiarm interferometers include "tritter" and "quarter" as basic elements, corresponding to the generalization of a beam splitter to a 3- and 4-port splitter, respectively. By injecting Fock states in the input ports of such interferometers, fringe patterns characterized by nonclassical visibilities are expected. This enables outperforming the quantum Fisher information obtained with classical fields in phase estimation. We also discuss the possibility of achieving the simultaneous estimation of more than one optical phase. This approach is expected to open new perspectives to quantum enhanced sensing and metrology performed in integrated photonic.Comment: 7 pages (+4 Supplementary Information), 5 figure

    Co-transplantation of Human Embryonic Stem Cell-derived Neural Progenitors and Schwann Cells in a Rat Spinal Cord Contusion Injury Model Elicits a Distinct Neurogenesis and Functional Recovery

    Get PDF
    Co-transplantation of neural progenitors (NPs) with Schwann cells (SCs) might be a way to overcome low rate of neuronal differentiation of NPs following transplantation in spinal cord injury (SCI) and the improvement of locomotor recovery. In this study, we initially generated NPs from human embryonic stem cells (hESCs) and investigated their potential for neuronal differentiation and functional recovery when co-cultured with SCs in vitro and co-transplanted in a rat acute model of contused SCI. Co-cultivation results revealed that the presence of SCs provided a consistent status for hESC-NPs and recharged their neural differentiation toward a predominantly neuronal fate. Following transplantation, a significant functional recovery was observed in all engrafted groups (NPs, SCs, NPs+SCs) relative to the vehicle and control groups. We also observed that animals receiving co-transplants established a better state as assessed with the BBB functional test. Immunohistofluorescence evaluation five weeks after transplantation showed invigorated neuronal differentiation and limited proliferation in the co-transplanted group when compared to the individual hESC-NPs grafted group. These findings have demonstrated that the co-transplantation of SCs with hESC-NPs could offer a synergistic effect, promoting neuronal differentiation and functional recovery

    Evaluation of hydrometric network efficacy and user requirements in the Republic of Ireland via expert opinion and statistical analysis

    Get PDF
    Decreased funding and shifting governmental priorities have resulted in a contraction of hydrometric measurement in many regions over the past two decades. Moreover, concerns exist with respect to appropriate data usage and (transboundary) exchange, in addition to the compatibility and extent of existing hydrometric datasets. These issues are undoubtedly magnified due to enhanced data demands and increased financial pressures on network managers, thus requiring new approaches to optimising the societal benefits and overall efficacy of hydrometric information for future socio-hydrological resilience. The current study employed a quantitative cross-sectional expert elicitation of 203 respondents to collate, analyse and assess hydrometric network users’ opinions, knowledge and experience. Current usage patterns, perceived network strengths, requirements, and limitations have been identified and discussed within the context of hydrometric resilience in a changing social, economic and natural environment. Findings indicate that small (\u3c30 km2) catchment data are most frequently employed in the Republic of Ireland, particularly with respect to extreme event prediction and flood management. Similarly, small catchments and areas characterised by previous/recent flooding were prioritised for resilience management via network amendment. Over half of those surveyed (50.5%) reported the current network as inadequate for their professional requirements. Conversely, respondents indicated network efficacy has improved (53.2%) or remained stable (26.6%) over the course of their professional career, however, improvements (as defined by individual respondents i.e. network density, data quality, data availability) have not occurred at a sufficient rate. User-defined efficacy (adequacy, resilience) was found to be a somewhat vague, multivariate concept, with no individual predictor identified, however, general data quality, network density, and urban catchment data were the most significant issues among respondents. A significant majority (85.4%) of respondents indicate that future resilience would be best achieved via network density amendment, with over 60% favouring geographically and/or categorically focused network increases, as opposed to more general national increases
    corecore