274 research outputs found

    NATCracker: NAT Combinations Matter

    Get PDF
    In this paper, we report our experience in working with Network Address Translators (NATs). Traditionally, there were only 4 types of NATs. For each type, the (im)possibility of traversal is well-known. Recently, the NAT community has provided a deeper dissection of NAT behaviors resulting into at least 27 types and documented the (im)possibility of traversal for some types. There are, however, two fundamental issues that were not previously tackled by the community. First, given the more elaborate set of behaviors, it is incorrect to reason about traversing a single NAT, instead combinations must be considered and we have not found any study that comprehensively states, for every possible combination, whether direct connectivity with no relay is feasible. Such a statement is the first outcome of the paper. Second, there is a serious need for some kind of formalism to reason about NATs which is a second outcome of this paper. The results were obtained using our own scheme which is an augmentation of currently-known traversal methods. The scheme is validated by reasoning using our formalism, simulation and implementation in a real P2P network

    Mesmerizer: A Effective Tool for a Complete Peer-to-Peer Software Development Life-cycle

    Get PDF
    In this paper we present what are, in our experience, the best practices in Peer-To-Peer(P2P) application development and how we combined them in a middleware platform called Mesmerizer. We explain how simulation is an integral part of the development process and not just an assessment tool. We then present our component-based event-driven framework for P2P application development, which can be used to execute multiple instances of the same application in a strictly controlled manner over an emulated network layer for simulation/testing, or a single application in a concurrent environment for deployment purpose. We highlight modeling aspects that are of critical importance for designing and testing P2P applications, e.g. the emulation of Network Address Translation and bandwidth dynamics. We show how our simulator scales when emulating low-level bandwidth characteristics of thousands of concurrent peers while preserving a good degree of accuracy compared to a packet-level simulator

    Robust design of steel and concrete composite structures

    Get PDF
    Accidental events, such as impact loading, are rare events with a very low probability of occurrence but their effects often leads to very high human losses and economical consequences. An adequate design should not only reduce the risk for the life of the occupancy, but should also minimize the disastrous results and enable a quick rebuilding and reuse. A robust design prevents the complete collapse of the structure when only a limited part is damaged or destroyed. Design against disproportionate collapse is usually based on the residual strength or the alternate load path methods. Identification of an alternate path may lead to an effective and cost efficient design for progressive collapse mitigation by redistributing the loads within the structure. The continuity of the frame and of the floor represent essential factors contributing to a robust structural response. They in fact enable development of 3D membrane action. A European project focusing on robustness of steel and steel and concrete composite structures subjected to accidental loads is still ongoing. In the framework of the project the authors concentrated their studies on the redundancy of the structure through slab-beam floor systems as well as through ductile joint design. At this aim, two 3D full scale substructures were extracted from a reference building and experimentally investigated with the purpose to get an insight into the mechanisms allowing the activation of the alternate load paths resources, when a column collapse. The paper illustrates the main features of both the specimens tested and the experimental campaign. The preliminary results of the tests are presented and discussed

    On The Feasibility Of Centrally-Coordinated Peer-To-Peer Live Streaming

    Get PDF
    In this paper we present an exploration of central coordination as a way of managing P2P live streaming overlays. The main point is to show the elements needed to construct a system with that approach. A key element in the feasibility of this approach is a near real-time optimization engine for peer selection. Peer organization in a way that enables high bandwidth utilization plus optimized peer selection based on multiple utility factors make it possible to achieve large source bandwidth savings and provide high quality of user experience. The benefits of our approach are also seen most when NAT constraints come into play

    A GPU-enabled solver for time-constrained linear sum assignment problems

    Get PDF
    This paper deals with solving large instances of the Linear Sum Assignment Problems (LSAPs) under realtime constraints, using Graphical Processing Units (GPUs). The motivating scenario is an industrial application for P2P live streaming that is moderated by a central tracker that is periodically solving LSAP instances to optimize the connectivity of thousands of peers. However, our findings are generic enough to be applied in other contexts. Our main contribution is a parallel version of a heuristic algorithm called Deep Greedy Switching (DGS) on GPUs using the CUDA programming language. DGS sacrifices absolute optimality in favor of a substantial speedup in comparison to classical LSAP solvers like the Hungarian and auctioning methods. We show the modifications needed to parallelize the DGS algorithm and the performance gains of our approach compared to a sequential CPU-based implementation of DGS and a mixed CPU/GPU-based implementation of it

    Direct Injection Liquid Chromatography High-Resolution Mass Spectrometry for Determination of Primary and Secondary Terrestrial and Marine Biomarkers in Ice Cores

    Get PDF
    Many atmospheric organic compounds are long-lived enough to be transported from their sources to polar regions and high mountain environments where they can be trapped in ice archives. While inorganic components in ice archives have been studied extensively to identify past climate changes, organic compounds have rarely been used to assess paleo-environmental changes, mainly due to the lack of suitable analytical methods. This study presents a new method of direct injection HPLC-MS analysis, without the need of pre-concentrating the melted ice, for the determination of a series of novel biomarkers in ice-core samples indicative of primary and secondary terrestrial and marine organic aerosol sources. Eliminating a preconcentration step reduces contamination potential and decreases the required sample volume thus allowing a higher time resolution in the archives. The method is characterised by limits of detections (LODs) in the range of 0.01-15 ppb, depending on the analyte, and accuracy evaluated through an interlaboratory comparison. We find that many components in secondary organic aerosols (SOA) are clearly detectable at concentrations comparable to those previously observed in replicate preconcentrated ice samples from the Belukha glacier, Russian Altai Mountains. Some compounds with low recoveries in preconcentration steps are now detectable in samples with this new direct injection method significantly increasing the range of environmental processes and sources that become accessible for paleo-climate studies

    Comparison of different methods of aggregation of model ensemble outcomes in the validation and reconstruction of real power plant signals

    No full text
    International audienceSensors are placed at various locations in a production plant to monitor the state of the processes and components. For the plant state monitoring to be effective, the sensors themselves must be monitored for detecting anomalies in their functioning and for reconstructing the correct values of the signals measured. In this work, the task of sensor monitoring and signal reconstruction is tackled with an ensemble of Principal Component Analysis (PCA) models handling individual overlapping groups of sensor signals, randomly generated according to the Random Feature Selection Ensemble (RFSE) technique. The outcomes of these models are combined using a Local Fusion (LF) technique based on the evaluation of the models performance on set of training patterns similar to the test pattern under reconstruction. The performances obtained using the LF method are compared to those obtained using classical aggregation methods such as Simple Mean (SM) Globally weighted average (GWA), Median (MD) and Trimmed Mean (TM), on a real case study concerning 215 signals monitored at a Finnish Pressurized Water Reactor (PWR) nuclear power plant

    Progressive collapse: the case of composite steel-concrete frames

    Get PDF
    Residual strength and alternate load paths are two fundamental design strategies to ensure adequate resistance against progressive collapse of structures. This paper presents an experimental study carried out on two full-scale steel and concrete composite frames to investigate their structural behaviour in case of a column collapse. The study focusses on the redundancy of the structure as provided by the beam-slab floor system as well as by the ductile beam-to-column joints. The specimens were ground floor sub-frames ‘extracted’ from two reference buildings designed in accordance to the Eurocodes. The frames have the same overall dimensions, but a different, symmetric and asymmetric, configuration of the column layout. In both tests, the collapse of an internal column was simulated. The paper presents the main features of the frames and the principal outcomes of the test on the symmetric frame

    Steel-concrete frames under the column loss scenario: An experimental study

    Get PDF
    Accidental events, such as impact loading or explosions, are rare events with a very low probability of occurrence. However, their effects often lead to very high human losses and economic consequences. An adequate design against these events should reduce the risk for the life of the occupancy, minimize the damage extension and enable a quick rebuilding and reuse. A structure fulfilling these requirements is ‘robust’. Different strategies can be pursued for accidental events, and among them, methods based on the residual strength or the alternate load path are frequently adopted because applicable to a vast range of structures. Adequate design strategies based on them require an in-deep knowledge of load transfer mechanisms from the damaged to the undamaged part of the structure. As to the frames, the important role of joint ductility was pointed out in recent studies. Besides, the flooring systems substantially affect the spread of the damage, but the research on this subject is still very limited. The present study focuses on steel-concrete composite frames under the column loss scenario. It aims to better understand the influence of both frame continuity and floor systems in the development of 3D membrane action. Two geometrically different 3D steel-concrete composite full-scale substructures were extracted from reference buildings and tested simulating the column collapse scenario. This paper illustrates the preparatory studies, the main features of the specimens and the outcomes of the first test. The test provided an insight in the need for an enhanced design of joints and pointed out the key features of the response of the floor system
    corecore