670 research outputs found

    Safe abstractions of data encodings in formal security protocol models

    Get PDF
    When using formal methods, security protocols are usually modeled at a high level of abstraction. In particular, data encoding and decoding transformations are often abstracted away. However, if no assumptions at all are made on the behavior of such transformations, they could trivially lead to security faults, for example leaking secrets or breaking freshness by collapsing nonces into constants. In order to address this issue, this paper formally states sufficient conditions, checkable on sequential code, such that if an abstract protocol model is secure under a Dolev-Yao adversary, then a refined model, which takes into account a wide class of possible implementations of the encoding/decoding operations, is implied to be secure too under the same adversary model. The paper also indicates possible exploitations of this result in the context of methods based on formal model extraction from implementation code and of methods based on automated code generation from formally verified model

    Formal Verification of Security Protocol Implementations: A Survey

    Get PDF
    Automated formal verification of security protocols has been mostly focused on analyzing high-level abstract models which, however, are significantly different from real protocol implementations written in programming languages. Recently, some researchers have started investigating techniques that bring automated formal proofs closer to real implementations. This paper surveys these attempts, focusing on approaches that target the application code that implements protocol logic, rather than the libraries that implement cryptography. According to these approaches, libraries are assumed to correctly implement some models. The aim is to derive formal proofs that, under this assumption, give assurance about the application code that implements the protocol logic. The two main approaches of model extraction and code generation are presented, along with the main techniques adopted for each approac

    The role of TcdB and TccC subunits in secretion of the photorhabdus Tcd toxin complex

    Get PDF
    The Toxin Complex (TC) is a large multi-subunit toxin encoded by a range of bacterial pathogens. The best-characterized examples are from the insect pathogens Photorhabdus, Xenorhabdus and Yersinia. They consist of three large protein subunits, designated A, B and C that assemble in a 5:1:1 stoichiometry. Oral toxicity to a range of insects means that some have the potential to be developed as pest control technology. The three subunit proteins do not encode any recognisable export sequences and as such little progress has been made in understanding their secretion. We have developed heterologous TC production and secretion models in E. coli and used them to ascribe functions to different domains of the crucial B+C sub-complex. We have determined that the B and C subunits use a secretion mechanism that is either encoded by the proteins themselves or employ an as yet undefined system common to laboratory strains of E. coli. We demonstrate that both the N-terminal domains of the B and C subunits are required for secretion of the whole complex. We propose a model whereby the N-terminus of the C-subunit toxin exports the B+C sub-complex across the inner membrane while that of the B-subunit allows passage across the outer membrane. We also demonstrate that even in the absence of the B-subunit, that the C-subunit can also facilitate secretion of the larger A-subunit. The recognition of this novel export system is likely to be of importance to future protein secretion studies. Finally, the identification of homologues of B and C subunits in diverse bacterial pathogens, including Burkholderia and Pseudomonas, suggests that these toxins are likely to be important in a range of different hosts, including man

    Homogeneously derived transit timings for 17 exoplanets and reassessed TTV trends for WASP-12 and WASP-4

    Get PDF
    19 pages, 4 figures, 6 tables; revised manuscript submitted to MNRAS; online-only supplements are in the download archiveWe homogeneously analyse ∼3.2 × 10 5 photometric measurements for ∼1100 transit light curves belonging to 17 exoplanet hosts. The photometric data cover 16 years (2004–2019) and include amateur and professional observations. Old archival light curves were reprocessed using up-to-date exoplanetary parameters and empirically debiased limb-darkening models. We also derive self-consistent transit and radial-velocity fits for 13 targets. We confirm the nonlinear transit timing variation (TTV) trend in the WASP-12 data at a high significance, and with a consistent magnitude. However, Doppler data reveal hints of a radial acceleration of about −7.5 ± 2.2 m s −1 yr −1, indicating the presence of unseen distant companions, and suggesting that roughly 10 per cent of the observed TTV was induced via the light-travel (or Roemer) effect. For WASP-4, a similar TTV trend suspected after the recent TESS observations appears controversial and model dependent. It is not supported by our homogeneous TTV sample, including 10 ground-based EXPANSION light curves obtained in 2018 simultaneously with TESS. Even if the TTV trend itself does exist in WASP-4, its magnitude and tidal nature are uncertain. Doppler data cannot entirely rule out the Roemer effect induced by possible distant companions.Peer reviewe

    Contemporary presence of dynamical and statistical production of intermediate mass fragments in midperipheral 58^{58}Ni+58^{58}Ni collisions at 30 MeV/nucleon

    Full text link
    The 58Ni+58Ni^{58}Ni+^{58}Ni reaction at 30 MeV/nucleon has been experimentally investigated at the Superconducting Cyclotron of the INFN Laboratori Nazionali del Sud. In midperipheral collisions the production of massive fragments (4\leZ\le12), consistent with the statistical fragmentation of the projectile-like residue and the dynamical formation of a neck, joining projectile-like and target-like residues, has been observed. The fragments coming from these different processes differ both in charge distribution and isotopic composition. In particular it is shown that these mechanisms leading to fragment production act contemporarily inside the same event.Comment: 9 pages, minor correction

    Ductal lavage: a way of carefully tracing the breast-secreting duct

    Get PDF
    Background: Breast cancer is the most commonly diagnosed neoplasia in women after nonmelanoma skin tumors. Unfortunately, present-day diagnostic methods are unable to identify the presence of a cancer until it has been developing for several years. Currently, ductal lavage seems to represent a new method of reaching an early diagnosis of breast cancer. Materials & methods: This study analyzed 30 patients with ages ranging from 40 to 55 years; and in 26 of these patients, we were able to obtain a sufficient quantity of material for cytological and biomolecular analysis. Results & conclusion: We propose an easy, reproducible method that makes it possible to obtain a detailed map of the nipple, in order to re-identify the duct orifice and take a series of repeated samples from it over a period of time. This procedure is a promising screening and translational research tool since it provides the quantity and quality of ductal fluid required for subsequent cytological and biomolecular analyse

    A novel approach for security function graph configuration and deployment

    Get PDF
    Network virtualization increased the versatility in enforcing security protection, by easing the development of new security function implementations. However, the drawback of this opportunity is that a security provider, in charge of configuring and deploying a security function graph, has to choose the best virtual security functions among a pool so large that makes manual decisions unfeasible. In light of this problem, the paper proposes a novel approach for synthesizing virtual security services by introducing the functionality abstraction. This new level of abstraction allows to work in the virtual level without considering the different function implementations, with the objective to postpone the function selection jointly with the deployment, after the configuration of the virtual graph. This novelty enables to optimize the function selection when the pool of available functions is very large. A framework supporting this approach has been implemented and it showed adequate scalability for the requirements of modern virtual networks

    ICZM and WTP of stakeholders for beach conservation: Policymaking suggestions from an Italian case study

    Get PDF
    In accordance with integrated coastal zone management (ICZM), private stakeholders could be asked to pay for the benefits from beach conservation projects. Since a private contribution is measured by the amount of other goods a person is willing to give up for beach quality, it can be solicited in monetary terms or, when possible, in other forms, such as specific works. In this paper, by analysing the results of two surveys in Italy concerning stakeholders' perceptions of ICZM and their willingness to pay for these benefits, suggestions for beach management are provided to policymakers. One survey focuses on beach visitors who are asked to pay in monetary terms, while the other focuses on sunbathing establishment managers, who are asked to pay not only in monetary terms but also through beach works. The results show that the majority of these stakeholders are fully or partially aware of what ICZM is, and are unwilling to pay. However, regression analysis of those willing to pay suggests that promoting an information and education campaign about ICZM may be important if stakeholders' probability of paying is to be increased

    Reshaping disaster management: An integrated community-led approach

    Full text link
    The management of disasters has traditionally involved public, private, and nongovernmental organisations working together. While scholars have examined the value of collaborations among these entities, less is known about how to successfully engage and empower communities in disaster management. Based on network governance theory, this article contributes to the growing body of public management literature on community engagement by presenting findings from an Australian research initiative conducted after the 2019/20 Black Summer bushfires in New South Wales. Through workshops and semi-structured interviews with a total of 58 members from local communities and emergency agencies, this paper identifies differing perspectives on power distribution among stakeholders, indicating complexities in achieving an integrated and community-led disaster management approach. The findings underscore the need to shift from exclusively centralised to more inclusive systems, recognising the unique contributions of nonofficial community-based groups. To address this, the study suggests: a funded community consultation committee, ensuring government and local community representation; collaborative debriefing sessions, leveraging technology for knowledge capture; and the adoption of different leadership styles able to identify, include, and integrate communities as both steerers and rowers within established hierarchical arrangements. Points for practitioners: The current centralised emergency management system, which relies on recognised experts and state-controlled facilities, limits the integration of nonofficial resources and community-based knowledge. A shift towards a more community-centric and integrated approach (collaborative polycentric governance) is needed to enhance disaster resilience and response in Australia. Different stages of disaster reduction could and should have different leadership styles: a transformational, collaborative community-based style should be implemented before and after the disaster, while a transactional leadership style, more focused on restructuring the system or how it is applied, should be adopted during the disaster
    corecore