284 research outputs found

    Thermo-mechanical behaviour of a compacted swelling clay

    Get PDF
    Compacted unsaturated swelling clay is often considered as a possible buffer material for deep nuclear waste disposal. An isotropic cell permitting simultaneous control of suction, temperature and pressure was used to study the thermo-mechanical behaviour of this clay. Tests were performed at total suctions ranging from 9 to 110 MPa, temperature from 25 to 80 degrees C, isotropic pressure from 0.1 to 60 MPa. It was observed that heating at constant suction and pressure induces either swelling or contraction. The results from compression tests at constant suction and temperature evidenced that at lower suction, the yield pressure was lower, the elastic compressibility parameter and the plastic compressibility parameter were higher. On the other hand, at a similar suction, the yield pressure was slightly influenced by the temperature; and the compressibility parameters were insensitive to temperature changes. The thermal hardening phenomenon was equally evidenced by following a thermo-mechanical path of loading-heating-cooling-reloading

    Performance of highly sensitive cardiac troponin T assay to detect ischaemia at PET-CT in low-risk patients with acute coronary syndrome: a prospective observational study.

    Get PDF
    Highly sensitive troponin T (hs-TnT) assay has improved clinical decision-making for patients admitted with chest pain. However, this assay's performance in detecting myocardial ischaemia in a lowrisk population has been poorly documented. To assess hs-TnT assay's performance to detect myocardial ischaemia at positron emission tomography/CT (PET-CT) in low-risk patients admitted with chest pain. Patients admitted for chest pain with a nonconclusive ECG and negative standard cardiac troponin T results at admission and after 6 hours were prospectively enrolled. Their hs-TnT samples were at T0, T2 and T6. Physicians were blinded to hs-TnT results. All patients underwent a PET-CT at rest and during adenosine-induced stress. All patients with a positive PET-CT result underwent a coronary angiography. Forty-eight patients were included. Six had ischaemia at PET-CT. All of them had ≥1 significant stenosis at coronary angiography. Areas under the curve (95% CI) for predicting significant ischaemia at PET-CT using hs-TnT were 0.764 (0.515 to 1.000) at T0, 0.812(0.616 to 1.000) at T2 and 0.813(0.638 to 0.989) at T6. The receiver operating characteristicbased optimal cut-off value for hs-TnT at T0, T2 and T6 needed to exclude significant ischaemia at PET-CT was <4 ng/L. Using this value, sensitivity, specificity, positive and negative predictive values of hs-TnT to predict significant ischaemia were 83%/38%/16%/94% at T0, 100%/40%/19%/100% at T2 and 100%/43%/20%/100% at T6, respectively. Our findings suggest that in low-risk patients, using the hs-TnT assay with a cut-off value of 4 ng/L demonstrates excellent negative predictive value to exclude myocardial ischaemia detection at PET-CT, at the expense of weak specificity and positive predictive value. ClinicalTrials.gov Identifier: NCT01374607

    The Doughnut framework: from theory to local applications in Switzerland—literature review & practical lessons

    Get PDF
    The Doughnut conceptual framework, originally developed by the economist Kate Raworth, delineates a “safe and just space” for human activities, located between a social foundation and an ecological ceiling. A targeted literature review shows that over the past decade this visually appealing, holistic and scientifically grounded framework has gained attention for its potential to guide socio-ecological transitions globally. At the same time, it has faced various theoretical critiques related to its scientific validity, social justice considerations, and challenges in local adaptations. This article seeks to bring clarity to the critiques often directed at the Doughnut, distinguishing those that refer to the Doughnut framework from those arising when attempting to implement it locally. It does so by drawing on two action research projects conducted in Switzerland using specific approaches for its practical applications at the local level—the Doughnut Unrolled methodology and the Data Portrait quantification. Moreover, it identifies the difficulties in maintaining the integrity of the framework and its strong sustainability principles when applied to regional or municipal scales. This article thus contributes to the discussion on the often-overlooked gap between the global conceptual framework and its practical implementation at the local level, particularly in Global North contexts. To address this and avoid any kind of “Doughnut-washing”, this article proposes six guiding principles for maintaining the framework’s integrity in local implementations. By applying these principles, it argues that the Doughnut framework can retain its transformative potential while remaining scientifically robust and actionable at various governance scales

    Le numérique, un choix de société compatible avec la transition écologique ? Le télétravail comme cas d’étude

    Get PDF
    Ces cinq dernières années, les études et interpellations se sont multipliées sur l’impact environnemental et social du numérique. Face à l’urgence d’un retour au sein des limites planétaires, ce Commentaire propose des pistes afin d’appréhender les enjeux liés à la mobilisation des outils numériques dans les choix de société. La première partie montre que le numérique représentait 2-4% des émissions de gaz à effet de serre (GES) mondiales en 2021, avec un taux de croissance de 6-9% par an, pouvant amener à un doublement de son impact avant 2030. Considérant le besoin de réduire les émissions mondiales de 50% d’ici 2030 et 95% d’ici 2050, afin d’atteindre la neutralité carbone et respecter l’Accord de Paris, l’état actuel et anticipé du numérique au niveau mondial apparaît donc comme n’étant pas soutenable. Au niveau européen, grâce au cas bien documenté de la France, on constate que ce sont les terminaux utilisateurs (ordinateurs, téléphones, TV) qui sont la source de 79% à 87% des émissions (directes et indirectes) de GES et de la production de déchets. Dans une approche de sobriété numérique, c’est donc une réduction drastique de la demande en terminaux neufs qui est requise, ainsi que la limitation de la sur-disponibilité et des incitations à l’achat de nouveaux produits, tels que les objets connectés. La deuxième partie classe les avantages théoriquement offerts par le numérique en matière d’impact selon le modèle Avoid-Shift-Improve (ASI) et montre qu’un questionnement fondamental des besoins perçus et de la demande (Avoid) semble nécessaire au sein d’une société de consommation. Le cas d’étude qui suit montre que le télétravail permettrait une économie de 60-90% des émissions de GES si les trajets pendulaires sont effectués en voiture, mais que les transports publics et actifs demeurent moins impactants lorsque la distance bureau-domicile est inférieure à 30km (trolleybus) ou 100km (train), aller-retour. La troisième partie élargit le périmètre d’analyse du cas d’étude et identifie au moins neuf mécanismes induisant un effet rebond, classés selon quatre types (microéconomique : effet direct et effet indirect, macroéconomique : effet de marché et effet-croissance). S’y ajoutent, sur la base du modèle du Donut, au moins quatre types d’effets indésirables engendrés par le numérique (sur la santé physique et mentale, les conditions de travail, l’équité sociale, la biodiversité). À la lumière de ces éléments, ce Commentaire conclut qu’il apparaît essentiel de procéder à un examen extensif et rigoureux des effets délétères qui pourraient être engendrés par l’utilisation actuelle du numérique et, plus encore, par de futurs choix de déploiement de technologies qui lui sont liées. Les propositions pour une sobriété numérique au niveau sociétal semblent ainsi être l’axe de réflexion le plus fécond pour le futur

    Four Lessons in Versatility or How Query Languages Adapt to the Web

    Get PDF
    Exposing not only human-centered information, but machine-processable data on the Web is one of the commonalities of recent Web trends. It has enabled a new kind of applications and businesses where the data is used in ways not foreseen by the data providers. Yet this exposition has fractured the Web into islands of data, each in different Web formats: Some providers choose XML, others RDF, again others JSON or OWL, for their data, even in similar domains. This fracturing stifles innovation as application builders have to cope not only with one Web stack (e.g., XML technology) but with several ones, each of considerable complexity. With Xcerpt we have developed a rule- and pattern based query language that aims to give shield application builders from much of this complexity: In a single query language XML and RDF data can be accessed, processed, combined, and re-published. Though the need for combined access to XML and RDF data has been recognized in previous work (including the W3C’s GRDDL), our approach differs in four main aspects: (1) We provide a single language (rather than two separate or embedded languages), thus minimizing the conceptual overhead of dealing with disparate data formats. (2) Both the declarative (logic-based) and the operational semantics are unified in that they apply for querying XML and RDF in the same way. (3) We show that the resulting query language can be implemented reusing traditional database technology, if desirable. Nevertheless, we also give a unified evaluation approach based on interval labelings of graphs that is at least as fast as existing approaches for tree-shaped XML data, yet provides linear time and space querying also for many RDF graphs. We believe that Web query languages are the right tool for declarative data access in Web applications and that Xcerpt is a significant step towards a more convenient, yet highly efficient data access in a “Web of Data”

    (Preprint) The Doughnut framework: from theory to local applications in Switzerland — literature review & practical lessons

    Get PDF
    The Doughnut conceptual framework, developed by the economist Kate Raworth, is gaining considerable momentum. It is often framed as a representation of the normative objective that socio-ecological transitions are intended to achieve. This contribution sets out the main strengths and weaknesses of this framework, according to the increasingly prolific literature on the subject on the one hand, and on the other, according to our recent practical experiences of downscaling the Doughnut to a Swiss territory and a Swiss institution. It shows that, since its creation, the Doughnut has been used and remodelled along a continuum between conceptual and theoretical purity, inspired by the framework of planetary boundaries and theories of basic human needs, and a pragmatic tool aimed at guiding public action at local level. In this respect, the local reinterpretation of the Doughnut raises several practical questions, which in turn can lead to the local tool being distanced from the original framework. We suggest that, if the Doughnut is to remain a strong sustainability tool, some additional principles should be adopted by practitioners while downscaling it. Hence, we propose six guiding principles for maintaining the integrity of the Doughnut’s conceptual framework in its local variations

    Is the Fed reacting to stock price fluctuations? Evidence from the Internet bubble

    Get PDF
    The second half of the 1990s saw a major bull market in equities in the United States, followed by a bear market that began in Spring 2000. This experience has led a number of academics, journalists, and businesspeople, to question the appropriate monetary policy response to sharp run-up in stock prices. The present study contributes to this line of research by assessing whether the Federal Reserve Bank (Fed) actually took into account stock prices when implementing its monetary policy during the Internet Bubble of the late 1990s. In 1993, Taylor (1993) claimed that a linear function of current inflation deviation from an inflation target and the output gap tracked the Fed funds rate fairly well between 1987 and 1992. Similarly, the empirical analysis carried out in this dissertation shows that this rule provides a good description of the monetary policy over the 1987-1995 period. However, the estimations over a more recent period seem to suggest that the Fed Funds rate did not follow the Taylor rule for the 1996-2006 period. Therefore, we wonder what can explain such a result. The hypothesis in this dissertation is that the weight of the two traditional variables from the Taylor rule has been reduced in favour of the increasing importance of a new variable to which the Fed has reacted over this period. The main result of this analysis actually indicates a decreased influence of inflation in monetary policy decisions over the period under consideration while the Fed was increasingly taking the fluctuations of stock prices into account. Even if stock prices fluctuations are not a stated goal of the US monetary policy, the reactions of the Fed to the extreme variations of stock prices during the Internet Bubble decreased the importance of the goals of price stability and sustainable growth, which was possible thanks to the reduced inflation observed over the period

    Detección y clasificación de zero-day malware a través de data mining y machine learning

    Get PDF
    Dado el constante incremento, tanto en número como en complejidad, de los ataques informáticos, los mecanismos convencionales de detección resultan ineficientes en la mayoría de los escenarios. En este contexto, la presente investigación propone determinar si técnicas de data mining y machine learning pueden ser utilizadas efectivamente para el entrenamiento de algoritmos capaces de detectar y clasificar correctamente nuevos tipos de amenazas.Facultad de Informátic

    Detección y clasificación de zero-day malware a través de data mining y machine learning

    Get PDF
    Muchos estudios sugieren que, durante los últimos años, ha habido un incremento exponencial de los ataques informáticos, causando a las organizaciones pérdidas financieras en el orden de los millones. Mientras muchas compañías dedican tiempo y recursos al desarrollo de antivirus; la complejidad, la velocidad de propagación y la capacidad polimórfica que poseen los virus modernos representan enormes desafíos para estas empresas. Motivados por encontrar nuevas alternativas, la comunidad de científicos de datos ha descubierto que la utilización de técnicas de machine learning y deep learning para la detección y clasificación de malware puede ofrecer una opción más que competitiva. Para esta investigación se comenzará realizando las extracción de información de un conjunto de datos compuesto por once mil archivos ASM y bytes correspondientes a nueve familias distintas de malwares. Luego, mediante la implementación de algoritmos de machine learning se intentará clasificar estos malwares en sus correspondientes familias. De forma complementaria, se realizará una clasificación binaria para detección mal- ware/no malware, con un conjunto reducido de programas benignos, finalizando así con la elaboración de comparaciones y conclusiones.Sociedad Argentina de Informática e Investigación Operativ
    corecore