192 research outputs found

    Ultrathin Acoustic Parity-Time Symmetric Metasurface Cloak

    Get PDF
    Invisibility or unhearability cloaks have beenmade possible by using metamaterials enabling light or sound to flow around obstacle without the trace of reflections or shadows. Metamaterials are known for being flexible building units that can mimic a host of unusual and extreme material responses, which are essential when engineering artificial material properties to realize a coordinate transforming cloak. Bending and stretching the coordinate grid in space require stringent material parameters; therefore, small inaccuracies and inevitablematerial losses become sources for unwanted scattering that are decremental to the desired effect.These obstacles further limit the possibility of achieving a robust concealment of sizeable objects from either radar or sonar detection. By using an elaborate arrangement of gain and lossy acousticmedia respecting parity-time symmetry, we built a one-way unhearability cloak able to hide objects seven times larger than the acoustic wavelength. Generally speaking, our approach has no limits in terms of working frequency, shape, or size, specifically though we demonstrate how, in principle, an object of the size of a human can be hidden from audible sound

    Theory of holey twistsonic media

    Get PDF
    Rotating two overlapping lattices relative to each other produces the well known moiré interference patterns and has surprisingly led to strongly correlated superconductivity in twisted bilayer graphene. This seminal effect that is associated with electrons occupying flat dispersion bands has stimulated a surge of activities in classical wave physics such as acoustics to explore equivalent scenarios. Here, we mimic twisted bilayer physics by employing a rigorous sound wave expansion technique to conduct band engineering in holey bilayer plates, i.e., twistsonic media. Our numerical findings show how one flexibly is able to design moiré sound interference characteristics that alone are controlled by the twist angle and the interlayer air separation. More specifically, our numerical approach provides a significant advantage in both computational speed and storage size in comparison with widely used commercial finite-element-method solvers. We foresee that our findings should stimulate further studies in terms of band engineering and exotic topological twisted phases.J.C. acknowledges the support from the European Research Council (ERC) through the Starting Grant 714577 PHONOMETA. Z.Z. acknowledges the support from the NSFC (12104226), the China National Postdoctoral Program for Innovative Talents (BX20200165) and the China Postdoctoral Science Foundation (2020M681541). D.T. acknowledges the support of MINECO through a Ramón y Cajal grant (Grant No. RYC-2016-21188) and of the Ministry of Science, Innovation and Universities trough project number RTI2018-093921-A-C42

    Multiple scattering theory of non-Hermitian sonic second-order topological insulators

    Get PDF
    Topological phases of sound enable unconventional confinement of acoustic energy at the corners in higher-order topological insulators. These unique states which go beyond the conventional bulk-boundary correspondence have recently been extended to non-Hermitian wave physics comprising finite crystal structures including loss and gain units. We use a multiple scattering theory to calculate these topologically trapped complex states that agree very well to finite element predictions. Moreover, our semi-numerical tool allows us to compute the spectral dependence of corner states in the presence of defects, illustrating the limits of the topological resilience of these confined non-Hermitian acoustic states

    The K-T and Tertiary-Pleistocene South American mammalian turnovers: similar phenomena?

    Get PDF
    The history of South American mammals has been episodic, apparently "stratified", and the "strata" relatively few in number and, as a rule, sharply and clearly separable. This is a consequence of the physical history of the continent. The fossil record shows that there were two great episodes characterized by drastic turnovers of mammal communities; both appear related to two of the most drastic physical changes withstood by the continent. The oldest episode is related to the separation of Africa from the other Gondwanan continents (shaping the primordial outlines of the eastern cost of the incipient Southern Atlantic Ocean), and to a sporadic connection of the South American plate with the North American plate. This led to the first great turnover: with the exception of two Gondwanan taxa (Monotremes and Gondwanatheres), and probably another one (Dryolestida), all the Gondwanan mammals (all non-tri- bosphenic taxa) became extinct, and were "replaced" by Laurasian tribosphenic marsupial and placental immigrants. Because of the early extinction (early Paleocene) of the Gondwanan non-tribosphenic survivors, and the subsequent isolation of the continent (including, at least, the Antarctic Peninsula) unique communities solely composed of quite endemic (native) marsupials and placentals were built up. As a consequence of the inter-American connection via the newborn Central America, an increasing biotic interchange began. The second great turnover, involving dispersal, extinction and survival, built up quite peculiar mammalian communities. These are the new basic mammal communities that, after the "Megafaunal Extinction" and the addition of a few and selected immigrants, distinguish the present Neotropical Region. Apparently this second great turnover was accomplished by replacement, not by displacement, as long thought. The failure to find mammals in rocks representing the K-T transition, has no record to analyze the modus operandi of the transcendental first turnover.Facultad de Ciencias Naturales y Muse

    PRÁCTICAS DE DISCRIMINACIÓN LABORAL EN MÉXICO, POR ANTECEDENTES PENALES

    Get PDF
    El objetivo general de esta investigación es analizar profundamente, la problemática que gira en torno a las prácticas discriminatorias en el ámbito laboral hacia personas ex convictas en México, para que de esta manera se pueda plantear una posible solución dotada de viabilidad. El propósito principal de esta investigación es examinar y detallar los elementos que conforman la discriminación laboral en México, especialmente por Antecedentes Penales. Planteando una serie de definiciones para así despejar dudas y concluir situaciones implícitas, pues desde luego la no discriminación es necesaria para el correcto y mejor funcionamiento de una sociedad. Probablemente el tema de la igualdad sea repetitivo en la presente investigación, ya que atendiendo al criterio del destacado Dr. Jesús Rodríguez Zepeda: “En el texto se insiste una y otra vez en el carácter de derecho fundamental de la no discriminación, para el acceso a las oportunidades socialmente disponibles” sobre todo el derecho al trabajo. Es preciso definir respecto al tema tratado, que es una sociedad justa, entendiendo que “una sociedad justa es aquella en la que no existen, o al menos no son significativos, los tratos de desprecio hacia grupos completos por razón de una característica o atributo (como el hecho de contar con Antecedentes Penales) que además han sido estigmatizados y asociados con inferioridad y falta de valor”, se habla del estigma bajo el que están catalogados, equivocadamente, las personas con Antecedentes Penales y su exclusión en el ámbito laboral

    Optimización de recursos en el Banco de Alimentos de Madrid: Abordando las necesidades de compras y aprovisionamiento

    Get PDF
    Máster Universitario en Ingeniería Industrial + Máster en Industria Conectada/ Master in Smart IndustryLa epidemia del Covid-19 sitúa a la sociedad española en una coyuntura en la que tanto el sistema sanitario como la conciencia social se ponen a prueba mientras que la economía se tambalea. En este contexto, los próximos meses estarán marcados, previsiblemente y entre otras consideraciones, por un esfuerzo ingente por prever y acotar la magnitud de la recesión económica que se avecina (y cuyos efectos ya se están empezando a vislumbrar) sobre los distintos colectivos que integran la sociedad. En un entorno democrático, humano y socialmente responsable, resulta relevante poner especial atención a aquellos colectivos que, durante el transcurso de la recesión, puedan llegar a encontrarse en una situación de mayor urgencia o necesidad económica. Estos grupos serán, precisamente, los que al no disponer de suficientes medios para combatir la recesión por cuenta propia, se encontrarán en una situación de mayor indefensión y exclusión, pudiendo llegar a propiciar cierta desestabilización social que afecte al resto de colectivos, en caso de que no se desplieguen medios para cubrir sus necesidades básicas. Motivado por esta situación de preocupación e incertidumbre, el objetivo de este trabajo es, precisamente, el de tratar de apoyar la labor de aquellas organizaciones que prestan ayuda a los colectivos más desfavorecidos, con el ánimo de contribuir a satisfacer sus necesidades más básicas y urgentes. En concreto, en este proyecto se pretende realizar un estudio estratégico que aborde la problemática del abastecimiento a los beneficiarios del Banco de Alimentos de Madrid, persiguiendo determinar cómo maximizar la utilidad social y nutricional de los recursos económicos disponibles en la organización. Para realizar este estudio, el proyecto se apoyará en un modelo matemático de requerimientos nutricionales para distintos clústers poblacionales, que se determinarán a partir de variables como el género y la edad. Dicho modelo cuenta con parámetros de decisión como 1) el aporte macronutricional y el precio por kg de los distintos grupos de alimentos disponibles en el mercado español, 2) la cantidad y el tipo de comida recibida en concepto de donativos, o 3) el número de beneficiarios por grupo de edad a los que hay que atender, entre otros. A partir de estos parámetros, el modelo es capaz de computar cuál es la cesta de inversión semanal que se habría de realizar para casar las necesidades nutricionales de los beneficiarios con los recursos disponibles, bajo un marco de decisión basado en una minimización de costes en el que se sitúa al Banco de Alimentos de Madrid como decisor centralizado. Además de los costes asociados a la compra de alimentos, se ha incorporado una variable de penalización económica asociada a la cantidad de macronutriente no suministrado por grupo poblacional. Esta variable de penalización representaría el coste social ocasionado por la no satisfacción de un requerimiento nutricional dentro de un grupo poblacional determinado. Esto es, para el caso del déficit de ácidos grasos esenciales, la variable de penalización estaría constituida, entre otras, por la métrica cuantitativa que recogiese el coste asociado a los tratamientos que podrían requerirse para tratar las enfermedades que apareciesen debido al déficit del nutriente, como obstrucciones en el flujo sanguíneo, enfermedades de hígado, etc. Para poner a prueba y comprobar las sensibilidades del modelo se plantearán dos casos de estudio, a saber: 1) Estudio de la respuesta del modelo en un caso básico, donde se tenga un único beneficiario o persona física por grupo poblacional, que permita comprobar fácilmente la coherencia de los resultados y llevar a cabo la validación subsiguiente del modelo. 2) El estudio de la operación real del Banco de Alimentos de Madrid durante el año 2018. Consiste en usar como datos de entrada: 1) las donaciones que se produjeron en el año, escaladas al caso semanal y 2) el número real de beneficiarios del Banco de Alimentos de Madrid. Así, por un lado, se podrá comprobar la respuesta del modelo ante un caso realista de operación y por otro, el caso de estudio podría servir para identificar oportunidades de mejora en las decisiones del aprovisionamiento del Banco de Alimentos de Madrid. El propósito del modelo no es el de realizar una predicción precisa, por ejemplo, sobre el coste de aprovisionamiento del Banco de Alimentos de Madrid durante un año de operación (para lo que habría que capturar aspectos como cuestiones logísticas o la posible estacionalidad en precios de algunos de los alimentos), sino más bien sacar conclusiones cualitativas sobre las variables y las cuestiones más relevantes en el problema de satisfacer las necesidades de la población beneficiaria. Así, los resultados del modelo dan una idea de cómo debería realizarse la planificación de las compras de distintos grupos de alimentos para un horizonte de operación semanal, y su correspondiente reparto entre los distintos grupos de población beneficiaria. De los resultados obtenidos se han podido desprender las siguientes conclusiones: • La distribución de macronutrientes requerida por la población menor de edad es más sencilla de satisfacer que la asociada a las poblaciones mayores de edad y la de las mujeres es más sencilla de satisfacer que la de los hombres. • La utilización de un modelo cuantitativo para la optimización de compras de alimentos en una organización como el Banco de Alimentos de Madrid podría usarse para reducir los costes de aprovisionamiento en más de un 10%. • Existe una alineación de objetivos entre la minimización de costes de aprovisionamiento, y la minimización de costes de logística (entendida como almacenamiento y distribución), ya que la optimización del aprovisionamiento lleva a reducir las cantidades de víveres en un 9%. • El resultado de la optimización conduce a un resultado de "hambre cero" En definitiva, el modelo propuesto, siendo capaz de representar satisfactoriamente los requerimientos nutricionales de los beneficiarios del Banco de Alimentos de Madrid, tiene cierto potencial para reflejar el coste de cubrir dichas necesidades, y así servir a efectos de herramienta de diseño de nuevas estrategias de aprovisionamiento o de evaluación de políticas de apoyo económico y social a la Organización. Por último, de los estudios realizados se ha realizado un artículo de investigación que pretende ser publicado en la revista European Journal of Nutrition, bajo el título “The Food Bank of Madrid: A Linear Model for Optimal Nutrition”, cuyo primer autor es el autor de este proyecto fin de máster.RESOURCE OPTIMIZATION FOR THE FOOD BANK OF MADRID: ADRESSING PURCHASING AND PROVISIONING NEEDS IN THE CONTEXT OF COVID-19 The Covid-19 pandemic places Spanish society at a juncture in which both the health system and the social conscience are put to the test while the economy faulters. In this context, the coming months will likely be marked, among other considerations, by an enormous effort to foresee and limit the magnitude of the economic recession that is coming (and whose effects are already beginning to be glimpsed) on the different groups that make up the society. In a democratic, human, and socially responsible environment, it is relevant to pay special attention to those groups that, during recessions, may find themselves in a situation of greater urgency or economic need. These groups will be, precisely, those who will find themselves in a situation of greater defenselessness and exclusion, since they do not have sufficient means to fight the recession on their own. This situation might even promote certain social destabilization that affects the rest of the groups, in case that the appropriate means are not deployed to help covering their basic needs. Driven by this situation of concern and uncertainty, one of the main goals of this work is, precisely, to support the labor of those organizations that assist disadvantaged groups, aiming to make a contribution in the satisfaction to their most basic and urgent needs. More specifically, in this project a strategic study that seeks to determine how to maximize the social utility of the available resources is pursued. To undertake this study, the project will be based on a mathematical-financial liner programming model that helps to make an optimized food supply based on the minimization of the cost that allows to cover the nutritional requirements needed by beneficiaries, which will be determined by variables such as age or sex. Said model will use decision parameters such as 1) the nutrient content and wholesale market price per kg of the different food groups that are available in the Spanish market, 2) the amount and type of food received as donations or 3) the number of beneficiaries per age group that have to be served, among others. From this input data, the model is able to compute the weekly purchase that should be done in order to match nutritional needs of beneficiaries with the available resources, under a cost minimization criteria, where Madrid’s Food Bank is placed as a centralized decision-maker. Besides costs related to the purchase of supplies, an economic penalty has been introduced, corresponding to the amount of non-supplied nutrient per population group. This penalty would serve to represent the social cost derived from the non-satisfaction of a given nutritional requirement for a particular population group. For instance. in the case of essential fatty acids, the penalty variable would be constituted, among other factors, by the quantitative metric that captured the cost associated to the medical treatments that would be needed to treat the diseases related to the deficient nutrient, such as obstructions in blood flow, liver disease, etc. In order to test the model, two case studies will be posed, namely: 1) Response of the model to a basic case-study where there is a single individual per population group, to quickly test the coherence and the trend of the results, which would ultimately serve to validate the formulation. 2) General case-study based in 2018 operation. By using real input data for 1) donations produced during the year and 2) number of beneficiaries by age and sex, the response of the model for a realistic case study can be obtained. Thusly, from these results, relevant cost-cutting opportunities may be identified, related to the decision making of Madrid’s Food Bank provisioning activities. The objective of the model is not that of producing a precise forecast of the final provisioning cost (since in order to do that a more detailed representation of logistics could be needed, or even price stationarity of different food groups could become relevant), but rather, extracting qualitative conclusions about the most relevant variables in the problem of food provisioning to a disadvantaged population. Therefore, results should show how to plan purchases for a weekly horizon and how to distribute them among the different groups. Specifically, out of the obtained results, the following conclusions have been drawn: • The macronutrient distribution required by youngsters is more easily satisfied than that of elders and women needs are more easily covered that that of men. • The use of a financial-quantitative model to optimize food purchases in an Organization such as the Food Bank of Madrid could be used to reduce provisioning costs over a 10%. • An objective alignment between cost minimization in food provisioning and cost minimization in logistics (storage and distribution) is deducted since food amounts drop by over a 9% when optimizing supply provisioning. • The results of the optimization lead to a “no hunger” result Ultimately, the proposed model, that is able to satisfactorily present the nutritional requirements of the beneficiary groups, has some potential to express the cost associated to the coverage of said needs and consequently, may serve as a tool for the design of new provisioning strategies, or even for the evaluation of further economic and social aid to the organization. From the performed studies, a research paper entitled “The Food Bank of Madrid: A Linear Model for Optimal Nutrition” has been produced and is currently under revision for acceptance in European Journal of Nutrition

    Ancient Landslide Reactivation at the Viaduct No. 1 Located on the Caraca-La-Guaira Highway in Venezuela

    Get PDF
    Reactivation of an ancient landslide detected in 1987 affected the southern side of the Viaduct No.1 located in the Caracas-La Guaira highway, which connects Caracas, capital of Venezuela, with its main seaport and the Simon Bolivar International Airport. The Viaduct was built in 1953 and covered a gorge of approximately 300 m. It consisted of three parallel double-hinged arch ribs made of plain concrete spanning over approximately 152 m and two smaller access Viaducts on either side of the arch rib span. This paper summarizes the results from geotechnical investigation, the evaluation of inclinometers readings and surface control points and the main rehabilitation measures conducted on the structure

    KheOps: Cost-effective Repeatability, Reproducibility, and Replicability of Edge-to-Cloud Experiments

    Full text link
    Distributed infrastructures for computation and analytics are now evolving towards an interconnected ecosystem allowing complex scientific workflows to be executed across hybrid systems spanning from IoT Edge devices to Clouds, and sometimes to supercomputers (the Computing Continuum). Understanding the performance trade-offs of large-scale workflows deployed on such complex Edge-to-Cloud Continuum is challenging. To achieve this, one needs to systematically perform experiments, to enable their reproducibility and allow other researchers to replicate the study and the obtained conclusions on different infrastructures. This breaks down to the tedious process of reconciling the numerous experimental requirements and constraints with low-level infrastructure design choices.To address the limitations of the main state-of-the-art approaches for distributed, collaborative experimentation, such as Google Colab, Kaggle, and Code Ocean, we propose KheOps, a collaborative environment specifically designed to enable cost-effective reproducibility and replicability of Edge-to-Cloud experiments. KheOps is composed of three core elements: (1) an experiment repository; (2) a notebook environment; and (3) a multi-platform experiment methodology.We illustrate KheOps with a real-life Edge-to-Cloud application. The evaluations explore the point of view of the authors of an experiment described in an article (who aim to make their experiments reproducible) and the perspective of their readers (who aim to replicate the experiment). The results show how KheOps helps authors to systematically perform repeatable and reproducible experiments on the Grid5000 + FIT IoT LAB testbeds. Furthermore, KheOps helps readers to cost-effectively replicate authors experiments in different infrastructures such as Chameleon Cloud + CHI@Edge testbeds, and obtain the same conclusions with high accuracies (> 88% for all performance metrics)

    ProvLight: Efficient Workflow Provenance Capture on the Edge-to-Cloud Continuum

    Full text link
    Modern scientific workflows require hybrid infrastructures combining numerous decentralized resources on the IoT/Edge interconnected to Cloud/HPC systems (aka the Computing Continuum) to enable their optimized execution. Understanding and optimizing the performance of such complex Edge-to-Cloud workflows is challenging. Capturing the provenance of key performance indicators, with their related data and processes, may assist in understanding and optimizing workflow executions. However, the capture overhead can be prohibitive, particularly in resource-constrained devices, such as the ones on the IoT/Edge.To address this challenge, based on a performance analysis of existing systems, we propose ProvLight, a tool to enable efficient provenance capture on the IoT/Edge. We leverage simplified data models, data compression and grouping, and lightweight transmission protocols to reduce overheads. We further integrate ProvLight into the E2Clab framework to enable workflow provenance capture across the Edge-to-Cloud Continuum. This integration makes E2Clab a promising platform for the performance optimization of applications through reproducible experiments.We validate ProvLight at a large scale with synthetic workloads on 64 real-life IoT/Edge devices in the FIT IoT LAB testbed. Evaluations show that ProvLight outperforms state-of-the-art systems like ProvLake and DfAnalyzer in resource-constrained devices. ProvLight is 26 -- 37x faster to capture and transmit provenance data; uses 5 -- 7x less CPU; 2x less memory; transmits 2x less data; and consumes 2 -- 2.5x less energy. ProvLight and E2Clab are available as open-source tools
    corecore