493 research outputs found
An Efficient Monte Carlo-based Probabilistic Time-Dependent Routing Calculation Targeting a Server-Side Car Navigation System
Incorporating speed probability distribution to the computation of the route
planning in car navigation systems guarantees more accurate and precise
responses. In this paper, we propose a novel approach for dynamically selecting
the number of samples used for the Monte Carlo simulation to solve the
Probabilistic Time-Dependent Routing (PTDR) problem, thus improving the
computation efficiency. The proposed method is used to determine in a proactive
manner the number of simulations to be done to extract the travel-time
estimation for each specific request while respecting an error threshold as
output quality level. The methodology requires a reduced effort on the
application development side. We adopted an aspect-oriented programming
language (LARA) together with a flexible dynamic autotuning library (mARGOt)
respectively to instrument the code and to take tuning decisions on the number
of samples improving the execution efficiency. Experimental results demonstrate
that the proposed adaptive approach saves a large fraction of simulations
(between 36% and 81%) with respect to a static approach while considering
different traffic situations, paths and error requirements. Given the
negligible runtime overhead of the proposed approach, it results in an
execution-time speedup between 1.5x and 5.1x. This speedup is reflected at
infrastructure-level in terms of a reduction of around 36% of the computing
resources needed to support the whole navigation pipeline
Heavy Rainfall Identification within the Framework of the LEXIS Project: The Italian Case Study
LEXIS (Large-scale EXecution for Industry and Society) H2020 project is currently developing an advanced system for Big Data analysis that takes advantage of interacting large-scale geographically-distributed HPC infrastructure and cloud services. More specifically, LEXIS Weather and Climate Large-Scale Pilot workflows ingest data coming from different sources, like global/regional weather models, conventional and unconventional meteorological observations, application models and socio-economic impact models, in order to provide enhanced meteorological information at the European scale. In the framework of LEXIS Weather and Climate Large-scale Pilot, CIMA Research Foundation is running a 7.5 km resolution WRF (Weather Research and Forecasting) model with European coverage, radar assimilation over the Italian area, and daily updates with 48 hours forecast. WRF data is then processed by ITHACA ERDS (Extreme Rainfall Detection System - http://erds.ithacaweb.org), an early warning system for the monitoring and forecasting of heavy rainfall events. The WRF model provides more detailed information compared to GFS (Global Forecast Systems) data, the most widely used source of rainfall forecasts, implemented in ERDS also. The entire WRF - ERDS workflow was applied to two of the most severe heavy rainfall events that affected Italy in 2020. The first case study is related to an intense rainfall event that affected Toscana during the afternoon and the evening of 4th June 2020. In this case, the Italian Civil Protection issued an orange alert for thunderstorms, on a scale from yellow (low) to orange (medium) to red (high). In several locations of the northern part of the Region more than 100 mm of rainfall were recorded in 3 hours, corresponding to an estimated return period equal to or greater than 200 years. As far as the 24-hours time interval concerns, instead, the estimated return period decreases to 10-50 years. Despite the slight underestimation, WRF model was able to properly forecast the spatial distribution of the rainfall pattern. In addition, thanks to WRF data, precise information about the locations that would be affected by the event were available in the early morning, several hours before the event affected these areas. The second case study is instead related to the heavy rainfall event that affected Palermo (Southern Italy) during the afternoon of 15th July 2020. According to SIAS (Servizio Informativo Agrometeorologico Siciliano) more than 130 mm of rain fell in about 2.5 hours, producing widespread damages due to urban flooding phenomena. The event was not properly forecasted by meteorological models operational at the time of the event, and the Italian Civil Protection did not issue an alert on that area (including Palermo). During that day, in fact, only a yellow alert for thunderstorms was issued on northern-central and western Sicily. Within LEXIS, no alert was issued using GFS data due to the severe underestimation of the amount of forecasted rainfall. Conversely, a WRF modelling experiment (three nested domain with 22.5, 7.5 and 2.5 km grid spacing, innermost over Italy) was executed, by assimilating the National radar reflectivity mosaic and in situ weather stations from the Italian Civil Protection Department, and it resulted in the prediction of a peak rainfall depth of about 35 mm in 1 hour and 55 mm in 3 hours, roughly 30 km far apart the actual affected area, thus values supportive at least a yellow alert over the Palermo area. Obtained results highlight how improved rainfall forecast, made available thanks to the use of HPC resources, significantly increases the capabilities of an operational early warning system in the extreme rainfall detection. Global-scale low-resolution rainfall forecasts like GFS one are in fact widely known as good sources of information for the identification of large-scale precipitation patterns but lack precision for local-scale applications
A Portable Drug Discovery Platform for Urgent Computing
Drug discovery is a long and costly process. Recent studies demonstrated how the introduction of an in-silico stage, named virtual screening, that suggests which molecule to test in-vitro, increases the drug discovery success probability. In the context of urgent computing, where it is important to find a therapeutic solution in a short time frame, the number of candidates that we can virtual screen is limited only by the available computation power. In this paper, we focus on LiGen, the virtual screening application of the EXSCALATE platform. In particular, we address two challenges of performing an extreme-scale virtual screening on a modern HPC system. The first one is posed by hardware heterogeneity, where GPUs of different vendors account for a large fraction of their performance. The second challenge concerns the operational difficulties of running the campaign since it requires significant effort and technical skills that are not common among domain experts. We show how hinging on SYCL and the LEXIS Platform, is the solution that the EXSCALATE Platform uses to address these challenges
Event-Related Potential Effects of Object Recognition depend on Attention and Part-Whole Configuration
The effects of spatial attention and part-whole configuration on recognition of repeated objects were investigated with behavioral and event-related potential (ERP) measures. Short-term repetition effects were measured for probe objects as a function of whether a preceding prime object was shown as an intact image or coarsely scrambled (split into two halves) and whether or not it had been attended during the prime display. In line with previous behavioral experiments, priming effects were observed from both intact and split primes for attended objects, but only from intact (repeated sameview) objects when they were unattended. These behavioral results were reflected in ERP waveforms at occipital–temporal locations as more negative-going deflections for repeated items in the time window between 220 and 300 ms after probe onset (N250r).Attended intact images showed generally more enhanced repetition effects than split ones. Unattended images showed repetition effects only when presented in an intact configuration, and this finding was limited to the right-hemisphere electrodes. Repetition effects in earlier (before 200 ms) time windows were limited to attended conditions at occipito-temporal sites during the N1, a component linked to the encoding of object structure, while repetition effects at central locations during the same time window (P150) were found for attended and unattended probes but only when repeated in the same intact configuration. The data indicate that view-generalization is mediated by a combination of analytic (part-based) representations and automatic view-dependent representations
Application of the New Mapping Method to Complex Three Coupled Maccari’s System Possessing M-Fractional Derivative
In this academic investigation, an innovative mapping approach is applied to complex three coupled Maccari’s system to unveil novel soliton solutions. This is achieved through the utilization of M-Truncated fractional derivative with employing the new mapping method and computer algebraic syatem (CAS) such as Maple. The derived solutions in the form of hyperbolic and trigonometric functions. Our study elucidates a variety of soliton solutions such as periodic, singular, dark, kink, bright, dark-bright solitons solutions. To facilitate comprehension, with certain solutions being visually depicted through 2-dimensional, contour, 3-dimensional, and phase plots depicting bifurcation characteristics utilizing Maple software. Furthermore, the incorporation of M-Truncated derivative enables a more extensive exploration of solution patterns. Our study establishes a connection between computer science and soliton physics, emphasizing the pivotal role of soliton phenomena in advancing simulations and computational modeling. Analytical solutions are subsequently generated through the application of the new mapping method. Following this, a thorough examination of the dynamic nature of the equation is conducted from various perspectives. In essence, understanding the dynamic characteristics of systems is of great importance for predicting outcomes and advancing new technologies. This research significantly contributes to the convergence of theoretical mathematics and applied computer science, emphasizing the crucial role of solitons in scientific disciplines
Modeling and simulations for the mitigation of atmospheric carbon dioxide through forest management programs
The growing global population causes more anthropogenic carbon dioxide emissions and raises the need for forest products, which in turn causes deforestation and elevated levels. A rise in the concentration of carbon dioxide in the atmosphere is the major reason for global warming. Carbon dioxide concentrations must be reduced soon to achieve the mitigation of climate change. Forest management programs accommodate a way to manage atmospheric levels. For this purpose, we considered a nonlinear fractional model to analyze the impact of forest management policies on mitigating atmospheric concentration. In this investigation, fractional differential equations were solved by utilizing the Atangana Baleanu Caputo derivative operator. It captures memory effects and shows resilience and efficiency in collecting system dynamics with less processing power. This model consists of four compartments, the concentration of carbon dioxide , human population , forest biomass , and forest management programs at any time . The existence and uniqueness of the solution for the fractional model are shown. Physical properties of the solution, non-negativity, and boundedness are also proven. The equilibrium points of the model were computed and further analyzed for local and global asymptotic stability. For the numerical solution of the suggested model, the Atangana-Toufik numerical scheme was employed. The acquired results validate analytical results and show the significance of arbitrary order . The effect of deforestation activities and forest management strategies were also analyzed on the dynamics of atmospheric carbon dioxide and forest biomass under the suggested technique. The illustrated results describe that the concentration of can be minimized if deforestation activities are controlled and proper forest management policies are developed and implemented. Furthermore, it is determined that switching to low-carbon energy sources, and developing and implementing more effective mitigation measures will result in a decrease in the mitigation of
Pegasus: Performance Engineering for Software Applications Targeting HPC Systems
Developing and optimizing software applications for high performance and energy efficiency is a very challenging task, even when considering a single target machine. For instance, optimizing for multicore-based computing systems requires in-depth knowledge about programming languages, application programming interfaces, compilers, performance tuning tools, and computer architecture and organization. Many of the tasks of performance engineering methodologies require manual efforts and the use of different tools not always part of an integrated toolchain. This paper presents Pegasus, a performance engineering approach supported by a framework that consists of a source-to-source compiler, controlled and guided by strategies programmed in a Domain-Specific Language, and an autotuner. Pegasus is a holistic and versatile approach spanning various decision layers composing the software stack, and exploiting the system capabilities and workloads effectively through the use of runtime autotuning. The Pegasus approach helps developers by automating tasks regarding the efficient implementation of software applications in multicore computing systems. These tasks focus on application analysis, profiling, code transformations, and the integration of runtime autotuning. Pegasus allows developers to program their strategies or to automatically apply existing strategies to software applications in order to ensure the compliance of non-functional requirements, such as performance and energy efficiency. We show how to apply Pegasus and demonstrate its applicability and effectiveness in a complex case study, which includes tasks from a smart navigation system
Addressing docking pose selection with structure-based deep learning: Recent advances, challenges and opportunities
Molecular docking is a widely used technique in drug discovery to predict the binding mode of a given ligand to its target. However, the identification of the near-native binding pose in docking experiments still represents a challenging task as the scoring functions currently employed by docking programs are parametrized to predict the binding affinity, and, therefore, they often fail to correctly identify the ligand native binding conformation. Selecting the correct binding mode is crucial to obtaining meaningful results and to conveniently optimizing new hit compounds. Deep learning (DL) algorithms have been an area of a growing interest in this sense for their capability to extract the relevant information directly from the protein-ligand structure. Our review aims to present the recent advances regarding the development of DL-based pose selection approaches, discussing limitations and possible future directions. Moreover, a comparison between the performances of some classical scoring functions and DL-based methods concerning their ability to select the correct binding mode is reported. In this regard, two novel DL-based pose selectors developed by us are presented
New genetic loci link adipose and insulin biology to body fat distribution.
Body fat distribution is a heritable trait and a well-established predictor of adverse metabolic outcomes, independent of overall adiposity. To increase our understanding of the genetic basis of body fat distribution and its molecular links to cardiometabolic traits, here we conduct genome-wide association meta-analyses of traits related to waist and hip circumferences in up to 224,459 individuals. We identify 49 loci (33 new) associated with waist-to-hip ratio adjusted for body mass index (BMI), and an additional 19 loci newly associated with related waist and hip circumference measures (P < 5 × 10(-8)). In total, 20 of the 49 waist-to-hip ratio adjusted for BMI loci show significant sexual dimorphism, 19 of which display a stronger effect in women. The identified loci were enriched for genes expressed in adipose tissue and for putative regulatory elements in adipocytes. Pathway analyses implicated adipogenesis, angiogenesis, transcriptional regulation and insulin resistance as processes affecting fat distribution, providing insight into potential pathophysiological mechanisms
- …
