62 research outputs found

    Enhanced matching engine for improving the performance of semantic web service discovery

    Get PDF
    Web services are the means to realize the Service Oriented Architecture (SOA) paradigm. One of the key tasks of the Web services is discovery also known as matchmaking. This is the act of locating suitable Web services to fulfill a specific goal and adding semantic descriptions to the Web services is the key to enabling an automated, intelligent discovery process. Current Semantic Web service discovery approaches are primarily classified into logic-based, non-logic-based and hybrid categories. An important challenge yet to be addressed by the current approaches is the use of the available constructs in Web service descriptions to achieve a better performance in matchmaking. Performance is defined in terms of precision and recall as well-known metrics in the information retrieval field. Moreover, when matchmaking a large number of Web services, maintaining a reasonable execution time becomes a crucial challenge. In this research, to address these challenges, a matching engine is proposed. The engine comprises a new logic-based and nonlogic- based matchmaker to improve the performance of Semantic Web service discovery. The proposed logic-based and non-logic-based matchmakers are also combined as a hybrid matchmaker for further improvement of performance. In addition, a pre-matching filter is used in the matching engine to enhance the execution time of matchmaking. The components of the matching engine were developed as prototypes and evaluated by benchmarking the results against data from the standard repository of Web services. The comparative evaluations in terms of performance and execution time highlighted the superiority of the proposed matching engine over the existing and prominent matchmakers. The proposed matching engine has been proven to enhance both the performance and execution time of the Semantic Web service discovery

    Increasing Coverage in Wireless Sensor Networks by Minimizing Displacements Using a Greedy Method based on Nodesâ Location and Neighborhood

    Get PDF
    The successful operation of a wireless sensor network depends on the proper coverage of the environment, which in turn is affected by the number and location of sensors. In most cases, the sensors are placed randomly in the deployment region, so by default, most coverage is not achieved in their initial deployment. One of the major challenges for network design is to determine the location strategy of the sensors so that the deployed nodes can cover as many regions as possible. The objective of this study is to solve this problem in such a way that the energy consumption of the nodes is minimal. Because the power supply of the sensor nodes is a non-rechargeable battery. The proposed approach uses division and detection of uncovered regions. Then a greedy method based on the topology and properties of the nodes and the network deployment region is presented to select the optimal nodes and cover the region. The proposed approach is simulated and the evaluation results show a decrease in the displacement of the sensors for more coverage and a reduction in energy consumption compared to similar works

    Improving Software Effort Estimation through a Hybrid Approach of Metaheuristic Algorithms in Analogy-based Method

    Get PDF
    Project management in software development is one of the most crucial activities as it encompasses the entire software development process from start to finish. Estimating the effort required for software projects is a significant challenge in project management. Managing software projects and consequently estimating their effort for more efficient and impactful management of such projects is necessary and unavoidable. Analogy-based estimation in software effort estimation involves comparing new projects to completed ones. However, this method can be ineffective due to variations in feature importance and dependencies. To address this, weights are assigned to features using optimization techniques like meta-heuristic algorithms. Yet, these algorithms may get stuck in local optima, yielding nonoptimal results. An approach to estimate software effort is proposed in this study. It aims to find global optimal feature weights by combining particle swarm and genetics metaheuristic algorithms. This hybrid approach leverages particle motion and composition to enhance solution generation, increasing the likelihood of finding the global optimum and overcoming local optima issues. The algorithm calculates feature weights for project estimation using analogy-based methods. The proposed approach was tested and assessed using two datasets, namely Maxwell and Desharnais. The experimental results indicated an enhancement in the evaluation criteria, including MMRE, MdMRE, and PRED, compared to similar research works

    An Approach to Improve the Live Migration Using Asynchronized Cache and Prioritized IP Packets

    Get PDF
    The live migration of a virtual machine is a method of moving virtual machines across hosts within a virtualized data center. Two main parameters should be considered for evaluation of live migration; total duration, and downtime of migration. This paper focuses on optimization of live migration in Xen environment where memory pages are dirtied rapidly. An approach is proposed to manage dirty pages during migration in the cache and prioritize the packets at the network level. According to the evaluations, when the system is under heavy workload or it is running within a stress tool, the virtual machines are intensively writing. The proposed approach outperforms the default method in terms of number of transferred pages, total migration time, and downtime. Experimental results showed that by increasing workload, the proposed approach reduced the number of sent pages by 47.4%, total migration time by 10%, and the downtime by 27.7% in live migration

    Extracorporeal Membrane Oxygenation for Severe Acute Respiratory Distress Syndrome associated with COVID-19: An Emulated Target Trial Analysis.

    Get PDF
    RATIONALE: Whether COVID patients may benefit from extracorporeal membrane oxygenation (ECMO) compared with conventional invasive mechanical ventilation (IMV) remains unknown. OBJECTIVES: To estimate the effect of ECMO on 90-Day mortality vs IMV only Methods: Among 4,244 critically ill adult patients with COVID-19 included in a multicenter cohort study, we emulated a target trial comparing the treatment strategies of initiating ECMO vs. no ECMO within 7 days of IMV in patients with severe acute respiratory distress syndrome (PaO2/FiO2 <80 or PaCO2 ≥60 mmHg). We controlled for confounding using a multivariable Cox model based on predefined variables. MAIN RESULTS: 1,235 patients met the full eligibility criteria for the emulated trial, among whom 164 patients initiated ECMO. The ECMO strategy had a higher survival probability at Day-7 from the onset of eligibility criteria (87% vs 83%, risk difference: 4%, 95% CI 0;9%) which decreased during follow-up (survival at Day-90: 63% vs 65%, risk difference: -2%, 95% CI -10;5%). However, ECMO was associated with higher survival when performed in high-volume ECMO centers or in regions where a specific ECMO network organization was set up to handle high demand, and when initiated within the first 4 days of MV and in profoundly hypoxemic patients. CONCLUSIONS: In an emulated trial based on a nationwide COVID-19 cohort, we found differential survival over time of an ECMO compared with a no-ECMO strategy. However, ECMO was consistently associated with better outcomes when performed in high-volume centers and in regions with ECMO capacities specifically organized to handle high demand. This article is open access and distributed under the terms of the Creative Commons Attribution Non-Commercial No Derivatives License 4.0 (http://creativecommons.org/licenses/by-nc-nd/4.0/)

    An Approach to Reduce Energy Consumption in Cloud data centers using Harmony Search Algorithm

    No full text
    Fast development of knowledge and communication has established a new computational style which is known as cloud computing. One of the main issues considered by the cloud infrastructure providers, is to minimize the costs and maximize the profitability. Energy management in the cloud data centers is very important to achieve such goal. Energy consumption can be reduced either by releasing idle nodes or by reducing the virtual machines migrations. To do the latter, one of the challenges is to select the placement approach of the migrated virtual machines on the appropriate node. In this paper, an approach to reduce the energy consumption in cloud data centers is proposed. This approach adapts harmony search algorithm to migrate the virtual machines. It performs the placement by sorting the nodes and virtual machines based on their priority in descending order. The priority is calculated based on the workload. The proposed approach is simulated. The evaluation results show the reduction in the virtual machine migrations, the increase of efficiency and the reduction of energy consumption

    A glimpse of Semantic Web trust

    Full text link

    Prediction of relevance between requests and web services using ann and LR models

    No full text
    An approach of Web service matching is proposed in this paper. It adopts semantic similarity measuring techniques to calculate the matching level between a pair of service descriptions. Their similarity is then specified by a numeric value. Determining a threshold for this value is a challenge in all similar matching approaches. To address this challenge, we propose the use of classification methods to predict the relevance of requests and Web services. In recent years, outcome prediction models using Logistic Regression and Artificial Neural Network have been developed in many research areas. We compare the performance of these methods on the OWLS-TC v3 service library. The classification accuracy is used to measure the performance of the methods. The experimental results show the efficiency of both methods in predicting the new cases. However, Artificial Neural Network with sensitivity analysis model outperforms Logistic Regression method

    Evaluation and classification of trophic status in coastal waters of Homozgan province using trophic index Scale (Trixcs) and unscale (Untrixcs)

    No full text
    One of the most important issues in environmental management of coastal waters is the evaluation of Eutrophication. In this study, the trophic status of coastal waters of Hormozgan province was evaluated using Scale (TRIXCS) and Unscale (UNTRIXCS) trophic indices in 2012. The results showed that TRIXCS values were in the range of 3.6 to 5.9 (in the area 1 Mezo-Eutroph, areas 2 and 3 Mezotroph, and area 4 Oligo – Mezotroph) and UNTRIXCS values ranged from 3.4 to 5.8 (no risk in the 4th area and high risk in other areas). Based on the results of TQR trix index, the status of eutrophication in areas 1, 2 and 3 was in good status, while area 4 was in very good status. The coefficient efficiency trophy index was 4.92±0.03, suggesting that the amount of nutrients consumed by phytoplankton near shore of Bandar Abbas (in area 1) was higher than other areas. Altogether, according to comments and suggestions by many scientists, trophic state Scale (Trixcs) index can be used for evaluation of eutrophication in coastal waters, after revisiting and development
    corecore