263 research outputs found
Autonomic regulation therapy to enhance myocardial function in heart failure patients: the ANTHEM-HFpEF study.
BackgroundApproximately half of the patients presenting with new-onset heart failure (HF) have HF with preserved left ventricular ejection fraction (HFpEF) and HF with mid-range left ventricular ejection fraction (HFmrEF). These patients have neurohormonal activation like that of HF with reduced ejection fraction; however, beta-blockers and angiotensin-converting enzyme inhibitors have not been shown to improve their outcomes, and current treatment for these patients is symptom based and empiric. Sympathoinhibition using parasympathetic stimulation has been shown to improve central and peripheral aspects of the cardiac nervous system, reflex control, induce myocyte cardioprotection, and can lead to regression of left ventricular hypertrophy. Beneficial effects of autonomic regulation therapy (ART) using vagus nerve stimulation (VNS) have also been observed in several animal models of HFpEF, suggesting a potential role for ART in patients with this disease.MethodsThe Autonomic Neural Regulation Therapy to Enhance Myocardial Function in Patients with Heart Failure and Preserved Ejection Fraction (ANTHEM-HFpEF) study is designed to evaluate the feasibility, tolerability, and safety of ART using right cervical VNS in patients with chronic, stable HFpEF and HFmrEF. Patients with symptomatic HF and HFpEF or HFmrEF fulfilling the enrolment criteria will receive chronic ART with a subcutaneous VNS system attached to the right cervical vagus nerve. Safety parameters will be continuously monitored, and cardiac function and HF symptoms will be assessed every 3 months during a post-titration follow-up period of at least 12 months.ConclusionsThe ANTHEM-HFpEF study is likely to provide valuable information intended to expand our understanding of the potential role of ART in patients with chronic symptomatic HFpEF and HFmrEF
A CONCEPTUAL MOBILE GIS SYSTEM FOR GLOBAL EMISSION MONITORING
Today information systems play key role in new century, geographical information system (GIS) as a special class of information system. Now we are moving wired to wireless network, that is mobile environment shown in fig2 (any where any time and any thing computing), plays key role in this new century. In this application mobile GIS, which will be a network environment, is proposed, which can perform monitoring of emission levels at a city level or country level or Global level. The concept will be as follows. The analog data of Automated weather monitoring devices (AWMD) will be converted into digital form where by the respective Clint device will transfer data to the server using wireless application protocol (WAP) the hardware and software system need to be integrated where by the mobile GIS components include atmospheric emission data receivers/analyzers WAP clients of GPS enabled system routers are linked to a GIS with relevant software. In the light of Kyoto protocol speedy and accurate determination of emission levels is need of the hour. This is especially required for signatory countries of Kyoto protocol. In this context fixed mobile automated weather stations can be connected to a mobile GIS based data base on a network / web based GIS. This will enable daily/weekly/monthly / yearly monitoring of hazardous parameters from the atmosphere is possible. In this paper the mobile GIS based system its principle of operation hardware and software requirements and the conceptualization are presented. This system is believed to be idealistic for the monitoring and emission quantification
Performance and Comparative Analysis of the Two Contrary Approaches for Detecting Near Duplicate Web Documents in Web Crawling
Recent years have witnessed the drastic development of World Wide Web (WWW). Information is being accessible at the finger tip anytime anywhere through the massive web repository. The performance and reliability of web engines thus face huge problems due to the presence of enormous amount of web data. The voluminous amount of web documents has resulted in problems for search engines leading to the fact that the search results are of less relevance to the user. In addition to this, the presence of duplicate and near-duplicate web documents has created an additional overhead for the search engines critically affecting their performance. The demand for integrating data from heterogeneous sources leads to the problem of near-duplicate web pages. The detection of near duplicate documents within a collection has recently become an area of great interest. In this research, we have presented an efficient approach for the detection of near duplicate web pages in web crawling which uses keywords and the distance measure. Besides that, G.S. Manku et al.’s fingerprint based approach proposed in 2007 was considered as one of the “state-of-the-art" algorithms for finding near-duplicate web pages. Then we have implemented both the approaches and conducted an extensive comparative study between our similarity score based approach and G.S. Manku et al.’s fingerprint based approach. We have analyzed our results in terms of time complexity, space complexity, Memory usage and the confusion matrix parameters. After taking into account the above mentioned performance factors for the two approaches, the comparison study clearly portrays our approach the better (less complex) of the two based on the factors considered.DOI:http://dx.doi.org/10.11591/ijece.v2i6.1746
Evaluation of fly ash bricks, Steel and RMC by application of TQM in Residential Building Construction
Materials are the key resources in construction industry, since no production is possible without materials. It forms a major contribution of the cost of the construction and therefore proper control over their procurement, storage, issue, movement and consumption is necessary. Material of required quality and quantity is made available so that will be substantial saving in cost, time and also improve quality of construction. Material cost being in the range of 50% to 60% of the total cost of the project. Uncontrolled supply of material may result in excessive supply. Total Quality Management (TQM) is one of the most popular modern management concepts.TQM is a quality management system which pursues excellence in customer satisfaction through continuous improvements of products and process by the total involvement and dedication of everyone involved in the process.TQM is a long term process and adopts strategic dimension. The aims of TQM are to achieve customer satisfaction, cost effectiveness, and defect free work through a relentless pursuit of the “war on waste. Customer satisfaction is achieved through focusing on process improvement, customer and supplier involvement, teamwork, training, and education. This paper aim to evaluate the level of effectiveness of application of Total Quality Management principle, materials provided by supplier at S. J. Contracts Pvt. Ltd. Company. A qualitative research approach is adopted in this study, where the questionnaire is distributed to supplier. To stay competitive in today’s market and maintain the quality of material is most important factor, so need to identify the level of quality practices in organization. Analyzed material quality by TQM application whether they follow the rules of total quality management or not, by maintaining the quality of materials, we achieved the quality of deliverables, therefore using the total quality management rules. For Further analysis, laboratory test on materials likes fly ash brick, steel, RMC are conducted. Materials are analyzed by using tools such as six sigma and benchmarking. Fly ash bricks analysis is carried and following results are obtained. The fly ash brick length is within expected benchmarked variation. The brick length is at 2 sigma performance. Considering Zero defect as the ultimate goal, then defect length is 25%. Material suppliers needs to more focus on the manufacturing process and quality of materials
Simultaneous Extraction of Zinc and Manganese Dioxide from Zinc Sulphide Concentrate and Manganese Ore
Direct leaching studies of mixture of zinc sulphide concentrate (concentrate) of Himalayan origin and manganese ore (ore) in stoichiometric proportion according to the reaction :
ZnS + Mn02 + 2H2SO4 = ZnSO4 + MnSO4 + S° + 2H20
in dilute sulphuric acid (present in spent liquor from electro-winning cells) were carried out to test its amenability to a novel process developed at National Metallurgical Laboratory (NML). The liberated sulphur in the sludge was extracted in pure form. The effects of controlling parameters viz. acid-concentration, temper-ature and duration of leaching were investigated to opti-mize the leaching conditions. Extraction efficiencies of zinc and sulphur under best condition were —82% each, while that of manganese was —95%. The inextricable portion of zinc sulphide was investigated with the help of XRD analysis of the concentrate and the final residue. The leach solution purified by conventional hydro-metallur-gical technique was fed to an electrolytic cell for
simultaneous electro-winning of zinc and electrolytic manganese dioxide (EMD). The EMD was y-variety as shown by TG/DTA and XRD analyses
Classification of Epileptic and Non-Epileptic Electroencephalogram (EEG) Signals Using Fractal Analysis and Support Vector Regression
Seizures are a common symptom of this neurological condition, which is caused by the discharge of brain nerve cells at an excessively fast rate. Chaos, nonlinearity, and other nonlinearities are common features of scalp and intracranial Electroencephalogram (EEG) data recorded in clinics. EEG signals that aren't immediately evident are challenging to categories because of their complexity. The Gradient Boost Decision Tree (GBDT) classifier was used to classify the majority of the EEG signal segments automatically. According to this study, the Hurst exponent, in combination with AFA, is an efficient way to identify epileptic signals. As with any fractal analysis approach, there are problems and factors to keep in mind, such as identifying whether or not linear scaling areas are present. These signals were classified as either epileptic or non-epileptic by using a combination of GBDT and a Support Vector Regression (SVR). The combined method's identification accuracy was 98.23%. This study sheds light on the effectiveness of AFA feature extraction and GBDT classifiers in EEG classification. The findings can be utilized to develop theoretical guidance for the clinical identification and prediction of epileptic EEG signals. Doi: 10.28991/ESJ-2022-06-01-011 Full Text: PD
A Novel Algorithm for Discovering Frequent Closures and Generators
The Important construction of many association rules needs the calculation of Frequent Closed Item Sets and Frequent Generator Item Sets (FCIS/FGIS). However, these two odd jobs are joined very rarely. Most of the existing methods apply level wise Breadth-First search. Though the Depth-First search depends on different characteristics of data, it is often better than others. Hence, in this paper it is named as FCFG algorithm that combines the Frequent closed item sets and frequent generators. This proposed algorithm (FCFG) extracts frequent itemsets (FIs) in a Depth-First search method. Then this algorithm extracts FCIS and FGIS from FIs by a level wise approach. Then it associates the generators to their closures. In FCFG algorithm, a generic technique is extended from an arbitrary FI-miner algorithm in order to support the generation of minimal non-redundant association rules. Experimental results indicate that FCFG algorithm performs better when compared with other level wise methods in most of the cases
Processing of spent tanning and chrome plating solutions for chromium recovery
Use of chromium in metal plating and leather tanning industries generates a lot of effluent containing Cr(III) and Cr(VI). Besides severe water pollution, substantial amount of chromium is lost due to the prevalent practice of removal-disposal technique followed world over. Processes based on recovery and reuse methodology are currently projected as ex¬cellent means of meeting the environmental regulations while producing chromium salts/solutions for recycling. This paper details the composi¬tion, conditions and quantity of spent liquors/effluents generated in elec¬troplating and leather tanning industries and their treatment by removal¬disposal as well as recovery-reuse methods. The approach based on pre¬cipitation, ion exchange and liquid-liquid extraction particularly for chro¬mium recovery is highlighted. The results of solvent extraction with D2EHPA and CYANEX 272 for chromium recovery from spent tanning baths are also summarised
Solvent extraction in copper metallurgy recovery of acid and metals from copper bleed stream
Solvent extraction in copper metallurgy has been the first major application for producing nonferrous metals beyond the production of rare earths/nuclear metals. With the advent of solvent extrac-tion (SX), several lean grade, complex, rnultimetal and pocket deposits including
byproducts could be processed to produce copper economically. Though, the SX technology is proven internationally, it has yet to find an industrial application in the Indian context. This paper outlines the possibility of using solvent extraction in copper metallurgy particularly in the Indian scenario. Specific example of the processing of copper bleed stream has also been mentioned. Some of the details and the flowsheet
coven here shows how the sulphuric acid can be recov-ered from the copper bleed stream for recycling in the system. Besides, the recovery of copper and nickel in convenient form like metal sulphates and electrolytic grade metal
cathodes can be taken u
Projecting Active Contours with Diminutive Sequence Optimality
Active contours are widely used in image segmentation. To cope with missing or misleading features in image frames taken in contexts such as spatial and surveillance, researchers have commence various ways to model the preceding of shapes and use the prior to constrict active contours. However, the shape prior is frequently learnt from a large set of annotated data, which is not constantly accessible in practice. In addition, it is often doubted that the existing shapes in the training set will be sufficient to model the new instance in the testing image. In this paper we propose to use the diminutive sequence of image frames to learn the missing contour of the input images. The central median minimization is a simple and effective way to impose the proposed constraint on existing active contour models. Moreover, we extend a fast algorithm to solve the projected model by using the hastened proximal method. The Experiments done using image frames acquired from surveillance, which demonstrated that the proposed method can consistently improve the performance of active contour models and increase the robustness against image defects such as missing boundaries
- …
