257 research outputs found

    A CONCEPTUAL MOBILE GIS SYSTEM FOR GLOBAL EMISSION MONITORING

    Get PDF
    Today information systems play key role in new century, geographical information system (GIS) as a special class of information system. Now we are moving wired to wireless network, that is mobile environment shown in fig2 (any where any time and any thing computing), plays key role in this new century. In this application mobile GIS, which will be a network environment, is proposed, which can perform monitoring of emission levels at a city level or country level or Global level. The concept will be as follows. The analog data of Automated weather monitoring devices (AWMD) will be converted into digital form where by the respective Clint device will transfer data to the server using wireless application protocol (WAP) the hardware and software system need to be integrated where by the mobile GIS components include atmospheric emission data receivers/analyzers WAP clients of GPS enabled system routers are linked to a GIS with relevant software. In the light of Kyoto protocol speedy and accurate determination of emission levels is need of the hour. This is especially required for signatory countries of Kyoto protocol. In this context fixed mobile automated weather stations can be connected to a mobile GIS based data base on a network / web based GIS. This will enable daily/weekly/monthly / yearly monitoring of hazardous parameters from the atmosphere is possible. In this paper the mobile GIS based system its principle of operation hardware and software requirements and the conceptualization are presented. This system is believed to be idealistic for the monitoring and emission quantification

    A Security Scheme for Textual & Graphical Passwords

    Get PDF
    Authentication is the process of identifying an individual, usually based on username and password. Authentication merely ensures that the individual is who he or she claims to be. This forestalls the activities against confidentiality and integrity. Shoulder surfing is the main problem of graphical passwords. To overcome the problem of shoulder surfing we introduced a novel Scheme. This scheme provides the login screen to the user at every time the user logs in, this login image consists of set of characters. User with his password clicks some pass characters which are different for different sessions and explained in proposed scheme. To provide better results Neural Network is used for the authenticatio

    Performance and Comparative Analysis of the Two Contrary Approaches for Detecting Near Duplicate Web Documents in Web Crawling

    Get PDF
    Recent years have witnessed the drastic development of World Wide Web (WWW). Information is being accessible at the finger tip anytime anywhere through the massive web repository. The performance and reliability of web engines thus face huge problems due to the presence of enormous amount of web data. The voluminous amount of web documents has resulted in problems for search engines leading to the fact that the search results are of less relevance to the user. In addition to this, the presence of duplicate and near-duplicate web documents has created an additional overhead for the search engines critically affecting their performance. The demand for integrating data from heterogeneous sources leads to the problem of near-duplicate web pages. The detection of near duplicate documents within a collection has recently become an area of great interest. In this research, we have presented an efficient approach for the detection of near duplicate web pages in web crawling which uses keywords and the distance measure. Besides that, G.S. Manku et al.’s fingerprint based approach proposed in 2007 was considered as one of the “state-of-the-art" algorithms for finding near-duplicate web pages. Then we have implemented both the approaches and conducted an extensive comparative study between our similarity score based approach and G.S. Manku et al.’s fingerprint based approach. We have analyzed our results in terms of time complexity, space complexity, Memory usage and the confusion matrix parameters. After taking into account the above mentioned performance factors for the two approaches, the comparison study clearly portrays our approach the better (less complex) of the two based on the factors considered.DOI:http://dx.doi.org/10.11591/ijece.v2i6.1746

    Predicting Software Reliability Using Ant Colony Optimization Technique with Travelling Salesman Problem for Software Process – A Literature Survey

    Get PDF
    Computer software has become an essential and important foundation in several versatile domains including medicine, engineering, etc. Consequently, with such widespread application of software, there is a need of ensuring software reliability and quality. In order to measure such software reliability and quality, one must wait until the software is implemented, tested and put for usage for a certain time period. Several software metrics have been proposed in the literature to avoid this lengthy and costly process, and they proved to be a good means of estimating software reliability. For this purpose, software reliability prediction models are built. Software reliability is one of the important software quality features. Software reliability is defined as the probability with which the software will operate without any failure for a specific period of time in a specified environment. Software reliability, when estimated in early phases of software development life cycle, saves lot of money and time as it prevents spending huge amount of money on fixing of defects in the software after it has been deployed to the client. Software reliability prediction is very challenging in starting phases of life cycle model. Software reliability estimation has thus become an important research area as every organization aims to produce reliable software, with good quality and error or defect free software. There are many software reliability growth models that are used to assess or predict the reliability of the software. These models help in developing robust and fault tolerant systems. In the past few years many software reliability models have been proposed for assessing reliability of software but developing accurate reliability prediction models is difficult due to the recurrent or frequent changes in data in the domain of software engineering. As a result, the software reliability prediction models built on one dataset show a significant decrease in their accuracy when they are used with new data. The main aim of this paper is to introduce a new approach that optimizes the accuracy of software reliability predictive models when used with raw data. Ant Colony Optimization Technique (ACOT) is proposed to predict software reliability based on data collected from literature. An ant colony system by combining with Travelling Sales Problem (TSP) algorithm has been used, which has been changed by implementing different algorithms and extra functionality, in an attempt to achieve better software reliability results with new data for software process. The intellectual behavior of the ant colony framework by means of a colony of cooperating artificial ants are resulting in very promising results. Keywords: Software Reliability, Reliability predictive Models, Bio-inspired Computing, Ant Colony Optimization technique, Ant Colon

    Utilization of iron values of red mud for metallurgical applications

    Get PDF
    A brief overview on the utilization of iron values of red mud is presented along with the results of some recent investigations conducted at National Metallurgical Laboratory. Red mud from Nalco, characterized by high iron content, is used in the studies. Two different strategies are explored : (a) extraction of iron and other metal values from red mud using a pat-ented process, named as Elgai process, available for the removal of alumina from iron ores; and (b) use of red mud as an additive in the iron ore sintering. The second approach has particularly yielded interesting results. Sinter with acceptable physical properties and reducibility could be produced with red mud addition from 50 to 125 kg/tonne of sinter. Red mud addition leads to the dilution of the iron content of sinter. It is suggested that this problem can be circumvented with addition of blue dust, a waste material, along with red mud

    Classification of Epileptic and Non-Epileptic Electroencephalogram (EEG) Signals Using Fractal Analysis and Support Vector Regression

    Get PDF
    Seizures are a common symptom of this neurological condition, which is caused by the discharge of brain nerve cells at an excessively fast rate. Chaos, nonlinearity, and other nonlinearities are common features of scalp and intracranial Electroencephalogram (EEG) data recorded in clinics. EEG signals that aren't immediately evident are challenging to categories because of their complexity. The Gradient Boost Decision Tree (GBDT) classifier was used to classify the majority of the EEG signal segments automatically. According to this study, the Hurst exponent, in combination with AFA, is an efficient way to identify epileptic signals. As with any fractal analysis approach, there are problems and factors to keep in mind, such as identifying whether or not linear scaling areas are present. These signals were classified as either epileptic or non-epileptic by using a combination of GBDT and a Support Vector Regression (SVR). The combined method's identification accuracy was 98.23%. This study sheds light on the effectiveness of AFA feature extraction and GBDT classifiers in EEG classification. The findings can be utilized to develop theoretical guidance for the clinical identification and prediction of epileptic EEG signals. Doi: 10.28991/ESJ-2022-06-01-011 Full Text: PD

    Effective Brain Tumor Classification Using Deep Residual Network-Based Transfer Learning

    Get PDF
    Brain tumor classification is an essential task in medical image processing that provides assistance to doctors for accurate diagnoses and treatment plans. A Deep Residual Network based Transfer Learning to a fully convoluted Convolutional Neural Network (CNN) is proposed to perform brain tumor classification of Magnetic Resonance Images (MRI) from the BRATS 2020 dataset. The dataset consists of a variety of pre-operative MRI scans to segment integrally varied brain tumors in appearance, shape, and histology, namely gliomas. A Deep Residual Network (ResNet-50) to a fully convoluted CNN is proposed to perform tumor classification from MRI of the BRATS dataset. The 50-layered residual network deeply convolutes the multi-category of tumor images in classification tasks using convolution block and identity block. Limitations such as Limited accuracy and complexity of algorithms in CNN-based ME-Net, and classification issues in YOLOv2 inceptions are resolved by the proposed model in this work. The trained CNN learns boundary and region tasks and extracts successful contextual information from MRI scans with minimal computation cost. The tumor segmentation and classification are performed in one step using a U-Net architecture, which helps retain spatial features of the image. The multimodality fusion is implemented to perform classification and regression tasks by integrating dataset information. The dice scores of the proposed model for Enhanced Tumor (ET), Whole Tumor (WT), and Tumor Core (TC) are 0.88, 0.97, and 0.90 on the BRATS 2020 dataset, and also resulted in 99.94% accuracy, 98.92% sensitivity, 98.63% specificity, and 99.94% precision

    A Novel Algorithm for Discovering Frequent Closures and Generators

    Get PDF
    The Important construction of many association rules needs the calculation of Frequent Closed Item Sets and Frequent Generator Item Sets (FCIS/FGIS). However, these two odd jobs are joined very rarely. Most of the existing methods apply level wise Breadth-First search. Though the Depth-First search depends on different characteristics of data, it is often better than others. Hence, in this paper it is named as FCFG algorithm that combines the Frequent closed item sets and frequent generators. This proposed algorithm (FCFG) extracts frequent itemsets (FIs) in a Depth-First search method. Then this algorithm extracts FCIS and FGIS from FIs by a level wise approach. Then it associates the generators to their closures. In FCFG algorithm, a generic technique is extended from an arbitrary FI-miner algorithm in order to support the generation of minimal non-redundant association rules. Experimental results indicate that FCFG algorithm performs better when compared with other level wise methods in most of the cases

    An augmented swirling and round jet impinging on a heated flat plate and its heat transfer characteristics

    Full text link
    A geometrical mechanism that generates augmented swirling and round jets is being proposed. The proposed geometry has an axial inlet port and three tangential inlet ports, each of diameter 10mm. A parameter called Split ratio, defined as the percentage of airflow split through these inlet ports, is introduced for the augmented jet. Flow at four different split ratios(SR 1,2,3, and 4) results in a single augmented jet of swirling and round jets of diameter D = 30mm, for which impingement heat transfer is predicted using 3D RANS numerical simulations. Also, computations for conventional round jets and swirling jets generated by an in-house geometrical vane swirler of 45, 60 and 30 degree vane angles each of jet diameter D = 30 mm are performed for the Reynolds number (Re = 6000 to 15,000) and at a jet-plate distance (H = 1.5D to 4D). A comparative study of the flow structures for all the jets using computations is done, followed by a limited discussion on Particle Image velocimetry (PIV) flow visualization results. An impingement heat transfer analysis for all the jets is studied numerically. It is inferred that at a smaller jet-plate distance H =1.5D, the augmented jet and vane swirler jets showed an improved heat transfer from the impingement surface (heated flat plate). In contrast, the conventional round jets showed maximum heat transfer at H = 4D. From the comparative study, the impingement heat transfer characteristics using the proposed augmented jet are better at an optimized jet-plate distance H=1.5D and at a split ratio (SR 4), with an enhancement in the average Nusselt number (Nu avg) of 88% than the conventional round jet and 101% than the vane-swirler jet counterpart. Similarly, an enhancement in the stagnation Nusselt number (Nu stg) of 189% than the round jet is predicted for the proposed augmented jet at SR-4.Comment: 35 pages, 28 figure
    corecore