1,497 research outputs found

    Survey of Machine Learning Techniques for Malware Analysis

    Get PDF
    Coping with malware is getting more and more challenging, given their relentless growth in complexity and volume. One of the most common approaches in literature is using machine learning techniques, to automatically learn models and patterns behind such complexity, and to develop technologies for keeping pace with the speed of development of novel malware. This survey aims at providing an overview on the way machine learning has been used so far in the context of malware analysis. We systematize surveyed papers according to their objectives (i.e., the expected output, what the analysis aims to), what information about malware they specifically use (i.e., the features), and what machine learning techniques they employ (i.e., what algorithm is used to process the input and produce the output). We also outline a number of problems concerning the datasets used in considered works, and finally introduce the novel concept of malware analysis economics, regarding the study of existing tradeoffs among key metrics, such as analysis accuracy and economical costs

    A Blockchain-Based Solution for Enabling Log-Based Resolution of Disputes in Multi-party Transactions

    Get PDF
    We are witnessing an ongoing global trend towards the automation of almost any transaction through the employment of some Internet-based mean. Furthermore, the large spread of cloud computing and the massive emergence of the software as a service (Saas) paradigm have unveiled many opportunities to combine distinct services, provided by different parties, to establish higher level and more advanced services, that can be offered to end users and enterprises. Business-to-business (B2B) integration and third-party authorization (i.e. using standards like OAuth) are examples of processes requiring more parties to interact with each other to deliver some desired functionality. These kinds of interactions mostly consist of transactions and are usually regulated by some agreement which defines the obligations that involved parties have to comply with. In case one of the parties claims a violation of some clause of such agreement, disputes can occur if the party accused of the infraction refuses to recognize its fault. Moreover, in case of auditing, for convenience reasons a party may deny to have taken part in a given transaction, or may forge historical records related to that transaction. Solutions based on a trusted third party (TTP) have drawbacks: high overhead due to the involvement of an additional party, possible fees to pay for each transaction, and the risks stemming from having to blindly trust another party. If it were possible to only base on transaction logs to sort disputes out, then it would be feasible to get rid of any TTP and related shortcomings. In this paper we propose SLAVE, a blockchain-based solution which does not require any TTP. Storing transactions in a public blockchain like Bitcoin’s or Ethereum’s provides strong guarantees on transactions’ integrity, hence they can be actually used as proofs when controversies arise. The solution we propose defines how to embed transaction logs in a public blockchain, so that each involved party can verify the identity of the others while keeping confident the content of transactions

    Android Malware Family Classification Based on Resource Consumption over Time

    Full text link
    The vast majority of today's mobile malware targets Android devices. This has pushed the research effort in Android malware analysis in the last years. An important task of malware analysis is the classification of malware samples into known families. Static malware analysis is known to fall short against techniques that change static characteristics of the malware (e.g. code obfuscation), while dynamic analysis has proven effective against such techniques. To the best of our knowledge, the most notable work on Android malware family classification purely based on dynamic analysis is DroidScribe. With respect to DroidScribe, our approach is easier to reproduce. Our methodology only employs publicly available tools, does not require any modification to the emulated environment or Android OS, and can collect data from physical devices. The latter is a key factor, since modern mobile malware can detect the emulated environment and hide their malicious behavior. Our approach relies on resource consumption metrics available from the proc file system. Features are extracted through detrended fluctuation analysis and correlation. Finally, a SVM is employed to classify malware into families. We provide an experimental evaluation on malware samples from the Drebin dataset, where we obtain a classification accuracy of 82%, proving that our methodology achieves an accuracy comparable to that of DroidScribe. Furthermore, we make the software we developed publicly available, to ease the reproducibility of our results.Comment: Extended Versio

    Timely processing of big data in collaborative large-scale distributed systems

    Get PDF
    Today’s Big Data phenomenon, characterized by huge volumes of data produced at very high rates by heterogeneous and geographically dispersed sources, is fostering the employment of large-scale distributed systems in order to leverage parallelism, fault tolerance and locality awareness with the aim of delivering suitable performances. Among the several areas where Big Data is gaining increasing significance, the protection of Critical Infrastructure is one of the most strategic since it impacts on the stability and safety of entire countries. Intrusion detection mechanisms can benefit a lot from novel Big Data technologies because these allow to exploit much more information in order to sharpen the accuracy of threats discovery. A key aspect for increasing even more the amount of data at disposal for detection purposes is the collaboration (meant as information sharing) among distinct actors that share the common goal of maximizing the chances to recognize malicious activities earlier. Indeed, if an agreement can be found to share their data, they all have the possibility to definitely improve their cyber defenses. The abstraction of Semantic Room (SR) allows interested parties to form trusted and contractually regulated federations, the Semantic Rooms, for the sake of secure information sharing and processing. Another crucial point for the effectiveness of cyber protection mechanisms is the timeliness of the detection, because the sooner a threat is identified, the faster proper countermeasures can be put in place so as to confine any damage. Within this context, the contributions reported in this thesis are threefold * As a case study to show how collaboration can enhance the efficacy of security tools, we developed a novel algorithm for the detection of stealthy port scans, named R-SYN (Ranked SYN port scan detection). We implemented it in three distinct technologies, all of them integrated within an SR-compliant architecture that allows for collaboration through information sharing: (i) in a centralized Complex Event Processing (CEP) engine (Esper), (ii) in a framework for distributed event processing (Storm) and (iii) in Agilis, a novel platform for batch-oriented processing which leverages the Hadoop framework and a RAM-based storage for fast data access. Regardless of the employed technology, all the evaluations have shown that increasing the number of participants (that is, increasing the amount of input data at disposal), allows to improve the detection accuracy. The experiments made clear that a distributed approach allows for lower detection latency and for keeping up with higher input throughput, compared with a centralized one. * Distributing the computation over a set of physical nodes introduces the issue of improving the way available resources are assigned to the elaboration tasks to execute, with the aim of minimizing the time the computation takes to complete. We investigated this aspect in Storm by developing two distinct scheduling algorithms, both aimed at decreasing the average elaboration time of the single input event by decreasing the inter-node traffic. Experimental evaluations showed that these two algorithms can improve the performance up to 30%. * Computations in online processing platforms (like Esper and Storm) are run continuously, and the need of refining running computations or adding new computations, together with the need to cope with the variability of the input, requires the possibility to adapt the resource allocation at runtime, which entails a set of additional problems. Among them, the most relevant concern how to cope with incoming data and processing state while the topology is being reconfigured, and the issue of temporary reduced performance. At this aim, we also explored the alternative approach of running the computation periodically on batches of input data: although it involves a performance penalty on the elaboration latency, it allows to eliminate the great complexity of dynamic reconfigurations. We chose Hadoop as batch-oriented processing framework and we developed some strategies specific for dealing with computations based on time windows, which are very likely to be used for pattern recognition purposes, like in the case of intrusion detection. Our evaluations provided a comparison of these strategies and made evident the kind of performance that this approach can provide

    A Wearable Wireless Magnetic Eye-Tracker, in-vitro and in-vivo tests

    Get PDF
    A wireless, wearable magnetic eye tracker is described and characterized. The proposed instrumentation enables simultaneous evaluation of eye and head angular displacements. Such a system can be used to determine the absolute gaze direction as well as to analyze spontaneous eye re-orientation in response to stimuli consisting in head rotations. The latter feature has implications to analyze the vestibulo-ocular reflex and constitutes an interesting opportunity to develop medical (oto-neurological) diagnostics. Details of data analysis are reported together with some results obtained in-vivo or with simple mechanical simulators that enable measurements under controlled conditions

    Severe early onset preeclampsia: short and long term clinical, psychosocial and biochemical aspects

    Get PDF
    Preeclampsia is a pregnancy specific disorder commonly defined as de novo hypertension and proteinuria after 20 weeks gestational age. It occurs in approximately 3-5% of pregnancies and it is still a major cause of both foetal and maternal morbidity and mortality worldwide1. As extensive research has not yet elucidated the aetiology of preeclampsia, there are no rational preventive or therapeutic interventions available. The only rational treatment is delivery, which benefits the mother but is not in the interest of the foetus, if remote from term. Early onset preeclampsia (<32 weeks’ gestational age) occurs in less than 1% of pregnancies. It is, however often associated with maternal morbidity as the risk of progression to severe maternal disease is inversely related with gestational age at onset2. Resulting prematurity is therefore the main cause of neonatal mortality and morbidity in patients with severe preeclampsia3. Although the discussion is ongoing, perinatal survival is suggested to be increased in patients with preterm preeclampsia by expectant, non-interventional management. This temporising treatment option to lengthen pregnancy includes the use of antihypertensive medication to control hypertension, magnesium sulphate to prevent eclampsia and corticosteroids to enhance foetal lung maturity4. With optimal maternal haemodynamic status and reassuring foetal condition this results on average in an extension of 2 weeks. Prolongation of these pregnancies is a great challenge for clinicians to balance between potential maternal risks on one the eve hand and possible foetal benefits on the other. Clinical controversies regarding prolongation of preterm preeclamptic pregnancies still exist – also taking into account that preeclampsia is the leading cause of maternal mortality in the Netherlands5 - a debate which is even more pronounced in very preterm pregnancies with questionable foetal viability6-9. Do maternal risks of prolongation of these very early pregnancies outweigh the chances of neonatal survival? Counselling of women with very early onset preeclampsia not only comprises of knowledge of the outcome of those particular pregnancies, but also knowledge of outcomes of future pregnancies of these women is of major clinical importance. This thesis opens with a review of the literature on identifiable risk factors of preeclampsia

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good

    Impacts of the Tropical Pacific/Indian Oceans on the Seasonal Cycle of the West African Monsoon

    Get PDF
    The current consensus is that drought has developed in the Sahel during the second half of the twentieth century as a result of remote effects of oceanic anomalies amplified by local land–atmosphere interactions. This paper focuses on the impacts of oceanic anomalies upon West African climate and specifically aims to identify those from SST anomalies in the Pacific/Indian Oceans during spring and summer seasons, when they were significant. Idealized sensitivity experiments are performed with four atmospheric general circulation models (AGCMs). The prescribed SST patterns used in the AGCMs are based on the leading mode of covariability between SST anomalies over the Pacific/Indian Oceans and summer rainfall over West Africa. The results show that such oceanic anomalies in the Pacific/Indian Ocean lead to a northward shift of an anomalous dry belt from the Gulf of Guinea to the Sahel as the season advances. In the Sahel, the magnitude of rainfall anomalies is comparable to that obtained by other authors using SST anomalies confined to the proximity of the Atlantic Ocean. The mechanism connecting the Pacific/Indian SST anomalies with West African rainfall has a strong seasonal cycle. In spring (May and June), anomalous subsidence develops over both the Maritime Continent and the equatorial Atlantic in response to the enhanced equatorial heating. Precipitation increases over continental West Africa in association with stronger zonal convergence of moisture. In addition, precipitation decreases over the Gulf of Guinea. During the monsoon peak (July and August), the SST anomalies move westward over the equatorial Pacific and the two regions where subsidence occurred earlier in the seasons merge over West Africa. The monsoon weakens and rainfall decreases over the Sahel, especially in August.Peer reviewe
    corecore