352 research outputs found

    Decoding-complexity-aware HEVC encoding using a complexity–rate–distortion model

    Get PDF
    The energy consumption of Consumer Electronic (CE) devices during media playback is inexorably linked to the computational complexity of decoding compressed video. Reducing a CE device's the energy consumption is therefore becoming ever more challenging with the increasing video resolutions and the complexity of the video coding algorithms. To this end, this paper proposes a framework that alters the video bit stream to reduce the decoding complexity and simultaneously limits the impact on the coding efficiency. In this context, this paper (i) first performs an analysis to determine the trade-off between the decoding complexity, video quality and bit rate with respect to a reference decoder implementation on a General Purpose Processor (GPP) architecture. Thereafter, (ii) a novel generic decoding complexity-aware video coding algorithm is proposed to generate decoding complexity-rate-distortion optimized High Efficiency Video Coding (HEVC) bit streams. The experimental results reveal that the bit streams generated by the proposed algorithm achieve 29.43% and 13.22% decoding complexity reductions for a similar video quality with minimal coding efficiency impact compared to the state-of-the-art approaches when applied to the HM16.0 and openHEVC decoder implementations, respectively. In addition, analysis of the energy consumption behavior for the same scenarios reveal up to 20% energy consumption reductions while achieving a similar video quality to that of HM 16.0 encoded HEVC bit streams

    Ferromagnetism at 300 K in spin-coated films of Co doped anatase and rutile TiO2

    Full text link
    Thin films of Ti1-xCoxO2 (x=0 and 0.03) have been prepared on sapphire substrates by spin-on technique starting from metalorganic precursors. When heat treated in air at 550 and 700 C respectively, these films present pure anatase and rutile structures as shown both by X-ray diffraction and Raman spectroscopy. Optical absorption indicate a high degree of transparency in the visible region. Such films show a very small magnetic moment at 300 K. However, when the anatase and the rutile films are annealed in a vacuum of 1x10-5 Torr at 500 oC and 600 oC respectively, the magnetic moment, at 300 K, is strongly enhanced reaching 0.36 B/Co for the anatase sample and 0.68 B/Co for the rutile one. The ferromagnetic Curie temperature of these samples is above 350 K.Comment: 31 july 200

    Optimized resource distribution for interactive TV applications

    Get PDF
    This paper proposes a novel resource optimization scheme for cloud-based interactive television applications that are increasingly believed to be the future of television broadcasting and media consumption, in general. The varying distribution of groups of users and the need for on-the-fly media processing inherent to this type of application necessitates a mechanism to efficiently allocate the resources at both a content and network level. A heuristic solution is proposed in order to (a) generate end-to-end delay bound multicast trees for individual groups of users and (b) co-locate multiple multicast trees, such that a minimum group quality metric can be satisfied. The performance of the proposed heuristic solution is evaluated in terms of the serving probability (i.e., the resource utilization efficiency) and execution time of the resource allocation decision making process. It is shown that improvements in the serving probability of up to 50%, in comparison with existing resource allocation schemes, and several orders of magnitude reduction of the execution time, in comparison to the linear programming approach to solving the optimization problem, can be achieved

    Content-adaptive feature-based CU size prediction for fast low-delay video encoding in HEVC

    Get PDF
    Determining the best partitioning structure of a Coding Tree Unit (CTU) is one of the most time consuming operations in HEVC encoding. Specifically, it is the evaluation of the quadtree hierarchy using the Rate-Distortion (RD) optimization that has the most significant impact on the encoding time, especially in the cases of High Definition (HD) and Ultra High Definition (UHD) videos. In order to expedite the encoding for low delay applications, this paper proposes a Coding Unit (CU) size selection and encoding algorithm for inter-prediction in the HEVC. To this end, it describes (i) two CU classification models based on Inter N×N mode motion features and RD cost thresholds to predict the CU split decision, (ii) an online training scheme for dynamic content adaptation, (iii) a motion vector reuse mechanism to expedite the motion estimation process, and finally introduces (iv) a computational complexity to coding efficiency trade-off process to enable flexible control of the algorithm. The experimental results reveal that the proposed algorithm achieves a consistent average encoding time performance ranging from 55% - 58% and 57%-61% with average Bjøntegaard Delta Bit Rate (BDBR) increases of 1.93% – 2.26% and 2.14% – 2.33% compared to the HEVC 16.0 reference software for the low delay P and low delay B configurations, respectively, across a wide range of content types and bit rates

    A framework for automated anomaly detection in high frequency water-quality data from in situ sensors

    Full text link
    River water-quality monitoring is increasingly conducted using automated in situ sensors, enabling timelier identification of unexpected values. However, anomalies caused by technical issues confound these data, while the volume and velocity of data prevent manual detection. We present a framework for automated anomaly detection in high-frequency water-quality data from in situ sensors, using turbidity, conductivity and river level data. After identifying end-user needs and defining anomalies, we ranked their importance and selected suitable detection methods. High priority anomalies included sudden isolated spikes and level shifts, most of which were classified correctly by regression-based methods such as autoregressive integrated moving average models. However, using other water-quality variables as covariates reduced performance due to complex relationships among variables. Classification of drift and periods of anomalously low or high variability improved when we applied replaced anomalous measurements with forecasts, but this inflated false positive rates. Feature-based methods also performed well on high priority anomalies, but were also less proficient at detecting lower priority anomalies, resulting in high false negative rates. Unlike regression-based methods, all feature-based methods produced low false positive rates, but did not and require training or optimization. Rule-based methods successfully detected impossible values and missing observations. Thus, we recommend using a combination of methods to improve anomaly detection performance, whilst minimizing false detection rates. Furthermore, our framework emphasizes the importance of communication between end-users and analysts for optimal outcomes with respect to both detection performance and end-user needs. Our framework is applicable to other types of high frequency time-series data and anomaly detection applications

    ceylon: An R package for plotting the maps of Sri Lanka

    Full text link
    The rapid evolution in the fields of computer science, data science, and artificial intelligence has significantly transformed the utilisation of data for decision-making. Data visualisation plays a critical role in any work that involves data. Visualising data on maps is frequently encountered in many fields. Visualising data on maps not only transforms raw data into visually comprehensible representations but also converts complex spatial information into simple, understandable form. Locating the data files necessary for map creation can be a challenging task. Establishing a centralised repository can alleviate the challenging task of finding shape files, allowing users to efficiently discover geographic data. The ceylon R package is designed to make simple feature data related to Sri Lanka's administrative boundaries and rivers and streams accessible for a diverse range of R users. With straightforward functionalities, this package allows users to quickly plot and explore administrative boundaries and rivers and streams in Sri Lanka.Comment:

    Distributed Lag Nonlinear Modelling Approach to Identify Relationship between Climatic Factors and Dengue Incidence in Colombo District, Sri Lanka

    Get PDF
    Dengue fever and its more severe deadly complication dengue hemorrhagic fever is an infectious mosquito borne disease. The rise in dengue fever has made a heavy economic burden to the country. Climate variability is considered as the major determinant of dengue transmission. Sri Lanka has a favorable climatic condition for development and transmission of dengue.  Hence the aim of this study is to estimate the effect of diverse climatic variables on the transmission of dengue while taking the lag effect and nonlinear effect into account. Weekly data on dengue cases were obtained from January, 2009 to September, 2014. Temperature, precipitation, visibility, humidity, and wind speed were also recorded as weekly averages. Poisson regression combined with distributed lag nonlinear model was used to quantify the impact of climatic factors. Results of  DLNM  revealed; Mean Temperature 250C – 270C at lag 1 – 8 weeks, Precipitation higher than  70mm at lag 1- 5 weeks and 20- 50mm at  lag 10 – 20 weeks, humidity ranged from 65% to 80% at lag 10 – 18 weeks, visibility greater than 14 km have a positive impact on the occurrence of dengue incidence while, mean temperature higher than 280C at lag 6 – 25 weeks, maximum temperature at lag 4 – 6 weeks, precipitation higher than 65mm at lag 15 – 20 weeks,  humidity less than 70% at lag 4 – 9 weeks, visibility less than 14km, high wind speed have a negative impact on the occurrence of dengue incidence. These findings can aid the targeting of vector control interventions and the planning for dengue vaccine implementation

    tsdataleaks: An R Package to Detect Potential Data Leaks in Forecasting Competitions

    Full text link
    Forecasting competitions are of increasing importance as a means to learn best practices and gain knowledge. Data leakage is one of the most common issues that can often be found in competitions. Data leaks can happen when the training data contains information about the test data. There are a variety of different ways that data leaks can occur with time series data. For example: i) randomly chosen blocks of time series are concatenated to form a new time series; ii) scale-shifts; iii) repeating patterns in time series; iv) white noise is added to the original time series to form a new time series, etc. This work introduces a novel tool to detect these data leaks. The tsdataleaks package provides a simple and computationally efficient algorithm to exploit data leaks in time series data. This paper demonstrates the package design and its power to detect data leakages with an application to forecasting competition data.Comment: 6 Pages, 6 Figure
    corecore