41,489 research outputs found

    Progressive Label Distillation: Learning Input-Efficient Deep Neural Networks

    Get PDF
    Much of the focus in the area of knowledge distillation has been on distilling knowledge from a larger teacher network to a smaller student network. However, there has been little research on how the concept of distillation can be leveraged to distill the knowledge encapsulated in the training data itself into a reduced form. In this study, we explore the concept of progressive label distillation, where we leverage a series of teacher-student network pairs to progressively generate distilled training data for learning deep neural networks with greatly reduced input dimensions. To investigate the efficacy of the proposed progressive label distillation approach, we experimented with learning a deep limited vocabulary speech recognition network based on generated 500ms input utterances distilled progressively from 1000ms source training data, and demonstrated a significant increase in test accuracy of almost 78% compared to direct learning.Comment: 9 page

    Sleep pattern disruption of flight attendants operating on the Asia–Pacific route

    Get PDF
    Jet lag is a common issue with flight attendants in international flights, as they have to cross several time zones back and forth, while their sleep patterns get disrupted by the legally required rest times between flights, which are normally carried out at different locations. This research aimed to investigate the sleep quality of a sample of flight attendants operating between New Zealand and Asia. Twenty flight attendants were surveyed in this research. The research found that flight attendants typically took a nap immediately after arriving into New Zealand, reporting a sound sleep time of about 6 hours. After the nap, however, they had problems falling sleep in subsequent nights. After their first nap, some flight attendants try to adapt to local light conditions, while others prefer to keep the sleep patterns they had back home. Both groups report different trends of sleep quality

    Rescattering effects in hadron-nucleus and heavy-ion collisions

    Full text link
    We review the extension of the factorization formalism of perturbative QCD to {\it coherent} soft rescattering associated with hard scattering in high energy nuclear collisions. We emphasize the ability to quantify high order corrections and the predictive power of factorization approach in terms of universal nonperturbative matrix elements. Although coherent rescattering effects are power suppressed by hard scales of the scattering, they are enhanced by the nuclear size and could play an important role in understanding the novel nuclear dependence observed in high energy nuclear collisions.Comment: 8 pages, 13 figures, to be published in the Proceedings of 1st International Conference on Hard and Electromagnetic Probes of High Energy Nuclear Collisions (Hard Probe 2004), Ericeira, Portugal, Nov. 4-10, 200

    Generalized Non-orthogonal Joint Diagonalization with LU Decomposition and Successive Rotations

    Full text link
    Non-orthogonal joint diagonalization (NJD) free of prewhitening has been widely studied in the context of blind source separation (BSS) and array signal processing, etc. However, NJD is used to retrieve the jointly diagonalizable structure for a single set of target matrices which are mostly formulized with a single dataset, and thus is insufficient to handle multiple datasets with inter-set dependences, a scenario often encountered in joint BSS (J-BSS) applications. As such, we present a generalized NJD (GNJD) algorithm to simultaneously perform asymmetric NJD upon multiple sets of target matrices with mutually linked loading matrices, by using LU decomposition and successive rotations, to enable J-BSS over multiple datasets with indication/exploitation of their mutual dependences. Experiments with synthetic and real-world datasets are provided to illustrate the performance of the proposed algorithm.Comment: Signal Processing, IEEE Transactions on (Volume:63 , Issue: 5
    corecore