334 research outputs found

    Universal Dependencies Parsing for Colloquial Singaporean English

    Full text link
    Singlish can be interesting to the ACL community both linguistically as a major creole based on English, and computationally for information extraction and sentiment analysis of regional social media. We investigate dependency parsing of Singlish by constructing a dependency treebank under the Universal Dependencies scheme, and then training a neural network model by integrating English syntactic knowledge into a state-of-the-art parser trained on the Singlish treebank. Results show that English knowledge can lead to 25% relative error reduction, resulting in a parser of 84.47% accuracies. To the best of our knowledge, we are the first to use neural stacking to improve cross-lingual dependency parsing on low-resource languages. We make both our annotation and parser available for further research.Comment: Accepted by ACL 201

    Unexpected Enhancement of Three-Dimensional Low-Energy Spin Correlations in Quasi-Two-Dimensional Fe1+y_{1+y}Te1x_{1-x}Sex_{x} System at High Temperature

    Full text link
    We report inelastic neutron scattering measurements of low energy (ω<10\hbar \omega < 10 meV) magnetic excitations in the "11" system Fe1+y_{1+y}Te1x_{1-x}Sex_{x}. The spin correlations are two-dimensional (2D) in the superconducting samples at low temperature, but appear much more three-dimensional when the temperature rises well above Tc15T_c \sim 15 K, with a clear increase of the (dynamic) spin correlation length perpendicular to the Fe planes. The spontaneous change of dynamic spin correlations from 2D to 3D on warming is unexpected and cannot be naturally explained when only the spin degree of freedom is considered. Our results suggest that the low temperature physics in the "11" system, in particular the evolution of low energy spin excitations towards %better satisfying the nesting condition for mediating superconducting pairing, is driven by changes in orbital correlations

    Substitution of Ni for Fe in superconducting Fe0.98_{0.98}Te0.5_{0.5}Se0.5_{0.5} depresses the normal-state conductivity but not the magnetic spectral weight

    Full text link
    We have performed systematic resistivity and inelastic neutron scattering measurements on Fe0.98z_{0.98-z}Niz_zTe0.5_{0.5}Se0.5_{0.5} samples to study the impact of Ni substitution on the transport properties and the low-energy (\le 12 meV) magnetic excitations. It is found that, with increasing Ni doping, both the conductivity and superconductivity are gradually suppressed; in contrast, the low-energy magnetic spectral weight changes little. Comparing with the impact of Co and Cu substitution, we find that the effects on conductivity and superconductivity for the same degree of substitution grow systematically as the atomic number of the substituent deviates from that of Fe. The impact of the substituents as scattering centers appears to be greater than any contribution to carrier concentration. The fact that low-energy magnetic spectral weight is not reduced by increased electron scattering indicates that the existence of antiferromagnetic correlations does not depend on electronic states close to the Fermi energy.Comment: 6 pages, 5 figure

    Coupling of spin and orbital excitations in the iron-based superconductor FeSe(0.5)Te(0.5)

    Full text link
    We present a combined analysis of neutron scattering and photoemission measurements on superconducting FeSe(0.5)Te(0.5). The low-energy magnetic excitations disperse only in the direction transverse to the characteristic wave vector (1/2,0,0), whereas the electronic Fermi surface near (1/2,0,0) appears to consist of four incommensurate pockets. While the spin resonance occurs at an incommensurate wave vector compatible with nesting, neither spin-wave nor Fermi-surface-nesting models can describe the magnetic dispersion. We propose that a coupling of spin and orbital correlations is key to explaining this behavior. If correct, it follows that these nematic fluctuations are involved in the resonance and could be relevant to the pairing mechanism.Comment: 4 pages, 4 figures; accepted versio

    Impact of artificial intelligence adoption on online returns policies

    Get PDF
    The shift to e-commerce has led to an astonishing increase in online sales for retailers. However, the number of returns made on online purchases is also increasing and have a profound impact on retailers’ operations and profit. Hence, retailers need to balance between minimizing and allowing product returns. This study examines an offline showroom versus an artificial intelligence (AI) online virtual-reality webroom and how the settings affect customers’ purchase and retailers’ return decisions. A case study is used to illustrate the AI application. Our results show that adopting artificial intelligence helps sellers to make better returns policies, maximize reselling returns, and reduce the risks of leftovers and shortages. Our findings unlock the potential of artificial intelligence applications in retail operations and should interest practitioners and researchers in online retailing, especially those concerned with online returns policies and the consumer personalized service experience

    In-Phase Bias Modulation Mode of Scanning Ion Conductance Microscopy With Capacitance Compensation

    Full text link

    Uncertainty Estimation by Fisher Information-based Evidential Deep Learning

    Full text link
    Uncertainty estimation is a key factor that makes deep learning reliable in practical applications. Recently proposed evidential neural networks explicitly account for different uncertainties by treating the network's outputs as evidence to parameterize the Dirichlet distribution, and achieve impressive performance in uncertainty estimation. However, for high data uncertainty samples but annotated with the one-hot label, the evidence-learning process for those mislabeled classes is over-penalized and remains hindered. To address this problem, we propose a novel method, Fisher Information-based Evidential Deep Learning (I\mathcal{I}-EDL). In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes. The generalization ability of our network is further improved by optimizing the PAC-Bayesian bound. As demonstrated empirically, our proposed method consistently outperforms traditional EDL-related algorithms in multiple uncertainty estimation tasks, especially in the more challenging few-shot classification settings

    Sim-T: Simplify the Transformer Network by Multiplexing Technique for Speech Recognition

    Full text link
    In recent years, a great deal of attention has been paid to the Transformer network for speech recognition tasks due to its excellent model performance. However, the Transformer network always involves heavy computation and large number of parameters, causing serious deployment problems in devices with limited computation sources or storage memory. In this paper, a new lightweight model called Sim-T has been proposed to expand the generality of the Transformer model. Under the help of the newly developed multiplexing technique, the Sim-T can efficiently compress the model with negligible sacrifice on its performance. To be more precise, the proposed technique includes two parts, that are, module weight multiplexing and attention score multiplexing. Moreover, a novel decoder structure has been proposed to facilitate the attention score multiplexing. Extensive experiments have been conducted to validate the effectiveness of Sim-T. In Aishell-1 dataset, when the proposed Sim-T is 48% parameter less than the baseline Transformer, 0.4% CER improvement can be obtained. Alternatively, 69% parameter reduction can be achieved if the Sim-T gives the same performance as the baseline Transformer. With regard to the HKUST and WSJ eval92 datasets, CER and WER will be improved by 0.3% and 0.2%, respectively, when parameters in Sim-T are 40% less than the baseline Transformer
    corecore