512 research outputs found

    The 2nd Generation VLTI path to performance

    Full text link
    The upgrade of the VLTI infrastructure for the 2nd generation instruments is now complete with the transformation of the laboratory, and installation of star separators on both the 1.8-m Auxiliary Telescopes (ATs) and the 8-m Unit Telescopes (UTs). The Gravity fringe tracker has had a full semester of commissioning on the ATs, and a first look at the UTs. The CIAO infrared wavefront sensor is about to demonstrate its performance relative to the visible wavefront sensor MACAO. First astrometric measurements on the ATs and astrometric qualification of the UTs are on-going. Now is a good time to revisit the performance roadmap for VLTI that was initiated in 2014, which aimed at coherently driving the developments of the interferometer, and especially its performance, in support to the new generation of instruments: Gravity and MATISSE.Comment: 9 pages, 6 figures, 1 table, Proc. SPIE 201

    Ontology Domain Modeling Support for Multilingual Servicies in e-commerce: MKBEEM

    Get PDF
    One of the main objectives of a truly user-friendly Information Society is to focus on advanced human language technologies enabling cost-effective interchange across language and culture and more natural interfaces to digital services. The recently launched IST-1999-10589 project MKBEEM (Multilingual Knowledge Based European Electronic Marketplace, 1st Feb. 2000 - 1st Aug. 2002) is rightly in that direction and the work will address basically, written language technologies and its use in the key sector of global business and electronic commerce. In particular MKBEEM will focus on adding multilinguality to all stages of the information cycle, including multilingual content generation and maintenance, automated translation and interpretation and enhancing the natural interactivity and usability of the service with unconstrained language input. On the Knowledge engineering side, the MKBEEM Ontologies will provide a consensual representation of the electronic commerce field in three typical Domains (Tourism, Mail order, Retailers) allowing the exchanges independently of the language of the end user, the service, or the content provider. Ontologies will be used for classifying and indexing catalogues, for filtering user’s query, for facilitating multilingual man-machine dialogues between user and software agent, and for inferring information that is relevant to the user’s request. This paper concentrates on ontology issues, while the used human language processing approaches will be presented closely in our later papers

    Manufacturing and Installation of the Compound Cryogenic Distribution Line for the Large Hadron Collider

    Get PDF
    The Large Hadron Collider (LHC) [1] currently under construction at CERN will make use of superconducting magnets operating in superfluid helium below 2 K. A compound cryogenic distribution line (QRL) will feed with helium at different temperatures and pressures the local elementary cooling loops in the cryomagnet strings. Low heat inleak to all temperature levels is essential for the overall LHC cryogenic performance. Following a competitive tendering, CERN adjudicated in 2001 the contract for the series line to Air Liquide (France). This paper recalls the main features of the technical specification and shows the project status. The basic choices and achievements for the industrialization phase of the series production are also presented, as well as the installation issues and status

    PIONIER: a visitor instrument for the VLTI

    Get PDF
    PIONIER is a 4-telescope visitor instrument for the VLTI, planned to see its first fringes in 2010. It combines four ATs or four UTs using a pairwise ABCD integrated optics combiner that can also be used in scanning mode. It provides low spectral resolution in H and K band. PIONIER is designed for imaging with a specific emphasis on fast fringe recording to allow closure-phases and visibilities to be precisely measured. In this work we provide the detailed description of the instrument and present its updated status.Comment: Proceedings of SPIE conference Optical and Infrared Interferometry II (Conference 7734) San Diego 201

    Increasing the imaging capabilities of the VLTI using integrated optics

    Get PDF
    Several scientific topics linked to the observation of extended structures around astrophysical sources (dust torus around AGN, disks around young stars, envelopes around AGBs) require imaging capability with milli-arcsecond spatial resolution. The current VLTI instruments, AMBER and MIDI, will provide in the coming months the required high angular resolution, yet without actual imaging. As a rule of thumb, the image quality accessible with an optical interferometer is directly related to the number of telescopes used simultaneously: the more the apertures, the better and the faster the reconstruction of the image. We propose an instrument concept to achieve interferometric combination of N telescopes (4 ≤ N ≤ 8) thanks to planar optics technology: 4 x 8-m telescopes in the short term and/or 8 x 1.8-m telescopes in the long term. The foreseen image reconstruction quality in the visible and/or in the near infrared will be equivalent to the one achieved with millimeter radio interferometers. Achievable spatial resolution will be better than the one foreseen with ALMA. This instrument would be able to acquire routinely 1 mas resolution images. A 13 to 20 magnitude sensitivity in spectral ranges from 0.6 to 2.5 μm is expected depending on the choice of the phase referencing guide source. High dynamic range, even on faint objects, is achievable thanks to the high accuracy provided by integrated optics for visibility amplitude and phase measurements. Based on recent validations of integrated optics presented here an imaging instrument concept can be proposed. The results obtained using the VLTI facilities give a demonstration of the potential of the proposed technique

    VLTI status update: a decade of operations and beyond

    Full text link
    We present the latest update of the European Southern Observatory's Very Large Telescope interferometer (VLTI). The operations of VLTI have greatly improved in the past years: reduction of the execution time; better offering of telescopes configurations; improvements on AMBER limiting magnitudes; study of polarization effects and control for single mode fibres; fringe tracking real time data, etc. We present some of these improvements and also quantify the operational improvements using a performance metric. We take the opportunity of the first decade of operations to reflect on the VLTI community which is analyzed quantitatively and qualitatively. Finally, we present briefly the preparatory work for the arrival of the second generation instruments GRAVITY and MATISSE.Comment: 10 pages, 7 figures, Proceedings of the SPIE, 9146-1

    Single-Round Proofs of Quantumness from Knowledge Assumptions

    Full text link
    A proof of quantumness is an efficiently verifiable interactive test that an efficient quantum computer can pass, but all efficient classical computers cannot (under some cryptographic assumption). Such protocols play a crucial role in the certification of quantum devices. Existing single-round protocols (like asking the quantum computer to factor a large number) require large quantum circuits, whereas multi-round ones use smaller circuits but require experimentally challenging mid-circuit measurements. As such, current proofs of quantumness are out of reach for near-term devices. In this work, we construct efficient single-round proofs of quantumness based on existing knowledge assumptions. While knowledge assumptions have not been previously considered in this context, we show that they provide a natural basis for separating classical and quantum computation. Specifically, we show that multi-round protocols based on Decisional Diffie-Hellman (DDH) or Learning With Errors (LWE) can be "compiled" into single-round protocols using a knowledge-of-exponent assumption or knowledge-of-lattice-point assumption, respectively. We also prove an adaptive hardcore-bit statement for a family of claw-free functions based on DDH, which might be of independent interest. Previous approaches to constructing single-round protocols relied on the random oracle model and thus incurred the overhead associated with instantiating the oracle with a cryptographic hash function. In contrast, our protocols have the same resource requirements as their multi-round counterparts without necessitating mid-circuit measurements, making them, arguably, the most efficient single-round proofs of quantumness to date. Our work also helps in understanding the interplay between black-box/white-box reductions and cryptographic assumptions in the design of proofs of quantumness.Comment: 51 page

    Single-Round Proofs of Quantumness from Knowledge Assumptions

    Get PDF
    A proof of quantumness is an efficiently verifiable interactive test that an efficient quantum computer can pass, but all efficient classical computers cannot (under some cryptographic assumption). Such protocols play a crucial role in the certification of quantum devices. Existing single-round protocols based solely on a cryptographic hardness assumption (like asking the quantum computer to factor a large number) require large quantum circuits, whereas multi-round ones use smaller circuits but require experimentally challenging mid-circuit measurements. In this work, we construct efficient single-round proofs of quantumness based on existing knowledge assumptions. While knowledge assumptions have not been previously considered in this context, we show that they provide a natural basis for separating classical and quantum computation. Our work also helps in understanding the interplay between black-box/white-box reductions and cryptographic assumptions in the design of proofs of quantumness. Specifically, we show that multi-round protocols based on Decisional Diffie-Hellman (DDH) or Learning With Errors (LWE) can be "compiled" into single-round protocols using a knowledge-of-exponent assumption [Bitansky et al., 2012] or knowledge-of-lattice-point assumption [Loftus et al., 2012], respectively. We also prove an adaptive hardcore-bit statement for a family of claw-free functions based on DDH, which might be of independent interest

    Evaluation of the Patient Acceptable Symptom State in a pooled analysis of two multicentre, randomised, double-blind, placebo-controlled studies evaluating lumiracoxib and celecoxib in patients with osteoarthritis

    Get PDF
    Patient Acceptable Symptom State (PASS) is an absolute threshold proposed for symptomatic variables in osteoarthritis (OA) to determine the point beyond which patients consider themselves well and, as such, are satisfied with treatment. Two large previously reported studies of knee OA have shown that both lumiracoxib and celecoxib were superior to placebo in terms of conventional outcome measures. To assess the clinical relevance of these results from the patient's perspective, the same data pooled from these two studies were analysed with respect to the PASS. In total, 3,235 patients were included in two multicentre, randomised, double-blind studies of identical design. Patients were randomly assigned to receive lumiracoxib 100 mg once daily (n = 811), lumiracoxib 100 mg once daily with an initial dose of lumiracoxib 200 mg once daily for the first 2 weeks (100 mg once daily with initial dose [n = 805]), celecoxib 200 mg once daily (n = 813), or placebo (n = 806) for 13 weeks. Treatments were compared with respect to the PASS criteria (for OA pain, patient's global assessment of disease activity, and the Western Ontario and McMaster Universities Osteoarthritis Index Likert version 3.1 [WOMAC™ LK 3.1] Function [difficulty in performing daily activities] subscale score). At week 13, 43.3%, 45.3%, and 42.2% of patients in the lumiracoxib 100 mg once daily, lumiracoxib 100 mg once daily with initial dose, and the celecoxib 200 mg once daily groups, respectively, considered their current states as satisfactory versus 35.5% in the placebo group. Similar results were observed for patient's global assessment of disease activity and WOMAC™ LK 3.1 Function subscale score. This post hoc analysis suggests that the statistical significance of the results observed with lumiracoxib or celecoxib compared with placebo using conventional outcome variables is complemented by clinical relevance to the patient. Trial registration numbers: NCT00366938 and NCT00367315

    Exploring the Local Landscape in the Triangle Network

    Full text link
    Characterizing the set of distributions that can be realized in the triangle network is a notoriously difficult problem. In this work, we investigate inner approximations of the set of local (classical) distributions of the triangle network. A quantum distribution that appears to be nonlocal is the Elegant Joint Measurement (EJM) [Entropy. 2019; 21(3):325], which motivates us to study distributions having the same symmetries as the EJM. We compare analytical and neural-network-based inner approximations and find a remarkable agreement between the two methods. Using neural network tools, we also conjecture network Bell inequalities that give a trade-off between the levels of correlation and symmetry that a local distribution may feature. Our results considerably strengthen the conjecture that the EJM is nonlocal.Comment: 8 + 19 pages, 19 figure
    corecore