2,407 research outputs found

    Exact and Approximate Determinization of Discounted-Sum Automata

    Get PDF
    A discounted-sum automaton (NDA) is a nondeterministic finite automaton with edge weights, valuing a run by the discounted sum of visited edge weights. More precisely, the weight in the i-th position of the run is divided by λi\lambda^i, where the discount factor λ\lambda is a fixed rational number greater than 1. The value of a word is the minimal value of the automaton runs on it. Discounted summation is a common and useful measuring scheme, especially for infinite sequences, reflecting the assumption that earlier weights are more important than later weights. Unfortunately, determinization of NDAs, which is often essential in formal verification, is, in general, not possible. We provide positive news, showing that every NDA with an integral discount factor is determinizable. We complete the picture by proving that the integers characterize exactly the discount factors that guarantee determinizability: for every nonintegral rational discount factor λ\lambda, there is a nondeterminizable λ\lambda-NDA. We also prove that the class of NDAs with integral discount factors enjoys closure under the algebraic operations min, max, addition, and subtraction, which is not the case for general NDAs nor for deterministic NDAs. For general NDAs, we look into approximate determinization, which is always possible as the influence of a word's suffix decays. We show that the naive approach, of unfolding the automaton computations up to a sufficient level, is doubly exponential in the discount factor. We provide an alternative construction for approximate determinization, which is singly exponential in the discount factor, in the precision, and in the number of states. We also prove matching lower bounds, showing that the exponential dependency on each of these three parameters cannot be avoided. All our results hold equally for automata over finite words and for automata over infinite words

    Rabin vs. Streett Automata

    Get PDF
    The Rabin and Streett acceptance conditions are dual. Accordingly, deterministic Rabin and Streett automata are dual. Yet, when adding nondeterminsim, the picture changes dramatically. In fact, the state blowup involved in translations between Rabin and Streett automata is a longstanding open problem, having an exponential gap between the known lower and upper bounds. We resolve the problem, showing that the translation of Streett to Rabin automata involves a state blowup in Theta(n2)Theta(n^2), whereas in the other direction, the translations of both deterministic and nondeterministic Rabin automata to nondeterministic Streett automata involve a state blowup in 2Theta(n)2^{Theta(n)}. Analyzing this substantial difference between the two directions, we get to the conclusion that when studying translations between automata, one should not only consider the state blowup, but also the emph{size} blowup, where the latter takes into account all of the automaton elements. More precisely, the size of an automaton is defined to be the maximum of the alphabet length, the number of states, the number of transitions, and the acceptance condition length (index). Indeed, size-wise, the results are opposite. That is, the translation of Rabin to Streett involves a size blowup in Theta(n2)Theta(n^2) and of Streett to Rabin in 2Theta(n)2^{Theta(n)}. The core difference between state blowup and size blowup stems from the tradeoff between the index and the number of states. (Recall that the index of Rabin and Streett automata might be exponential in the number of states.) We continue with resolving the open problem of translating deterministic Rabin and Streett automata to the weaker types of deterministic co-B"uchi and B"uchi automata, respectively. We show that the state blowup involved in these translations, when possible, is in 2Theta(n)2^{Theta(n)}, whereas the size blowup is in Theta(n2)Theta(n^2)

    VLT/VIMOS Observations of an Occulting Galaxy Pair: Redshifts and Effective Extinction Curve

    Get PDF
    We present VLT/VIMOS IFU observations of an occulting galaxy pair previously discovered in HST observations. The foreground galaxy is a low-inclination spiral disk, which causes clear attenuation features seen against the bright bulge and disk of the background galaxy. We find redshifts of z=0.064±0.003z=0.064 \pm0.003 and z=0.065 for the foreground and background galaxy respectively. This relatively small difference does not rule out gravitational interaction between the two galaxies. Emission line ratios point to a star-forming, not AGN-dominated foreground galaxy. We fit the Cardelli, Clayton & Mathis (CCM) extinction law to the spectra of individual fibres to derive slope (RVR_V) and normalization (AVA_V). The normalization agrees with the HST attenuation map and the slope is lower than the Milky Way relation (RV<3.1R_V<3.1), which is likely linked to the spatial sampling of the disk. We speculate that the values of RVR_V point to either coherent ISM structures in the disk larger than usual (9\sim9 kpc) or higher starting values of RVR_V, indicative of recent processing of the dust. The foreground galaxy is a low stellar mass spiral (M3×109MM_* \sim 3 \times 10^9 M_\odot) with a high dust content (Mdust0.5×106MM_{\rm dust} \sim 0.5 \times 10^6 M_\odot). The dust disk geometry visible in the HST image would explain the observed SED properties of smaller galaxies: a lower mean dust temperature, a high dust-to-stellar mass ratio but relatively little optical attenuation. Ongoing efforts to find occulting pairs with a small foreground galaxies will show how common this geometry is.Comment: 16 pages, 3 tables, 13 figures, accepted for publication in MNRA

    Approximate Determinization of Quantitative Automata

    Get PDF
    Quantitative automata are nondeterministic finite automata with edge weights. They value a run by some function from the sequence of visited weights to the reals, and value a word by its minimal/maximal run. They generalize boolean automata, and have gained much attention in recent years. Unfortunately, important automaton classes, such as sum, discounted-sum, and limit-average automata, cannot be determinized. Yet, the quantitative setting provides the potential of approximate determinization. We define approximate determinization with respect to a distance function, and investigate this potential. We show that sum automata cannot be determinized approximately with respect to any distance function. However, restricting to nonnegative weights allows for approximate determinization with respect to some distance functions. Discounted-sum automata allow for approximate determinization, as the influence of a word\u27s suffix is decaying. However, the naive approach, of unfolding the automaton computations up to a sufficient level, is shown to be doubly exponential in the discount factor. We provide an alternative construction that is singly exponential in the discount factor, in the precision, and in the number of states. We prove matching lower bounds, showing exponential dependency on each of these three parameters. Average and limit-average automata are shown to prohibit approximate determinization with respect to any distance function, and this is the case even for two weights, 0 and 1

    Do Simulation-Based Skill Exercises and Post-Encounter Notes Add Additional Value to a Standardized Patient-Based Clinical Skills Examination?

    Get PDF
    Background. Standardized patient (SP) clinical assessments have limited utility in assessing higher-level clinical competencies. This study explores the value of including simulation exercises and postencounter notes in an SP clinical skills examination. Methods. Two exercises involving cardiac auscultation and ophthalmic funduscopy simulations along with written post encounter notes were added to an SP-based performance examination. Descriptive analyses of students' performance and correlations with SP-based performance measures were obtained. Results. Students' abilities to detect abnormalities on physical exam were highly variable. There were no correlations between SP-based and simulation-derived measures of physical examination competency. Limited correlations were found between students' abilities to perform and document physical examinations and their formulation of appropriate differential diagnoses. Conclusions. Clinical simulation exercises add depth to SP-based assessments of performance. Evaluating the content of post encounter notes offers some insight into students' integrative abilities, and this appears to be improved by the addition of simulation-based post encounter skill exercises. However, further refinement of this methodology is needed

    Towards an Axiomatization of Simple Analog Algorithms

    No full text
    International audienceWe propose a formalization of analog algorithms, extending the framework of abstract state machines to continuous-time models of computation
    corecore