8,481 research outputs found

    Best Practices in Mental Health at Corrections Facilities

    Get PDF
    Police, court personnel, and correctional staff interact with, stabilize, and treat more persons with mental illness than any other system in America—making criminal justice agencies the largest mental health provider in the United States. Yet a wide gap exists between the training of corrections staff and the enormous responsibility they have for day-to-day management of mental health issues. To narrow this gap in jail and prison settings, the best practices include training programs, screening procedures, communication between staff, and good documentation. Quality mental health services help maintain security by reducing inmate and staff stress levels and helping to facilitate offender participation in rehabilitative programming. They increase the likelihood of successful reintegration of mentally ill offenders into the community by promoting adequate community based mental health care follow-up, thereby contributing to reduced recidivism. By following these best practices, correctional organizations can also reduce the likelihood of expensive civil litigation or other legal actions that can result from inadequate correctional mental health services

    The Price of Information in Combinatorial Optimization

    Full text link
    Consider a network design application where we wish to lay down a minimum-cost spanning tree in a given graph; however, we only have stochastic information about the edge costs. To learn the precise cost of any edge, we have to conduct a study that incurs a price. Our goal is to find a spanning tree while minimizing the disutility, which is the sum of the tree cost and the total price that we spend on the studies. In a different application, each edge gives a stochastic reward value. Our goal is to find a spanning tree while maximizing the utility, which is the tree reward minus the prices that we pay. Situations such as the above two often arise in practice where we wish to find a good solution to an optimization problem, but we start with only some partial knowledge about the parameters of the problem. The missing information can be found only after paying a probing price, which we call the price of information. What strategy should we adopt to optimize our expected utility/disutility? A classical example of the above setting is Weitzman's "Pandora's box" problem where we are given probability distributions on values of nn independent random variables. The goal is to choose a single variable with a large value, but we can find the actual outcomes only after paying a price. Our work is a generalization of this model to other combinatorial optimization problems such as matching, set cover, facility location, and prize-collecting Steiner tree. We give a technique that reduces such problems to their non-price counterparts, and use it to design exact/approximation algorithms to optimize our utility/disutility. Our techniques extend to situations where there are additional constraints on what parameters can be probed or when we can simultaneously probe a subset of the parameters.Comment: SODA 201

    Generic absence of strong singularities and geodesic completeness in modified loop quantum cosmologies

    Get PDF
    Different regularizations of the Hamiltonian constraint in loop quantum cosmology yield modified loop quantum cosmologies, namely mLQC-I and mLQC-II, which lead to qualitatively different Planck scale physics. We perform a comprehensive analysis of resolution of various singularities in these modified loop cosmologies using effective spacetime description and compare with earlier results in standard loop quantum cosmology. We show that the volume remains non-zero and finite in finite time evolution for all considered loop cosmological models. Interestingly, even though expansion scalar and energy density are bounded due to quantum geometry, curvature invariants can still potentially diverge due to pressure singularities at a finite volume. These divergences are shown to be harmless since geodesic evolution does not break down and no strong singularities are present in the effective spacetimes of loop cosmologies. Using a phenomenological matter model, various types of exotic strong and weak singularities, including big rip, sudden, big freeze and type-IV singularities, are studied. We show that as in standard loop quantum cosmology, big rip and big freeze singularities are resolved in mLQC-I and mLQC-II, but quantum geometric effects do not resolve sudden and type-IV singularities.Comment: Minor revision to match published version in CQ

    Resolution of strong singularities and geodesic completeness in loop quantum Bianchi-II spacetimes

    Get PDF
    Generic resolution of singularities and geodesic completeness in the loop quantization of Bianchi-II spacetimes with arbitrary minimally coupled matter is investigated. Using the effective Hamiltonian approach, we examine two available quantizations: one based on the connection operator and second by treating extrinsic curvature as connection via gauge fixing. It turns out that for the connection based quantization, either the inverse triad modifications or imposition of weak energy condition is necessary to obtain a resolution of all strong singularities and geodesic completeness. In contrast, the extrinsic curvature based quantization generically resolves all strong curvature singularities and results in a geodesically complete effective spacetime without inverse triad modifications or energy conditions. In both the quantizations, weak curvature singularities can occur resulting from divergences in pressure and its derivatives at finite densities. These are harmless events beyond which geodesics can be extended. Our work generalizes previous results on the generic resolution of strong singularities in the loop quantization of isotropic, Bianchi-I and Kantowski-Sachs spacetimes.Comment: 24 pages. Revised version to appear in CQG. Clarifications on quantization prescriptions and triad orientations adde

    Collaborative Reuse of Streaming Dataflows in IoT Applications

    Full text link
    Distributed Stream Processing Systems (DSPS) like Apache Storm and Spark Streaming enable composition of continuous dataflows that execute persistently over data streams. They are used by Internet of Things (IoT) applications to analyze sensor data from Smart City cyber-infrastructure, and make active utility management decisions. As the ecosystem of such IoT applications that leverage shared urban sensor streams continue to grow, applications will perform duplicate pre-processing and analytics tasks. This offers the opportunity to collaboratively reuse the outputs of overlapping dataflows, thereby improving the resource efficiency. In this paper, we propose \emph{dataflow reuse algorithms} that given a submitted dataflow, identifies the intersection of reusable tasks and streams from a collection of running dataflows to form a \emph{merged dataflow}. Similar algorithms to unmerge dataflows when they are removed are also proposed. We implement these algorithms for the popular Apache Storm DSPS, and validate their performance and resource savings for 35 synthetic dataflows based on public OPMW workflows with diverse arrival and departure distributions, and on 21 real IoT dataflows from RIoTBench.Comment: To appear in IEEE eScience Conference 201
    corecore