89 research outputs found
Recommended from our members
WHY WE LEAVE: THE ROLE OF APPROACH AND AVOIDANCE MOTIVATIONS IN ROMANTIC RELATIONSHIP DISSOLUTION
Romantic relationship dissolution (i.e., a breakup) is one of the most stressful events a person can experience. Breakup initiators often encounter a range of emotions (e.g., guilt, depression, anxiety) after leaving their romantic partner. Yet, little is known about how an initiator’s unique motivations for leaving impact their breakup strategies and emotions in the aftermath. The current study examined the association between dissolution motives, breakup strategies, and experienced outcomes in 273 marital and non-marital breakup initiators. As predicted, initiators who left due to approach dissolution motives (e.g., seeking more freedom; draws to leave) experienced greater positive outcomes following a breakup; however, avoidance dissolution motives (e.g., avoiding future conflict; barriers to staying) predicted greater positive outcomes (e.g., relief, sense of control) over and above approach dissolution motives and previously considered variables (e.g., attachment anxiety/avoidance, closeness, relationship satisfaction, and reward/threat sensitivity). Contrary to predictions, approach dissolution motives were related to both direct and indirect breakup strategy use. Avoidance dissolution motives significantly predicted more indirect breakup strategy use and were unrelated to the overall use of direct breakup strategies. These results help elucidate the motivations behind breaking romantic social bonds and move researchers closer to understanding the impact of dissolution considerations on behavior and outcomes above and beyond frequently considered factors such as attachment
Experimentation with Raw Data Vault Data Warehouse Loading
The principal novelty in this work is raw Data Vault (DV) loads from source systems, and experiments with effects of allowing certain kind of permissible errors to be kept in the Data Vault until correct values are supplied. The principal novelty in this work is raw Data Vault (DV) loads from source systems, and experiments with effects of allowing certain kind of permissible errors to be kept in the Data Vault until correct values are supplied
Lead Release from Mexican Eathenware
Lead glazed vessels could be the source of poisoning for rural Mexicans. The degree of poisoning depends on the amount of lead released from the handcrafted ware and on how much lead is then ingested with food or beverages. The posioning is subtle, and the people may not recognize the connection between the disease and the source. Therefore, education and technological changes are necessary to reduce the risk of lead poisoning.
I investigated the problem of lead release from Mexican earthenware from several approaches. First, the ceramic materials and methods involved in pottery making in central Mexico were studied. Second, I discussed the problem with some persons who live in a pottery producing area, Michoacan, to understand their attitudes toward lead glazes and poisoning. Third, chemical analysis was performed on pots collected in the region and produced by similar means.
I have reached four conclusions concerning the problem of lead release. First, Mexican earthenware does release unsafe amounts of lead. Second, the potters and consumers of such ware risk at least insidious poisoning from the production and usage of the pottery. Third, these persons are not aware, however, of the disease or the probably source. Fourth, technological changes would be accepted if they prove to be economically feasible for rural pottery making
Fingerphoto Deblurring using Wavelet Style Transfer
This work focuses on a deblurring network designed specifically for the task of deblurring contactless fingerprints, a.k.a fingerphotos. This network takes the general idea of style transfer, and applies it to the realm of deblurring. The standard use case for style transfer networks is artistic style transfer, which is used for recreational purposes. For this work, though, style transfer is used in the context of deblurring. Since style transfer can transfer artistic styles from one image to another, who\u27s to say it can’t transfer clarity and sharpness as well? This is the focus of this work; taking a blurry fingerphoto, and deblurring it using the sharpness and clarity found in a sharp fingerphoto. To accomplish this feat, the Discrete Wavelet Transform is used in order to split images into their respective directionally-specific sub-bands. Then, each blurry and sharp sub-band is combined, resulting in a deblurred set of sub-bands. Finally, these sub-bands are sent to a neural decoder for reconstruction and refinement. The result is a deblurred version of the original blurry fingerphoto, with minimal loss of fine details and minutiae
Testing Data Vault-Based Data Warehouse
Data warehouse (DW) projects are undertakings that require integration of disparate sources of data, a well-defined mapping of the source data to the reconciled data, and effective Extract, Transform, and Load (ETL) processes. Owing to the complexity of data warehouse projects, great emphasis must be placed on an agile-based approach with properly developed and executed test plans throughout the various stages of designing, developing, and implementing the data warehouse to mitigate against budget overruns, missed deadlines, low customer satisfaction, and outright project failures. Yet, there are often attempts to test the data warehouse exactly like traditional back-end databases and legacy applications, or to downplay the role of quality assurance (QA) and testing, which only serve to fuel the frustration and mistrust of data warehouse and business intelligence (BI) systems. In spite of this, there are a number of steps that can be taken to ensure DW/BI solutions are successful, highly trusted, and stable. In particular, adopting a Data Vault (DV)-based Enterprise Data Warehouse (EDW) can simplify and enhance various aspects of testing, and curtail delays common in non-DV based DW projects. A major area of focus in this research is raw DV loads from source systems, keeping transformations to a minimum in the ETL process which loads the DV from the source. Certain load errors, classified as permissible errors and enforced by business rules, are kept in the Data Vault until correct values are supplied. Major transformation activities are pushed further downstream to the next ETL process which loads and refreshes the Data Mart (DM) from the Data Vault
Mixed integer programming on transputers
Mixed Integer Programming (MIP) problems occur in many industries and their practical solution can be challenging in terms of both time and effort. Although faster computer hardware has allowed the solution of more MIP problems in reasonable times, there will come a point when the hardware cannot be speeded up any more. One way of improving the solution times of MIP problems without further speeding up the hardware is to improve the effectiveness of the solution algorithm used.
The advent of accessible parallel processing technology and techniques provides the opportunity to exploit any parallelism within MIP solving algorithms in order to accelerate the solution of MIP problems. Many of the MIP problem solving algorithms in the literature contain a degree of exploitable parallelism. Several algorithms were considered as candidates for parallelisation within the constraints imposed by the currently available parallel hardware and techniques.
A parallel Branch and Bound algorithm was designed for and implemented on an array of transputers hosted by a PC. The parallel algorithm was designed to operate as a process farm, with a master passing work to various slave processors. A message-passing harness was developed to allow full control of the slaves and the work sent to them.
The effects of using various node selection techniques were studied and a default node selection strategy decided upon for the parallel algorithm. The parallel algorithm was also designed to take full advantage of the structure of MIP problems formulated using global entities such as general integers and special ordered sets. The presence of parallel processors makes practicable the idea of performing more than two branches on an unsatisfied global entity. Experiments were carried out using multiway branching strategies and a default branching strategy decided upon for appropriate types of MIP problem
Weak convergence of spectral shift functions revisited
We study convergence of the spectral shift function for the finite interval
restrictions of a pair of full-line Schr\"odinger operators to an interval of
the form with coupled boundary conditions at the endpoints as
in the case when the finite interval restrictions are
relatively prime to those with Dirichlet boundary conditions. Using a
Krein-type resolvent identity we show that the spectral shift function for the
finite interval restrictions converges weakly to that for the pair of full-line
Schr\"odinger operators as the length of the interval tends to infinity.Comment: 23 page
Data Vault and HQDM Principles
The paper explores applicability of high quality data modeling (HQDM)principles for data vault modelin
- …
