6,574 research outputs found
The reduced form as an empirical tool: a cautionary tale from the financial veil
An analysis of the limitations of the reduced-form empirical strategy as a method of testing the Modigliani-Miller model of corporate financial structure, demonstrating that an empirical strategy that is not closely tied to an underlying economic theory of behavior will usually yield estimates that are too imprecise or too unreliable to form a basis for policy.Corporations - Finance ; Investments
Collaborative Collective Algorithms to Coordinate UGVs
Sentel/Brilliant Innovations has developed autonomous UGVs (unmanned ground vehicles) capable of generating a map of an unknown location through exploration using local software and the power of Google Tango technology. This project was tasked with developing an efficient and capable map-stitching solution allowing multiple UGVs to coordinate their movements and share information in order to greatly improve the speed at which these drones can be used to generate maps. The solution utilizes the processing power of a Raspberry Pi to pull maps from a Redis server and stitch them together. Once stitched, the maps are redistributed via the Redis server back through the network, providing every UGV the opportunity to obtain the global map. All of this stitching is performed on a single UGV, freeing the other drones to focus on generating and uploading their own unique maps to the server. The drones can use this new information to better inform their next move to prevent multiple drones from generating a map of the same location. In the future, Sentel/Brilliant Innovations hopes to take this technology and attach more advanced sensors to the drones, allowing them to add greater detail of the environment to the map rather than simply drawing boundaries. These drones have many potential applications, such as search and rescue, seeking out potential hazards, and intelligence for military and civil use.https://scholarscompass.vcu.edu/capstone/1187/thumbnail.jp
Exploring the movement dynamics of deception
Both the science and the everyday practice of detecting a lie rest on the same assumption: hidden cognitive states that the liar would like to remain hidden nevertheless influence observable behavior. This assumption has good evidence. The insights of professional interrogators, anecdotal evidence, and body language textbooks have all built up a sizeable catalog of non-verbal cues that have been claimed to distinguish deceptive and truthful behavior. Typically, these cues are discrete, individual behaviors—a hand touching a mouth, the rise of a brow—that distinguish lies from truths solely in terms of their frequency or duration. Research to date has failed to establish any of these non-verbal cues as a reliable marker of deception. Here we argue that perhaps this is because simple tallies of behavior can miss out on the rich but subtle organization of behavior as it unfolds over time. Research in cognitive science from a dynamical systems perspective has shown that behavior is structured across multiple timescales, with more or less regularity and structure. Using tools that are sensitive to these dynamics, we analyzed body motion data from an experiment that put participants in a realistic situation of choosing, or not, to lie to an experimenter. Our analyses indicate that when being deceptive, continuous fluctuations of movement in the upper face, and somewhat in the arms, are characterized by dynamical properties of less stability, but greater complexity. For the upper face, these distinctions are present despite no apparent differences in the overall amount of movement between deception and truth. We suggest that these unique dynamical signatures of motion are indicative of both the cognitive demands inherent to deception and the need to respond adaptively in a social context
Simulation of Alternative Marketing Strategies for U.S. Cotton
Three marketing strategies (selling a put option, cash sale at harvest, and cash sale in June) are simulated based on historical values and ranked based on certainty equivalents for a representative irrigated and dryland cotton farm Scenario analysis is also used to compare varying yield values.Simulation, Marketing, Cotton, Risk, Marketing, Research Methods/ Statistical Methods,
History, exam, and labs: Is one enough to diagnose acute adult appendicitis?
No, none of the 3--history, exam, or labs-- is sufficiently accurate to diagnose acute appendicitis (strength of recommendation [SOR]: A, based on meta-analysis of high- quality studies). When combined, the following tests are helpful: an elevated C-reactive protein (CRP), elevated total white blood cell (WBC) count, elevated percentage of polymorphonuclear leukocyte (PMN) cells (left shift), and the presence of guarding or rebound on physical examination. The combination of any 2 of these tests yields a very high positive likelihood ratio (LR +), but the absence of these does not exclude appendicitis (SOR: A, based on meta-analysis of high-quality studies)
IDA: An implicit, parallelizable method for calculating drainage area
Models of landscape evolution or hydrological processes typically depend on the accurate determination of upslope drainage area from digital elevation data, but such calculations can be very computationally demanding when applied to high-resolution topographic data. To overcome this limitation, we propose calculating drainage area in an implicit, iterative manner using linear solvers. The basis of this method is a recasting of the flow routing problem as a sparse system of linear equations, which can be solved using established computational techniques. This approach is highly parallelizable, enabling data to be spread over multiple computer processors. Good scalability is exhibited, rendering it suitable for contemporary high-performance computing architectures with many processors, such as graphics processing units (GPUs). In addition, the iterative nature of the computational algorithms we use to solve the linear system creates the possibility of accelerating the solution by providing an initial guess, making the method well suited to iterative calculations such as numerical landscape evolution models. We compare this method with a previously proposed parallel drainage area algorithm and present several examples illustrating its advantages, including a continent-scale flow routing calculation at 3 arc sec resolution, improvements to models of fluvial sediment yield, and acceleration of drainage area calculations in a landscape evolution model. We additionally describe a modification that allows the method to be used for parallel basin delineation.National Science Foundation (U.S.). Geomorphology and Land-Use Dynamics Program (Award EAR-0951672
Resilience of the Internet of Things (IoT) from an Information Assurance (IA) Perspective
Internet infrastructure developments and the rise of the IoT Socio-Technical Systems (STS) have frequently generated more unsecure protocols to facilitate the rapid intercommunication between the plethoras of IoT devices. Whereas, current development of the IoT has been mainly focused on enabling and effectively meeting the functionality requirement of digital-enabled enterprises we have seen scant regard to their IA architecture, marginalizing system resilience with blatant afterthoughts to cyber defence. Whilst interconnected IoT devices do facilitate and expand information sharing; they further increase of risk exposure and potential loss of trust to their Socio-Technical Systems. A change in the IoT paradigm is needed to enable a security-first mind-set; if the trusted sharing of information built upon dependable resilient growth of IoT is to be established and maintained. We argue that Information Assurance is paramount to the success of IoT, specifically its resilience and dependability to continue its safe support for our digital economy
Containers for Portable, Productive, and Performant Scientific Computing
Containers are an emerging technology that holds promise for improving productivity and code portability in scientific computing. The authors examine Linux container technology for the distribution of a nontrivial scientific computing software stack and its execution on a spectrum of platforms from laptop computers through high-performance computing systems. For Python code run on large parallel computers, the runtime is reduced inside a container due to faster library imports. The software distribution approach and data that the authors present will help developers and users decide on whether container technology is appropriate for them. The article also provides guidance for vendors of HPC systems that rely on proprietary libraries for performance on what they can do to make containers work seamlessly and without performance penalty
- …
