1,997 research outputs found
Response-Surface Methods in R, Using rsm
This article describes the recent package rsm, which was designed to provide R support for standard response-surface methods. Functions are provided to generate central-composite and Box-Behnken designs. For analysis of the resulting data, the package provides for estimating the response surface, testing its lack of fit, displaying an ensemble of contour plots of the fitted surface, and doing follow-up analyses such as steepest ascent, canonical analysis, and ridge analysis. It also implements a coded-data structure to aid in this essential aspect of the methodology. The functions are designed in hopes of providing an intuitive and effective user interface. Potential exists for expanding the package in a variety of ways.
SASWeave: Literate Programming Using SAS
SASweave is a collection of scripts that allow one to embed SAS code into a LATEX document, and automatically incorporate the results as well. SASweave is patterned after Sweave, which does the same thing for code written in R. In fact, a document may contain both SAS and R code. Besides the convenience of being able to easily incorporate SAS examples in a document, SASweave facilitates the concept of "literate programming": having code, documentation, and results packaged together. Among other things, this helps to ensure that the SAS output in the document is in concordance with the code.
A coding approach for detection of tampering in write-once optical disks
We present coding methods for protecting against tampering of write-once optical disks, which turns them into a secure digital medium for applications where critical information must be stored in a way that prevents or allows detection of an attempt at falsification. Our method involves adding a small amount of redundancy to a modulated sector of data. This extra redundancy is not used for normal operation, but can be used for determining, say, as a testimony in court, that a disk has not been tampered with
Viscoelastic properties of green wood across the grain measured by harmonic tests in the range of 0\degree C to 95\degree C. Hardwood vs. softwood and normal wood vs. reaction wood
The viscoelastic properties of wood have been investigated with a dynamic
mechanical analyser (DMA) specifically conceived for wooden materials, the
WAVET device (environmental vibration analyser for wood). Measurements were
carried out on four wood species in the temperature range of 0\degree C to
100\degree C at frequencies varying between 5 mHz and 10 Hz. Wood samples were
tested in water-saturated conditions, in radial and tangential directions. As
expected, the radial direction always revealed a higher storage modulus than
the tangential direction. Great differences were also observed in the loss
factor. The tan\delta peak and the internal friction are higher in tangential
direction than in radial direction. This behaviour is attributed to the fact
that anatomical elements act depending on the direction. Viscoelastic behaviour
of reaction wood differs from that of normal or opposite wood. Compression wood
of spruce, which has higher lignin content, is denser and stiffer in transverse
directions than normal wood, and has lower softening temperature (Tg). In
tension wood, the G-layer is weakly attached to the rest of the wall layers.
This may explain why the storage modulus and the softening temperature of
tension wood are lower than those for the opposite wood. In this work, we also
point out that the time-temperature equivalence fits only around the transition
region, i.e. between Tg and Tg + 30\degree C. Apart from these regions, the
wood response combines the effect of all constitutive polymers, so that the
equivalence is not valid anymore
A Coding Approach for Detection of Tampering in Write-Once Optical Disks
We present coding methods for protecting against tampering of write-once optical
disks which turns them into a secure digital medium for applications where critical
information must be stored in a way that presents or allows detection of an attempt at
falsification. Our method involves adding a small amount of redundancy to a modulated
sector of data. This extra redundancy is not used for normal operation, but can be
used for determining, say as a testimony in court, that a disk has not been tampered
with
Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain
Room-scale Virtual Reality (VR) has become an affordable consumer reality, with applications ranging from entertainment to productivity. However, the limited physical space available for room-scale VR in the typical home or office environment poses a significant problem. To solve this, physical spaces can be extended by amplifying the mapping of physical to virtual movement (translational gain). Although amplified movement has been used since the earliest days of VR, little is known about how it influences reach-based interactions with virtual objects, now a standard feature of consumer VR. Consequently, this paper explores the picking and placing of virtual objects in VR for the first time, with translational gains of between 1x (a one-to-one mapping of a 3.5m*3.5m virtual space to the same sized physical space) and 3x (10.5m*10.5m virtual mapped to 3.5m*3.5m physical). Results show that reaching accuracy is maintained for up to 2x gain, however going beyond this diminishes accuracy and increases simulator sickness and perceived workload. We suggest gain levels of 1.5x to 1.75x can be utilized without compromising the usability of a VR task, significantly expanding the bounds of interactive room-scale VR
Standardized or simple effect size: what should be reported?
It is regarded as best practice for psychologists to report effect size when disseminating quantitative research findings. Reporting of effect size in the psychological literature is patchy – though this may be changing – and when reported it is far from clear that appropriate effect size statistics are employed. This paper considers the practice of reporting point estimates of standardized effect size and explores factors such as reliability, range restriction and differences in design that distort standardized effect size unless suitable corrections are employed. For most purposes simple (unstandardized) effect size is more robust and versatile than standardized effect size. Guidelines for deciding what effect size metric to use and how to report it are outlined. Foremost among these are: i) a preference for simple effect size over standardized effect size, and ii) the use of confidence intervals to indicate a plausible range of values the effect might take. Deciding on the appropriate effect size statistic to report always requires careful thought and should be influenced by the goals of the researcher, the context of the research and the potential needs of readers
- …
