961 research outputs found
Tests for predicting complications of pre-eclampsia: A protocol for systematic reviews
Background
Pre-eclampsia is associated with several complications. Early prediction of complications and timely management is needed for clinical care of these patients to avert fetal and maternal mortality and morbidity. There is a need to identify best testing strategies in pre eclampsia to identify the women at increased risk of complications. We aim to determine the accuracy of various tests to predict complications of pre-eclampsia by systematic quantitative reviews.
Method
We performed extensive search in MEDLINE (1951–2004), EMBASE (1974–2004) and also will also include manual searches of bibliographies of primary and review articles. An initial search has revealed 19500 citations. Two reviewers will independently select studies and extract data on study characteristics, quality and accuracy. Accuracy data will be used to construct 2 × 2 tables. Data synthesis will involve assessment for heterogeneity and appropriately pooling of results to produce summary Receiver Operating Characteristics (ROC) curve and summary likelihood ratios.
Discussion
This review will generate predictive information and integrate that with therapeutic effectiveness to determine the absolute benefit and harm of available therapy in reducing complications in women with pre-eclampsia
Looming struggles over technology for border control
New technologies under development, capable of inflicting pain on masses of people, could be used for border control against asylum seekers. Implementation might be rationalized by the threat of mass migration due to climate change, nuclear disaster or exaggerated fears of refugees created by governments. We focus on taser anti-personnel mines, suggesting both technological countermeasures and ways of making the use of such technology politically counterproductive. We also outline several other types of ‘non-lethal’ technology that could be used for border control and raise human rights concerns: high-powered microwaves, armed robots, wireless tasers, acoustic devices/vortex rings, ionizing and pulsed energy lasers, chemical calmatives, convulsants, bioregulators and malodurants. Whether all these possible border technologies will be implemented is a matter for speculation, but their serious human rights implications warrant advance scrutiny
Recommendations on the nature of the passenger response time distribution to be used in the MSC 1033 assembly time analysis based on data derived from sea trials
The passenger response time distributions adopted by the International Maritime Organisation (IMO)in their assessment of the assembly time for passanger ships involves two key assumptions. The first is that the response time distribution assumes the form of a uniform random distribution and the second concerns the actual response times. These two assumptions are core to the validity of the IMO analysis but are not based on real data, being the recommendations of an IMO committee. In this paper, response time data collected from assembly trials conducted at sea on a real passanger vessel using actual passangers are presented and discussed. Unlike the IMO specified response time distributions, the data collected from these trials displays a log-normal distribution, similar to that found in land based environments. Based on this data, response time distributions for use in the IMO assesmbly for the day and night scenarios are suggeste
Four-dimensional Cone Beam CT Reconstruction and Enhancement using a Temporal Non-Local Means Method
Four-dimensional Cone Beam Computed Tomography (4D-CBCT) has been developed
to provide respiratory phase resolved volumetric imaging in image guided
radiation therapy (IGRT). Inadequate number of projections in each phase bin
results in low quality 4D-CBCT images with obvious streaking artifacts. In this
work, we propose two novel 4D-CBCT algorithms: an iterative reconstruction
algorithm and an enhancement algorithm, utilizing a temporal nonlocal means
(TNLM) method. We define a TNLM energy term for a given set of 4D-CBCT images.
Minimization of this term favors those 4D-CBCT images such that any anatomical
features at one spatial point at one phase can be found in a nearby spatial
point at neighboring phases. 4D-CBCT reconstruction is achieved by minimizing a
total energy containing a data fidelity term and the TNLM energy term. As for
the image enhancement, 4D-CBCT images generated by the FDK algorithm are
enhanced by minimizing the TNLM function while keeping the enhanced images
close to the FDK results. A forward-backward splitting algorithm and a
Gauss-Jacobi iteration method are employed to solve the problems. The
algorithms are implemented on GPU to achieve a high computational efficiency.
The reconstruction algorithm and the enhancement algorithm generate visually
similar 4D-CBCT images, both better than the FDK results. Quantitative
evaluations indicate that, compared with the FDK results, our reconstruction
method improves contrast-to-noise-ratio (CNR) by a factor of 2.56~3.13 and our
enhancement method increases the CNR by 2.75~3.33 times. The enhancement method
also removes over 80% of the streak artifacts from the FDK results. The total
computation time is ~460 sec for the reconstruction algorithm and ~610 sec for
the enhancement algorithm on an NVIDIA Tesla C1060 GPU card.Comment: 20 pages, 3 figures, 2 table
A GPU-based finite-size pencil beam algorithm with 3D-density correction for radiotherapy dose calculation
Targeting at the development of an accurate and efficient dose calculation
engine for online adaptive radiotherapy, we have implemented a finite size
pencil beam (FSPB) algorithm with a 3D-density correction method on GPU. This
new GPU-based dose engine is built on our previously published ultrafast FSPB
computational framework [Gu et al. Phys. Med. Biol. 54 6287-97, 2009].
Dosimetric evaluations against Monte Carlo dose calculations are conducted on
10 IMRT treatment plans (5 head-and-neck cases and 5 lung cases). For all
cases, there is improvement with the 3D-density correction over the
conventional FSPB algorithm and for most cases the improvement is significant.
Regarding the efficiency, because of the appropriate arrangement of memory
access and the usage of GPU intrinsic functions, the dose calculation for an
IMRT plan can be accomplished well within 1 second (except for one case) with
this new GPU-based FSPB algorithm. Compared to the previous GPU-based FSPB
algorithm without 3D-density correction, this new algorithm, though slightly
sacrificing the computational efficiency (~5-15% lower), has significantly
improved the dose calculation accuracy, making it more suitable for online IMRT
replanning
3D tumor localization through real-time volumetric x-ray imaging for lung cancer radiotherapy
Recently we have developed an algorithm for reconstructing volumetric images
and extracting 3D tumor motion information from a single x-ray projection. We
have demonstrated its feasibility using a digital respiratory phantom with
regular breathing patterns. In this work, we present a detailed description and
a comprehensive evaluation of the improved algorithm. The algorithm was
improved by incorporating respiratory motion prediction. The accuracy and
efficiency were then evaluated on 1) a digital respiratory phantom, 2) a
physical respiratory phantom, and 3) five lung cancer patients. These
evaluation cases include both regular and irregular breathing patterns that are
different from the training dataset. For the digital respiratory phantom with
regular and irregular breathing, the average 3D tumor localization error is
less than 1 mm. On an NVIDIA Tesla C1060 GPU card, the average computation time
for 3D tumor localization from each projection ranges between 0.19 and 0.26
seconds, for both regular and irregular breathing, which is about a 10%
improvement over previously reported results. For the physical respiratory
phantom, an average tumor localization error below 1 mm was achieved with an
average computation time of 0.13 and 0.16 seconds on the same GPU card, for
regular and irregular breathing, respectively. For the five lung cancer
patients, the average tumor localization error is below 2 mm in both the axial
and tangential directions. The average computation time on the same GPU card
ranges between 0.26 and 0.34 seconds
PCA-based lung motion model
Organ motion induced by respiration may cause clinically significant
targeting errors and greatly degrade the effectiveness of conformal
radiotherapy. It is therefore crucial to be able to model respiratory motion
accurately. A recently proposed lung motion model based on principal component
analysis (PCA) has been shown to be promising on a few patients. However, there
is still a need to understand the underlying reason why it works. In this
paper, we present a much deeper and detailed analysis of the PCA-based lung
motion model. We provide the theoretical justification of the effectiveness of
PCA in modeling lung motion. We also prove that under certain conditions, the
PCA motion model is equivalent to 5D motion model, which is based on physiology
and anatomy of the lung. The modeling power of PCA model was tested on clinical
data and the average 3D error was found to be below 1 mm.Comment: 4 pages, 1 figure. submitted to International Conference on the use
of Computers in Radiation Therapy 201
Rodentizide exposure of red fox (Vulpes vulpes) in Scotland, before and after the introduction of an industry stewardship scheme
Development of a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport
Monte Carlo simulation is the most accurate method for absorbed dose
calculations in radiotherapy. Its efficiency still requires improvement for
routine clinical applications, especially for online adaptive radiotherapy. In
this paper, we report our recent development on a GPU-based Monte Carlo dose
calculation code for coupled electron-photon transport. We have implemented the
Dose Planning Method (DPM) Monte Carlo dose calculation package (Sempau et al,
Phys. Med. Biol., 45(2000)2263-2291) on GPU architecture under CUDA platform.
The implementation has been tested with respect to the original sequential DPM
code on CPU in phantoms with water-lung-water or water-bone-water slab
geometry. A 20 MeV mono-energetic electron point source or a 6 MV photon point
source is used in our validation. The results demonstrate adequate accuracy of
our GPU implementation for both electron and photon beams in radiotherapy
energy range. Speed up factors of about 5.0 ~ 6.6 times have been observed,
using an NVIDIA Tesla C1060 GPU card against a 2.27GHz Intel Xeon CPU
processor.Comment: 13 pages, 3 figures, and 1 table. Paper revised. Figures update
- …
