758 research outputs found
The Swing of the Regulatory Pendulum in Europe: From Precautionary Principle to (Regulatory) Impact Analysis
Regulation in Europe is currently driven by three distinct, yet not entirely unrelated factors. These are competitiveness, sustainable development and governance. Increasingly these factors influence both the need for, and concepts of, what the European Commission (the Commission) refers to as "better regulation".To ensure better regulation, two regulatory philosophies have been put forward, namely the precautionary principle and impact assessment.In this paper, I first briefly describe the current drivers of better regulation. Then I examine the use of these two regulatory philosophies in helping to achieve better regulation. In the final section I offer some speculations on the future development of European Union (EU) regulation. Will elements of the Commission and the EU member states operate in an even more precautionary environment, or will the implementation of the precautionary principle be seen as too costly, forcing regulators to resort to an even greater use of impact analysis'
Evaluation of Siting Strategies: The Case of Two UK Waste Tire Incinerators
Examines the circumstances that may have contributed to differing outcomes with respect to siting similar facilities in the UK
Risk Communication and Management in the 21st Century
Environmental, food, and health regulation in the UK and in many other European countries is in a state of crisis. Following a series of regulatory scandals, regulators decided that drastic changes were needed. There would be no more consensual style regulation with closed-door deliberations between industry and regulators. The public's trust toward regulators had disappeared, and the continued deteriorating situation was not made better by an increasingly aggressive media trying to either directly or indirectly discredit the regulators by unnecessarily amplifying risks and, in many cases, manufacturing uncertainty. Regulators and their advisors took the view that the best way out of this quagmire would be to put forward a new model of regulatory decision-making. This model would be based on transparency throughout the regulatory process and would encourage public and stakeholder deliberation. The model would also promote risk-averse decision making such as the adoption of the precautionary principle, as regulators are frightened about possible scandals lurking around the corner. Finally, in the new model, scientists are to a certain degree demoted. The new model of regulatory decision-making is not problem-free, however. It has a number of teething problems, which this paper addresses.
The Plateau-ing of the European Better Regulation Agenda: An Analysis of Activities Carried Out by the Barroso Commission
This paper examines the European Commission's (EC) Better Regulation Agenda, from the time that President Barroso came to power, in November 2004 , to the 2006 summer recess. It particularly focuses on whether the Commission's regulatory thinking has moved away from the precautionary principle and towards Regulatory Impact Analysis (RIA), something I predicted in 2004 (Lofstedt 2004). The article summarizes the papers and communications in the Better Regulation area put forward by the Commission since November 2004, and makes a number of observations about how the Better Regulation Agenda may develop in the future. In conclusion I argue that the Commission's Better Regulation Agenda has plateaued. Commissioner Verheugen will not be successful in pushing the Agenda further forward because of issues such as REACH and opposition from member states, notably France. It is based on a combination of desk research and interviews with policy-makers, regulators, academics and stakeholders who have been involved either in shaping or fighting the Better Regulation Agenda.Regulatory Reform
Continuation of Nesterov's Smoothing for Regression with Structured Sparsity in High-Dimensional Neuroimaging
Predictive models can be used on high-dimensional brain images for diagnosis
of a clinical condition. Spatial regularization through structured sparsity
offers new perspectives in this context and reduces the risk of overfitting the
model while providing interpretable neuroimaging signatures by forcing the
solution to adhere to domain-specific constraints. Total Variation (TV)
enforces spatial smoothness of the solution while segmenting predictive regions
from the background. We consider the problem of minimizing the sum of a smooth
convex loss, a non-smooth convex penalty (whose proximal operator is known) and
a wide range of possible complex, non-smooth convex structured penalties such
as TV or overlapping group Lasso. Existing solvers are either limited in the
functions they can minimize or in their practical capacity to scale to
high-dimensional imaging data. Nesterov's smoothing technique can be used to
minimize a large number of non-smooth convex structured penalties but
reasonable precision requires a small smoothing parameter, which slows down the
convergence speed. To benefit from the versatility of Nesterov's smoothing
technique, we propose a first order continuation algorithm, CONESTA, which
automatically generates a sequence of decreasing smoothing parameters. The
generated sequence maintains the optimal convergence speed towards any globally
desired precision. Our main contributions are: To propose an expression of the
duality gap to probe the current distance to the global optimum in order to
adapt the smoothing parameter and the convergence speed. We provide a
convergence rate, which is an improvement over classical proximal gradient
smoothing methods. We demonstrate on both simulated and high-dimensional
structural neuroimaging data that CONESTA significantly outperforms many
state-of-the-art solvers in regard to convergence speed and precision.Comment: 11 pages, 6 figures, accepted in IEEE TMI, IEEE Transactions on
Medical Imaging 201
Simulated Data for Linear Regression with Structured and Sparse Penalties
A very active field of research in Bioinformatics is to integrate structure in Machine Learning methods. Methods recently developed claim that they allow simultaneously to link the computed model to the graphical structure of the data set and to select a handful of important features in the analysis. However, there is still no way to simulate data for which we can separate the three properties that such method claim to achieve. These properties are: (i) the sparsity of the solution, i.e., the fact the the model is based on a few features of the data; (ii) the structure of the model; (iii) the relation between the structure of the model and the graphical model behind the generation of the data
Why aren't more veterinary practices owned or led by women?
The increasing proportion of women among the body of UK veterinary surgeons practicing clinical medicine has been consistently highlighted in RCVS surveys (RCVS 2006, 2010, 2014a). Despite women outnumbering men in clinical practice (57% v 43%) in 2014 (RCVS, 2014a) they do not own veterinary practices or hold practice partnerships or leadership positions in proportions that may be expected, even when adjusting for age and experience (RCVS, 2014b)
Common and Distinct Components in Data Fusion
In many areas of science multiple sets of data are collected pertaining to
the same system. Examples are food products which are characterized by
different sets of variables, bio-processes which are on-line sampled with
different instruments, or biological systems of which different genomics
measurements are obtained. Data fusion is concerned with analyzing such sets of
data simultaneously to arrive at a global view of the system under study. One
of the upcoming areas of data fusion is exploring whether the data sets have
something in common or not. This gives insight into common and distinct
variation in each data set, thereby facilitating understanding the
relationships between the data sets. Unfortunately, research on methods to
distinguish common and distinct components is fragmented, both in terminology
as well as in methods: there is no common ground which hampers comparing
methods and understanding their relative merits. This paper provides a unifying
framework for this subfield of data fusion by using rigorous arguments from
linear algebra. The most frequently used methods for distinguishing common and
distinct components are explained in this framework and some practical examples
are given of these methods in the areas of (medical) biology and food science.Comment: 50 pages, 12 figure
- …
