818 research outputs found
Estimation in the group action channel
We analyze the problem of estimating a signal from multiple measurements on a
\mbox{group action channel} that linearly transforms a signal by a random
group action followed by a fixed projection and additive Gaussian noise. This
channel is motivated by applications such as multi-reference alignment and
cryo-electron microscopy. We focus on the large noise regime prevalent in these
applications. We give a lower bound on the mean square error (MSE) of any
asymptotically unbiased estimator of the signal's orbit in terms of the
signal's moment tensors, which implies that the MSE is bounded away from 0 when
is bounded from above, where is the number of observations,
is the noise standard deviation, and is the so-called
\mbox{moment order cutoff}. In contrast, the maximum likelihood estimator is
shown to be consistent if diverges.Comment: 5 pages, conferenc
Putting the wood back into our rivers: an experiment in river rehabilitation
This paper presents an overview of a project established to assess the effectiveness of woody debris (WD) reintroduction as a river rehabilitation tool. An outline of an experiment is presented that aims to develop and assess the effectiveness of engineered log jams (ELJs) under Australian conditions, and to demonstrate the potential for using a range of ELJs to stabilise a previously de-snagged, high energy gravel-bed channel. Furthermore, the experiment will test the effectiveness of a reach based rehabilitation strategy to increase geomorphic variability and hence habitat diversity. While primarily focusing on the geomorphic and engineering aspects of the rehabilitation strategy, fish and freshwater mussel populations are also being monitored. The project is located within an 1100m reach of the Williams River, NSW. Twenty separate ELJ structures were constructed, incorporating a total of 430 logs placed without any artificial anchoring (e.g., no cabling or imported ballast). A geomorphic control reach was established 3.1 km upstream of the project reach. In the 6 months since the structures were built the study site has experienced 6 flows that have overtopped most structures, 3 of the flows were in excess of the mean annual flood, inundating 19 of the ELJs by 2 - 3 m, and one by 0.5 m. Early results indicate that with the exception of LS4 and LS5, all structures are performing as intended and that the geomorphic variability of the reach has substantially increased
Local Algorithms for Block Models with Side Information
There has been a recent interest in understanding the power of local
algorithms for optimization and inference problems on sparse graphs. Gamarnik
and Sudan (2014) showed that local algorithms are weaker than global algorithms
for finding large independent sets in sparse random regular graphs. Montanari
(2015) showed that local algorithms are suboptimal for finding a community with
high connectivity in the sparse Erd\H{o}s-R\'enyi random graphs. For the
symmetric planted partition problem (also named community detection for the
block models) on sparse graphs, a simple observation is that local algorithms
cannot have non-trivial performance.
In this work we consider the effect of side information on local algorithms
for community detection under the binary symmetric stochastic block model. In
the block model with side information each of the vertices is labeled
or independently and uniformly at random; each pair of vertices is
connected independently with probability if both of them have the same
label or otherwise. The goal is to estimate the underlying vertex
labeling given 1) the graph structure and 2) side information in the form of a
vertex labeling positively correlated with the true one. Assuming that the
ratio between in and out degree is and the average degree , we characterize three different regimes under which a
local algorithm, namely, belief propagation run on the local neighborhoods,
maximizes the expected fraction of vertices labeled correctly. Thus, in
contrast to the case of symmetric block models without side information, we
show that local algorithms can achieve optimal performance for the block model
with side information.Comment: Due to the limitation "The abstract field cannot be longer than 1,920
characters", the abstract here is shorter than that in the PDF fil
Super-resolution provided by the arbitrarily strong superlinearity of the blackbody radiation
Blackbody radiation is a fundamental phenomenon in nature, and its explanation by Planck marks a cornerstone in the history of Physics. In this theoretical work, we show that the spectral radiance given by Planck's law is strongly superlinear with temperature, with an arbitrarily large local exponent for decreasing wavelengths. From that scaling analysis, we propose a new concept of super-resolved detection and imaging: if a focused beam of energy is scanned over an object that absorbs and linearly converts that energy into heat, a highly nonlinear thermal radiation response is generated, and its point spread function can be made arbitrarily smaller than the excitation beam focus. Based on a few practical scenarios, we propose to extend the notion of super-resolution beyond its current niche in microscopy to various kinds of excitation beams, a wide range of spatial scales, and a broader diversity of target objects
Women and Philanthropy: The Next Fundraising Frontier
As universities seek to find private contributions from the community, they must tum to an obvious source of income that has been under-asked and undercultivated-women. Historically, women have not provided sacrificial gifts, primarily because they have not been financially or psychologically prepared to do so. Today, however, several forces-education and career choices, business acumen, charitable giving attitudes, wealth and life expectancy-converge to make this the opportune time to solicit women for major gifts. Yet it is not enough to recognize this fact. Universities must create new models of fend-raising strategies that appeal to women. In doing so, universities must also consider generational differences that significantly affect women\u27s decisions to give
Hard axis magnetization behavior and the surface spin flop transition in antiferromagnetic Fe Cr 100 superlattices
Premature Cell Senescence and T Cell Receptor‐Independent Activation of CD8+ T Cells in Juvenile Idiopathic Arthritis
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/99044/1/art38015.pd
The Congressional Bureaucracy
Congress has a bureaucracy.This Article introduces the concept of the “congressional bureaucracy,” and theorizes what it means for Congress to have an internal workforce of more than 4,000 nonpartisan, highly specialized, and long-serving experts, without which the modern Congress could not function. These experts—not elected Members or their political staffs—write the text of the laws, audit implementation, research policy, estimate bills’ economic effects, decide which committees control legislation and which amendments can be made, edit and rearrange already-enacted (!) legislation into the law as we see it in the U.S. Code, and much more. The congressional bureaucracy furthers internal and external separation of powers, revives theories of Congress as a rational actor, and supplies key insight for statutory interpretation. But Courts, lawyers and legal scholars have almost entirely ignored their existence.This project is based on two years of confidential interviews with high-level staffers in Congress’s nine nonpartisan legislative institutions—the Office of the Law Revision Counsel; the Offices of the Legislative Counsels; the Congressional Research Service; the Government Accountability Office; the Parliamentarians; the Congressional Budget Office; the Joint Committee on Taxation; MedPAC and MACPAC—and additional interviews with partisan staff. The project furthers a new line of legislation scholarship about the value to theory and doctrine of understanding how Congress actually works. Courts cannot claim the doctrines of statutory interpretation are democratically linked to Congress, as virtually all judges do, without understanding how it writes legislation.Our research reveals that the congressional bureaucracy serves purposes previously unimagined by legal scholarship. Classic bureaucracy literature posits that Congress loses power when it delegates. But the congressional bureaucracy was explicitly founded so that Congress could reclaim and safeguard its own powers against an executive branch that was encroaching on the legislative process. The bureaucracy also safeguards Congress’s own internal separation of powers, the salutary decentralization of law-producing responsibilities among a collection of nonpartisan actors, preventing any one aspect of the lawmaking process from coming under undue political or centralized control.Understanding the congressional bureaucracy’s work also provocatively deconstructs the concept of a “statutory text.” The words Congress enacts are the result of a highly dialogic process that is triggered by and includes assumptions about critical inputs from the bureaucracy. Members and partisan staff focus on the substance of legislation at the macro level, not the specific words chosen at the micro level—that is the bureaucracy’s job. What we see when we open the statute books often is not even what Congress enacted or how Congress arranged it, because OLRC reorganizes and edits the laws after passage. So conceived, the concept of a “statute” is much more capacious than merely the “text” at the moment of the vote. None of this is illegitimate; Congress has set itself up this way. All of these inputs are part of the “text” as Congress intends it to be understood.Together, these institutions paint a picture of a Congress that is not as irrational as the public considers it to be. They also have on-the-ground lessons for statutory interpretation, highlighting critical inputs that courts miss and numerous statutory cues—from code placement to consistency of language to CBO scores—some of which courts dramatically overread, others of which should be attractive even to textualists because they result from formalist, objective, collectively congressional action. The field is now engaged in emerging debates about whether doctrine can absorb this kind of detail about legislative process; understanding the congressional bureaucracy is a critical new piece of this account
THE ENDURING RELEVANCE OF CONGRESS DESPITE THE COURT\u27S SHIFT TO “ORDINARY READER” STATUTORY INTERPRETATION
Has Congress become irrelevant to statutory interpretation? The dominant theoretical and doctrinal paradigm in American statutory interpretation has always been the conversation between Congress and the courts. Today, however, the Court’s new, second-generation textualists claim they have left Congress behind. They argue they have changed textualism’s perspective, from an “insider” perspective focused on Congress’s textual choices, to an “outsider” perspective based on how “ordinary people” read statutes. The Court’s self-professed shift away from a legiscentric approach, if true, would be a seismic shift in the conception of the judicial role. Whereas judges and scholars—including first-generation textualists—had for a century focused on legislative supremacy and Congress’s practices and intentions, today, a majority of the Court claims its role is something entirely different. Rather than serve as a “junior partner” of the legislature, the Court says its role is to enforce a populist conception of how regular people encounter statutes, as well as the value of fair notice. But as it turns out, divorcing statutory meaning from Congress is not as simple as it looks. Our review of statutory interpretation cases over the past six Terms illustrates that, despite their protests, even the most ardent textualists’ opinions that purport to turn on ordinary meaning are in fact riddled with implicit—and sometimes explicit—assumptions about congressional intent and how Congress drafts, including surreptitious uses of legislative history. This Essay explores the Court’s rhetorical shift and why it has not been complete in doctrinal implementation. The congressional perspective in fact remains ubiquitous in the Court’s interpretive work, even as the Court disavows it
Penalty Corner Routines in Elite Women’s Indoor Field Hockey: Prediction of Outcomes based on Tactical Decisions
Indoor hockey is a highly competitive international sport, yet no research to date has investigated the key actions within this sport. As with outdoor field hockey, penalty corners represent one of the most likely situations in which goals can be scored. All 36 matches of the round-robin phase of the 2010-2011 England Hockey League Women’s Premier Division ‘Super Sixes’ competition were analysed with the purpose of establishing which factors can predict the scoring of a goal using Binary Logistic Regression analysis. Seventy two (22.6%) of the 319 observed penalty corners resulted in a goal. The strongest predictor of scoring a goal was taking the penalty corner from the goalkeeper’s right. Based on the odds ratio (OR), the odds of the attacking team scoring were 2.27 (CI = 1.41 - 3.65) times higher with penalty corners taken from the goalkeeper’s right as opposed to the left. Additionally, if the goalkeeper decided to rush to the edge of the circle, the odds of the attacking team failing to score were 2.19 (CI = 1.18 - 4.08) times higher compared to when the goalkeeper remained near the goal line. These results suggest that strategic decisions from the players and coaches have an important part to play in the success of penalty corners. Future research should investigate the impact of goalkeepers’ movement and further examine the technical and tactical intricacies of penalty corners
- …
