651 research outputs found
Eulerian video magnification for revealing subtle changes in the world
Our goal is to reveal temporal variations in videos that are difficult or impossible to see with the naked eye and display them in an indicative manner. Our method, which we call Eulerian Video Magnification, takes a standard video sequence as input, and applies spatial decomposition, followed by temporal filtering to the frames. The resulting signal is then amplified to reveal hidden information. Using our method, we are able to visualize the flow of blood as it fills the face and also to amplify and reveal small motions. Our technique can run in real time to show phenomena occurring at the temporal frequencies selected by the user.United States. Defense Advanced Research Projects Agency (DARPA SCENICC program)National Science Foundation (U.S.) (NSF CGV-1111415)Quanta Computer (Firm)Nvidia Corporation (Graduate Fellowship
Objectivity and the Transactional Theory of Perception
The visual demonstrations of Professor Adelbert Ames support the transactional theory of perception. This theory asserts that the very contents of our sense experiences are shaped by our past experiences, as well as our expectations of future experiences. This theory, in turn, supports a critical realism about the relationship between perception and reality.SUNY BrockportPhilosophic Exchang
Percolation theory applied to measures of fragmentation in social networks
We apply percolation theory to a recently proposed measure of fragmentation
for social networks. The measure is defined as the ratio between the
number of pairs of nodes that are not connected in the fragmented network after
removing a fraction of nodes and the total number of pairs in the original
fully connected network. We compare with the traditional measure used in
percolation theory, , the fraction of nodes in the largest cluster
relative to the total number of nodes. Using both analytical and numerical
methods from percolation, we study Erd\H{o}s-R\'{e}nyi (ER) and scale-free (SF)
networks under various types of node removal strategies. The removal strategies
are: random removal, high degree removal and high betweenness centrality
removal. We find that for a network obtained after removal (all strategies) of
a fraction of nodes above percolation threshold, . For fixed and close to percolation threshold
(), we show that better reflects the actual fragmentation. Close
to , for a given , has a broad distribution and it is
thus possible to improve the fragmentation of the network. We also study and
compare the fragmentation measure and the percolation measure
for a real social network of workplaces linked by the households of the
employees and find similar results.Comment: submitted to PR
Correlation between centrality metrics and their application to the opinion model
In recent decades, a number of centrality metrics describing network
properties of nodes have been proposed to rank the importance of nodes. In
order to understand the correlations between centrality metrics and to
approximate a high-complexity centrality metric by a strongly correlated
low-complexity metric, we first study the correlation between centrality
metrics in terms of their Pearson correlation coefficient and their similarity
in ranking of nodes. In addition to considering the widely used centrality
metrics, we introduce a new centrality measure, the degree mass. The m order
degree mass of a node is the sum of the weighted degree of the node and its
neighbors no further than m hops away. We find that the B_{n}, the closeness,
and the components of x_{1} are strongly correlated with the degree, the
1st-order degree mass and the 2nd-order degree mass, respectively, in both
network models and real-world networks. We then theoretically prove that the
Pearson correlation coefficient between x_{1} and the 2nd-order degree mass is
larger than that between x_{1} and a lower order degree mass. Finally, we
investigate the effect of the inflexible antagonists selected based on
different centrality metrics in helping one opinion to compete with another in
the inflexible antagonists opinion model. Interestingly, we find that selecting
the inflexible antagonists based on the leverage, the B_{n}, or the degree is
more effective in opinion-competition than using other centrality metrics in
all types of networks. This observation is supported by our previous
observations, i.e., that there is a strong linear correlation between the
degree and the B_{n}, as well as a high centrality similarity between the
leverage and the degree.Comment: 20 page
Sizes and intensities of mesoscale precipitation areas as depicted by digital radar data.
Thesis. 1976. M.S.--Massachusetts Institute of Technology. Dept. of Meteorology.Microfiche copy available in Archives and Science.Bibliography: leaf 86.M.S
Development of an Oral Form of Azacytidine: 2′3′5′Triacetyl-5-Azacytidine
Myelodysplastic syndromes (MDSs) represent a group of incurable stem-cell malignancies which are predominantly treated by supportive care. Epigenetic silencing through promoter methylation of a number of genes is present in poor-risk subtypes of MDS and often predicts transformation to acute myelogenous leukemia (AML). Azacitidine and decitabine, two FDA-approved DNA methyltransferase (DNMT) inhibitors, are able to improve overall response although their oral bioavailability complicates their clinical use. This study evaluated 2′, 3′, 5′-triacetyl-5-azacitidine (TAC) as a potential prodrug for azacitidine. The prodrug demonstrated significant pharmacokinetic improvements in bioavailability, solubility, and stability over the parent compound. In vivo analyses indicated a lack of general toxicity coupled with significantly improved survival. Pharmacodynamic analyses confirmed its ability to suppress global methylation in vivo. These data indicate that esterified nucleoside derivatives may be effective prodrugs for azacitidine and encourages further investigation of TAC into its metabolism, activity, and possible clinical evaluation
Reusable Launch Vehicle Technology Program
Industry/NASA Reusable Launch Vehicle (RLV) Technology Program efforts are underway to design, test, and develop technologies and concepts for viable commercial launch systems that also satisfy national needs at acceptable recurring costs. Significant progress has been made in understanding the technical challenges of fully reusable launch systems and the accompanying management and operational approaches for achieving a low-cost program. This paper reviews the current status of the Reusable Launch Vehicle Technology Program including the DC-XA, X-33 and X-34 flight systems and associated technology programs. It addresses the specific technologies being tested that address the technical and operability challenges of reusable launch systems including reusable cryogenic propellant tanks, composite structures, thermal protection systems, improved propulsion, and subsystem operability enhancements. The recently concluded DC-XA test program demonstrated some of these technologies in ground and flight tests. Contracts were awarded recently for both the X-33 and X-34 flight demonstrator systems. The Orbital Sciences Corporation X-34 flight test vehicle will demonstrate an air-launched reusable vehicle capable of flight to speeds of Mach 8. The Lockheed-Martin X-33 flight test vehicle will expand the test envelope for critical technologies to flight speeds of Mach 15. A propulsion program to test the X-33 linear aerospike rocket engine using a NASA SR-71 high speed aircraft as a test bed is also discussed. The paper also describes the management and operational approaches that address the challenge of new cost-effective, reusable launch vehicle systems
- …
