643 research outputs found
Browsing Large Image Datasets through Voronoi Diagrams
Conventional browsing of image collections use mechanisms such as thumbnails arranged on a regular grid or on a line, often mounted over a scrollable panel. However, this approach does not scale well with the size of the datasets (number of images). In this paper, we propose a new thumbnail-based interface to browse large collections of images. Our approach is based on weighted centroidal anisotropic Voronoi diagrams.
A dynamically changing subset of images is represented by thumbnails and shown on the screen. Thumbnails are shaped like general polygons, to better cover screen space, while still reflecting the original aspect ratios or orientation of the represented images. During the browsing process, thumbnails are dynamically rearranged, reshaped and rescaled. The objective is to devote more screen space (more numerous and larger thumbnails) to the parts of the dataset closer to the current region of interest, and progressively lesser away from it, while still making the dataset visible as a whole. During the entire process, temporal coherence is always maintained. GPU implementation easily guarantees the frame rates needed for fully smooth interactivity
Almost Isometric Mesh Parameterization through Abstract Domains
In this paper, we propose a robust, automatic technique to build a global hi-quality parameterization of a two-manifold triangular mesh. An adaptively chosen 2D domain of the parameterization is built as part of the process. The produced parameterization exhibits very low isometric distortion, because it is globally optimized to preserve both areas and angles. The domain is a collection of equilateral triangular 2D regions enriched with explicit adjacency relationships (it is abstract in the sense that no 3D embedding is necessary). It is tailored to minimize isometric distortion, resulting in excellent parameterization qualities, even when meshes with complex shape and topology are mapped into domains composed of a small number of large continuous regions. Moreover, this domain is, in turn, remapped into a collection of 2D square regions, unlocking many advantages found in quad-based domains (e. g., ease of packing). The technique is tested on a variety of cases, including challenging ones, and compares very favorably with known approaches. An open-source implementation is made available
Recovering the star formation rate in the solar neighborhood
This paper develops a method for obtaining the star formation histories of a mixed, resolved population through the use of color-magnitude diagrams (CMDs). The method provides insight into the local star formation rate, analyzing the observations of the Hipparcos satellite through a comparison with synthetic CMDs computed for different histories with an updated stellar evolution library. Parallax and photometric uncertainties are included explicitly and corrected using the Bayesian Richardson-Lucy algorithm. We first describe our verification studies using artificial data sets. From this sensitivity study, the critical factors determining the success of a recovery for a known star formation rate are a partial knowledge of the IMF and the age-metallicity relation, and sample contamination by clusters and moving groups (special populations whose histories are different than that of the whole sample). Unresolved binaries are less important impediments. We highlight how these limit the method. For the real field sample, complete to Mv < 3.5, we find that the solar neighborhood star formation rate has a characteristic timescale for variation of about 6 Gyr, with a maximum activity close to 3 Gyr ago. The similarity of this finding with column integrated star formation rates may indicate a global origin, possibly a collision with a satellite galaxy. We also discuss applications of this technique to general photometric surveys of other complex systems (e.g. Local Group dwarf galaxies) where the distances are well known
White Dwarf Cosmochronology in the Solar Neighborhood
The study of the stellar formation history in the solar neighborhood is a
powerful technique to recover information about the early stages and evolution
of the Milky Way. We present a new method which consists of directly probing
the formation history from the nearby stellar remnants. We rely on the volume
complete sample of white dwarfs within 20 pc, where accurate cooling ages and
masses have been determined. The well characterized initial-final mass relation
is employed in order to recover the initial masses (1 < M/Msun < 8) and total
ages for the local degenerate sample. We correct for moderate biases that are
necessary to transform our results to a global stellar formation rate, which
can be compared to similar studies based on the properties of main-sequence
stars in the solar neighborhood. Our method provides precise formation rates
for all ages except in very recent times, and the results suggest an enhanced
formation rate for the solar neighborhood in the last 5 Gyr compared to the
range 5 < Age (Gyr) < 10. Furthermore, the observed total age of ~10 Gyr for
the oldest white dwarfs in the local sample is consistent with the early
seminal studies that have determined the age of the Galactic disk from stellar
remnants. The main shortcoming of our study is the small size of the local
white dwarf sample. However, the presented technique can be applied to larger
samples in the future.Comment: 25 pages, 10 figures, accepted for publication in the Astrophysical
Journa
Joint Interactive Visualization of 3D Models and Pictures in Walkable Scenes
The 3D digitalization of buildings, urban scenes, and the like is now a mature technology. Highly complex, densely sampled, reasonably accurate 3D models can be obtained by range-scanners and even image-based reconstruction methods from dense image collections. Acquisition of naked geometry is not enough in Cultural Heritage applications, because the surface colors (e.g. pictorial data) are clearly of central importance. Moreover, the 3D geometry cannot be expected to be complete, lacking context, parts made of materials like glass and metal, difficult to reach surfaces, etc. Easily captured photographs are the natural source of the appearance data missing in the 3D geometry. In spite of the recent availability of reliable technologies to align 2D images on 3D data, the two sides of the dataset are not easy to combine satisfactorily in a visualization. Texture mapping techniques, perhaps the most obvious candidate for the task, assume strict content consistency (3D to 2D, and 2D to 2D) which these datasets do not and should not exhibit (the advantage of pictures consisting in their ability to feature details, lighting conditions, non-persistent items, etc. which are absent in the 3D models or in the other pictures). In this work, we present a simple but effective technique to jointly and interactively visualize 2D and 3D data of this kind. This technique is used within PhotoCloud [IV12], a flexible opensource tool which is being designed to browse, navigate, and visualize large, remotely stored 3D-2D datasets, and which emphasizes scalability, usability, and ability to cope with heterogeneous data from various sources
PolyCube-Maps
Standard texture mapping of real-world meshes suffers from the presence of seams that need to be introduced in order to avoid excessive distortions and to make the topology of the mesh compatible to the one of the texture domain. In contrast, cube maps provide a mechanism that could be used for seamless texture mapping with low distortion, but only if the object roughly resembles a cube. We extend this concept to arbitrary meshes by using as texture domain the surface of a polycube whose shape is similar to that of the given mesh. Our approach leads to a seamless texture mapping method that is simple enough to be implemented in currently available graphics hardware
Practical quad mesh simplification
In this paper we present an innovative approach to incremental quad mesh simplification, i.e. the task of producing a low complexity quad mesh starting from a high complexity one. The process is based on a novel set of strictly local operations which preserve quad structure. We show how good tessellation quality (e.g. in terms of vertex valencies) can be achieved by pursuing uniform length and canonical proportions of edges and diagonals. The decimation process is interleaved with smoothing in tangent space. The latter strongly contributes to identify a suitable sequence of local modification operations. The method is naturally extended to manage preservation of feature lines (e.g. creases) and varying (e.g. adaptive) tessellation densities. We also present an original Triangle-to-Quad conversion algorithm that behaves well in terms of geometrical complexity and tessellation quality, which we use to obtain the initial quad mesh from a given triangle mesh
Practical quad mesh simplification
In this paper we present an innovative approach to incremental quad mesh simplification, i.e. the task of producing a low complexity quad mesh starting from a high complexity one. The process is based on a novel set of strictly local operations which preserve quad structure. We show how good tessellation quality (e.g. in terms of vertex valencies) can be achieved by pursuing uniform length and canonical proportions of edges and diagonals. The decimation process is interleaved with smoothing in tangent space. The latter strongly contributes to identify a suitable sequence of local modification operations. The method is naturally extended to manage preservation of feature lines (e.g. creases) and varying (e.g. adaptive) tessellation densities. We also present an original Triangle-to-Quad conversion algorithm that behaves well in terms of geometrical complexity and tessellation quality, which we use to obtain the initial quad mesh from a given triangle mesh
NGC 2849 and NGC 6134: two more BOCCE open clusters
We present CCD photometry of two southern open clusters. As part of the Bologna Open Cluster Chemical Evolution project we obtained BVI and UBVI imaging for NGC 2849 and NGC 6134, respectively. By means of the synthetic colour-magnitude diagram method and using various evolutionary sets of stellar evolution tracks with various metallicities, we determined at the same time age, distance and reddening. We also determined an approximate metallicity for NGC 2849, for which the information is not available from sounder methods like high-resolution spectroscopy. NGC 2849 turned to be 0.85-1.0 Gyr old with a solar metallicity. The foreground reddening is E(B - V) = 0.28 - 0.32, and the true distance modulus (m - M) 0 = 13.8-13.9. For NGC 6134 we did not obtain fully consistent answers from the V, B - V and V, V - I photometry, an unexpected problem, since both the metallicity and the reddening are known (from high-resolution spectroscopy and the U - B, B - V two colours diagram, respectively). This may either indicate a difficulty of current models (evolutionary tracks and/or models of atmosphere) to accurately reproduce colours, or be related to differences in the metal mixture assumed by the models and those of the clusters. Assuming the spectroscopic abundance and the colour excess [E(B - V) = 0.35] from the U - B, B - V plot, we derived a best age between 0.82 and 0.95 Gyr and a distance modulus 10.5. In agreement with previous studies, the NGC 6134 colour-magnitude diagram shows also a clear main sequence gap at V ˜ 15 and B-V ˜ 0.9-1.0 that is unexplained by canonical stellar evolution models.Fil: Ahumada, Andrea Veronica. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad Nacional de Cordoba. Observatorio Astronomico de Cordoba. Departamento de Astrofisica Estelar; ArgentinaFil: Cignoni, M.. Universitá di Bologna. Dipartimento di Astronomia; Italia;Fil: Bragaglia, A.. INAF-Osservatorio Astronomico di Bologna; Italia;Fil: Donati, P.. Universitá di Bologna. Dipartimento di Astronomia; Italia;Fil: Tosi, M.. INAF-Osservatorio Astronomico di Bologna; Italia;Fil: Marconi, G.. European Southern Observatory (ESO); Chile
Load-Balancing for Parallel Delaunay Triangulations
Computing the Delaunay triangulation (DT) of a given point set in
is one of the fundamental operations in computational geometry.
Recently, Funke and Sanders (2017) presented a divide-and-conquer DT algorithm
that merges two partial triangulations by re-triangulating a small subset of
their vertices - the border vertices - and combining the three triangulations
efficiently via parallel hash table lookups. The input point division should
therefore yield roughly equal-sized partitions for good load-balancing and also
result in a small number of border vertices for fast merging. In this paper, we
present a novel divide-step based on partitioning the triangulation of a small
sample of the input points. In experiments on synthetic and real-world data
sets, we achieve nearly perfectly balanced partitions and small border
triangulations. This almost cuts running time in half compared to
non-data-sensitive division schemes on inputs exhibiting an exploitable
underlying structure.Comment: Short version submitted to EuroPar 201
- …
