293 research outputs found
The Interface Region Imaging Spectrograph (IRIS)
The Interface Region Imaging Spectrograph (IRIS) small explorer spacecraft
provides simultaneous spectra and images of the photosphere, chromosphere,
transition region, and corona with 0.33-0.4 arcsec spatial resolution, 2 s
temporal resolution and 1 km/s velocity resolution over a field-of-view of up
to 175 arcsec x 175 arcsec. IRIS was launched into a Sun-synchronous orbit on
27 June 2013 using a Pegasus-XL rocket and consists of a 19-cm UV telescope
that feeds a slit-based dual-bandpass imaging spectrograph. IRIS obtains
spectra in passbands from 1332-1358, 1389-1407 and 2783-2834 Angstrom including
bright spectral lines formed in the chromosphere (Mg II h 2803 Angstrom and Mg
II k 2796 Angstrom) and transition region (C II 1334/1335 Angstrom and Si IV
1394/1403 Angstrom). Slit-jaw images in four different passbands (C II 1330, Si
IV 1400, Mg II k 2796 and Mg II wing 2830 Angstrom) can be taken simultaneously
with spectral rasters that sample regions up to 130 arcsec x 175 arcsec at a
variety of spatial samplings (from 0.33 arcsec and up). IRIS is sensitive to
emission from plasma at temperatures between 5000 K and 10 MK and will advance
our understanding of the flow of mass and energy through an interface region,
formed by the chromosphere and transition region, between the photosphere and
corona. This highly structured and dynamic region not only acts as the conduit
of all mass and energy feeding into the corona and solar wind, it also requires
an order of magnitude more energy to heat than the corona and solar wind
combined. The IRIS investigation includes a strong numerical modeling component
based on advanced radiative-MHD codes to facilitate interpretation of
observations of this complex region. Approximately eight Gbytes of data (after
compression) are acquired by IRIS each day and made available for unrestricted
use within a few days of the observation.Comment: 53 pages, 15 figure
On the Use of Electrooculogram for Efficient Human Computer Interfaces
The aim of this study is to present electrooculogram signals that can be used for human computer interface efficiently. Establishing an efficient alternative channel for communication without overt speech and hand movements is important to increase the quality of life for patients suffering from Amyotrophic Lateral Sclerosis or other illnesses that prevent correct limb and facial muscular responses. We have made several experiments to compare the P300-based BCI speller and EOG-based new system. A five-letter word can be written on average in 25 seconds and in 105 seconds with the EEG-based device. Giving message such as “clean-up” could be performed in 3 seconds with the new system. The new system is more efficient than P300-based BCI system in terms of accuracy, speed, applicability, and cost efficiency. Using EOG signals, it is possible to improve the communication abilities of those patients who can move their eyes
Evolutionary history and species delimitations: a case study of the hazel dormouse, Muscardinus avellanarius
Robust identification of species and significant evolutionary units (ESUs) is essential to implement appropriate conservation strategies for endangered species. However, definitions of species or ESUs are numerous and
sometimes controversial, which might lead to biased conclusions, with serious consequences for the management of
endangered species. The hazel dormouse, an arboreal rodent of conservation concern throughout Europe is an
ideal model species to investigate the relevance of species identification for conservation purposes. This species is a
member of the Gliridae family, which is protected in Europe and seriously threatened in the northern part of its
range. We assessed the extent of genetic subdivision in the hazel dormouse by sequencing one mitochondrial gene
(cytb) and two nuclear genes (BFIBR, APOB) and genotyping 10 autosomal microsatellites. These data were analysed using a combination of phylogenetic analyses and species delimitation methods. Multilocus analyses revealed
the presence of two genetically distinct lineages (approximately 11 % cytb genetic divergence, no nuclear alleles
shared) for the hazel dormouse in Europe, which presumably diverged during the Late Miocene. The phylogenetic
patterns suggests that Muscardinus avellanarius populations could be split into two cryptic species respectively
distributed in western and central-eastern Europe and Anatolia. However, the comparison of several species
definitions and methods estimated the number of species between 1 and 10. Our results revealed the difficulty in
choosing and applying an appropriate criterion and markers to identify species and highlight the fact that consensus
guidelines are essential for species delimitation in the future. In addition, this study contributes to a better
knowledge about the evolutionary history of the species
Linear, Deterministic, and Order-Invariant Initialization Methods for the K-Means Clustering Algorithm
Over the past five decades, k-means has become the clustering algorithm of
choice in many application domains primarily due to its simplicity, time/space
efficiency, and invariance to the ordering of the data points. Unfortunately,
the algorithm's sensitivity to the initial selection of the cluster centers
remains to be its most serious drawback. Numerous initialization methods have
been proposed to address this drawback. Many of these methods, however, have
time complexity superlinear in the number of data points, which makes them
impractical for large data sets. On the other hand, linear methods are often
random and/or sensitive to the ordering of the data points. These methods are
generally unreliable in that the quality of their results is unpredictable.
Therefore, it is common practice to perform multiple runs of such methods and
take the output of the run that produces the best results. Such a practice,
however, greatly increases the computational requirements of the otherwise
highly efficient k-means algorithm. In this chapter, we investigate the
empirical performance of six linear, deterministic (non-random), and
order-invariant k-means initialization methods on a large and diverse
collection of data sets from the UCI Machine Learning Repository. The results
demonstrate that two relatively unknown hierarchical initialization methods due
to Su and Dy outperform the remaining four methods with respect to two
objective effectiveness criteria. In addition, a recent method due to Erisoglu
et al. performs surprisingly poorly.Comment: 21 pages, 2 figures, 5 tables, Partitional Clustering Algorithms
(Springer, 2014). arXiv admin note: substantial text overlap with
arXiv:1304.7465, arXiv:1209.196
A polynomial-time algorithm for the discrete facility location problem with limited distances and capacity constraints
The objective in terms of the facility location problem with limited distances is to minimize the sum of distance functions from the facility to its clients, but with a limit on each of these distances, from which the corresponding function becomes constant. The problem is applicable in situations where the service provided by the facility is insensitive after given threshold distances. In this paper, we propose a polynomial-time algorithm for the discrete version of the problem with capacity constraints regarding the number of served clients. These constraints are relevant for introducing quality measures in facility location decision processes as well as for justifying the facility creation
BNCI Horizon 2020 - Towards a Roadmap for Brain/Neural Computer Interaction
In this paper, we present BNCI Horizon 2020, an EU Coordination and Support Action (CSA) that will provide a roadmap for brain-computer interaction research for the next years, starting in 2013, and aiming at research efforts until 2020 and beyond. The project is a successor of the earlier EU-funded Future BNCI CSA that started in 2010 and produced a roadmap for a shorter time period. We present how we, a consortium of the main European BCI research groups as well as companies and end user representatives, expect to tackle the problem of designing a roadmap for BCI research. In this paper, we define the field with its recent developments, in particular by considering publications and EU-funded research projects, and we discuss how we plan to involve research groups, companies, and user groups in our effort to pave the way for useful and fruitful EU-funded BCI research for the next ten years
Workload measurement in a communication application operated through a P300-based brain-computer interface
n/
The Estimation of Cortical Activity for Brain-Computer Interface: Applications in a Domotic Context
In order to analyze whether the use of the cortical activity, estimated from noninvasive EEG recordings, could be useful to detect mental states related to the imagination of limb movements, we estimate cortical activity from high-resolution EEG recordings in a group of healthy subjects by using realistic head models. Such cortical activity was estimated in region of interest associated with the subject's Brodmann areas by using a depth-weighted minimum norm technique. Results showed that the use of the cortical-estimated activity instead of the unprocessed EEG improves the recognition of the mental states associated to the limb movement imagination in the group of normal subjects. The BCI methodology presented here has been used in a group of disabled patients in order to give
them a suitable control of several electronic devices disposed in a three-room environment devoted to the neurorehabilitation. Four of six patients were able to control several electronic devices in this domotic context with the BCI system
A survey on feature weighting based K-Means algorithms
This is a pre-copyedited, author-produced PDF of an article accepted for publication in Journal of Classification [de Amorim, R. C., 'A survey on feature weighting based K-Means algorithms', Journal of Classification, Vol. 33(2): 210-242, August 25, 2016]. Subject to embargo. Embargo end date: 25 August 2017. The final publication is available at Springer via http://dx.doi.org/10.1007/s00357-016-9208-4 © Classification Society of North America 2016In a real-world data set there is always the possibility, rather high in our opinion, that different features may have different degrees of relevance. Most machine learning algorithms deal with this fact by either selecting or deselecting features in the data preprocessing phase. However, we maintain that even among relevant features there may be different degrees of relevance, and this should be taken into account during the clustering process. With over 50 years of history, K-Means is arguably the most popular partitional clustering algorithm there is. The first K-Means based clustering algorithm to compute feature weights was designed just over 30 years ago. Various such algorithms have been designed since but there has not been, to our knowledge, a survey integrating empirical evidence of cluster recovery ability, common flaws, and possible directions for future research. This paper elaborates on the concept of feature weighting and addresses these issues by critically analysing some of the most popular, or innovative, feature weighting mechanisms based in K-Means.Peer reviewedFinal Accepted Versio
DME production via methanol dehydration with H form and desilicated ZSM-5 type zeolitic catalysts: Study on the correlation between acid sites and conversion
Methanol (MeOH) dehydration for Dimethyl ether (DME) production is one of the possible pathways to produce a green, synthetic fuel that can substitute fossil/conventional ones in automotive/transportation applications. DME synthesis in gas phase usually occurs in presence of an acid catalyst at moderate temperature (up to 250 °C). This work deals with the use of MFI-type zeolitic catalysts. H form and desilicated zeolite samples were synthesized, characterized, and tested to investigate their catalytic activity in MeOH dehydration reaction. Ammonia temperature-programmed desorption (NH3-TPD) and Fourier-transform Infrared spectroscopy (FT-IR) analyses were carried out to elucidate the amount and the nature of acid sites. Zeolite sample desilicated for 60 minutes presented a higher amount of Bronsted acid sites (that can be correlated to the superior catalytic activity), while the Turnover frequency (TOF) referred to the amount of Bronsted acid sites is very similar for the investigated samples. Finally, preliminary kinetic investigation via linear fitting of experimental data on the Arrhenius plot was carried out for simple first and second order kinetic models
- …
