20 research outputs found
CECM: Constrained evidential -means algorithm
International audienceIn clustering applications, prior knowledge about cluster membership is sometimes available. To integrate such auxiliary information, constraint-based (or semi-supervised) methods have been proposed in the hard or fuzzy clustering frameworks. This approach is extended to evidential clustering, in which the membership of objects to clusters is described by belief functions. A variant of the Evidential C-means (ECM) algorithm taking into account pairwise constraints is proposed. These constraints are translated into the belief function framework and integrated in the cost function. Experiments with synthetic and real data sets demonstrate the interest of the method. In particular, an application to medical image segmentation is presented
Belief Theory for Large-Scale Multi-label Image Classification
International audienceClassifier combination is known to generally perform better than each individual classifier by taking into account the complementarity between the input pieces of information. Dempster-Shafer theory is a framework of interest to make such a fusion at the decision level, and allows in addition to handle the conflict that can exist between the classifiers as well as the uncertainty that remains on the sources of information. In this contribution, we present an approach for classifier fusion in the context of large-scale multi-label and multi-modal image classification that improves the classification accuracy. The complexity of calculations is reduced by considering only a subset of the frame of discernment. The classification results on a large dataset of 18,000 images and 99 classes show that the proposed method gives higher performances than of those classifiers separately considered, while keeping tractable computational cost
New Surface Treatments and New Fibers: The Challenge to Satisfy New Requirements for Technical Textiles
Evidential Logistic Regression for Binary SVM Classifier Calibration
International audienceThe theory of belief functions has been successfully used in many classification tasks. It is especially useful when combining multiple classifiers and when dealing with high uncertainty. Many classification approaches such as k-nearest neighbors, neural network or decision trees have been formulated with belief functions. In this paper, we propose an evidential calibration method that transforms the output of a classifier into a belief function. The calibration, which is based on logistic regression, is computed from a likelihood-based belief function. The uncertainty of the calibration step depends on the number of training samples and is encoded within a belief function. We apply our method to the calibration and combination of several SVM classifiers trained with different amounts of data
Application of a semi-Hertzian method to the simulation of vehicles in high-speed switches
Logistic regression revisited: belief function analysis
International audienceWe show that the weighted sum and softmax operations performed in logistic regression classifiers can be interpreted in terms of evidence aggregation using Dempster's rule of combination. From that perspective, the output probabilities from such classifiers can be seen as normalized plausibilities, for some mass functions that can be laid bare. This finding suggests that the theory of belief functions is a more general framework for classifier construction than is usually considered
