295 research outputs found
An Exploration of Some Pitfalls of Thematic Map Assessment Using the New Map Tools Resource
A variety of metrics are commonly employed by map producers and users to assess and compare thematic maps’ quality, but their use and interpretation is inconsistent. This problem is exacerbated by a shortage of tools to allow easy calculation and comparison of metrics from different maps or as a map’s legend is changed. In this paper, we introduce a new website and a collection of R functions to facilitate map assessment. We apply these tools to illustrate some pitfalls of error metrics and point out existing and newly developed solutions to them. Some of these problems have been previously noted, but all of them are under-appreciated and persist in published literature. We show that binary and categorical metrics, including information about true-negative classifications, are inflated for rare categories, and more robust alternatives should be chosen. Most metrics are useful to compare maps only if their legends are identical. We also demonstrate that combining land-cover classes has the often-neglected consequence of apparent improvement, particularly if the combined classes are easily confused (e.g., different forest types). However, we show that the average mutual information (AMI) of a map is relatively robust to combining classes, and reflects the information that is lost in this process; we also introduce a modified AMI metric that credits only correct classifications. Finally, we introduce a method of evaluating statistical differences in the information content of competing maps, and show that this method is an improvement over other methods in more common use. We end with a series of recommendations for the meaningful use of accuracy metrics by map users and producer
LACO-Wiki: A land cover validation tool and a new, innovative teaching resource for remote sensing and the geosciences
The validation of land cover products is an important step in the workflow of generating a land cover map from remotely-sensed imagery. Many students of remote sensing will be given exercises on classifying a land cover map followed by the validation process. Many algorithms exist for classification, embedded within proprietary image processing software or increasingly as open source tools. However, there is little standardization for land cover validation, nor a set of open tools available for implementing this process. The LACO-Wiki tool was developed as a way of filling this gap, bringing together standardized land cover validation methods and workflows into a single portal. This includes the storage and management of land cover maps and validation data; step-by-step instructions to guide users through the validation process; sound sampling designs; an easy-to-use environment for validation sample interpretation; and the generation of accuracy reports based on the validation process. The tool was developed for a range of users including producers of land cover maps, researchers, teachers and students. The use of such a tool could be embedded within the curriculum of remote sensing courses at a university level but is simple enough for use by students aged 13-18. A beta version of the tool is available for testing at: http://www.lacowiki.net
LACO-Wiki: A New Online Land Cover Validation Tool Demonstrated Using GlobeLand30 for Kenya
Accuracy assessment, also referred to as validation, is a key process in the workflow of developing a land cover map. To make this process open and transparent, we have developed a new online tool called LACO-Wiki, which encapsulates this process into a set of four simple steps including uploading a land cover map, creating a sample from the map, interpreting the sample with very high resolution satellite imagery and generating a report with accuracy measures. The aim of this paper is to present the main features of this new tool followed by an example of how it can be used for accuracy assessment of a land cover map. For the purpose of illustration, we have chosen GlobeLand30 for Kenya. Two different samples were interpreted by three individuals: one sample was provided by the GlobeLand30 team as part of their international efforts in validating GlobeLand30 with GEO (Group on Earth Observation) member states while a second sample was generated using LACO-Wiki. Using satellite imagery from Google Maps, Bing and Google Earth, the results show overall accuracies between 53% to 61%, which is lower than the global accuracy assessment of GlobeLand30 but may be reasonable given the complex landscapes found in Kenya. Statistical models were then fit to the data to determine what factors affect the agreement between the three interpreters such as the land cover class, the presence of very high resolution satellite imagery and the age of the image in relation to the baseline year for GlobeLand30 (2010). The results showed that all factors had a significant effect on the agreement
A global dataset of crowdsourced land cover and land use reference data
Global land cover is an essential climate variable and a key biophysical driver for earth system models. While remote sensing technology, particularly satellites, have played a key role in providing land cover datasets, large discrepancies have been noted among the available products. Global land use is typically more difficult to map and in many cases cannot be remotely sensed. In-situ or ground-based data and high resolution imagery are thus an important requirement for producing accurate land cover and land use datasets and this is precisely what is lacking. Here we describe the global land cover and land use reference data derived from the Geo-Wiki crowdsourcing platform via four campaigns. These global datasets provide information on human impact, land cover disagreement, wilderness and land cover and land use. Hence, they are relevant for the scientific community that requires reference data for global satellite-derived products, as well as those interested in monitoring global terrestrial ecosystems in general
Multivariate Analysis of Dopaminergic Gene Variants as Risk Factors of Heroin Dependence
BACKGROUND: Heroin dependence is a debilitating psychiatric disorder with complex inheritance. Since the dopaminergic system has a key role in rewarding mechanism of the brain, which is directly or indirectly targeted by most drugs of abuse, we focus on the effects and interactions among dopaminergic gene variants. OBJECTIVE: To study the potential association between allelic variants of dopamine D2 receptor (DRD2), ANKK1 (ankyrin repeat and kinase domain containing 1), dopamine D4 receptor (DRD4), catechol-O-methyl transferase (COMT) and dopamine transporter (SLC6A3) genes and heroin dependence in Hungarian patients. METHODS: 303 heroin dependent subjects and 555 healthy controls were genotyped for 7 single nucleotide polymorphisms (SNPs) rs4680 of the COMT gene; rs1079597 and rs1800498 of the DRD2 gene; rs1800497 of the ANKK1 gene; rs1800955, rs936462 and rs747302 of the DRD4 gene. Four variable number of tandem repeats (VNTRs) were also genotyped: 120 bp duplication and 48 bp VNTR in exon 3 of DRD4 and 40 bp VNTR and intron 8 VNTR of SLC6A3. We also perform a multivariate analysis of associations using Bayesian networks in Bayesian multilevel analysis (BN-BMLA). FINDINGS AND CONCLUSIONS: In single marker analysis the TaqIA (rs1800497) and TaqIB (rs1079597) variants were associated with heroin dependence. Moreover, -521 C/T SNP (rs1800955) of the DRD4 gene showed nominal association with a possible protective effect of the C allele. After applying the Bonferroni correction TaqIB was still significant suggesting that the minor (A) allele of the TaqIB SNP is a risk component in the genetic background of heroin dependence. The findings of the additional multiple marker analysis are consistent with the results of the single marker analysis, but this method was able to reveal an indirect effect of a promoter polymorphism (rs936462) of the DRD4 gene and this effect is mediated through the -521 C/T (rs1800955) polymorphism in the promoter
LACO-WIKI-A new open access online portal for land cover validation with high resolution imagery
Recommended from our members
Evaluation of Elevated Tritium Levels in Groundwater Downgradient from the 618-11 Burial Ground Phase I Investigations
This report describes the results of the preliminary investigation of elevated tritium in groundwater discovered near the 618-11 burial ground, located in the eastern part of the Hanford Site. Tritium in one well downgradient of the burial ground was detected at levels up to 8,140,000 pCi/L. The 618-11 burial ground received a variety of radioactive waste from the 300 Area between 1962 and 1967. The burial ground covers 3.5 hectare (8.6 acre) and contains trenches, large diameter caissons, and vertical pipe storage units. The burial ground was stabilized with a native sediment covering. The Energy Northwest reactor complex was constructed immediately east of the burial ground
LACO-Wiki Mobile: An Open Source Application for In situ DataCollection and Land Cover Validation
LACO-Wiki Mobile is a smartphone application for in situ data collection, which is being developed as a freeand open source software project in the framework of the European Space Agency funded project CrowdVal. Themobile application works in tandem with the online land cover validation tool LACO-Wiki (https://www.laco-wiki.net), where users can generate a statistically robust sample for validation purposes. The land cover legend isread directly from the map uploaded to LACO-Wiki for generating the sample. The user must also indicate the typeof validation to be undertaken. Blind validation is where the user chooses the land cover type from a pre-definedlegend, plausibility validation is where users can see the land cover type from the map while on the ground andcan then choose to accept it as correct or not, while enhanced plausibility allows users to indicate the correct landcover type when it is incorrectly specified in the land cover map. This sample is then transferred to the mobileapplication, which directs users to specific locations on the ground to collect the validation data, i.e. operating in a‘directed’ mode. The user can see the locations of the points on the mobile phone, and as they approach a point, theuser is given the option to validate the land cover. These points can then be used in the accuracy assessment of theland cover map. The application also operates in ‘opportunistic’ mode, i.e. allowing the user to collect land coverat any location of their choice, e.g. while driving along a road. Such data collection can be useful for verifyingvisually interpreted samples or complementing training data for the development of land cover maps. Although theopen source application is still under development, a version will be openly accessible in github by the time of theconference
Recommended from our members
Geochemical Characterization of Chromate Contamination in the 100 Area Vadose Zone at the Hanford Site
The major objectives of the proposed study were to: 1.) determine the leaching characteristics of hexavalent chromium [Cr(VI)] from contaminated sediments collected from 100 Area spill sites; 2.) elucidate possible Cr(VI) mineral and/or chemical associations that may be responsible for Cr(VI) retention in the Hanford Site 100 Areas through the use of i.) macroscopic leaching studies and ii.) microscale characterization of contaminated sediments; and 3.) provide information to construct a conceptual model of Cr(VI) geochemistry in the Hanford 100 Area vadose zone. In addressing these objectives, additional benefits accrued were: (1) a fuller understanding of Cr(VI) entrained in the vadose zone that will that can be utilized in modeling potential Cr(VI) source terms, and (2) accelerating the Columbia River 100 Area corridor cleanup by providing valuable information to develop remedial action based on a fundamental understanding of Cr(VI) vadose zone geochemistry. A series of macroscopic column experiments were conducted with contaminated and uncontaminated sediments to study Cr(VI) desorption patterns in aged and freshly contaminated sediments, evaluate the transport characteristics of dichromate liquid retrieved from old pipelines of the 100 Area; and estimate the effect of strongly reducing liquid on the reduction and transport of Cr(VI). Column experiments used the < 2 mm fraction of the sediment samples and simulated Hanford groundwater solution. Periodic stop-flow events were applied to evaluate the change in elemental concentration during time periods of no flow and greater fluid residence time. The results were fit using a two-site, one dimensional reactive transport model. Sediments were characterized for the spatial and mineralogical associations of the contamination using an array of microscale techniques such as XRD, SEM, EDS, XPS, XMP, and XANES. The following are important conclusions and implications. Results from column experiments indicated that most of contaminant Cr travels fast through the sediments and appears as Cr(VI) in the effluents. The significance of this for groundwater concentrations would, however, depend on the mass flux of recharge to the water table. adsorption of Cr(VI) to sediments from spiked Cr(VI) solution is low; calculated retardation coefficients are close to one. Calcium polysulfide solutions readily reduced Cr(VI) to Cr(III) in column experiments. However a significant amount of the Cr(VI) was mobilized ahead of the polysulfide solution front. This has significant implications for in-situ reductive remediation techniques. The experiments suggest that it would be difficult to design a remedial measure using infiltration of liquid phase reductants without increasing transport of Cr(VI) toward the water table. The microscopic characterization results are consistent with the column studies. Cr(VI) is found as ubiquitous coatings on sediment grain surfaces. Small, higher concentration, chromium sites are associated with secondary clay mineral inclusions, with occasional barium chromate minerals, and reduced to Cr(III) in association with iron oxides that are most likely magnetite primary minerals. Within the restricted access domains of sediment matrix, ferrous iron could also diffuse from in situ, high-surface-area minerals to cause the reductive immobilization of chromate. This process may be favored at microscale geochemical zones where ferrous iron could be supplied. Once nucleated, micrometer-scale precipitates are favored as growing locales for further accumulation, causing the formation of discrete zones of Cr(III)
- …
