An indirect assessment of thematic accuracy in the geologic habitat maps of Oregon and Washington Chris Romsos, Chris Goldfinger, Rondi Robison, and Jason Chaytor Active Tectonics and Seafloor Mapping Lab, College of Oceanic and and Atmospheric Sciences, Oregon State University, Corvallis, OR, 97331 What is Thematic Accuracy and how is it commonly assessed? Introduction: The surficial geologic habitat maps for Oregon and Washington are thematic maps, they show the distribution of benthic habitat classes over the continental margin of Oregon and Washington. Traditionally, map accuracy has simply described positional accuracy (e.g. elevations given are + or – 10 meters). A thematic map of marine geologic habitat, the product of an interpretative process, introduces another type of accuracy termed “Thematic Accuracy”. Thematic Accuracy deals with the misidentification or omission of a habitat class. An assessment of thematic accuracy for the geologic habitat maps is needed as the maps are implemented in high level decision making processes and modeling efforts. To address this problem a map set of weighted data-density is produced. The map set serves as an estimate of thematic accuracy based on the assumption that data rich areas yield the highest quality interpretations of habitat classes. The map set portrays, for each of the underlying data types, a continuous “density” surface weighted according to the unique qualities of the dataset. “Unique” qualities relate to our assessment of the utility of a particular data type for the strict purpose of interpreting the physiographic and lithologic character of mapped habitats. A final composite weighted density surface or “shadow map” serves as a visual guide among data rich and data poor regions and also as a surrogate to an interpretive quality assessment or calibration study. Methods: The weighted data-density mapping method evaluates the “quality” of each data type independently on a scale of one to ten, then in aggregate (final composite map). Quality ranks for each data type are determined according to the nature and shape of density distributions and to our interpretation as to their utility. That is, each data type is standardized to a qualitative assessment of its value for habitat mapping. This standard ranking procedure allows combination of disparate data types in the final assessment of overall “quality”. Providing a ranked density map for each data type also allows map users to reclassify or reassemble final maps based upon alternate criteria (e.g. selecting the maximum quality score at any location instead of the sum of scores). Results: The weighted data-density maps were specifically designed to be incorporated into the Bayesian Network modeling of Essential Fish Habitat (EFH) by MRAG Americas, contractors to NOAA Fisheries and the Pacific Fisheries Management Council. There are five individual shadow maps of data density and quality permitting explorations of dependency at model nodes. The first four maps are each unique to a particular data type or survey technique (bathymetric, samples, seismic reflection, and sidescan data types). The fifth map is a composite of the principle four created by summing quality scores at each grid cell. The raster data format permits easy spatial queries. Also provided are the raw data distribution maps used to create the weighted density surfaces (not shown). In this format, the deliverable product is not a dead-end product. It remains possible to view, re-order, or re-render any map according to the needs of the research question at hand. Figure 5. Composite map showing the additive weighted datadensity value at each grid cell among all datasets + + + = Table 1. Data Weighting Schemes: Soundings Weighted Data-Density soundings per 100m grid cell 0 1 2–5 5–60 >60 Samples Weighted Data-Density Data All Sediment Samples (buffered @ 500m radius) Seismic Weighted Data-Density Data USGS, Corliss Cruise (Twichell, 1998) MCAR (McCrory, 1998) OSU (Goldfinger, 1997) Industry Dataset 1 Industry Dataset 2 Industry Dataset 3 (unpublished) USGS, Boomer UW (Palmer, 1998) Dgicon (Goldfinger, 1992) Sonne (Flueh, 1996) Industry Dataset 4 Silver (Silver, 1972) UW TT79 USGS Open File Report 87-607 (Snavely, 87-607) Quality/Rank 1 2 3 5 10 Quality/Rank 10 Quality/Rank 10 10 10 10 5 5 5 5 5 5 1 1 1 1 Sidescan Sonar Weighted Data-Density Survey Gloria EEZ Survey High Resolution Deep-Tow Surveys High Resolution Nearshore Surveys References: Quality/Rank 1 10 10 Figure 1. Weighted Sounding Density Figure 2. Weighted Sample Density Figure 3. Weighted Seismic Density Figure 4. Weighted Sidescan Density Total number of soundings within each 100m gridcell is determined using an extension of MB system. The resultant grid is reclassified according to the scheme at left (Table1.) This scheme emphasizes the lower portion of the density range, where small increases in density correspond to great increases in bathymetric quality. Sample points are buffered at a 500m radius using the Geo-Processing tools of ArcGIS creating a polygon dataset. The polygon buffers are each assigned a rank of 10 (we assume that all sample data is excellent data, however, its utility degrades rapidly away from the point). The polygon dataset is converted to a 100m raster in the final step. Quality of seismic data for mapping habitat and predicting rock outcrop varies greatly among the individual datasets (Table 1.), a result of varying acoustic frequencies and varying technologies employed to collect this data. Processing steps are similar to those presented in Figure 2. All seismic survey lines are buffered at 500m. Sidescan sonar data is high quality data with the lone exception of the lowfrequency Gloria EEZ survey. Low frequency systems are not typically used to infer surficial geology. All sidescan datasets are ranked according to the scheme in Table 1. Cell size of the final raster is 100m, though the resolution of sidescan data is typically much higher. Conclusions/Implications: • This assessment of thematic accuracy is based solely on the quantity and quality of the input data. Thematic accuracy should be assessed using traditional methods (reference datasets) as they become available. • These maps make prioritized continued acoustic and in-situ sampling programs possible, ensuring that additional data be collected over areas poorly covered at present. • Crist, P., and R. Deitner. Assessing Land Cover Map Accuracy. Version 2.0.0 (16 February 2000). A handbook for conducting Gap Analysis. Internet WWW page, at URL: http://www.gap.uidaho.edu/LandCoverAssessment/default.htm • Romsos, C., Goldfinger, C., Chaytor, J., Mapping Data Density and Quality on the Oregon and Washington Continental Margin, Technical Memorandum, 2003. Support for this project provided by:
© Copyright 2026 Paperzz