SOIL MAPPER ®

SOIL MAPPER®

landsat_agriculture  soil mapper® is a fully automatic software that permits to generate land cover classification maps through the analysis of multispectral satellite data in the optical domain.

As input it requires multispectral remotely sensed (RS) images calibrated on Top of Atmosphere (TOA) physical values: TOA Reflectance values for Visible (VIS), Near Infrared (NIR), Short Wave Infrared (SWIR), Mid-Wave Infrared (MIR) bands and brightness temperature (BT) for Thermal Infrared (TIR) bands.

As output, it generates a preliminary classification map where each pixel is associated with one label belonging to a discrete set of spectral categories.
In the last version soil mapper® has been extended on its satellite sensors applicability and has been improved in the system performance through the implementation of spectral classification system and the upgrading of cloud detection technique.
Another focal soil mapper® improvement is the classification outputs standardization among the different supported satellite sensors, allowing the land classification comparison between different satellite sensors images.

To run, soil mapper® requires neither user supervision nor ground truth data sample, i.e., it is fully automatic (unsupervised).
Spectral classes detected by soil mapper® have a semantic meaning belonging to the following categories:

  • Vegetation,
  • Bare soil / Built-up,
  • Snow / Ice,
  • Clouds,
  • Water / Shadows,
  • Outliers.

soil mapper® output can be directly used as soil classification map, or can be considered as Baseline Map for further analyses within following fields:

  • specific analyses focused on defined stratus or group of strata
  • advanced semantic-geographic based query
  • multitemporal applications
  • other soil analysis applications

In general, a classification map is suitable for driving further second-stage (e.g., supervised, hierarchical) image analysis algorithms (e.g., segmentation, classification, clustering, topographic correction, etc.) on a stratified image basis.

soil mapper®has been improved to support most common satellite optical sensors (from medium to high and very high resolution), like: MODIS, AVHRR, AATSR, MERIS, Landsat 5 TM/7 ETM+, ASTER, SPOT-4 HRVIR, SPOT-5 HRG, IRS 1-C/-D, IRS-P6, IKONOS, ALOS/AVNIR-2, QuickBird, WorldView-2.
Further ingestion modules related to existing and future missions (e.g. MSG/SEVIRI, CBERS, RapidEye, Pleiades, GeoEye-1) or specific sensors (i.e. airborne multispectral sensors like AVIRIS) can be implemented.

In its current version,soil mapper®, automatically generates three output classification maps of three different classification level sets (each classification set has the same classes number for all the supported satellite sensors):

  • “Complete classification set”, identifying the largest possible set of output spectral categories, consisting of 56 land cover classes
  • “Intermediate classification set”, identifying a reduced set of spectral categories, consisting of 26 land cover classes
  • “Corine land Cover – Like classification set”, identifying a minimal set of spectral categories, consisting of a semantic classification with 12 classes in accordance with the classification system

Besides the three-level classification maps, the last version of soil mapper®generates a series of masks and spectral indexes (Value Added Products) to enrich the software package output and to provide continuous spectral indexes potentially useful for further application-dependent image analysis, like:

  • Vegetation index
  • Normalized difference vegetation index
  • Normalized difference snow index
  • Cloud mask
  • Water mask
  • Vegetation mask

Moreover a series of masked indexes (a combination of the indexes and masks: VI*MVEGETATION, NDVI*MVEGETATION, NDSI*MWATER), can also be generated.

soil mapper® takes inspiration from a paper recently published in RS literature (Baraldi, et al., 2006 “Automatic Spectral Rule-Based Preliminary Mapping of Calibrated Landsat TM and ETM+ Images”, IEEE TGRS, 44 (9) 2563-2586).