Smart, texture-sensitive instrument classification for in situ rock and layer analysis

K. L. Wagstaff, D. R. Thompson, W. Abbey, A. Allwood, D. L. Bekker, N. A. Cabrol, T. Fuchs, K. Ortega

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

Science missions have limited lifetimes, necessitating an efficient investigation of the field site. The efficiency of onboard cameras, critical for planning, is limited by the need to downlink images to Earth for every decision. Recent advances have enabled rovers to take follow-up actions without waiting hours or days for new instructions. We propose using built-in processing by the instrument itself for adaptive data collection, faster reconnaissance, and increased mission science yield. We have developed a machine learning pixel classifier that is sensitive to texture differences in surface materials, enabling more sophisticated onboard classification than was previously possible. This classifier can be implemented in a Field Programmable Gate Array (FPGA) for maximal efficiency and minimal impact on the rest of the system's functions. In this paper, we report on initial results from applying the texture-sensitive classifier to three example analysis tasks using data from the Mars Exploration Rovers. Key Points Smart instruments can analyze their own data for science investigations Random forest classifiers can effectively address texture-based image tasks Future spacecraft can train and deploy their own image classifiers.

Original languageEnglish
Pages (from-to)4188-4193
Number of pages6
JournalGeophysical Research Letters
Volume40
Issue number16
DOIs
StatePublished - 28 Aug 2013
Externally publishedYes

Keywords

  • classification
  • onboard analysis

Fingerprint

Dive into the research topics of 'Smart, texture-sensitive instrument classification for in situ rock and layer analysis'. Together they form a unique fingerprint.

Cite this