Cancer-cell deep-learning classification by integrating quantitative-phase spatial and temporal fluctuations

Shani Ben Baruch, Noa Rotman-Nativ, Alon Baram, Hayit Greenspan, Natan T. Shaked

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

We present a new classification approach for live cells, integrating together the spatial and temporal fluctuation maps and the quantitative optical thickness map of the cell, as acquired by common-path quantitative-phase dynamic imaging and processed with a deep-learning frame-work. We demonstrate this approach by classifying between two types of cancer cell lines of different metastatic potential originating from the same patient. It is based on the fact that both the cancer-cell morphology and its mechanical properties, as indicated by the cell temporal and spatial fluctu-ations, change over the disease progression. We tested different fusion methods for inputting both the morphological optical thickness maps and the coinciding spatio-temporal fluctuation maps of the cells to the classifying network framework. We show that the proposed integrated triple-path deep-learning architecture improves over deep-learning classification that is based only on the cell morphological evaluation via its quantitative optical thickness map, demonstrating the benefit in the acquisition of the cells over time and in extracting their spatio-temporal fluctuation maps, to be used as an input to the classifying deep neural network.

Original languageEnglish
Article number3353
JournalCells
Volume10
Issue number12
DOIs
StatePublished - Dec 2021
Externally publishedYes

Keywords

  • Cancer cells
  • Classification
  • Deep learning
  • Fluctuations

Fingerprint

Dive into the research topics of 'Cancer-cell deep-learning classification by integrating quantitative-phase spatial and temporal fluctuations'. Together they form a unique fingerprint.

Cite this