Classification of breast ultrasound with human-rating BI-RADS scores using mined diagnostic patterns and optimized neuro-network

Qinghua Huang, Zhaoji Miao, Jiawei Li, Longzhong Liu, Xuelong Li

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Breast ultrasound (BUS) is a powerful screening tool for examination of breast lesions. Recently, research attention has been paid to combining doctor's opinions and machine learning technology to build up a better computer-aided diagnosis (CAD) system. In this paper, we propose an improved approach that uses human-rating BI-RADS scores to classify the BUS samples. A BI-RADS feature scoring scheme is firstly adopted to standardize the descriptions on breast lesions, and then the diagnostic patterns are mined by a biclustering algorithm in the collected BI-RADS feature score dataset. With an input sample, the diagnostic patterns could be activated to different degrees which represent the high-level features by calculating the distance between input sample and patterns. The high-level features of the sample are input into a multi-layer perception neural network (MLPNN) and we use a cost matrix to convert the output from probabilities to classification cost. The structure of the MLPNN and the values of elements in cost matrix are optimized by Particle Swarm Optimization, and it finally classifies the input of BUS sample. According to the comparative experiments with other CAD approaches and the experienced sonographers, the proposed approach achieved the best sensitivity, indicating that it can serve as an assistant diagnostic system in clinical practices.

Original languageEnglish
Pages (from-to)536-542
Number of pages7
JournalNeurocomputing
Volume417
DOIs
StatePublished - 5 Dec 2020

Keywords

  • Adaptive filter approach
  • BI-RADS
  • Cost-sensitive MLPNN
  • Ultrasound CAD approaches

Fingerprint

Dive into the research topics of 'Classification of breast ultrasound with human-rating BI-RADS scores using mined diagnostic patterns and optimized neuro-network'. Together they form a unique fingerprint.

Cite this