Development of depth level estimation algorithm for breast self-examination

Date of Publication


Document Type

Master's Thesis

Degree Name

Master of Science in Electronics and Communications Engineering


Gokongwei College of Engineering


Electronics And Communications Engg


It has been proposed in the previous studies of Mohammadi et al[1] and Masilang et al[2] to develop a computer vision-based Breast Self Examination guidance system. It was able to indicate to the user on which area of the breast should be palpated using a simple webcam and desktop computer. However, a comprehensive Breast Self-Examination guidance system requires not just means palpating all areas of the breasts, but also each palpation should suffice a proper pressure level. The goal of this thesis is to develop a depth level estimation algorithm that can recognize whether the palpation is low, medium or deep palpation using only a simple RGB camera and a desktop computer.

To do this, we gathered a RGB and Depth Images of subjects, with varying cup size, performing BSE using Kinect for Xbox 360. A medical expert supervised the recording of the said dataset. Also, this study introduces an evaluation scheme for quantifying low, medium and deep pressure level coming from the said dataset using a Fuzzy-like membership Relation. The said evaluation scheme was also utilized to have a quantitative accuracy for the previous studies of depth estimation for BSE.

For estimating the depth using simple camera, this study proposes the use of Local Binary Pattern Global Histogram Features and 9 Laws Textures Histogram as the feature extraction scheme. These features shall be the input to Support Vector Machine. Our depth level estimation algorithm was able to classify depth levels with an overall test accuracy of 77.71%. It provides a 250% higher accuracy than the state-of-the-art.

Abstract Format






Accession Number


Shelf Location

Archives, The Learning Commons, 12F Henry Sy Sr. Hall

Physical Description

1 computer optical disc ; 4 3/4 in.

This document is currently not available here.