+ All Categories
Home > Documents > CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound,...

CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound,...

Date post: 07-Jul-2020
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
9
Defen Externa DRDC-R January Atlas in 3D Auto Mahdi Ma Electrical Stergios S Defence R Canada - IEEE Inte 2005, 201 Date of Pu ce Rese al Literature RDDC-2018 2018 s-Base D Ultras omated arsousi and Ko and Compute Stergiopoulos Research and Toronto Rese ernational Con 5. ublication from earch an e (P) 8-P005 ed Seg sound d Kidne onstantinos N er Engineering d Developmen earch Centre nference on m Ext Publish CAN UNC nd Deve CAN U menta d, and i ey Seg N. Plataniotis g Departmen nt Engineering her: August 20 CLASSIFIED lopment UNCLASSIFIE ation of its App gmenta t, University o in Medicine 015 D t Canad ED f Abdo plicatio ation of Toronto and Biology a ominal on in Society (EMB Organ BC15), pp. 20 ns 001-
Transcript
Page 1: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

DefenExternaDRDC-RJanuary

Atlasin 3DAutoMahdi MaElectrical Stergios SDefence RCanada - IEEE Inte2005, 201 Date of Pu

ce Reseal LiteratureRDDC-2018

2018

s-BaseD Ultrasomatedarsousi and Koand Compute

StergiopoulosResearch andToronto Rese

ernational Con5.

ublication from

earch ane (P) 8-P005

ed Segsound

d Kidneonstantinos Ner Engineering

d Developmenearch Centre

nference on

m Ext Publish

CAN UNC

nd Deve

CAN U

mentad, and iey SegN. Plataniotisg Departmen

nt

Engineering

her: August 20

CLASSIFIED

lopment

UNCLASSIFIE

ation ofits App

gmenta

t, University o

in Medicine

015

D

t Canad

ED

f Abdoplicatioation

of Toronto

and Biology

a

ominalon in

Society (EMB

Organ

BC15), pp. 20

ns

001-

Page 2: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

CAN UNCLASSIFIED

Template in use: (2012) CR EL1 Advanced Template_EN 2017-11_02-V01_WW.dotm

© Her Majesty the Queen in Right of Canada (Department of National Defence), 2018 © Sa Majesté la Reine en droit du Canada (Ministère de la Défense nationale), 2018

CAN UNCLASSIFIED

IMPORTANT INFORMATIVE STATEMENTS

Disclaimer: This document is not published by the Editorial Office of Defence Research and Development Canada, an agency of the Department of National Defence of Canada, but is to be catalogued in the Canadian Defence Information System (CANDIS), the national repository for Defence S&T documents. Her Majesty the Queen in Right of Canada (Department of National Defence) makes no representations or warranties, expressed or implied, of any kind whatsoever, and assumes no liability for the accuracy, reliability, completeness, currency or usefulness of any information, product, process or material included in this document. Nothing in this document should be interpreted as an endorsement for the specific use of any tool, technique or process examined in it. Any reliance on, or use of, any information, product, process or material included in this document is at the sole risk of the person so using it or relying on it. Canada does not assume any liability in respect of any damages or losses arising out of or in connection with the use of, or reliance on, any information, product, process or material included in this document.

This document was reviewed for Controlled Goods by Defence Research and Development Canada (DRDC) using the Schedule to the Defence Production Act.

Page 3: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound,and its Application in Automated Kidney Segmentation

Mahdi Marsousi1, Konstantinos N. Plataniotis2 and Stergios Stergiopoulos3

Abstract— Automated segmentation of abdominal organs in3D ultrasound images is an important and challenging task to-ward computer assisted emergency diagnosis. However, specklenoise, low-contrast organ tissues, intensity-profile inhomogene-ity, and partial organ visibility are some ultrasound challengeswhich limits the utility of the automated diagnosis solutions.In this paper, an atlas-based method to automatically segmentan organ of interest in abdominal 3D ultrasound images isproposed. The atlas model contains texture information andshape knowledge of the organ, which facilitates an accuratediscrimination of organ from non-organ voxels in input 3Dultrasound images. The proposed method offers a mechanismto automatically detect the organ, and therefore, it eliminatesthe need of manual initialization of organ segmentation. Theproposed method is applied to automatically segment the rightkidney in 3D ultrasound images. The experimental resultsindicate that the proposed method provides a higher detectionand segmentation accuracy compared to state-of-the-art.

I. INTRODUCTIONThe utility of three-dimensional (3D) ultrasound imaging

has been increased in medical diagnosis, due to qualityimprovement and faster image acquisition recently achievedin 3D ultrasound devices [1]. 3D ultrasound imagery pro-vides a non-invasive and portable tool to visualize abdominalorgans and blunt trauma. In emergency diagnosis, ultrasoundimaging has advantages over magnetic resonance imaging(MRI) and computed tomography, since (1) it rapidly ac-quires abdominal views without imposing any side effect onthe patients, and (2) because of the portability of ultrasoundimaging devices, it eliminates the need of moving an unstablepatient from a resuscitation room to an imaging room [2]. Forcomputer assisted medical diagnosis, 3D ultrasound imagingoffers advantages compared to 2D ultrasound imaging interms of reliability, ease of use and accuracy [2]. A single3D ultrasound image of a particular abdominal view reducesviewing-angle dependency that exists in 2D systems, whichfacilitates the design of automated medical diagnosis inemergency applications [3]. Despite the advantages providedby 3D ultrasound imagery, its excessive data complexity over2D ultrasound arises an immediate need of computerizedtools to interpret, analyze and visualize volumetric data.

Automated organ segmentation is at the heart of com-puterized medical diagnosis systems. In order to provideautomated organ segmentation, a robust and accurate organ

1 M. Marsousi is with the Electrical and Computer Engineering depart-ment at the University of Toronto [email protected]

2 K. Plataniotis is with the Electrical and Computer Engineering depart-ment, University of Toronto [email protected]

3 S. Stergopoulos is with the Defence Research and DevelopmentCanada Toronto, an Agency for the Candadian department of [email protected]

detection module, which eliminates the need of manuallyinitializing segmentation, is essential. Although the tasks ofabdominal organ detection and segmentation in 3D MRI andCT images have been extensively investigated [4]–[6], thereare relatively few methods representing organ segmentationin 3D ultrasound images. This is because processing andanalyzing ultrasound images face the following challenges:

• Ultrasound-specific challenges: speckle noise de-grades quality of images. Low contrast intensity reducesseparability of an organ from its surrounding, andinconsistent intensity and discontinuity among organ’sboundary result in a poor segmentation quality [7].

• Organ-specific difficulties: the location of an organand its neighbor tissues with high beam scatteringcharacteristic may result in partial invisibility of theorgan shape leading to incorrect decisions [8].

• Operator-specific problems: misalignment of the ultra-sound probe due to operator’s inexperience and limitedtraining leads to an organ’s partial visibility causing amis-characterization of the organ’s shape [8].

The task of abdominal organ segmentation in 3D ultra-sound images has been addressed in some papers [8]–[11].Fernandez and Lopez [9] proposed a method to segmentthe kidney in 3D ultrasound images by combining Markovrandom field and active contours (MRF-AC). The MRF ap-plies intensity information of neighbor pixels in the contourevolution process, which reduces the segmentation sensitivityto noise. The kidney segmentation task is performed in 2Dslices, and then, the segmented contours are concatenatedto build up a 3D kidney shape. This results in shapediscontinuity along the z− axis. Yang et al. [10] proposeda method based on atlas registration and statistical texturepriors to segment the prostate in trasrectal ultrasound (TRUS)images. The method in [10] uses a prostate atlas to train SVMclassifiers, and thereby, it does not require initialization ofa shape model. The atlas database is comprised by a setof TRUS images and ground-truth (masks) data. Each inputimage is registered on the atlas images to find deformationfields. The deformation fileds are applied on the images andmasks in the atlas database, and then, SVM classifiers aretrained to discriminate between prostate and non-prostatevoxels. Finally, the input image is classified into prostateand non-prostate voxels by the trained SVMs. Since foreach input image, SVM classifiers are required to be trained,this method burdens with a massive computational load.Recently, Noll et al. [11] proposed a method to detectand segment the kidney in 3D ultrasound volumes. In their

Page 4: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

method, the ultrasound volumes are preprocessed to reducespeckle noise and improve intensity contrast. The method in[11] starts with placing candidate nodes on some brightestpoints on the input volume. Then, a search graph strategyis employed in which radial rays are emitted on threecoordinate planes to find zero-to-one transitions. Each radialray with a zero-to-one transition is labeled as “on”, andotherwise “off”. A candidate node with more than 80%of on-rays is selected as the detected kidney center point.To segment the detected kidney, the on-rays are used toinitialize the fast marching approach [12]. The fast marchingoutput is then used to create an initial level-set function, andfinally, the edge-based level-set method [13] is applied tosegment the kidney. This method often made false positivedetections by selecting non-kidney regions. In addition, theedge-based level-set method is highly sensitive to noise anddiscontinuities among the kidney’s boundary.

Marsousi et al. [8] proposed a shape-based kidney detec-tion and segmentation. An input 3D ultrasound volume ispreprocessed to address the ultrasound-specific challenges.By applying local thresholding on the preprocessed volume,a binarized volume is obtained that classifies voxels intokidney and non-kidney candidates. Then, an area-based rigidregistration supporting a translational deformation is appliedto find value and location of the maximum matching ofa kidney shape model and the binarized volume. If themaximum matching is greater than a threshold, it decidesthat the kidney exists in the input ultrasound volume. Then,the fitted shape is deformed by rigid and elastic deformations(region-based level-set) to segment the detected kidney. Sincethis method utilizes area-based registration, it is not robustagainst the organ orientation and scaling.

In this paper, an atlas-based method is presented to seg-ment abdominal organs in 3D ultrasound images. The atlasdatabase includes a reference volume, organ shape modeland spatially aligned neural network (SANN) classifiers.The proposed method consists of training and segmentationprocesses. In the training step, a set of training ultrasoundvolumes along with their ground-truth data are rigidly regis-tered on the reference volume. Then, the registered trainingvolumes are used to generate the atlas database. In the seg-mentation step, a feature-based rigid registration is appliedto fit the organ shape of the input volume on the organ shapeof the reference volume. Afterward, SANNs are applied toclassify voxels into organ and non-organ candidates basedon texture information, and a binarized volume is obtained.Then, the organ shape model is rigidly deformed to fit insidethe binarized volume. The fitted shape model is used toinitialize a level-set function. Finally, the region-based level-set method is applied to segment the organ. Compared to themethod in [8], this paper offers the following contributions:• Texture information is utilized to classify voxels into

organ and non-organ candidates, instead of using localthresholding, which reduces false positive detections.

• A feature-based registration is applied to match inputvolumes on the organ shape in the reference volume,which improves the performance of classifying voxels.

• An affine registration based on cross-correlation matrixis applied to fit the organ shape model on the clas-sified voxels, and to decide whether the organ exists.It increases the segmentation robustness against organdeformations.

The rest of the paper is organized as follows: In sectionII, the proposed method including the training and segmen-tation steps are introduced in details. Then, in section III,some experiments are represented to evaluate the proposedsolution. Finally, section IV concludes the paper.

II. METHODOLOGY

This paper introduces an approach to segment abdmonialorgans in 3D ultrasound images using an atlas model. Theatlas model is generated through a training process, andis comprised by organ shape model and SANN classifiers.Then, the atlas model is used in the segmentation processwhich is comprised by classification, rigid and elastic de-formations. In this section, the training and segmentationprocesses are represented.

A. Training Process

The training process is designed to generate an atlas modelwhich represents shape and texture information of an organof interest. Assume, a training set of ultrasound volumes,V tri ∈ <Sx×Sy×Sz |i ∈ [1, 2, · · · , Ntr]

, exists in which

each volume entirely visualizes the organ shape. Ground-truth data as binarized masks outlining the organ shapesare manually generated for each training volume as, φtri,where φtri(x, y, z) = 1 for voxels belonging to the organ,and otherwise, φtri(x, y, z) = −1. We are interested in a sta-tistical shape model representing an average organ shape andits variability. A reference shape, φref , is arbitrarily selectedfrom the training shapes, and all the other training shapesare rigidly registered on the reference shape via similaritytransformations,

M i|i ∈ [1, 2, · · · , Ntr]

. The transforma-

tion matrices, M is, are applied on both the training volumesand masks, V regi , φregi |i ∈ [1, 2, · · · , Ntr]. Then, the sta-tistical shape model is generated as, Φ =

∑Ntrn=1 φ

regi

Ntr[8].

Texture information of the organ of interest provides amean to discriminate organ from non-organ voxels. The 3DGabor filters are recognized as an effective tool to extracttexture information [14], and they are utilized to extracttexture features from the registered training volumes as [14],

g( ~X) = g( ~X).e(2πj(F sin θy cos θzx+F sin θy sin θzy+F cos θyz)), (1)

g(x, y, z) =1

(2π)32 σ3

.e

(− (x2+y2+z2)

2σ2

), (2)

where F is the radial center frequency, ~X is [x, y, z]T , σ isthe standard deviation which determines the Gaussian scale,and θy and θz are rotational angles over y- and z- axes,respectively. A set of Gabor filters with σ ∈ 0.3, 0.6,θz ∈ 0, π3 ,

2π3 and θy ∈ 0, π4 ,

pi2 are applied on

the registered training resulting in 18 volumetric features,Vregigj = V regi ∗ gj |j ∈ [1, 2, · · · , 18]

.

Page 5: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

]

]

vect …

vect …

vect …

vect …

… vect…

… vect…

]

Training

Training

Gabor Filtering

Extract Sub-volumes Vectorization Concatenating

Features

ConcatenatingFeatures of Sub-volumes

Training Neural NetworkClassifiers

Extract Sub-volumesVectorizationConcatenating

Labels

Fig. 1. Displaying processing pipeline of training spatially aligned neural network (SANN) classifiers.

: Image Domain

Fig. 2. Displaying an arbitrary organ split into Nx = 3 and Ny = 3rectangular regions. For the ease of understanding, a 2D model is depicted.

The texture features are fed into SANNs classifiers, which

each classifier is trained and applied on a rectangular region

which partially covers the organ of interest. The combination

of the classifiers encompasses the entire organ shape (Fig.

2). Lets define Nx, Ny and Nz as the number of spatially

aligned classifiers over x-, y- and z- axes, respectively.

Also, we consider 50% pixel overlapping for aligning the

rectangular regions. For an organ of size Wx×Wy×Wz and

its top-left corner point coordinated at B = [Bx, By, Bz]T ,

boundaries of rectangular region, Φ[kx,ky,kz ], are calculated

as, Bx + Wx(Kx−1)Nx+1 ≤ x < Bx + Wx(Kx+1)

Nx+1 , By +Wy(Ky−1)

Ny+1 ≤ y < By +Wy(Ky+1)

Ny+1 , and Bz +Wz(Kz−1)

Nz+1 ≤z < Bz + Wz(Kz+1)

Nz+1 , where kx ∈ [1, 2, · · · , Nx], ky ∈[1, 2, · · · , Ny] and kz ∈ [1, 2, · · · , Nz]. For each training

volume, sub-volumes from Vregigj and φregi are extracted as,

Vregi

gj ,[kx,ky,kz ]and φ

regi

[kx,ky,kz ]. In the next step, feature vectors

fregi

gj ,[kx,ky,kz ]and label vector l

regi

[kx,ky,kz ]are obtained by

vectorizing Vregi

gj ,[kx,ky,kz ]and φ

regi

[kx,ky,kz ], respectively. Then, a

feature matrix is formed by concatenating the feature vectors,

as fregi

[kx,ky,kz ]. To form the total features matrix, f[kx,ky,kz ], all

the features are then vertically concatenated as, f[kx,ky,kz ] =

[freg1

[kx,ky,kz ]; f

reg2

[kx,ky,kz ]; · · · ; fregNtr

[kx,ky,kz ]]. The total labels vec-

tor, l[kx,ky,kz ], is obtained by vertical concatenation of la-

bels as, l[kx,ky,kz ] = [lreg1

[kx,ky,kz ]; l

reg2

[kx,ky,kz ]; · · · ; lregNtr

[kx,ky,kz ]].

Finally for each sub-volume, a neural network classifier,

NET[kx,ky,kz ], are trained with the total features matrix,

f[kx,ky,kz ], and total labels vector, l[kx,ky,kz ]. This architec-

ture, shown in Fig. 1, provides two advantages:

• Texture information highly vary throughout the entire

organ shape, and training a single classifier leads to

poor sensitivity of detecting voxels of the organ of in-

terest. Thus, using multiple spatially aligned classifiers

improves the segmentation sensitivity [15].

• By spatially aligning the classifiers, irrelevant voxels,

which are placed out of spatial range of the classifiers,

are not entered into classification, resulting in less false

positive detections.

B. Segmentation Process

In this sub-section, we introduce how the generated atlas

database is applied to automatically segment the abdominal

organ in input 3D ultrasound images (Fig. 3). In order to

achieve an automated segmentation method, an organ detec-

tion process is needed that (1) decides whether the organ

exists in the input 3D image, and (2) finds the alignment of

the detected organ respect to the reference shape. Assume

V in is an input 3D ultrasound image. First, voxels of V in

are classified into organ and non-organ candidates using

the trained SANN classifiers, NET[kx,ky,kz ], and V cl is

achieved. Then, the generated shape model, Φ, is rigidly

registered on V cl using an affine transformation, and Φreg

is achieved. If the maximum cross-correlation, Γ, is greater

than a threshold level, Γth, then the organ detection process

decides that the organ of interest exists in V in. Then, Φreg is

used to initialize a level-set function, and finally, the level-

set function is propagated using the region-based level-set

approach until the organ of interest is segmented [8].

1) Organ Detection: The organ detection process starts

with denoising V in by a 3D Gaussian-Hamming finite

impulse response (FIR) filter introduced in [8], and V dn

is obtained. Before applying SANN classifiers, an existing

organ of interest in V in should be correctly aligned within

the boundaries of SANNs, otherwise its voxels are not

correctly classified. Hence, a feature-based registration based

on an affine transformation is applied to align the organ (if

exists) in V in on φref . First, we need to select some featured

points on the reference shape, and then extract features

of the selected points. The featured points are selected

on φref as[xk, yk, zk]

T |φref (xk, yk, zk) > 0

where k ∈[1, 2, · · · , Nfe]. Nfe featured points are selected on φref

with equal distances from each other. Then, for each feature

point, its feature is extracted as fek = V ref (n,m, l)|xk −

Page 6: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

wfe ≤ n ≤ xk+wfe, yk−wfe ≤ m ≤ yk+wfe, zk−wfe ≤l ≤ zk + wfe where wfe is the half-width of the features.Afterward, for each extracted feature, fek, we search fora corresponding point on V dn based on a maximum cross-correlation metric, [xk, yk, zk]T . Having the pairs of points,the feature-based registration problem is formulated as,

min~paff

Nfe∑k=1

∥∥∥∥∥∥ xkykzk

− T~paff

xkykzk

∥∥∥∥∥∥2

, (3)

T~paff =

sx · r11 r12 r13

r21 sy · r22 r23

r31 r32 sz · r33

txtytz

~0T 1

Rθx ·Rθy ·Rθz =

r11 r12 r13

r21 r22 r23

r31 r32 r33

,(4)

where ~paff = [tx, ty, tz, sx, sy, sz, θx, θy, θz] is the parame-ters’ vector of the affine transformation, tx, ty, tz are trans-lation parameters, sx, sy, sz are the scaling parameters,and θx, θy, θz are the orientation parameters. Rθx , Rθy ,and Rθz are 3D rotation matrices along x-, y− and z- axes,respectively. The equation (3) is solved using the proposedmethod by Horn [16]. The calculated affine transformationis then applied on V dn to align its organ (if exists) on thereference shape, and V reg is obtained.

In order to classify voxels of the registered volume, V reg,into organ and non-organ candidates, its texture featuresare required to be extracted using equation (1) with σ ∈0.3, 0.6, θz ∈ 0, π3 ,

2π3 and θy ∈ 0, π4 ,

pi2 , result-

ing in 18 volumetric features,V reggj |j ∈ [1, 2, · · · , 18]

.

Then, all the sub-volumes are extracted and vectorized toform feature vectors, f[kx,ky,kz ], where kx ∈ [1, 2, · · · , Nx],ky ∈ [1, 2, · · · , Ny] and kz ∈ [1, 2, · · · , Nz]. Afterward, thefeature vectors are fed into their corresponding classifiers,NET[kx,ky,kz ], and label vectors, l[kx,ky,kz ], are achieved.Then, the label vectors are reshaped to form sub-volumes,Y[kx,ky,kz ], and by combining sub-volumes, adding over-lapped sub-volumes, and dividing the result by 8, the classi-fied volume is obtained, V cl, where 0 ≤ V cl(x, y, z) ≤ 1.

The organ detection process is finalized by performingan area-based registration to fit the shape model on V cl.It searches for an affine transform, T~pST , that maximizesthe cross-correlation between V cl and transformed shapemodel, Φ(T ′~paff [x, y, z]T ), where T ′ and ~paff are definedin equation (3). This registration problem is formulated as,

~paff , ~Xc,Γ = max~paff

max~Xc∈Ω

∑~Y ∈ΩΦ

V cl( ~Xc + ~Y )Φ(T ′~paff~Y )

, (5)

where Γ is the maximum cross-correlation of V cl anddeformed Φ, and ~Xc = [xc, yc, zc]

T is the candidate or-gan shape center. Ω and ΩΦ are domains of the inputultrasound volume and organ shape model, respectively.In order to accept the fitted shape model as a detectedorgan, the maximum cross-correlation should be higher thanthe threshold value , Γ > Γmin. The Gradient Descentmethod is used to solve equation (5), in which 12 up-dating vectors are utilized to iteratively update ~paff to

move toward a sub-optimal solution. The updating vectorsare ~e1,2 = [±δsx, 0, 0, 0, 0, 0], ~e3,4 = [0,±δsy, 0, 0, 0, 0],~e5,6 = [0, 0,±δsz, 0, 0, 0], ~e7,8 = [0, 0, 0, 0,±δθx, 0, 0],~e9,10 = [0, 0, 0, 0,±δθy, 0] and ~e11,12 = [0, 0, 0, 0, 0,±δθz].

Preprocessing

Preprocessing

Select Featured Points

Extract Features

Find Corresponding

Points

Calculate Affine Transformation

Apply Affine Transformation

Extract Features (3D Gabor Filters)

Apply SANN Classifiers

Affine Registration

Level-Set Initialization

Level-Set Propagation

𝑉𝑖𝑛

𝑉𝑟𝑒𝑓

𝑇 𝑝𝑎𝑓𝑓

𝑉𝑟𝑒𝑔𝑉𝑐𝑙

Φ𝑟𝑒𝑔

𝑉𝑑𝑛

Segmented Organ

Vectorization & Concatenation

Feat

ure

-bas

ed R

egis

trat

ion

Γ > Γ𝑚𝑖𝑛?

Γ

Yes Org

an D

etec

tio

n

Φ

Fig. 3. Displaying block diagram of the proposed organ segmentation.

III. EXPERIMENTS AND RESULTSThe proposed method is applied to automatically segment

the right kidney in abdominal 3D ultrasound images. Wesetup experiments to evaluate accuracy of the proposedapproach, compared to the existing 3D kidney segmentationmethods including MRF-AC [9], Marsousi et al.-EMBC14[8], and Noll et al. [11]. The proposed approach is im-plemented using parallel processing in MATLAB R2014b(parfor) on a processor with 8 cores (3MHz). All the othermethods are also developed in MATLAB. A set of ultrasoundvolumes are used including 36 volumes acquired by a GEVoluson e ultrasound machine from 8 healthy male andfemale volunteers, in which each volume has a size of178 × 250 × 178px3. 21 volumes are acquired from theright upper quadrant view of the eFAST exam [17], whicheach of them visualizes the right kidney, “with-kidney”. Theother volumes are randomly acquired from other abdominalultrasound views, “without-kidney”. We split the volumesinto training and evaluation sets. The training set contains6 volumes to create the atlas database. The evaluation setconsists of 30 volumes in which 15 volumes are with-kidneyand the other 15 volumes are without-kidney. For each with-kidney volume, a ground-truth, GT , as a 3D binarized maskis manually drawn using Turtle-seg [18].

To evaluate the kidney detection methods, we use theaccuracy measure as AccKD = 100% NTP+NTN

NTP+NTN+NFP+NFNwhere NTP , NTN , NFP and NFN are numbers of true pos-itive, true negative, false positive and false negative kidneydetections, respectively. To evaluate the kidney segmentationmethods, we use Dice’s coefficient DSC = 2TP

2TP+FN+FP ,accuracy measure ACC = (100%). TP+TN

TP+TN+FP+FN , andmean distance MD = 1

|AS|∫p′∈AS e(p

′, GT )dp′ wheree(p,GT ) is the minimum L-2 norm of a voxel p to GT . TP ,TN , FP and FN are true positive, true negative, false pos-itive and false negative segmentation regions, respectively.

We applied the evaluation set to assess the kidney de-tection accuracy of the proposed method, Marsousi et al-EMBC14 and Noll et al.. The results are shown in TableI. Accordingly, the proposed method provides the highestdetection accuracy, without any FP detection. The kidney

Page 7: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

segmentation results of the proposed methods, MRF-AC,Marsousi et al.-EMBC14, and Noll et al. for the 15 with-kidney volumes in the evaluation set are demonstrated inTable II. Accordingly, the proposed method provides thehighest dice’s coefficient, DSC = 0.51±0.17, highest accu-racy, ACC(%) = 94.01± 1.93, and the least mean distanceerror, MD = 3.84± 2.12. Compared to Marsousi et al., theproposed method provides a higher segmentation accuracydue to the use of SANN classifiers and affine registration ofthe shape model on V cl. Fig. 4 shows segmentation resultsof a volume in the evaluation set.

TABLE ICOMPARING ACCURACY OF THE PROPOSED KIDNEY DETECTION

METHOD WITH MARSOUSI et al.-EMBC14 AND NOLL et al. [11].

Method NTP NTN NFP NFN ACCKD(%)Noll et al. 8 11 3 8 63.33

Marsousi et al. 11 14 4 1 83.33Proposed Method 12 14 0 4 86.67

TABLE IICOMPARING µ AND σ OF THE DSC AND ACC, AND MD METRICES OF

THE SEGMENTATION METHODS: THE PROPOSED METHOD, MARSOUSI et

al.-EMBC14, NOLL et al. AND MRF-AC.

DSC ACC(%) MDµ σ µ σ µ σ

Marsousi et al. 0.41 0.08 93.71 1.35 4.21 2.80Noll et al. 0.34 0.07 88.07 0.02 13.99 3.25MRF-AC 0.48 0.16 92.93 2.45 6.15 5.04

Proposed Method 0.51 0.17 94.01 1.93 3.84 2.12

Marsousi et al.: 𝐷𝑆𝐶 = 0.48

Noll et al.: 𝐷𝑆𝐶 = 0.47 MRF-AC: 𝐷𝑆𝐶 = 0.62

Proposed Method: 𝐷𝑆𝐶 = 0.50

Fig. 4. Displaying kidney segmentation results. The green, red and yellowregions show ground truth, automated segmentation and overlap regions.

IV. CONCLUSION

This paper introduces an automated organ segmentationmethod in abdominal 3D ultrasound images. The atlasdatabase represents both shape and texture knowledge of anorgan of interest. The proposed method consists of trainingand segmentation processes. In the training process, an organshape model is generated based on manually segmentedshapes. Then, texture features are extracted using 3D Gaborfilters, and SANN classifiers are trained to discriminateorgan and non-organ voxels. In the segmentation process,an input volume is first registered on the reference volumeusing a feature-based registration method. Then, features are

extracted with 3D Gabor filters, and the trained classifiers areapplied to separate organ from non-organ voxels. Then, theshape model is rigidly registered on the classified volume,and it decides whether the organ exists. Finally, an elasticdeformation using the region-based level-set method is ap-plied to segment the organ. The proposed method is appliedon kidney segmentation in abdominal 3D ultrasound images.The accuracy of the proposed method is compared to state-of-the-art, and the reported results confirm the superiority ofthe proposed method in this paper. As a direction for futureresearch, we investigate the utility of learnt-based filters.

REFERENCES

[1] S. Stergiopoulos and S. Yi-Ting, “Portable 4d ultrasound diagnosticimaging system.” in Proc. IEEE UFFCS, 2011.

[2] M. Christie-Large, D. Michaelides, and S. L. J. James, “Focusedassessment with sonography for trauma: the fast scan,” Trauma,vol. 10, no. 2, pp. 93–101, 2008.

[3] M. Riccabona, T. R. Nelson, and D. H. Pretorius, “Three-dimensionalultrasound: accuracy of distance and volume measurements,” Ultra-sound in Obstetrics Gynecology, vol. 7, no. 6, pp. 429–434, 1996.

[4] X. Chen and U. Bagci, “3d automatic anatomy segmentation basedon iterative graph-cut-asm,” Med. phys., vol. 38, no. 8, pp. 4610–22,2011.

[5] K. Li and B. Fei, “A new 3d model-based minimal path segmentationmethod for kidney mr images,” in Int. Conf. Bioinfo. and Biomed.Eng. (ICBBE), 2008, pp. 2342–44.

[6] R. Cuingnet, R. Prevost, D. Lesage, L. D. Cohen, B. Mory, andR. Ardon, “Automatic detection and segmentation of kidneys in 3dct images using random forests,” in Med. Image Comp. and Comp.-Assisted Interv. MICCAI 2012. Springer, 2012, vol. 7512, pp. 66–74.

[7] A. Belaid, D. Boukerroui, Y. Maingourd, and J. F. Lerallut, “Implicitactive contours for ultrasound images segmentation driven by phaseinformation and local maximum likelihood,” in IEEE Int. Symp. onBiomed. Imag.: From Nano to Macro, 2011, pp. 630–635.

[8] M. Marsousi, K. N. Plataniotis, and S. Stergiopoulos, “Shape-basedkidney detection and segmentation in three-dimensional abdominalultrasound images,” in Proc. IEEE Eng. Med. Biol. Soc., Aug 2014,pp. 2890–94.

[9] M. Martn-Fernndez and C. Alberola-Lpez, “An approach for contourdetection of human kidneys from ultrasound images using markovrandom fields and active contours,” Med. Image Anal., pp. 1–23, 2005.

[10] X. Yang, D. Schuster, V. Master, P. Nieh, A. Fenster, and B. Fei, “Au-tomatic 3d segmentation of ultrasound images using atlas registrationand statistical texture prior,” in SPIE Med. Imag. International Societyfor Optics and Photonics, 2011, pp. 796 432–796 432.

[11] M. Noll, X. Li, and S. Wesarg, “Automated kidney detection andsegmentation in 3d ultrasound,” in Clinical Image-Based Proc. Trans-lational Research in Medical Imag. Springer, 2014, pp. 83–90.

[12] N. Forcadel, C. L. Guyader, and C. Gout, “Generalized fast march-ing method: applications to image segmentation,” Numerical Algo.,vol. 48, no. 1-3, pp. 189–211, 2008.

[13] R. Malladi, J. A. Sethian, and B. C. Vemuri, “Shape modeling withfront propagation: A level set approach,” IEEE Trans. Pattern Anal.and Mach. Intell., vol. 17, no. 2, pp. 158–175, 1995.

[14] Y. Wang and C.-S. Chua, “Face recognition from 2d and 3d imagesusing 3d gabor filters,” Image and Vision Computing, vol. 23, no. 11,pp. 1018 – 1028, 2005.

[15] Y. Zhan and D. Shen, “Deformable segmentation of 3-d ultrasoundprostate images using statistical texture matching method,” IEEETransactions on Med. Imag., vol. 25, no. 3, pp. 256–272, March 2006.

[16] B. K. Horn, “Closed-form solution of absolute orientation using unitquaternions,” JOSA A, vol. 4, no. 4, pp. 629–642, 1987.

[17] G. S. Rozycki, M. G. Ochsner, D. V. Feliciano, B. D. Thomas,B. R. Boulanger, F. E. Davis, R. E. Falcone, and J. A. Schmidt,“Early detection of hemoperitoneum by ultrasound examination of theright upper quadrant: A multicenter study,” The J. of Trauma: Injury,Infection, and Critical Care, vol. 45, no. 5, pp. 878–883, 1998.

[18] A. Top, G. Hamarneh, and R. Abugharbieh, “Active learning forinteractive 3d image segmentation,” in Med. Image Comp. and Comp.-Assisted Interv. (MICCAI), ser. LNCS, vol. 6893. Springer, 2011, pp.603–610.

Page 8: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

DOCUMENT CONTROL DATA (Security markings for the title, abstract and indexing annotation must be entered when the document is Classified or Designated)

1. ORIGINATOR (The name and address of the organization preparing the document. Organizations for whom the document was prepared, e.g., Centre sponsoring a contractor's report, or tasking agency, are entered in Section 8.) DRDC – Toronto Research Centre Defence Research and Development Canada 1133 Sheppard Avenue West P.O. Box 2000 Toronto, Ontario M3M 3B9 Canada

2a. SECURITY MARKING (Overall security marking of the document including special supplemental markings if applicable.)

CAN UNCLASSIFIED

2b. CONTROLLED GOODS

NON-CONTROLLED GOODS DMC A

3. TITLE (The complete document title as indicated on the title page. Its classification should be indicated by the appropriate abbreviation (S, C or U) in parentheses after the title.) Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation

4. AUTHORS (last name, followed by initials – ranks, titles, etc., not to be used) Marsousi, Mahdi ; Plataniotis, Konstantinos N.; Stergiopoulos, Stergios

5. DATE OF PUBLICATION (Month and year of publication of document.) January 2018

6a. NO. OF PAGES (Total containing information, including Annexes, Appendices, etc.)

5

6b. NO. OF REFS (Total cited in document.)

18 7. DESCRIPTIVE NOTES (The category of the document, e.g., technical report, technical note or memorandum. If appropriate, enter the type of report,

e.g., interim, progress, summary, annual or final. Give the inclusive dates when a specific reporting period is covered.) External Literature (P)

8. SPONSORING ACTIVITY (The name of the department project office or laboratory sponsoring the research and development – include address.) DRDC – Toronto Research Centre Defence Research and Development Canada 1133 Sheppard Avenue West P.O. Box 2000 Toronto, Ontario M3M 3B9 Canada

9a. PROJECT OR GRANT NO. (If appropriate, the applicable research and development project or grant number under which the document was written. Please specify whether project or grant.)

9b. CONTRACT NO. (If appropriate, the applicable number under which the document was written.)

10a. ORIGINATOR’S DOCUMENT NUMBER (The official document number by which the document is identified by the originating activity. This number must be unique to this document.) DRDC-RDDC-2018-P005

10b. OTHER DOCUMENT NO(s). (Any other numbers which may be assigned this document either by the originator or by the sponsor.)

11a. FUTURE DISTRIBUTION (Any limitations on further dissemination of the document, other than those imposed by security classification.)

Public release

11b. FUTURE DISTRIBUTION OUTSIDE CANADA (Any limitations on further dissemination of the document, other than those imposed by security classification.)

Page 9: CAN UNCLASSIFIEDD · 2018-03-12 · Atlas-Based Segmentation of Abdominal Organs in 3D Ultrasound, and its Application in Automated Kidney Segmentation Mahdi Marsousi1, Konstantinos

12. ABSTRACT (A brief and factual summary of the document. It may also appear elsewhere in the body of the document itself. It is highly desirable that the abstract of classified documents be unclassified. Each paragraph of the abstract shall begin with an indication of the security classification of the information in the paragraph (unless the document itself is unclassified) represented as (S), (C), (R), or (U). It is not necessary to include here abstracts in both official languages unless the text is bilingual.)

Automated segmentation of abdominal organs in 3D ultrasound images is an important and challenging task toward computer assisted emergency diagnosis. However, speckle noise, low-contrast organ tissues, intensity-profile inhomogeneity, and partial organ visibility are some ultrasound challenges which limits the utility of the automated diagnosis solutions. In this paper, an atlas-based method to automatically segment an organ of interest in abdominal 3D ultrasound images is proposed. The atlas model contains texture information and shape knowledge of the organ, which facilitates an accurate discrimination of organ from non-organ voxels in input 3D ultrasound images. The proposed method offers a mechanism to automatically detect the organ, and therefore, it eliminates the need of manual initialization of organ segmentation. The proposed method is applied to automatically segment the right kidney in 3D ultrasound images. The experimental results indicate that the proposed method provides a higher detection and segmentation accuracy compared to state-of-the-art. ___________________________________________________________________________

13. KEYWORDS, DESCRIPTORS or IDENTIFIERS (Technically meaningful terms or short phrases that characterize a document and could be helpful in cataloguing the document. They should be selected so that no security classification is required. Identifiers, such as equipment model designation, trade name, military project code name, geographic location may also be included. If possible keywords should be selected from a published thesaurus, e.g., Thesaurus of Engineering and Scientific Terms (TEST) and that thesaurus identified. If it is not possible to select indexing terms which are Unclassified, the classification of each should be indicated as with the title.) 3D Ultrasound Imaging, Atlas-Based Organ Segmentation, Kidney Detection


Recommended