Hi
all,
These
last two weeks have been very busy, mainly with finalizing the testing of the
code. There have been unforeseen road-blocks during the testing, especially
regarding the handling of singularities. It hasn’t been easy, since these have
to comply with the desired segmentation task, especially when taking into
account the various input images that one can use. Below is an example of the
test for a grey-scale image such as a T1 image of the brain:
def test_greyscale_iter():
com
= ConstantObservationModel()
icm
= IteratedConditionalModes()
mu,
sigma = com.initialize_param_uniform(image, nclasses)
sigmasq
= sigma ** 2
neglogl
= com.negloglikelihood(image, mu, sigmasq, nclasses)
initial_segmentation
= icm.initialize_maximum_likelihood(neglogl)
npt.assert_equal(initial_segmentation.max(),
nclasses - 1)
npt.assert_equal(initial_segmentation.min(),
0)
mu,
sigma, sigmasq = com.seg_stats(image, initial_segmentation, nclasses)
npt.assert_equal(mu.all()
>= 0, True)
npt.assert_equal(sigmasq.all()
>= 0, True)
final_segmentation
= np.empty_like(image)
seg_init
= initial_segmentation.copy()
for
i in range(max_iter):
print('iteration:
', i)
PLN
= com.prob_neighborhood(image, initial_segmentation, beta,
nclasses)
npt.assert_equal(PLN.all()
>= 0.0, True)
PLY
= com.prob_image(image, nclasses, mu, sigmasq, PLN)
npt.assert_equal(PLY.all()
>= 0.0, True)
mu_upd,
sigmasq_upd = com.update_param(image, PLY, mu, nclasses)
npt.assert_equal(mu_upd.all()
>= 0.0, True)
npt.assert_equal(sigmasq_upd.all()
>= 0.0, True)
negll
= com.negloglikelihood(image, mu_upd, sigmasq_upd, nclasses)
npt.assert_equal(negll.all()
>= 0.0, True)
plt.figure()
plt.imshow(negll[...,
1, 0])
plt.colorbar()
final_segmentation,
energy = icm.icm_ising(negll, beta,
initial_segmentation)
initial_segmentation
= final_segmentation.copy()
mu
= mu_upd.copy()
sigmasq
= sigmasq_upd.copy()
difference_map
= np.abs(seg_init - final_segmentation)
npt.assert_equal(np.abs(np.sum(difference_map))
!= 0, True)
return
seg_init, final_segmentation, PLY
Basically,
here I am testing for making sure that the input is the right one, that the
output of the functions calculating the probabilities (PLN and PLY) are within
0 and 1, and making sure that the parameters (means and variances) are being
updated accordingly. At the end I make
sure that the output final segmentation is different from the initial
segmentation that is the input to the Expectation Maximization (EM) algorithm
embedded in the for loop. As can be seen here within the loop, the EM algorithm
alternates with the ICM segmentation method (Iterated Conditional Modes) and in
each loop the parameters (means and variances) get updated. Right now, the
algorithm is performing well up to a certain amount of iterations. As said,
this is due to singularity handling. I will test the algorithm it in the next
following days in order to make it as robust as possible. I should be starting
the validation of the segmentation on Friday. Will give a heads-up on
how the validation goes. If the results are as expected I will be ready to try
the algorithm not only on T1 images of the brain but also on diffusion-derived
scalar maps such as the “Power-maps”.
No comments:
Post a Comment