Authors: Yong-Hwan Lee and Tony Storey
This project simulates the Expectation Maximization (EM) algorithm for removing Poisson noise from medical images. The EM algorithm is particularly useful in this context because it allows for the estimation of the underlying image by iteratively maximizing the likelihood function, which accounts for the statistical nature of Poisson noise. Given the quantum nature of particles and their discrete arrival times, Poisson noise often manifests in medical imaging, especially in modalities like X-ray Computed Tomography (CT). This noise can lead to significant artifacts in the reconstructed images, which might result in diagnostic inaccuracies.
To mitigate these issues, the EM algorithm is applied as it effectively separates the noise from the actual signal in the image data. By modeling the image acquisition process and noise characteristics, the algorithm iteratively refines the image estimate, ultimately converging on a solution that minimizes the impact of noise.
Find more details from the report: PDF
The distribution of
Here,
Then, the observations
Each pixel represents the absorption coefficient. Below are examples of voxel models,
| *--------------------* |
| y3---\ | | | | |
| ---/ | p1 | p2 | p3 | |
- *--------------------* -
| y2---\ | | | | |
| ---/ | p4 | p5 | p6 | |
- *--------------------* -
| y1---\ | | | | |
| ---/ | p7 | p8 | p9 | |
| *--------------------* |
-------|------|-------
y9 y10 y11
|| || ||
\/ \/ \/
*--------------------*
| | | |
| p1 | p2 | p3 |
*--------------------*
| | | |
| p4 | p5 | p6 |
*--------------------*
| | | |
| p7 | p8 | p9 |
*--------------------*
-------|------|-------
For the observations, the likelihood function can be written as,
Then, log-likelihood function can be below,
Looking at the derivative of the log-likelihood with resprect to
Lemma. Let
Then,
By taking
Because the expectation of a Binomial distribution with parameters
Therefore,
Setting the derivative to 0 to find a possible maximum gives,
Solving for all
function X_new = EM_algorithm(A, y, X)
n = length(X);
m = length(y);
% Starting EM Algorithm
for j = 1:n
for i = 1:m
den1 = A*X;
% Compute the probability
prob(i) = y(i)*A(i,j)/den1(i);
end
% E step to copute expectation
expectation(j) = sum(prob);
den2(j) = sum(A(:,j));
% M step to maximize likelihood
X_new(j) = X(j)/den2(j)*expectation(j);
end
end
The Expectation Maximization (EM) algorithm was implemented in MATLAB to reduce noise and estimate the 9 parameters from a 3x3 pixel matrix. To validate its performance, 10000 Monte Carlo simulations were conducted, and the resulting EM estimates were used to recover the underlying data.
As shown in Figure 2, the EM algorithm exhibited a monotonically increasing likelihood as it converged toward a stable solution.
The Cramér-Rao Lower Bound (CRLB) was used as a benchmark to assess the efficiency of the mean squared error (MSE) for each parameter. To further evaluate performance, the signal gain was varied from 0.1 to 10, with both the CRLB and MSE as illustrated in Figure 3.
To implement the code, follow these steps:
- Clone the repository.
- Run the
main.m
file to complete the EM algorithm and estimate the body model matrix coefficients.
[1] C. F. van Oosten, "The EM-algorithm for Poisson data," Bachelor's thesis, Mathematical Institute, Leiden University, Leiden, The Netherlands, Aug. 2014.