How to use Gaussian Mixture Model (GMM) for peak decomposition?

324 views Asked by At

I have generated some data points as a linear mixture of three 1D bell shape Gaussian distributions with different parameters (mean and variance) using following code:

import matplotlib.pyplot as plt
import numpy as np
from scipy import stats
import seaborn as sns
sns.set_style("darkgrid")
%matplotlib inline
from sklearn.mixture import GaussianMixture

x = np.linspace(start=-40,stop=40, num=1000)
y1 = stats.norm.pdf(x, loc=1,scale=1.5) # First Gaussian distribution
y2 = stats.norm.pdf(x, loc=5,scale=2.5) # Second Gaussian distribution
y3 = stats.norm.pdf(x, loc=-15,scale= 10) # Third Gaussian distribution

Y = y1+y2+y3


fig = plt.figure(figsize=(7, 5),dpi=300)
plt.plot(x,y1,lw=2,label='First component')
plt.plot(x,y2,lw=2,label='Second component')
plt.plot(x,y3,lw=2,label='Third component')
plt.plot(x,Y,lw=3,label= 'Linear Mixture')


plt.legend(loc='best',facecolor="white")
plt.show()

I tried to decompose these 3 peaks in a reverse process using sklearn.mixture.GaussianMixture. But it does not return my expected mean and variance of every Gaussian function.

model = GaussianMixture(n_components=3).fit(Y.reshape(-1,1))
print(model.means_)
print(model.covariances_)
0

There are 0 answers