GMMParameterEstimation.jl Documentation
GMMParameterEstimation.jl is a package for estimating the parameters of Gaussian k-mixture models using the method of moments. It can potentially find the parameters for arbitrary k
with known or unknown mixing coefficients. However, since the number of possible solutions to the polynomial system that determines the first dimension parameters and mixing coefficients for $k>4$ is unknown, for the unknown mixing coefficient case with $k>4$ failure of the package to find the parameters might occur if an insufficient number of solutions to the system were found
Examples
The following code snippet will use the given moments to return an estimate of the parameters using the method of moments with unknown mixing coefficients and dense covariance matrices.
using GMMParameterEstimation
d = 3
k = 2
first_moments = [1.0, -0.67, 2.44, -4.34, 17.4, -46.16, 201.67]
diagonal_moments = [-0.28 2.11 -2.46 15.29 -31.77; 0.4 4.25 3.88 54.75 59.10]
off_diag_system = Dict{Vector{Int64}, Float64}([2, 1, 0] => 1.8506, [1, 0, 1] => -0.329, [2, 0, 1] => 0.0291, [0, 2, 1] => 1.5869, [1, 1, 0] => -1.374, [0, 1, 1] => -0.333)
is_solution_found, (mixing_coefficients, means, covariances) = estimate_parameters(d, k, first_moments, diagonal_moments, off_diag_system)
Inputs:
The number of dimensions
d
The number of mixture components
k
Optional: A vector of mixing coefficients
w
with lengthk
A list of the first $3k+1$ moments (including moment 0) of the first dimension as
first_moments
A matrix where row
i
contains the first $2k+1$ moments (not including moment 0) of thei
th dimension asdiagonal_moments
Optional: A dictionary mapping the index of a mixed dimensional moment as a list of integers to the corresponding moment
off_diag_system
(See mixedMomentSystem for clarrification on which moments to include.)
Outputs:
An indicator of success in finding the parameters
is_solution_found
A tuple of the parameters
(mixing_coefficients, means, covariances)
$\\~\\$
The following code snippet will generate the denoised moments necessary for parameter recovery from the given parameters.
using GMMParameterEstimation
d=3
k=2
diagonal = false
means = [0.83 0.24 -1.53; 0.22 0.04 -0.71]
covariances = [0.8828527552401668 0.27735188899130847 1.6710529671002674; 2.257873093006253 -1.644707016523332 -0.533030022431624;;; 0.27735188899130847 1.2623673813995742 3.5270452552353238; -1.644707016523332 2.577324062116896 -0.5049891831614162;;; 1.6710529671002674 3.5270452552353238 16.696895556824817; -0.533030022431624 -0.5049891831614162 1.7733508773418585]
mixing_coefficients = [.3, .7]
if diagonal
true_first, true_diag = diagonalPerfectMoments(d, k, w, true_means, true_covariances)
else
true_first, true_diag, true_others = densePerfectMoments(d, k, w, true_means, true_covariances)
end
$\\~\\$
Parameter estimation
The main functionality of this package stems from
Main.GMMParameterEstimation.estimate_parameters
— Functionestimate_parameters(d::Integer, k::Integer, first::Vector{Float64}, second::Matrix{Float64}, last::Dict{Vector{Int64}, Float64})
Compute an estimate for the parameters of a d
-dimensional Gaussian k
-mixture model from the moments.
For the unknown mixing coefficient dense covariance matrix case, first
should be a list of moments 0 through 3k for the first dimension, second
should be a matrix of moments 1 through 2k+1 for the remaining dimensions, and last
should be a dictionary of the indices as lists of integers and the corresponding moments.
estimate_parameters(d::Integer, k::Integer, w::Array{Float64}, first::Vector{Float64}, second::Matrix{Float64}, last::Dict{Vector{Int64}, Float64})
Compute an estimate for the parameters of a d
-dimensional Gaussian k
-mixture model from the moments.
For the known mixing coefficient dense covariance matrix case, w
should be a vector of the mixing coefficients first
should be a list of moments 0 through 3k for the first dimension, second
should be a matrix of moments 1 through 2k+1 for the remaining dimensions, and last
should be a dictionary of the indices as lists of integers and the corresponding moments.
estimate_parameters(d::Integer, k::Integer, first::Vector{Float64}, second::Matrix{Float64})
Compute an estimate for the parameters of a d
-dimensional Gaussian k
-mixture model from the moments.
For the unknown mixing coefficient diagonal covariance matrix case, first
should be a list of moments 0 through 3k for the first dimension, and second
should be a matrix of moments 1 through 2k+1 for the remaining dimensions.
estimate_parameters(d::Integer, k::Integer, w::Array{Float64}, first::Vector{Float64}, second::Matrix{Float64})
Compute an estimate for the parameters of a d
-dimensional Gaussian k
-mixture model from the moments.
For the known mixing coefficient diagonal covariance matrix case, w
should be a vector of the mixing coefficients first
should be a list of moments 0 through 3k for the first dimension, and second
should be a matrix of moments 1 through 2k+1 for the remaining dimensions.
estimate_parameters(k::Integer, shared_cov::Matrix{Float64}, first::Vector{Float64}, second::Matrix{Float64})
Compute an estimate for the means of a Gaussian k
-mixture model with equal mixing coefficients and known shared covariances from the moments.
The shared covariance matrix shared_cov
will determine the dimension. Then first
should be a list of moments 0 through k for the first dimension, second
should be a matrix of moments $m_{je_1+e_i}$ for j in 0 to k
-1 and i in 2 to d as a matrix where d is the dimension, i varies across rows, and j varies down columns.
which computes the parameter recovery using Algorithm 1 from Estimating Gaussian Mixtures Using Sparse Polynomial Moment Systems. Note that the unknown mixing coefficient cases with $k\in\{2,3,4\}$ load a set of generic moments and the corresponding solutions to the first 1-D polynomial system from sys1_k2.jld2
, sys1_k3.jld2
, or sys1_k4.jld2
for a slight speedup. If k
is not specified, k=1 will be assumed, and the resulting polynomial system will be solved explicitly and directly.
In one dimension, for a random variable $X$ with density $f$ define the $i$th moment as $m_i=E[X^i]=\int x^if(x)dx$. For a Gaussian mixture model, this results in a polynomial in the parameters. For a sample $\{y_1,y_2,\dots,y_N\}$, define the $i$th sample moment as $\overline{m_i}=\frac{1}{N}\sum_{j=1}^N y_j^i$. The sample moments approach the true moments as $N\rightarrow\infty$, so by setting the polynomials equal to the empirical moments, we can then solve the polynomial system to recover the parameters.
For a multivariate random variable $X$ with density $f_X$ define the moments as $m_{i_1,\dots,i_n} = E[X_1^{i_1}\cdots X_n^{i_n}] = \int\cdots\int x_1^{i_1}\cdots x_n^{i_n}f_X(x_1,\dots,x_n)dx_1\cdots dx_n$ and the empirical moments as $\overline{m}_{i_1,\dots,i_n} = \frac{1}{N}\sum_{j=1}^Ny_{j_1}^{i_1}\cdots y_{j_n}^{i_n}$. And again, by setting the polynomials equal to the empirical moments, we can then solve the system of polynomials to recover the parameters. However, choosing which moments becomes more complicated. If we know the mixing coefficients, we can use the first $2k+1$ moments of each dimension to find the means and the diagonal entries of the covariance matrices. If we do not know the mixing coefficients, we need the first $3k$ moments of the first dimension to also find the mixing coefficients. See mixedMomentSystem for which moments to include to fill in the off-diagonals of the covariance matrices if needed.
On a standard laptop we have successfully recovered parameters with unknown mixing coefficients for $k\leq 4$ and known mixing coefficients for $k\leq 5$, with $d\leq 10^5$ for the diagonal covariance case and $d\leq 50$ for the dense covariance case. Higher k
values or higher d
values have led to issues with running out of RAM.
$\\~\\$
One potential difficulty in estimating the mixing coefficients is the resulting dependence on higher moments in the first dimension. In sample data, if another dimension leads to more accurate moments, using that dimension to recover mixing coefficients and then proceeding can address this difficulty. The following function was designed for this purpose. Note that it can either be provided with an d x n sample, or a d x 3k+1 array of moments 0 through 3k for each dimension, augmented by a dictionary of the off-diagonal moments if seeking non-diagonal covariance matrices.
Main.GMMParameterEstimation.dimension_cycle
— Functiondimension_cycle(k::Integer, sample::Matrix{Float64}, diagonal)
Cycle over the dimensions of the sample
to find candidate mixing coefficients, then solve for parameters based on those.
This will take longer than estimate_parameters since it does multiple tries. Will try each dimension to attempt to find mixing coefficients, and if found will try to solve for parameters. Returns pass
= false if no dimension results in mixing coefficients that allow for a solution.
dimension_cycle(d::Integer, k::Integer, cycle_moments::Array{Float64}, indexes::Dict{Vector{Int64}, Float64})
Cycle over the dimensions of cycle_moments
to find candidate mixing coefficients, then solve for parameters based on those.
This will take longer than estimate_parameters since it does multiple tries. Will try each dimension to attempt to find mixing coefficients, and if found will try to solve for parameters. Returns pass
= false if no dimension results in mixing coefficients that allow for a solution. cycle_moments
should be an array of the 0 through 3k
moments for each dimension. If no indexes
is given, assumes diagonal covariance matrices.
Checking Input Formatting
To check you are formatting your inputs correctly, we have included
Main.GMMParameterEstimation.checkInputs
— FunctioncheckInputs(d::Integer, k::Integer, first::Vector{Float64}, second::Matrix{Float64}, last::Dict{Vector{Int64}, Float64})
Returns true
if the inputs are the right format for estimate_parameters
and an error otherwise.
checkInputs(d::Integer, k::Integer, w::Vector{Float64}, first::Vector{Float64}, second::Matrix{Float64}, last::Dict{Vector{Int64}, Float64})
Returns true
if the inputs are the right format for estimate_parameters
and an error otherwise.
checkInputs(d::Integer, k::Integer, first::Vector{Float64}, second::Matrix{Float64})
Returns true
if the inputs are the right format for estimate_parameters
and an error otherwise.
checkInputs(d::Integer, k::Integer, w::Vector{Float64}, first::Vector{Float64}, second::Matrix{Float64})
Returns true
if the inputs are the right format for estimate_parameters
and an error otherwise.
checkInputs(d::Integer, k::Integer, shared_cov::Matrix{Float64}, first::Vector{Float64}, second::Matrix{Float64})
Returns true
if the inputs are the right format for estimate_parameters
and an error otherwise.
Generate and sample from Gaussian Mixture Models
Generation
Main.GMMParameterEstimation.makeCovarianceMatrix
— FunctionmakeCovarianceMatrix(d::Integer, diagonal::Bool)
Generate random d
xd
covariance matrix.
If diagonal
==true, returns a diagonal covariance matrix.
Note that the entries of the resulting covariance matrices are generated from a normal distribution centered at 0 with variance 1.
$\\~\\$
Main.GMMParameterEstimation.generateGaussians
— FunctiongenerateGaussians(d::Integer, k::Integer, diagonal::Bool)
Generate means and covariances for k
Gaussians with dimension d
.
diagonal
should be true for spherical case, and false for dense covariance matrices.
The parameters are returned as a tuple, with weights in a 1D vector, means as a k x d array, and variances as a k x d x d array if diagonal
is false or as a list of Diagonal{Float64, Vector{Float64}}
if diagonal
is true to save memory. Note that each entry of each parameter is generated from a normal distribution centered at 0 with variance 1.
$\\~\\$
Main.GMMParameterEstimation.getSample
— FunctiongetSample(numb::Integer, w::Vector{Float64}, means::Matrix{Float64}, covariances::Vector)
Generate a Gaussian mixture model sample with numb
entries, mixing coefficients w
, means means
, and covariances covariances
.
getSample(numb::Integer, w::Vector{Float64}, means::Matrix{Float64}, covariances::Array{Float64, 3})
Generate a Gaussian mixture model sample with numb
entries, mixing coefficients w
, means means
, and covariances covariances
.
This relies on the Distributions package.
$\\~\\$
Computing moments
Main.GMMParameterEstimation.sampleMoments
— FunctionsampleMoments(sample::Matrix{Float64}, k; diagonal = false)
Use the sample to compute the moments necessary for parameter estimation using method of moments with general covariance matrices and mixing coefficients.
Returns moments 0 to 3k for the first dimension, moments 1 through 2k+1 for the other dimensions as a matrix, and a dictionary with indices and moments for the off-diagonal system if diagonal
is false.
Main.GMMParameterEstimation.diagonalPerfectMoments
— FunctiondiagonalPerfectMoments(d, k, w, true_means, true_covariances)
Use the given parameters to compute the exact moments necessary for parameter estimation with diagonal covariance matrices.
Returns moments 0 to 3k for the first dimension, and moments 1 through 2k+1 for the other dimensions as a matrix.
Main.GMMParameterEstimation.densePerfectMoments
— FunctiondensePerfectMoments(d, k, w, true_means, true_covariances)
Use the given parameters to compute the exact moments necessary for parameter estimation with dense covariance matrices.
Returns moments 0 to 3k for the first dimension, moments 1 through 2k+1 for the other dimensions as a matrix, and a dictionary with indices and moments for the off-diagonal system.
Main.GMMParameterEstimation.moments_for_cycle
— Functionmoments_for_cycle(d, k, w, means, covars, diagonal)
Calculate 0 through 3k
+1 denoised moments for every dimension.
Used as input for cycling over the dimensions to find candidate mixing coefficients.
Main.GMMParameterEstimation.equalMixCovarianceKnown_moments
— FunctionequalMixCovarianceKnown_moments(k, mean, shared_cov)
Use the given parameters to compute the exact moments necessary for parameter estimation with equal mixing coefficients and shared known covariances.
Returns moments 0 to k
for the first dimension, and moments math m\_{je\_1+e\_i}
for j in 0 to k
-1 and i in 2 to d as a matrix where d is the dimension, i varies across rows, and j varies down columns.
equalMixCovarianceKnown_moments(k, sample)
Use the given parameters to compute the sample moments necessary for parameter estimation with equal mixing coefficients and shared known covariances.
Returns moments 0 to k
for the first dimension, and moments m{je1+e_i} for j in 0 to k
-1 and i in 2 to d as a matrix where d is the dimension, i varies across rows, and j varies down columns.
These expect parameters to be given with weights in a 1D vector, means as a k x d array, and covariances as a k x d x d array for dense covariance matrices or as a list of diagonal matrices for diagonal covariance matrices.
$\\~\\$
Build the polynomial systems
Main.GMMParameterEstimation.build1DSystem
— Functionbuild1DSystem(k::Integer, m::Integer)
Build the polynomial system for a mixture of 1D Gaussians where 'm'-1 is the highest desired moment and the mixing coefficients are unknown.
build1DSystem(k::Integer, m::Integer, a::Union{Vector{Float64}, Vector{Variable}})
Build the polynomial system for a mixture of 1D Gaussians where 'm'-1 is the highest desired moment, and a
is a vector of the mixing coefficients.
This uses the usual recursive formula for moments of a univariate Gaussian in terms of the mean and variance, and then takes a convex combination with either variable mixing coefficients or the provided mixing coefficients.
$\\~\\$
Main.GMMParameterEstimation.selectSol
— FunctionselectSol(k::Integer, solution::Result, polynomial::Expression, moment::Number)
Select a k
mixture solution from solution
accounting for polynomial
and moment
.
Sort out a k
mixture statistically significant solutions from solution
, and return the one closest to moment
when polynomial
is evaluated at those values.
Statistically significant means has positive variances here. This is used to select which solution from the parameter homotopy will be used.
$\\~\\$
Main.GMMParameterEstimation.tensorPower
— FunctiontensorPower(tensor, power::Integer)
Compute the power
tensor power of tensor
.
Main.GMMParameterEstimation.convert_indexing
— Functionconvert_indexing(moment_i, d)
Convert the d
dimensional multivariate moment_i
index to the corresponding tensor moment index.
To our knowledge, the only closed form formula for the mixed dimensional moments of a multivariate Gaussian is that provided by Jo$\~{a}$o M. Pereira, Joe Kileel, and Tamara G. Kolda in Tensor Moments of Gaussian Mixture Models: Theory and Applications. However, the tensor moments are indexed in a different way than the multivariate moment notation we used. Let $m_{a_1\cdots a_n}$ be a d-th order multivariate moment and let $M_{i_1\cdots i_d}^{(d)}$ be an entry of the d-th order tensor moment. Then $m_{a_1\cdots a_n}=M_{i_1\cdots i_d}^{(d)}$ where $a_j=|\{i_k=j\}|$. Note that due to symmetry, the indexing of the tensor moment is non-unique. For example, $m_{102} = M_{133}^{(3)}=M_{331}^{(3)}=M_{313}^{(3)}=m_{102}$.
$\\~\\$
Main.GMMParameterEstimation.mixedMomentSystem
— FunctionmixedMomentSystem(d, k, mixing, ms, vs)
Build a linear system for finding the off-diagonal covariances entries.
For a d
dimensional Gaussian k
-mixture model with mixing coefficients mixing
, means ms
, and covariances vs
where the diagonal entries have been filled in and the off diagonals are variables.
The final step in our method of moments parameter recovery for non-diagonal covariance matrices is building and solving a system of $N:=\frac{k}{2}(d^2-d)$ linear equations in the same number of unknowns to fill in the off diagonal. The polynomial for $m_{a_1\cdots a_n}$ is linear if all but two $a_i=0$ and at least one $a_1=1$. There are $n^2-n$ of these for each order $\geq2$, so we need these equations for up to $\lceil \frac{k}{2}\rceil$-th order. These moments should be provided to the solver using minimal possible degree, and if only half the possible moments for a degree are necessary (k/2 is even) provide the moments with higher power in earlier dimension, e.g. use [2,1,0] instead of [1,2,0].
Note: the polynomial is still linear when 3 $a_i=1$ and the rest of the $a_i$ are 0 but this complicates generating the system so we did not include those.
Referring back to Pereira et al. for a closed form method of generating the necessary moment polynomials, we generate the linear system using the already computed mixing coefficients, means, and diagonals of the covariances, and return it as a dictionary of index=>polynomial pairs that can then be matched with the corresponding moments.
Index
Main.GMMParameterEstimation.build1DSystem
Main.GMMParameterEstimation.checkInputs
Main.GMMParameterEstimation.convert_indexing
Main.GMMParameterEstimation.densePerfectMoments
Main.GMMParameterEstimation.diagonalPerfectMoments
Main.GMMParameterEstimation.dimension_cycle
Main.GMMParameterEstimation.equalMixCovarianceKnown_moments
Main.GMMParameterEstimation.estimate_parameters
Main.GMMParameterEstimation.generateGaussians
Main.GMMParameterEstimation.getSample
Main.GMMParameterEstimation.makeCovarianceMatrix
Main.GMMParameterEstimation.mixedMomentSystem
Main.GMMParameterEstimation.moments_for_cycle
Main.GMMParameterEstimation.sampleMoments
Main.GMMParameterEstimation.selectSol
Main.GMMParameterEstimation.tensorPower