Package mdp :: Package nodes :: Class GMMScikitsLearnNode
[hide private]
[frames] | no frames]

Class GMMScikitsLearnNode


Gaussian Mixture Model This node has been automatically generated by wrapping the scikits.learn.mixture.GMM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Representation of a Gaussian mixture model probability distribution. This class allows for easy evaluation of, sampling from, and maximum-likelihood estimation of the parameters of a GMM distribution.

Initializes parameters such that every mixture component has zero mean and identity covariance.

Parameters

n_states : int, optional
Number of mixture components. Defaults to 1.
cvtype : string (read-only), optional
String describing the type of covariance parameters to use. Must be one of 'spherical', 'tied', 'diag', 'full'. Defaults to 'diag'.

Attributes

cvtype : string (read-only)
String describing the type of covariance parameters used by the GMM. Must be one of 'spherical', 'tied', 'diag', 'full'.
n_features : int
Dimensionality of the Gaussians.
n_states : int (read-only)
Number of mixture components.
weights : array, shape (n_states,)
Mixing weights for each mixture component.
means : array, shape (n_states, n_features)
Mean parameters for each mixture component.
covars : array

Covariance parameters for each mixture component. The shape depends on cvtype:

  • (n_states,) if 'spherical',
  • (n_features, n_features) if 'tied',
  • (n_states, n_features) if 'diag',
  • (n_states, n_features, n_features) if 'full'
converged_ : bool
True when convergence was reached in fit(), False otherwise.

Methods

decode(X)
Find most likely mixture components for each point in X.
eval(X)
Compute the log likelihood of X under the model and the posterior distribution over mixture components.
fit(X)
Estimate model parameters from X using the EM algorithm.
predict(X)
Like decode, find most likely mixtures components for each observation in X.
rvs(n=1)
Generate n samples from the model.
score(X)
Compute the log likelihood of X under the model.

Examples

>>> import numpy as np
>>> from scikits.learn import mixture
>>> g = mixture.GMM(n_states=2)
>>> # Generate random observations with two modes centered on 0
>>> # and 10 to use for training.
>>> np.random.seed(0)
>>> obs = np.concatenate((np.random.randn(100, 1),
...                       10 + np.random.randn(300, 1)))
>>> g.fit(obs)
GMM(cvtype='diag', n_states=2)
>>> g.weights
array([ 0.25,  0.75])
>>> g.means
array([[ 0.05980802],
       [ 9.94199467]])
>>> g.covars
[array([[ 1.01682662]]), array([[ 0.96080513]])]
>>> np.round(g.weights, 2)
array([ 0.25,  0.75])
>>> np.round(g.means, 2)
array([[ 0.06],
       [ 9.94]])
>>> np.round(g.covars, 2)
... #doctest: +NORMALIZE_WHITESPACE
array([[[ 1.02]],
       [[ 0.96]]])
>>> g.predict([[0], [2], [9], [10]])
array([0, 0, 1, 1])
>>> np.round(g.score([[0], [2], [9], [10]]), 2)
array([-2.32, -4.16, -1.65, -1.19])
>>> # Refit the model on new data (initial parameters remain the
>>> # same), this time with an even split between the two modes.
>>> g.fit(20 * [[0]] +  20 * [[10]])
GMM(cvtype='diag', n_states=2)
>>> np.round(g.weights, 2)
array([ 0.5,  0.5])
Instance Methods [hide private]
 
__init__(self, input_dim=None, output_dim=None, dtype=None, **kwargs)
Gaussian Mixture Model This node has been automatically generated by wrapping the scikits.learn.mixture.GMM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Representation of a Gaussian mixture model probability distribution. This class allows for easy evaluation of, sampling from, and maximum-likelihood estimation of the parameters of a GMM distribution.
 
_execute(self, x)
 
_get_supported_dtypes(self)
Return the list of dtypes supported by this node. The types can be specified in any format allowed by numpy.dtype.
 
_stop_training(self, **kwargs)
Concatenate the collected data in a single array.
 
execute(self, x)
Predict label for data. This node has been automatically generated by wrapping the scikits.learn.mixture.GMM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters
 
stop_training(self, **kwargs)
Estimate model parameters with the expectation-maximization algorithm. This node has been automatically generated by wrapping the scikits.learn.mixture.GMM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. A initialization step is performed before entering the em algorithm. If you want to avoid this step, set the keyword argument init_params to the empty string ''. Likewise, if you would like just to do an initialization, call this method with n_iter=0.

Inherited from unreachable.newobject: __long__, __native__, __nonzero__, __unicode__, next

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __setattr__, __sizeof__, __subclasshook__

    Inherited from Cumulator
 
_train(self, *args)
Collect all input data in a list.
 
train(self, *args)
Collect all input data in a list.
    Inherited from Node
 
__add__(self, other)
 
__call__(self, x, *args, **kwargs)
Calling an instance of Node is equivalent to calling its execute method.
 
__repr__(self)
repr(x)
 
__str__(self)
str(x)
 
_check_input(self, x)
 
_check_output(self, y)
 
_check_train_args(self, x, *args, **kwargs)
 
_get_train_seq(self)
 
_if_training_stop_training(self)
 
_inverse(self, x)
 
_pre_execution_checks(self, x)
This method contains all pre-execution checks.
 
_pre_inversion_checks(self, y)
This method contains all pre-inversion checks.
 
_refcast(self, x)
Helper function to cast arrays to the internal dtype.
 
_set_dtype(self, t)
 
_set_input_dim(self, n)
 
_set_output_dim(self, n)
 
copy(self, protocol=None)
Return a deep copy of the node.
 
get_current_train_phase(self)
Return the index of the current training phase.
 
get_dtype(self)
Return dtype.
 
get_input_dim(self)
Return input dimensions.
 
get_output_dim(self)
Return output dimensions.
 
get_remaining_train_phase(self)
Return the number of training phases still to accomplish.
 
get_supported_dtypes(self)
Return dtypes supported by the node as a list of numpy.dtype objects.
 
has_multiple_training_phases(self)
Return True if the node has multiple training phases.
 
inverse(self, y, *args, **kwargs)
Invert y.
 
is_training(self)
Return True if the node is in the training phase, False otherwise.
 
save(self, filename, protocol=-1)
Save a pickled serialization of the node to filename. If filename is None, return a string.
 
set_dtype(self, t)
Set internal structures' dtype.
 
set_input_dim(self, n)
Set input dimensions.
 
set_output_dim(self, n)
Set output dimensions.
Static Methods [hide private]
 
is_invertible()
Return True if the node can be inverted, False otherwise.
bool
is_trainable()
Return True if the node can be trained, False otherwise.
Properties [hide private]

Inherited from object: __class__

    Inherited from Node
  _train_seq
List of tuples:
  dtype
dtype
  input_dim
Input dimensions
  output_dim
Output dimensions
  supported_dtypes
Supported dtypes
Method Details [hide private]

__init__(self, input_dim=None, output_dim=None, dtype=None, **kwargs)
(Constructor)

 

Gaussian Mixture Model This node has been automatically generated by wrapping the scikits.learn.mixture.GMM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Representation of a Gaussian mixture model probability distribution. This class allows for easy evaluation of, sampling from, and maximum-likelihood estimation of the parameters of a GMM distribution.

Initializes parameters such that every mixture component has zero mean and identity covariance.

Parameters

n_states : int, optional
Number of mixture components. Defaults to 1.
cvtype : string (read-only), optional
String describing the type of covariance parameters to use. Must be one of 'spherical', 'tied', 'diag', 'full'. Defaults to 'diag'.

Attributes

cvtype : string (read-only)
String describing the type of covariance parameters used by the GMM. Must be one of 'spherical', 'tied', 'diag', 'full'.
n_features : int
Dimensionality of the Gaussians.
n_states : int (read-only)
Number of mixture components.
weights : array, shape (n_states,)
Mixing weights for each mixture component.
means : array, shape (n_states, n_features)
Mean parameters for each mixture component.
covars : array

Covariance parameters for each mixture component. The shape depends on cvtype:

  • (n_states,) if 'spherical',
  • (n_features, n_features) if 'tied',
  • (n_states, n_features) if 'diag',
  • (n_states, n_features, n_features) if 'full'
converged_ : bool
True when convergence was reached in fit(), False otherwise.

Methods

decode(X)
Find most likely mixture components for each point in X.
eval(X)
Compute the log likelihood of X under the model and the posterior distribution over mixture components.
fit(X)
Estimate model parameters from X using the EM algorithm.
predict(X)
Like decode, find most likely mixtures components for each observation in X.
rvs(n=1)
Generate n samples from the model.
score(X)
Compute the log likelihood of X under the model.

Examples

>>> import numpy as np
>>> from scikits.learn import mixture
>>> g = mixture.GMM(n_states=2)
>>> # Generate random observations with two modes centered on 0
>>> # and 10 to use for training.
>>> np.random.seed(0)
>>> obs = np.concatenate((np.random.randn(100, 1),
...                       10 + np.random.randn(300, 1)))
>>> g.fit(obs)
GMM(cvtype='diag', n_states=2)
>>> g.weights
array([ 0.25,  0.75])
>>> g.means
array([[ 0.05980802],
       [ 9.94199467]])
>>> g.covars
[array([[ 1.01682662]]), array([[ 0.96080513]])]
>>> np.round(g.weights, 2)
array([ 0.25,  0.75])
>>> np.round(g.means, 2)
array([[ 0.06],
       [ 9.94]])
>>> np.round(g.covars, 2)
... #doctest: +NORMALIZE_WHITESPACE
array([[[ 1.02]],
       [[ 0.96]]])
>>> g.predict([[0], [2], [9], [10]])
array([0, 0, 1, 1])
>>> np.round(g.score([[0], [2], [9], [10]]), 2)
array([-2.32, -4.16, -1.65, -1.19])
>>> # Refit the model on new data (initial parameters remain the
>>> # same), this time with an even split between the two modes.
>>> g.fit(20 * [[0]] +  20 * [[10]])
GMM(cvtype='diag', n_states=2)
>>> np.round(g.weights, 2)
array([ 0.5,  0.5])
Overrides: object.__init__

_execute(self, x)

 
Overrides: Node._execute

_get_supported_dtypes(self)

 
Return the list of dtypes supported by this node. The types can be specified in any format allowed by numpy.dtype.
Overrides: Node._get_supported_dtypes

_stop_training(self, **kwargs)

 
Concatenate the collected data in a single array.
Overrides: Node._stop_training

execute(self, x)

 

Predict label for data. This node has been automatically generated by wrapping the scikits.learn.mixture.GMM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters

X : array-like, shape = [n_samples, n_features]

Returns

C : array, shape = (n_samples,)

Overrides: Node.execute

is_invertible()
Static Method

 
Return True if the node can be inverted, False otherwise.
Overrides: Node.is_invertible
(inherited documentation)

is_trainable()
Static Method

 
Return True if the node can be trained, False otherwise.
Returns: bool
A boolean indicating whether the node can be trained.
Overrides: Node.is_trainable

stop_training(self, **kwargs)

 

Estimate model parameters with the expectation-maximization algorithm. This node has been automatically generated by wrapping the scikits.learn.mixture.GMM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. A initialization step is performed before entering the em algorithm. If you want to avoid this step, set the keyword argument init_params to the empty string ''. Likewise, if you would like just to do an initialization, call this method with n_iter=0.

Parameters

X : array_like, shape (n, n_features)
List of n_features-dimensional data points. Each row corresponds to a single data point.
n_iter : int, optional
Number of EM iterations to perform.
min_covar : float, optional
Floor on the diagonal of the covariance matrix to prevent overfitting. Defaults to 1e-3.
thresh : float, optional
Convergence threshold.
params : string, optional
Controls which parameters are updated in the training process. Can contain any combination of 'w' for weights, 'm' for means, and 'c' for covars. Defaults to 'wmc'.
init_params : string, optional
Controls which parameters are updated in the initialization process. Can contain any combination of 'w' for weights, 'm' for means, and 'c' for covars. Defaults to 'wmc'.
Overrides: Node.stop_training