Package mdp :: Package nodes :: Class ProjectedGradientNMFScikitsLearnNode
[hide private]
[frames] | no frames]

Class ProjectedGradientNMFScikitsLearnNode



Non-Negative matrix factorization by Projected Gradient (NMF)
This node has been automatically generated by wrapping the ``scikits.learn.decomposition.nmf.ProjectedGradientNMF`` class
from the ``sklearn`` library.  The wrapped instance can be accessed
through the ``scikits_alg`` attribute.
**Parameters**

X: array, [n_samples, n_features]
    Data the model will be fit to.

n_components: int or None
    Number of components
    if n_components is not set all components are kept

init:  'nndsvd' |  'nndsvda' | 'nndsvdar' | int | RandomState
    Method used to initialize the procedure.
    Default: 'nndsvdar'
    Valid options:

        - 'nndsvd': default Nonnegative Double Singular Value
        -     Decomposition (NNDSVD) initialization (better for sparseness)
        - 'nndsvda': NNDSVD with zeros filled with the average of X
        -     (better when sparsity is not desired)
        - 'nndsvdar': NNDSVD with zeros filled with small random values
        -     (generally faster, less accurate alternative to NNDSVDa
        -     for when sparsity is not desired)
        - int seed or RandomState: non-negative random matrices


sparseness: 'data' | 'components' | None
    Where to enforce sparsity in the model.
    Default: None

beta: double
    Degree of sparseness, if sparseness is not None. Larger values mean
    more sparseness.
    Default: 1

eta: double
    Degree of correctness to mantain, if sparsity is not None. Smaller
    values mean larger error.
    Default: 0.1

tol: double
    Tolerance value used in stopping conditions.
    Default: 1e-4

max_iter: int
    Number of iterations to compute.
    Default: 200

nls_max_iter: int
    Number of iterations in NLS subproblem.
    Default: 2000

**Attributes**

components_: array, [n_components, n_features]
    Non-negative components of the data
reconstruction_err_: number
    Frobenius norm of the matrix difference between the
    training data and the reconstructed data from the
    fit produced by the model. || X - WH ||_2

**Examples**


>>> import numpy as np
>>> X = np.array([[1,1], [2, 1], [3, 1.2], [4, 1], [5, 0.8], [6, 1]])
>>> from scikits.learn.decomposition import ProjectedGradientNMF
>>> model = ProjectedGradientNMF(n_components=2, init=0)
>>> model.fit(X) #doctest: +ELLIPSIS
ProjectedGradientNMF(nls_max_iter=2000, eta=0.1, max_iter=200,
           init=<mtrand.RandomState object at 0x...>, beta=1,
           sparseness=None, n_components=2, tol=0.0001)
>>> model.components_
array([[ 0.77032744,  0.11118662],
       [ 0.38526873,  0.38228063]])
>>> model.reconstruction_err_ #doctest: +ELLIPSIS
0.00746...
>>> model = ProjectedGradientNMF(n_components=2, init=0,
...                              sparseness='components')
>>> model.fit(X) #doctest: +ELLIPSIS
ProjectedGradientNMF(nls_max_iter=2000, eta=0.1, max_iter=200,
           init=<mtrand.RandomState object at 0x...>, beta=1,
           sparseness='components', n_components=2, tol=0.0001)
>>> model.components_
array([[ 1.67481991,  0.29614922],
       [-0.        ,  0.4681982 ]])
>>> model.reconstruction_err_ #doctest: +ELLIPSIS
0.513...

**Notes**

This implements C.-J. Lin. Projected gradient methods
for non-negative matrix factorization. Neural
Computation, 19(2007), 2756-2779.
http://www.csie.ntu.edu.tw/~cjlin/nmf/

NNDSVD is introduced in
C. Boutsidis, E. Gallopoulos: SVD based
initialization: A head start for nonnegative
matrix factorization - Pattern Recognition, 2008
http://www.cs.rpi.edu/~boutsc/files/nndsvd.pdf

Instance Methods [hide private]
 
__init__(self, input_dim=None, output_dim=None, dtype=None, **kwargs)
Non-Negative matrix factorization by Projected Gradient (NMF) This node has been automatically generated by wrapping the ``scikits.learn.decomposition.nmf.ProjectedGradientNMF`` class from the ``sklearn`` library.
 
_execute(self, x)
list
_get_supported_dtypes(self)
Return the list of dtypes supported by this node. The types can be specified in any format allowed by numpy.dtype.
 
_stop_training(self, **kwargs)
Concatenate the collected data in a single array.
 
execute(self, x)
Transform the data X according to the fitted NMF model This node has been automatically generated by wrapping the scikits.learn.decomposition.nmf.ProjectedGradientNMF class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters
 
stop_training(self, **kwargs)
Learn a NMF model for the data X. This node has been automatically generated by wrapping the scikits.learn.decomposition.nmf.ProjectedGradientNMF class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters

Inherited from unreachable.newobject: __long__, __native__, __nonzero__, __unicode__, next

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __setattr__, __sizeof__, __subclasshook__

    Inherited from Cumulator
 
_train(self, *args)
Collect all input data in a list.
 
train(self, *args)
Collect all input data in a list.
    Inherited from Node
 
__add__(self, other)
 
__call__(self, x, *args, **kwargs)
Calling an instance of Node is equivalent to calling its execute method.
 
__repr__(self)
repr(x)
 
__str__(self)
str(x)
 
_check_input(self, x)
 
_check_output(self, y)
 
_check_train_args(self, x, *args, **kwargs)
 
_get_train_seq(self)
 
_if_training_stop_training(self)
 
_inverse(self, x)
 
_pre_execution_checks(self, x)
This method contains all pre-execution checks.
 
_pre_inversion_checks(self, y)
This method contains all pre-inversion checks.
 
_refcast(self, x)
Helper function to cast arrays to the internal dtype.
 
_set_dtype(self, t)
 
_set_input_dim(self, n)
 
_set_output_dim(self, n)
 
copy(self, protocol=None)
Return a deep copy of the node.
 
get_current_train_phase(self)
Return the index of the current training phase.
 
get_dtype(self)
Return dtype.
 
get_input_dim(self)
Return input dimensions.
 
get_output_dim(self)
Return output dimensions.
 
get_remaining_train_phase(self)
Return the number of training phases still to accomplish.
 
get_supported_dtypes(self)
Return dtypes supported by the node as a list of numpy.dtype objects.
 
has_multiple_training_phases(self)
Return True if the node has multiple training phases.
 
inverse(self, y, *args, **kwargs)
Invert y.
 
is_training(self)
Return True if the node is in the training phase, False otherwise.
 
save(self, filename, protocol=-1)
Save a pickled serialization of the node to filename. If filename is None, return a string.
 
set_dtype(self, t)
Set internal structures' dtype.
 
set_input_dim(self, n)
Set input dimensions.
 
set_output_dim(self, n)
Set output dimensions.
Static Methods [hide private]
 
is_invertible()
Return True if the node can be inverted, False otherwise.
bool
is_trainable()
Return True if the node can be trained, False otherwise.
Properties [hide private]

Inherited from object: __class__

    Inherited from Node
  _train_seq
List of tuples:
  dtype
dtype
  input_dim
Input dimensions
  output_dim
Output dimensions
  supported_dtypes
Supported dtypes
Method Details [hide private]

__init__(self, input_dim=None, output_dim=None, dtype=None, **kwargs)
(Constructor)

 

Non-Negative matrix factorization by Projected Gradient (NMF)
This node has been automatically generated by wrapping the ``scikits.learn.decomposition.nmf.ProjectedGradientNMF`` class
from the ``sklearn`` library.  The wrapped instance can be accessed
through the ``scikits_alg`` attribute.
**Parameters**

X: array, [n_samples, n_features]
    Data the model will be fit to.

n_components: int or None
    Number of components
    if n_components is not set all components are kept

init:  'nndsvd' |  'nndsvda' | 'nndsvdar' | int | RandomState
    Method used to initialize the procedure.
    Default: 'nndsvdar'
    Valid options:

        - 'nndsvd': default Nonnegative Double Singular Value
        -     Decomposition (NNDSVD) initialization (better for sparseness)
        - 'nndsvda': NNDSVD with zeros filled with the average of X
        -     (better when sparsity is not desired)
        - 'nndsvdar': NNDSVD with zeros filled with small random values
        -     (generally faster, less accurate alternative to NNDSVDa
        -     for when sparsity is not desired)
        - int seed or RandomState: non-negative random matrices


sparseness: 'data' | 'components' | None
    Where to enforce sparsity in the model.
    Default: None

beta: double
    Degree of sparseness, if sparseness is not None. Larger values mean
    more sparseness.
    Default: 1

eta: double
    Degree of correctness to mantain, if sparsity is not None. Smaller
    values mean larger error.
    Default: 0.1

tol: double
    Tolerance value used in stopping conditions.
    Default: 1e-4

max_iter: int
    Number of iterations to compute.
    Default: 200

nls_max_iter: int
    Number of iterations in NLS subproblem.
    Default: 2000

**Attributes**

components_: array, [n_components, n_features]
    Non-negative components of the data
reconstruction_err_: number
    Frobenius norm of the matrix difference between the
    training data and the reconstructed data from the
    fit produced by the model. || X - WH ||_2

**Examples**


>>> import numpy as np
>>> X = np.array([[1,1], [2, 1], [3, 1.2], [4, 1], [5, 0.8], [6, 1]])
>>> from scikits.learn.decomposition import ProjectedGradientNMF
>>> model = ProjectedGradientNMF(n_components=2, init=0)
>>> model.fit(X) #doctest: +ELLIPSIS
ProjectedGradientNMF(nls_max_iter=2000, eta=0.1, max_iter=200,
           init=<mtrand.RandomState object at 0x...>, beta=1,
           sparseness=None, n_components=2, tol=0.0001)
>>> model.components_
array([[ 0.77032744,  0.11118662],
       [ 0.38526873,  0.38228063]])
>>> model.reconstruction_err_ #doctest: +ELLIPSIS
0.00746...
>>> model = ProjectedGradientNMF(n_components=2, init=0,
...                              sparseness='components')
>>> model.fit(X) #doctest: +ELLIPSIS
ProjectedGradientNMF(nls_max_iter=2000, eta=0.1, max_iter=200,
           init=<mtrand.RandomState object at 0x...>, beta=1,
           sparseness='components', n_components=2, tol=0.0001)
>>> model.components_
array([[ 1.67481991,  0.29614922],
       [-0.        ,  0.4681982 ]])
>>> model.reconstruction_err_ #doctest: +ELLIPSIS
0.513...

**Notes**

This implements C.-J. Lin. Projected gradient methods
for non-negative matrix factorization. Neural
Computation, 19(2007), 2756-2779.
http://www.csie.ntu.edu.tw/~cjlin/nmf/

NNDSVD is introduced in
C. Boutsidis, E. Gallopoulos: SVD based
initialization: A head start for nonnegative
matrix factorization - Pattern Recognition, 2008
http://www.cs.rpi.edu/~boutsc/files/nndsvd.pdf

Overrides: object.__init__

_execute(self, x)

 
Overrides: Node._execute

_get_supported_dtypes(self)

 
Return the list of dtypes supported by this node. The types can be specified in any format allowed by numpy.dtype.
Returns: list
The list of dtypes supported by this node.
Overrides: Node._get_supported_dtypes

_stop_training(self, **kwargs)

 
Concatenate the collected data in a single array.
Overrides: Node._stop_training

execute(self, x)

 

Transform the data X according to the fitted NMF model This node has been automatically generated by wrapping the scikits.learn.decomposition.nmf.ProjectedGradientNMF class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters

X: array, [n_samples, n_features]
Data matrix to be transformed by the model

Returns

data: array, [n_samples, n_components]
Transformed data
Overrides: Node.execute

is_invertible()
Static Method

 
Return True if the node can be inverted, False otherwise.
Overrides: Node.is_invertible
(inherited documentation)

is_trainable()
Static Method

 
Return True if the node can be trained, False otherwise.
Returns: bool
A boolean indication whether the node can be trained.
Overrides: Node.is_trainable

stop_training(self, **kwargs)

 

Learn a NMF model for the data X. This node has been automatically generated by wrapping the scikits.learn.decomposition.nmf.ProjectedGradientNMF class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters

X: array, [n_samples, n_features]
Data matrix to be decomposed

Returns

self

Overrides: Node.stop_training