Home | Trees | Indices | Help |
|
---|
|
Kernel Principal component analysis (KPCA) This node has been automatically generated by wrapping the ``scikits.learn.decomposition.pca.KernelPCA`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Non-linear dimensionality reduction through the use of kernels. **Parameters** n_components: int or None Number of components. If None, all non-zero components are kept. kernel: "linear" | "poly" | "rbf" | "precomputed" kernel Default: "linear" sigma: float width of the rbf kernel Default: 1.0 degree: int degree of the polynomial kernel Default: 3 alpha: int hyperparameter of the ridge regression that learns the inverse transform (when fit_inverse_transform=True) Default: 1.0 fit_inverse_transform: bool learn the inverse transform (i.e. learn to find the pre-image of a point) Default: False **Attributes** ``lambdas_``, alphas_: - Eigenvalues and eigenvectors of the centered kernel matrix dual_coef_: - Inverse transform matrix X_transformed_fit_: - Projection of the fitted data on the kernel principal components Reference Kernel PCA was intoduced in: - Bernhard Schoelkopf, Alexander J. Smola, - and Klaus-Robert Mueller. 1999. Kernel principal - component analysis. In Advances in kernel methods, - MIT Press, Cambridge, MA, USA 327-352.
|
|||
|
|||
|
|||
list |
|
||
|
|||
|
|||
|
|||
Inherited from Inherited from |
|||
Inherited from Cumulator | |||
---|---|---|---|
|
|||
|
|||
Inherited from Node | |||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|
|||
|
|||
bool |
|
|
|||
Inherited from |
|||
Inherited from Node | |||
---|---|---|---|
_train_seq List of tuples: |
|||
dtype dtype |
|||
input_dim Input dimensions |
|||
output_dim Output dimensions |
|||
supported_dtypes Supported dtypes |
|
Kernel Principal component analysis (KPCA) This node has been automatically generated by wrapping the ``scikits.learn.decomposition.pca.KernelPCA`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Non-linear dimensionality reduction through the use of kernels. **Parameters** n_components: int or None Number of components. If None, all non-zero components are kept. kernel: "linear" | "poly" | "rbf" | "precomputed" kernel Default: "linear" sigma: float width of the rbf kernel Default: 1.0 degree: int degree of the polynomial kernel Default: 3 alpha: int hyperparameter of the ridge regression that learns the inverse transform (when fit_inverse_transform=True) Default: 1.0 fit_inverse_transform: bool learn the inverse transform (i.e. learn to find the pre-image of a point) Default: False **Attributes** ``lambdas_``, alphas_: - Eigenvalues and eigenvectors of the centered kernel matrix dual_coef_: - Inverse transform matrix X_transformed_fit_: - Projection of the fitted data on the kernel principal components Reference Kernel PCA was intoduced in: - Bernhard Schoelkopf, Alexander J. Smola, - and Klaus-Robert Mueller. 1999. Kernel principal - component analysis. In Advances in kernel methods, - MIT Press, Cambridge, MA, USA 327-352.
|
|
|
|
Transform X.
This node has been automatically generated by wrapping the X: array-like, shape (n_samples, n_features) Returns X_new: array-like, shape (n_samples, n_components)
|
|
|
Fit the model from data in X.
This node has been automatically generated by wrapping the
Returns
|
Home | Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1-MDP on Mon Apr 27 21:56:19 2020 | http://epydoc.sourceforge.net |