sparse_encode (original) (raw)

sklearn.decomposition.sparse_encode(X, dictionary, *, gram=None, cov=None, algorithm='lasso_lars', n_nonzero_coefs=None, alpha=None, copy_cov=True, init=None, max_iter=1000, n_jobs=None, check_input=True, verbose=0, positive=False)[source]#

Sparse coding.

Each row of the result is the solution to a sparse coding problem. The goal is to find a sparse array code such that:

Read more in the User Guide.

Parameters:

Xarray-like of shape (n_samples, n_features)

Data matrix.

dictionaryarray-like of shape (n_components, n_features)

The dictionary matrix against which to solve the sparse coding of the data. Some of the algorithms assume normalized rows for meaningful output.

gramarray-like of shape (n_components, n_components), default=None

Precomputed Gram matrix, dictionary * dictionary'.

covarray-like of shape (n_components, n_samples), default=None

Precomputed covariance, dictionary' * X.

algorithm{‘lasso_lars’, ‘lasso_cd’, ‘lars’, ‘omp’, ‘threshold’}, default=’lasso_lars’

The algorithm used:

n_nonzero_coefsint, default=None

Number of nonzero coefficients to target in each column of the solution. This is only used by algorithm='lars' and algorithm='omp'and is overridden by alpha in the omp case. If None, thenn_nonzero_coefs=int(n_features / 10).

alphafloat, default=None

If algorithm='lasso_lars' or algorithm='lasso_cd', alpha is the penalty applied to the L1 norm. If algorithm='threshold', alpha is the absolute value of the threshold below which coefficients will be squashed to zero. If algorithm='omp', alpha is the tolerance parameter: the value of the reconstruction error targeted. In this case, it overridesn_nonzero_coefs. If None, default to 1.

copy_covbool, default=True

Whether to copy the precomputed covariance matrix; if False, it may be overwritten.

initndarray of shape (n_samples, n_components), default=None

Initialization value of the sparse codes. Only used ifalgorithm='lasso_cd'.

max_iterint, default=1000

Maximum number of iterations to perform if algorithm='lasso_cd' or'lasso_lars'.

n_jobsint, default=None

Number of parallel jobs to run.None means 1 unless in a joblib.parallel_backend context.-1 means using all processors. See Glossaryfor more details.

check_inputbool, default=True

If False, the input arrays X and dictionary will not be checked.

verboseint, default=0

Controls the verbosity; the higher, the more messages.

positivebool, default=False

Whether to enforce positivity when finding the encoding.

Added in version 0.20.

Returns:

codendarray of shape (n_samples, n_components)

The sparse codes.

Examples

import numpy as np from sklearn.decomposition import sparse_encode X = np.array([[-1, -1, -1], [0, 0, 3]]) dictionary = np.array( ... [[0, 1, 0], ... [-1, -1, 2], ... [1, 1, 1], ... [0, 1, 1], ... [0, 2, 1]], ... dtype=np.float64 ... ) sparse_encode(X, dictionary, alpha=1e-10) array([[ 0., 0., -1., 0., 0.], [ 0., 1., 1., 0., 0.]])