tf.keras.layers.Dense  |  TensorFlow v2.16.1 (original) (raw)

tf.keras.layers.Dense

Stay organized with collections Save and categorize content based on your preferences.

Just your regular densely-connected NN layer.

Inherits From: Layer, Operation

tf.keras.layers.Dense(
    units,
    activation=None,
    use_bias=True,
    kernel_initializer='glorot_uniform',
    bias_initializer='zeros',
    kernel_regularizer=None,
    bias_regularizer=None,
    activity_regularizer=None,
    kernel_constraint=None,
    bias_constraint=None,
    lora_rank=None,
    **kwargs
)

Used in the notebooks

Used in the guide Used in the tutorials
Advanced automatic differentiation Debug a TensorFlow 2 migrated training pipeline Effective Tensorflow 2 Migration examples: Canned Estimators Migrate early stopping Overfit and underfit Time series forecasting Load a pandas DataFrame Using DTensors with Keras Intro to Autoencoders

Dense implements the operation:output = activation(dot(input, kernel) + bias)where activation is the element-wise activation function passed as the activation argument, kernel is a weights matrix created by the layer, and bias is a bias vector created by the layer (only applicable if use_bias is True).

Args
units Positive integer, dimensionality of the output space.
activation Activation function to use. If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x).
use_bias Boolean, whether the layer uses a bias vector.
kernel_initializer Initializer for the kernel weights matrix.
bias_initializer Initializer for the bias vector.
kernel_regularizer Regularizer function applied to the kernel weights matrix.
bias_regularizer Regularizer function applied to the bias vector.
activity_regularizer Regularizer function applied to the output of the layer (its "activation").
kernel_constraint Constraint function applied to the kernel weights matrix.
bias_constraint Constraint function applied to the bias vector.
lora_rank Optional integer. If set, the layer's forward pass will implement LoRA (Low-Rank Adaptation) with the provided rank. LoRA sets the layer's kernel to non-trainable and replaces it with a delta over the original kernel, obtained via multiplying two lower-rank trainable matrices. This can be useful to reduce the computation cost of fine-tuning large dense layers. You can also enable LoRA on an existingDense layer by calling layer.enable_lora(rank).
Input shape
N-D tensor with shape: (batch_size, ..., input_dim). The most common situation would be a 2D input with shape (batch_size, input_dim).
Output shape
N-D tensor with shape: (batch_size, ..., units). For instance, for a 2D input with shape (batch_size, input_dim), the output would have shape (batch_size, units).
Attributes
input Retrieves the input tensor(s) of a symbolic operation.Only returns the tensor(s) corresponding to the _first time_the operation was called.
kernel
output Retrieves the output tensor(s) of a layer.Only returns the tensor(s) corresponding to the _first time_the operation was called.

Methods

enable_lora

View source

enable_lora(
    rank, a_initializer='he_uniform', b_initializer='zeros'
)

from_config

View source

@classmethod from_config( config )

Creates a layer from its config.

This method is the reverse of get_config, capable of instantiating the same layer from the config dictionary. It does not handle layer connectivity (handled by Network), nor weights (handled by set_weights).

Args
config A Python dictionary, typically the output of get_config.
Returns
A layer instance.

quantized_build

View source

quantized_build(
    input_shape, mode
)

symbolic_call

View source

symbolic_call(
    *args, **kwargs
)
Class Variables
QUANTIZATION_MODE_ERROR_TEMPLATE ("Invalid quantization mode. Expected one of ('int8', 'float8'). Received: " 'quantization_mode={mode}')