nki.language.softplus — AWS Neuron Documentation (original) (raw)

This document is relevant for: Inf2, Trn1, Trn2

nki.language.softplus#

nki.language.softplus(x, *, dtype=None, mask=None, **kwargs)[source]#

Softplus activation function on the input, element-wise.

Softplus is a smooth approximation to the ReLU activation, defined as:

softplus(x) = log(1 + exp(x))

Parameters:

Returns:

a tile that has softplus of x.

This document is relevant for: Inf2, Trn1, Trn2