A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.pytorch.org/docs/stable/generated/torch.nn.LogSoftmax.html below:

LogSoftmax — PyTorch 2.8 documentation

LogSoftmax#
class torch.nn.LogSoftmax(dim=None)[source]#

Applies the log ⁡ ( Softmax ( x ) ) \log(\text{Softmax}(x)) log(Softmax(x)) function to an n-dimensional input Tensor.

The LogSoftmax formulation can be simplified as:

LogSoftmax ( x i ) = log ⁡ ( exp ⁡ ( x i ) ∑ j exp ⁡ ( x j ) ) \text{LogSoftmax}(x_{i}) = \log\left(\frac{\exp(x_i) }{ \sum_j \exp(x_j)} \right) LogSoftmax(xi)=log(jexp(xj)exp(xi))

Shape:
  • Input: ( ∗ ) (*) () where * means, any number of additional dimensions

  • Output: ( ∗ ) (*) (), same shape as the input

Parameters

dim (int) – A dimension along which LogSoftmax will be computed.

Returns

a Tensor of the same dimension and shape as the input with values in the range [-inf, 0)

Return type

None

Examples:

>>> m = nn.LogSoftmax(dim=1)
>>> input = torch.randn(2, 3)
>>> output = m(input)

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4