Applies a softmax function.
Softmax is defined as:
Softmax ( x i ) = e x p ( x i ) ∑ j e x p ( x j ) \text{Softmax}(x_{i}) = \frac{exp(x_i)}{\sum_j exp(x_j)} Softmax(xi)=∑jexp(xj)exp(xi)
where i , j i, j i,j run over sparse tensor indices and unspecified entries are ignores. This is equivalent to defining unspecified entries as negative infinity so that e x p ( x k ) = 0 exp(x_k) = 0 exp(xk)=0 when the entry with index k k k has not specified.
It is applied to all slices along dim, and will re-scale them so that the elements lie in the range [0, 1] and sum to 1.
input (Tensor) – input
dim (int) – A dimension along which softmax will be computed.
dtype (torch.dtype
, optional) – the desired data type of returned tensor. If specified, the input tensor is casted to dtype
before the operation is performed. This is useful for preventing data type overflows. Default: None
Access comprehensive developer documentation for PyTorch
View Docs TutorialsGet in-depth tutorials for beginners and advanced developers
View Tutorials ResourcesFind development resources and get your questions answered
View ResourcesRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4