A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.pytorch.org/docs/stable/generated/torch.svd_lowrank.html below:

torch.svd_lowrank — PyTorch 2.8 documentation

Return the singular value decomposition (U, S, V) of a matrix, batches of matrices, or a sparse matrix A A A such that A ≈ U diag ⁡ ( S ) VH A \approx U \operatorname{diag}(S) V^{\text{H}} AUdiag(S)VH. In case M M M is given, then SVD is computed for the matrix A − M A - M AM.

Note

The implementation is based on the Algorithm 5.1 from Halko et al., 2009.

Note

For an adequate approximation of a k-rank matrix A A A, where k is not known in advance but could be estimated, the number of Q Q Q columns, q, can be choosen according to the following criteria: in general, k < = q < = m i n ( 2 ∗ k , m , n ) k <= q <= min(2*k, m, n) k<=q<=min(2k,m,n). For large low-rank matrices, take q = k + 5..10 q = k + 5..10 q=k+5..10. If k is relatively small compared to m i n ( m , n ) min(m, n) min(m,n), choosing q = k + 0..2 q = k + 0..2 q=k+0..2 may be sufficient.

Note

This is a randomized method. To obtain repeatable results, set the seed for the pseudorandom number generator

Note

In general, use the full-rank SVD implementation torch.linalg.svd() for dense matrices due to its 10x higher performance characteristics. The low-rank SVD will be useful for huge sparse matrices that torch.linalg.svd() cannot handle.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4