Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
Version: 0.4.0 Suggests: covr, knitr, rmarkdown, testthat (≥ 3.0.0) Published: 2023-11-10 DOI: 10.32614/CRAN.package.attention Author: Bastiaan Quast [aut, cre] Maintainer: Bastiaan Quast <bquast at gmail.com> License: GPL (≥ 3) NeedsCompilation: no Materials: README NEWS CRAN checks: attention results Documentation: Reference manual: attention.pdf Vignettes: Complete Self-Attention from ScratchPlease use the canonical form https://CRAN.R-project.org/package=attention to link to this page.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4