A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://www.aclweb.org/anthology/2020.emnlp-demos.6/ below:

Transformers: State-of-the-Art Natural Language Processing

Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Remi Louf, Morgan Funtowicz, Joe Davison, Sam Shleifer, Patrick von Platen, Clara Ma, Yacine Jernite, Julien Plu, Canwen Xu, Teven Le Scao, Sylvain Gugger, Mariama Drame, Quentin Lhoest, Alexander Rush

AbstractRecent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at https://github.com/huggingface/transformers.
Anthology ID:
2020.emnlp-demos.6
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
October
Year:
2020
Address:
Online
Editors:
Qun Liu, David Schlangen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
38–45
Language:
URL:
https://aclanthology.org/2020.emnlp-demos.6/
DOI:
10.18653/v1/2020.emnlp-demos.6
Award:
 Honorable Demonstration Paper
Bibkey:
Cite (ACL):
Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Remi Louf, Morgan Funtowicz, Joe Davison, Sam Shleifer, Patrick von Platen, Clara Ma, Yacine Jernite, Julien Plu, Canwen Xu, Teven Le Scao, Sylvain Gugger, Mariama Drame, Quentin Lhoest, and Alexander Rush. 2020. Transformers: State-of-the-Art Natural Language Processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 38–45, Online. Association for Computational Linguistics.
Cite (Informal):
Transformers: State-of-the-Art Natural Language Processing (Wolf et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-demos.6.pdf
Code
 huggingface/transformers +  additional community code
@inproceedings{wolf-etal-2020-transformers,
    title = "Transformers: State-of-the-Art Natural Language Processing",
    author = "Wolf, Thomas  and
      Debut, Lysandre  and
      Sanh, Victor  and
      Chaumond, Julien  and
      Delangue, Clement  and
      Moi, Anthony  and
      Cistac, Pierric  and
      Rault, Tim  and
      Louf, Remi  and
      Funtowicz, Morgan  and
      Davison, Joe  and
      Shleifer, Sam  and
      von Platen, Patrick  and
      Ma, Clara  and
      Jernite, Yacine  and
      Plu, Julien  and
      Xu, Canwen  and
      Le Scao, Teven  and
      Gugger, Sylvain  and
      Drame, Mariama  and
      Lhoest, Quentin  and
      Rush, Alexander",
    editor = "Liu, Qun  and
      Schlangen, David",
    booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
    month = oct,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2020.emnlp-demos.6/",
    doi = "10.18653/v1/2020.emnlp-demos.6",
    pages = "38--45",
    abstract = "Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at \url{https://github.com/huggingface/transformers}."
}
<?xml version="1.0" encoding="UTF-8"?>
<modsCollection xmlns="http://www.loc.gov/mods/v3">
<mods ID="wolf-etal-2020-transformers">
    <titleInfo>
        <title>Transformers: State-of-the-Art Natural Language Processing</title>
    </titleInfo>
    <name type="personal">
        <namePart type="given">Thomas</namePart>
        <namePart type="family">Wolf</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Lysandre</namePart>
        <namePart type="family">Debut</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Victor</namePart>
        <namePart type="family">Sanh</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Julien</namePart>
        <namePart type="family">Chaumond</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Clement</namePart>
        <namePart type="family">Delangue</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Anthony</namePart>
        <namePart type="family">Moi</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Pierric</namePart>
        <namePart type="family">Cistac</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Tim</namePart>
        <namePart type="family">Rault</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Remi</namePart>
        <namePart type="family">Louf</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Morgan</namePart>
        <namePart type="family">Funtowicz</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Joe</namePart>
        <namePart type="family">Davison</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Sam</namePart>
        <namePart type="family">Shleifer</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Patrick</namePart>
        <namePart type="family">von Platen</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Clara</namePart>
        <namePart type="family">Ma</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Yacine</namePart>
        <namePart type="family">Jernite</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Julien</namePart>
        <namePart type="family">Plu</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Canwen</namePart>
        <namePart type="family">Xu</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Teven</namePart>
        <namePart type="family">Le Scao</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Sylvain</namePart>
        <namePart type="family">Gugger</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Mariama</namePart>
        <namePart type="family">Drame</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Quentin</namePart>
        <namePart type="family">Lhoest</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <name type="personal">
        <namePart type="given">Alexander</namePart>
        <namePart type="family">Rush</namePart>
        <role>
            <roleTerm authority="marcrelator" type="text">author</roleTerm>
        </role>
    </name>
    <originInfo>
        <dateIssued>2020-10</dateIssued>
    </originInfo>
    <typeOfResource>text</typeOfResource>
    <relatedItem type="host">
        <titleInfo>
            <title>Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations</title>
        </titleInfo>
        <name type="personal">
            <namePart type="given">Qun</namePart>
            <namePart type="family">Liu</namePart>
            <role>
                <roleTerm authority="marcrelator" type="text">editor</roleTerm>
            </role>
        </name>
        <name type="personal">
            <namePart type="given">David</namePart>
            <namePart type="family">Schlangen</namePart>
            <role>
                <roleTerm authority="marcrelator" type="text">editor</roleTerm>
            </role>
        </name>
        <originInfo>
            <publisher>Association for Computational Linguistics</publisher>
            <place>
                <placeTerm type="text">Online</placeTerm>
            </place>
        </originInfo>
        <genre authority="marcgt">conference publication</genre>
    </relatedItem>
    <abstract>Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at https://github.com/huggingface/transformers.</abstract>
    <identifier type="citekey">wolf-etal-2020-transformers</identifier>
    <identifier type="doi">10.18653/v1/2020.emnlp-demos.6</identifier>
    <location>
        <url>https://aclanthology.org/2020.emnlp-demos.6/</url>
    </location>
    <part>
        <date>2020-10</date>
        <extent unit="page">
            <start>38</start>
            <end>45</end>
        </extent>
    </part>
</mods>
</modsCollection>
%0 Conference Proceedings
%T Transformers: State-of-the-Art Natural Language Processing
%A Wolf, Thomas
%A Debut, Lysandre
%A Sanh, Victor
%A Chaumond, Julien
%A Delangue, Clement
%A Moi, Anthony
%A Cistac, Pierric
%A Rault, Tim
%A Louf, Remi
%A Funtowicz, Morgan
%A Davison, Joe
%A Shleifer, Sam
%A von Platen, Patrick
%A Ma, Clara
%A Jernite, Yacine
%A Plu, Julien
%A Xu, Canwen
%A Le Scao, Teven
%A Gugger, Sylvain
%A Drame, Mariama
%A Lhoest, Quentin
%A Rush, Alexander
%Y Liu, Qun
%Y Schlangen, David
%S Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
%D 2020
%8 October
%I Association for Computational Linguistics
%C Online
%F wolf-etal-2020-transformers
%X Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at https://github.com/huggingface/transformers.
%R 10.18653/v1/2020.emnlp-demos.6
%U https://aclanthology.org/2020.emnlp-demos.6/
%U https://doi.org/10.18653/v1/2020.emnlp-demos.6
%P 38-45
Markdown (Informal)

[Transformers: State-of-the-Art Natural Language Processing](https://aclanthology.org/2020.emnlp-demos.6/) (Wolf et al., EMNLP 2020)

ACL

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4