dc.contributor.author |
Sopyła, Krzysztof |
dc.contributor.author |
Sawaniewski, Łukasz |
dc.date.accessioned |
2021-07-30T06:03:47Z |
dc.date.available |
2021-07-30T06:03:47Z |
dc.date.issued |
2021 |
dc.identifier.uri |
http://hdl.handle.net/11321/835 |
dc.description |
Polish RoBERTa model trained on Polish Wikipedia, Polish literature and Oscar. |
dc.language.iso |
pol |
dc.publisher |
Ermlab |
dc.source.uri |
https://github.com/Ermlab/PoLitBert/ |
dc.subject |
transformers |
dc.subject |
word embeddings |
dc.title |
PoLitBert_v32k_cos1_2_50k - Polish RoBERTa model |
dc.type |
toolService |
metashare.ResourceInfo#ContentInfo.detailedType |
tool |
metashare.ResourceInfo#ResourceComponentType#ToolServiceInfo.languageDependent |
true |
has.files |
no |
branding |
CLARIN-PL |
demo.uri |
https://minio.clarin-pl.eu/ermlab/public/PoLitBert/models/PoLitBert_v32k_cos1_2_50k.zip |
contact.person |
Krzysztof Sopyła office@ermlab.com Ermlab |
files.size |
0 |
files.count |
0 |