Add 'brainsteam/content/bookmarks/2022/04/02/models-official-projects-token-dropping-at-master-tensorflow-models1648877142.md'
continuous-integration/drone/push Build is passing
Details
continuous-integration/drone/push Build is passing
Details
This commit is contained in:
parent
c466c5e129
commit
9b4f83e13b
|
@ -0,0 +1,14 @@
|
|||
---
|
||||
bookmark-of: https://github.com/tensorflow/models/tree/master/official/projects/token_dropping
|
||||
date: '2022-04-02T01:25:42.935585'
|
||||
tags:
|
||||
- nlp
|
||||
title: "models/official/projects/token_dropping at master \xB7 tensorflow/models"
|
||||
type: bookmark
|
||||
url: /bookmarks/2022/04/02/models-official-projects-token-dropping-at-master-tensorflow-models1648877142
|
||||
|
||||
---
|
||||
|
||||
> Token dropping aims to accelerate the pretraining of transformer models such as BERT without degrading its performance on downstream tasks.
|
||||
|
||||
> A BERT model pretrained using this token dropping method is not different to a BERT model pretrained in the conventional way: a BERT checkpoint pretrained with token dropping can be viewed and used as a normal BERT checkpoint, for finetuning etc.
|
Loading…
Reference in New Issue