2022-04-02 06:25:44 +01:00
---
2024-09-08 12:09:57 +01:00
bookmark-of:
title: models/official/projects/token_dropping at master · tensorflow/models · GitHub
url: https://github.com/tensorflow/models/tree/master/official/projects/token_dropping
2022-04-02 06:25:44 +01:00
date: '2022-04-02T01:25:42.935585'
2023-07-09 11:34:44 +01:00
post_meta:
- date
2022-04-02 06:25:44 +01:00
tags:
- nlp
2023-07-09 11:34:44 +01:00
title: models/official/projects/token_dropping at master · tensorflow/models
type: bookmarks
2022-04-02 06:25:44 +01:00
url: /bookmarks/2022/04/02/models-official-projects-token-dropping-at-master-tensorflow-models1648877142
---
> Token dropping aims to accelerate the pretraining of transformer models such as BERT without degrading its performance on downstream tasks.
2023-07-09 11:34:44 +01:00
> A BERT model pretrained using this token dropping method is not different to a BERT model pretrained in the conventional way: a BERT checkpoint pretrained with token dropping can be viewed and used as a normal BERT checkpoint, for finetuning etc.