Skip to content

Commit d821358

Browse files
committed
update readme
1 parent 3737889 commit d821358

File tree

1 file changed

+8
-6
lines changed

1 file changed

+8
-6
lines changed

README.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ This implementation is provided with [Google's pre-trained models](https://githu
1919

2020
## Installation
2121

22-
This repo was tested on Python 3.6+ and PyTorch 0.4.1
22+
This repo was tested on Python 3.5+ and PyTorch 0.4.1/1.0.0
2323

2424
### With pip
2525

@@ -372,9 +372,9 @@ Where `$THIS_MACHINE_INDEX` is an sequential index assigned to each of your mach
372372

373373
We showcase several fine-tuning examples based on (and extended from) [the original implementation](https://github.com/google-research/bert/):
374374

375-
- a sequence-level classifier on the MRPC classification corpus,
376-
- a token-level classifier on the question answering dataset SQuAD, and
377-
- a sequence-level multiple-choice classifier on the SWAG classification corpus.
375+
- a *sequence-level classifier* on the MRPC classification corpus,
376+
- a *token-level classifier* on the question answering dataset SQuAD, and
377+
- a *sequence-level multiple-choice classifier* on the SWAG classification corpus.
378378

379379
#### MRPC
380380

@@ -427,7 +427,7 @@ python run_classifier.py \
427427

428428
#### SQuAD
429429

430-
This example code fine-tunes BERT on the SQuAD dataset. It runs in 24 min (with BERT-base) or 68 min (with BERT-large) on single tesla V100 16GB.
430+
This example code fine-tunes BERT on the SQuAD dataset. It runs in 24 min (with BERT-base) or 68 min (with BERT-large) on a single tesla V100 16GB.
431431

432432
The data for SQuAD can be downloaded with the following links and should be saved in a `$SQUAD_DIR` directory.
433433

@@ -458,7 +458,9 @@ Training with the previous hyper-parameters gave us the following results:
458458
{"f1": 88.52381567990474, "exact_match": 81.22043519394512}
459459
```
460460

461-
The data for Swag can be downloaded by cloning the following [repository](https://github.com/rowanz/swagaf)
461+
#### SWAG
462+
463+
The data for SWAG can be downloaded by cloning the following [repository](https://github.com/rowanz/swagaf)
462464

463465
```shell
464466
export SWAG_DIR=/path/to/SWAG

0 commit comments

Comments
 (0)