README.md
1.7 KB · 30 lines · markdown Raw
1 ---
2 language: en
3 tags:
4 - summarization
5 license: apache-2.0
6 datasets:
7 - cnn_dailymail
8 - xsum
9 thumbnail: https://huggingface.co/front/thumbnails/distilbart_medium.png
10 ---
11
12 ### Usage
13
14 This checkpoint should be loaded into `BartForConditionalGeneration.from_pretrained`. See the [BART docs](https://huggingface.co/transformers/model_doc/bart.html?#transformers.BartForConditionalGeneration) for more information.
15
16 ### Metrics for DistilBART models
17
18 | Model Name | MM Params | Inference Time (MS) | Speedup | Rouge 2 | Rouge-L |
19 |:---------------------------|------------:|----------------------:|----------:|----------:|----------:|
20 | distilbart-xsum-12-1 | 222 | 90 | 2.54 | 18.31 | 33.37 |
21 | distilbart-xsum-6-6 | 230 | 132 | 1.73 | 20.92 | 35.73 |
22 | distilbart-xsum-12-3 | 255 | 106 | 2.16 | 21.37 | 36.39 |
23 | distilbart-xsum-9-6 | 268 | 136 | 1.68 | 21.72 | 36.61 |
24 | bart-large-xsum (baseline) | 406 | 229 | 1 | 21.85 | 36.50 |
25 | distilbart-xsum-12-6 | 306 | 137 | 1.68 | 22.12 | 36.99 |
26 | bart-large-cnn (baseline) | 406 | 381 | 1 | 21.06 | 30.63 |
27 | distilbart-12-3-cnn | 255 | 214 | 1.78 | 20.57 | 30.00 |
28 | distilbart-12-6-cnn | 306 | 307 | 1.24 | 21.26 | 30.59 |
29 | distilbart-6-6-cnn | 230 | 182 | 2.09 | 20.17 | 29.70 |
30