Showing preview only (4,295K chars total). Download the full file or copy to clipboard to get everything.
Repository: apple/ml-mkqa
Branch: main
Commit: 651b8cc85c40
Files: 18
Total size: 4.1 MB
Directory structure:
gitextract_i684binf/
├── .gitignore
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── LICENSE
├── README.md
├── mkqa_eval.py
├── mkqa_eval_all_languages.py
├── mkqa_eval_util.py
├── requirements.txt
├── sample_predictions/
│ ├── en.jsonl
│ ├── es.jsonl
│ ├── results/
│ │ ├── en_results/
│ │ │ └── metrics.json
│ │ ├── es_results/
│ │ │ └── metrics.json
│ │ ├── metrics.json
│ │ └── zh_cn_results/
│ │ └── metrics.json
│ └── zh_cn.jsonl
└── tests/
├── test_mkqa_eval.py
└── testdata/
└── en_prediction.jsonl
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
__pycache__
*.pyc
.DS_STORE
.idea
================================================
FILE: CODE_OF_CONDUCT.md
================================================
# Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies within all project spaces, and it also applies when
an individual is representing the project or its community in public spaces.
Examples of representing a project or community include using an official
project e-mail address, posting via an official social media account, or acting
as an appointed representative at an online or offline event. Representation of
a project may be further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the open source team at [opensource-conduct@group.apple.com](mailto:opensource-conduct@group.apple.com). All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant](https://www.contributor-covenant.org), version 1.4,
available at [https://www.contributor-covenant.org/version/1/4/code-of-conduct.html](https://www.contributor-covenant.org/version/1/4/code-of-conduct.html)
================================================
FILE: CONTRIBUTING.md
================================================
# Contribution Guide
Thanks for your interest in contributing. This project was released to accompany a research paper for purposes of reproducability, and beyond its publication there are limited plans for future development of the repository.
While we welcome new pull requests and issues please note that our response may be limited. Forks and out-of-tree improvements are strongly encouraged.
## Before you get started
By submitting a pull request, you represent that you have the right to license your contribution to Apple and the community, and agree by submitting the patch that your contributions are licensed under the [LICENSE](LICENSE).
We ask that all community members read and observe our [Code of Conduct](CODE_OF_CONDUCT.md).
================================================
FILE: LICENSE
================================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
================================================
FILE: README.md
================================================
# MKQA: Multilingual Knowledge Questions & Answers
[**Tasks**](#task-description) | [**Dataset**](#dataset) | [**Leaderboard**](#leaderboard) | [**Evaluation**](#evaluation) |
[**Paper**](https://arxiv.org/abs/2007.15207) |
[**Citation**](#citation) | [**License**](#license)
We introduce MKQA, an open-domain question answering evaluation set comprising 10k question-answer pairs
aligned across 26 typologically diverse languages (260k question-answer pairs in total).
The goal of this dataset is to provide a challenging benchmark for question answering quality across a wide set of
languages. Please refer to our paper for details, [MKQA: A Linguistically Diverse Benchmark for Multilingual Open Domain Question Answering](https://arxiv.org/abs/2007.15207)
## Task Description
Given a question <code>q<sup>l</sup></code> in language `l`, the task is to produce a prediction <code>p<sup>l</sup></code> in {No Answer, Yes, No, Text Answer},
where a Text Answer is a span of tokens in the corresponding language.
<code>p<sup>l</sup></code> can be obtained by any method, extracted from a document, generated, or derived from a knowledge graph.
Wherever possible, textual answers are accompanied by Wikidata QIDs, for entity linking and evaluating knowledge graph approaches.
These QIDs also enable automatic translations for most answers into any Wikipedia language through the Wikidata knowledge graph.
## Dataset
MKQA contains 10,000 queries sampled from the [Google Natural Questions dataset](https://github.com/google-research-datasets/natural-questions).
For each query we collect new passage-independent answers.
These queries and answers are then human translated into 25 Non-English languages.
MKQA data can be downloaded from [here](dataset/mkqa.jsonl.gz).
Each example in the dataset contains the unique Natural Questions `example_id`, the original English `query`, and then `queries` and `answers` in 26 languages.
```
{
'example_id': 563260143484355911,
'queries': {
'en': "who sings i hear you knocking but you can't come in",
'ru': "кто поет i hear you knocking but you can't come in",
'ja': '「 I hear you knocking」は誰が歌っていますか',
'zh_cn': "《i hear you knocking but you can't come in》是谁演唱的",
...
},
'query': "who sings i hear you knocking but you can't come in",
'answers': {'en': [{'type': 'entity',
'entity': 'Q545186',
'text': 'Dave Edmunds',
'aliases': []}],
'ru': [{'type': 'entity',
'entity': 'Q545186',
'text': 'Эдмундс, Дэйв',
'aliases': ['Эдмундс', 'Дэйв Эдмундс', 'Эдмундс Дэйв', 'Dave Edmunds']}],
'ja': [{'type': 'entity',
'entity': 'Q545186',
'text': 'デイヴ・エドモンズ',
'aliases': ['デーブ・エドモンズ', 'デイブ・エドモンズ']}],
'zh_cn': [{'type': 'entity', 'text': '戴维·埃德蒙兹 ', 'entity': 'Q545186'}],
...
},
}
```
Each answer is labelled with an answer type. The breakdown is:
| Answer Type | Occurrence |
|---------------|---------------|
| `entity` | `4221` |
| `long_answer` | `1815` |
| `unanswerable` | `1427` |
| `date` | `1174` |
| `number` | `485` |
| `number_with_unit` | `394` |
| `short_phrase` | `346` |
| `binary` | `138` |
For each language, there can be more than one acceptable textual answer, in order to capture a variety of possible valid answers.
All the supported languages are:
| Language code | Language name |
|---------------|---------------|
| `ar` | `Arabic` |
| `da` | `Danish` |
| `de` | `German` |
| `en` | `English` |
| `es` | `Spanish` |
| `fi` | `Finnish` |
| `fr` | `French` |
| `he` | `Hebrew` |
| `hu` | `Hungarian` |
| `it` | `Italian` |
| `ja` | `Japanese` |
| `ko` | `Korean` |
| `km` | `Khmer` |
| `ms` | `Malay` |
| `nl` | `Dutch` |
| `no` | `Norwegian` |
| `pl` | `Polish` |
| `pt` | `Portuguese` |
| `ru` | `Russian` |
| `sv` | `Swedish` |
| `th` | `Thai` |
| `tr` | `Turkish` |
| `vi` | `Vietnamese` |
| `zh_cn` | `Chinese (Simplified)` |
| `zh_hk` | `Chinese (Hong kong)` |
| `zh_tw` | `Chinese (Traditional)` |
## Leaderboard
| Model name | Best Overall F1 | Best Answerable F1 | Best Unnswerable F1 | Link to paper |
|---------------|---------------|---------------|---------------|---------------|
| (Baseline) XLM-R Large Translate Train |46.0 | 27.6 |84.5 |https://arxiv.org/pdf/1911.02116.pdf |
> Submit a pull request to this repository to add yourself to the MKQA leaderboard. Scores are ordered by Best Overall F1.
## Evaluation
The official evaluation scripts provide two ways to evaluate performance on the MKQA dataset
The evaluation script expects a json lines (jsonl) prediction file with a specific format:
```
{
"example_id": -7449157003522518870,
"prediction": "Hafþór Júlíus `` Thor '' Björnsson",
"binary_answer": null,
"no_answer_prob": 0.23618
}
...
```
### Evaluate performance for single language
To evaluate prediscion for a single language, use `mkqa_eval.py` script and indicate prediction language
```
python mkqa_eval.py --annotation_file ./dataset/mkqa.jsonl.gz \
--predictions_file ./sample_predictions/en.jsonl \
--language en \
(--out_dir <optional output directory for saving metrics and pr curves>, --verbose)
```
### Evaluate performances for all languages
To evaluate predictions for all languages, use the following script. Save the prediction file for each language in the same directory and name each prediction file by its language code, such as `en.jsonl`
```
python mkqa_eval_all_languages.py --annotation_file ./dataset/mkqa.jsonl.gz \
--predictions_dir ./sample_predictions \
(--out_dir <optional output directory for saving metrics and pr curves>, --verbose)
```
To get our the zero shot multilingual bert baseline, use the provided prediction jsonl files [here][./sample_predictions]
Sample output:
```
+---------------+-----------+-----------+----------------------+----------------------+------------------------+---------------------+
| language | best_em | best_f1 | best_answerable_em | best_answerable_f1 | best_unanswerable_em | best_f1_threshold |
|---------------+-----------+-----------+----------------------+----------------------+------------------------+---------------------|
| en | 45.39 | 51.97 | 26.65 | 36.38 | 84.45 | -5.95 |
| es | 39.73 | 43.83 | 16.69 | 22.76 | 87.75 | -2.52 |
| zh_cn | 32.42 | 32.43 | 0 | 0.01 | 100 | -10.53 |
| Macro Average | 39.18 | 42.74 | 14.45 | 19.72 | 90.73 | -6.33 |
+---------------+-----------+-----------+----------------------+----------------------+------------------------+---------------------+
```
> Here it computes the macro average over the languages for which prediction files were supplied for.
The official macro-average requires all 26 prediction files to be included.
To run tests for the evaluation scripts
```
pytest ./tests/test_mkqa_eval.py
```
## Citation
Please cite the following if you found MKQA, our [paper](https://arxiv.org/abs/2007.15207), or these resources useful.
```
@misc{mkqa,
title = {MKQA: A Linguistically Diverse Benchmark for Multilingual Open Domain Question Answering},
author = {Shayne Longpre and Yi Lu and Joachim Daiber},
year = {2020},
URL = {https://arxiv.org/pdf/2007.15207.pdf}
}
```
## License
The code in this repository is licensed according to the [LICENSE](LICENSE) file.
The Multilingual Knowledge Questions and Answers dataset is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/3.0/
## Contact Us
To contact us feel free to email the authors in the paper or create an issue in this repository.
================================================
FILE: mkqa_eval.py
================================================
import argparse
import collections
import json
import logging
import os
import sys
from gzip import GzipFile
from typing import Dict, Optional, Any
import numpy as np
import mkqa_eval_util as eval_util
MKQA_LANGUAGES = [
"ar",
"da",
"de",
"en",
"es",
"fi",
"fr",
"he",
"hu",
"it",
"ja",
"km",
"ko",
"ms",
"nl",
"no",
"pl",
"pt",
"ru",
"sv",
"th",
"tr",
"vi",
"zh_cn",
"zh_hk",
"zh_tw",
]
# A data structure for storing annotations
MKQAAnnotation = collections.namedtuple(
"MKQAAnnotation",
[
"example_id", # The unique ID for each example.
"types", # All answer types selected by inter-grader agreement
"answers", # All valid answer strings
],
)
# A data structure for storing predictions
MKQAPrediction = collections.namedtuple(
"MKQAPrediction",
[
"example_id", # The unique ID for each example.
"prediction", # The predicted answer text ("" serves as No Answer)
"binary_answer", # Is answer {"yes", "no"} (case insensitive), or `None` (indicating neither)
"no_answer_prob",
# (Optional) Score/probability that the answer is No Answer. Used to select the best threshold that maximizes F1.
],
)
def parse_args():
parser = argparse.ArgumentParser("Official evaluation script for single language MKQA.")
parser.add_argument(
"-a",
"--annotation_file",
metavar="mkqa.jsonl.gz",
required=True,
help="Input annotations MKQA JSON Lines gzip file.",
)
parser.add_argument(
"-p",
"--predictions_file",
metavar="preds.json",
required=True,
help="Model predictions json line file",
)
parser.add_argument(
"-l",
"--language",
required=True,
choices=MKQA_LANGUAGES,
help=f"Language code. Accepts any of: {MKQA_LANGUAGES}",
)
parser.add_argument(
"-o",
"--out-dir",
metavar="results/",
help="Write accuracy metrics to file (default is stdout).",
)
parser.add_argument("-v", "--verbose", action="store_true")
parser.add_argument("--print_metrics", "-m", action="store_true", default=True)
if len(sys.argv) == 1:
parser.print_help()
sys.exit(1)
return parser.parse_args()
def read_annotations(gold_path: str) -> Dict[str, Any]:
"""Read mkqa gold annotation for all languages
Args:
gold_path: path to mkqa annotation file, such as mkqa.jsonl.gz
Returns:
A mapping from example id to MKQAAnnotation for all languages
"""
assert os.path.exists(gold_path)
all_gold_annotations = collections.defaultdict(dict)
gzipped_input_file = open(gold_path, "rb")
with GzipFile(fileobj=gzipped_input_file) as input_file:
for line in input_file:
example = json.loads(line)
for language in MKQA_LANGUAGES:
valid_answers, answer_types = [], []
for answer in example["answers"][language]:
# Binary (Yes/No) answer text is always "yes" / "no"
# If answer['text'] is None then it `""` represents No Answer
valid_answers.append(answer["text"] or "")
valid_answers.extend(answer.get("aliases", []))
answer_types.append(answer["type"])
annotation = MKQAAnnotation(
example_id=str(example["example_id"]),
types=list(set(answer_types)),
answers=list(set(valid_answers)),
)
all_gold_annotations[language][annotation.example_id] = annotation
for lang, annotations in all_gold_annotations.items():
if len(annotations) != 10000:
logging.warning(
f"The annotations file you've provided contains {len(all_gold_annotations)} for language {lang} examples, where 10000 are expected."
)
return all_gold_annotations
def read_predictions(predictions_path: str) -> Dict[str, MKQAPrediction]:
"""Read model prediction json file
Args:
predictions_path: path to prediction json line file
{
"example_id": <example id>,
"prediction": <prediction text>,
"binary_answer": <"yes", "no", "">,
"no_answer_prob": <prob>
}
Returns:
A mapping from example id to MKQAPrediction
"""
assert os.path.exists(predictions_path)
logging.info(f"Reading predictions from file: {predictions_path}")
with open(predictions_path, "r") as f:
predictions = [json.loads(l) for l in f.readlines()]
predict_labels = {}
for prediction in predictions:
binary_answer = prediction["binary_answer"].lower() if prediction["binary_answer"] else None
assert binary_answer in {
"yes",
"no",
None,
}, f"Binary prediction can only be yes, no, or None. Provided answer was {binary_answer}"
predict_label = MKQAPrediction(
example_id=str(prediction["example_id"]),
prediction=prediction["prediction"] or "",
binary_answer=binary_answer,
no_answer_prob=prediction.get("no_answer_prob", 0),
)
predict_labels[predict_label.example_id] = predict_label
return predict_labels
def compute_mkqa_scores_for_language(
predictions: Dict[str, MKQAPrediction],
gold_annotations: Dict[str, MKQAAnnotation],
language: str,
) -> (Dict[str, float], Dict[str, float]):
"""
Compute Exact Match and token overlap F1 scores per answer, similar to SQuAD
Args:
predictions: mapping from example id to MKQAPrediction
gold_annotations: mapping from example id to MKQAAnnotation
language: evaluation language in MKQA_LANGUAGES
Returns:
Squad like em and f1
"""
predict_texts, answer_texts = [], []
for example_id, gold_annotation in gold_annotations.items():
assert example_id in predictions, f"{example_id} is missing from prediction"
predict_texts.append(
predictions[example_id].binary_answer or predictions[example_id].prediction
)
answer_texts.append(gold_annotation.answers)
text_metrics = eval_util.get_text_metrics(predict_texts, answer_texts, language)
f1_scores = {example_id: text_metrics["f1"][i] for i, example_id in enumerate(gold_annotations)}
em_scores = {
example_id: text_metrics["exact_match"][i] for i, example_id in enumerate(gold_annotations)
}
return em_scores, f1_scores
def compute_best_threshold(
predictions: Dict[str, MKQAPrediction],
raw_em: Dict[str, float],
raw_f1: Dict[str, float],
no_answer_probs: Dict[str, float],
qid_is_answerable: Dict[str, bool],
) -> Dict[str, float]:
"""Compute the averaged text metrics at the best threshold chosen to maximize F1.
The threshold is varied over No Answer probabilities as provided in the predictions file.
Args:
predictions: mapping from example id to MKQAPrediction
raw_em: mapping from example id to em
raw_f1: mapping from example id to f1
no_answer_probs: mapping from example id to no_answer_probs
qid_is_answerable: mapping from example id to boolean indicator for whether the example is answerable
Returns:
text metrics at the best threshold of f1
"""
id_to_predtext = {exid: p.binary_answer or p.prediction for exid, p in predictions.items()}
ans_em_scores = {qid: raw_em[qid] for qid in raw_em if qid_is_answerable[qid]}
ans_f1_scores = {qid: raw_f1[qid] for qid in raw_f1 if qid_is_answerable[qid]}
unans_em_scores = {qid: raw_em[qid] for qid in raw_em if not qid_is_answerable[qid]}
best_scores = eval_util.compute_best_score_and_threshold(
id_to_predtext, raw_f1, no_answer_probs, qid_is_answerable
)
best_f1, f1_threshold = best_scores["best_score"], best_scores["best_threshold"]
best_em_by_id = eval_util.apply_no_answer_threshold(
raw_em, no_answer_probs, qid_is_answerable, f1_threshold
)
best_answerable_em_by_id = eval_util.apply_no_answer_threshold(
ans_em_scores, no_answer_probs, qid_is_answerable, f1_threshold
)
best_answerable_f1_by_id = eval_util.apply_no_answer_threshold(
ans_f1_scores, no_answer_probs, qid_is_answerable, f1_threshold
)
best_unanswerable_em_by_id = eval_util.apply_no_answer_threshold(
unans_em_scores, no_answer_probs, qid_is_answerable, f1_threshold
)
return {
"best_em": round(100.0 * np.mean(list(best_em_by_id.values())), 2),
"best_f1": round(best_f1, 2),
"best_answerable_em": round(100.0 * np.mean(list(best_answerable_em_by_id.values())), 2),
"best_answerable_f1": round(100.0 * np.mean(list(best_answerable_f1_by_id.values())), 2),
"best_unanswerable_em": round(
100.0 * np.mean(list(best_unanswerable_em_by_id.values())), 2
),
"best_f1_threshold": round(f1_threshold, 2),
}
def evaluate(
annotations: Dict[str, MKQAAnnotation],
predictions: Dict[str, MKQAPrediction],
language: str,
out_dir: Optional[str] = None,
verbose: bool = False,
print_metrics: bool = True,
) -> Dict[str, Any]:
"""Evaluates predictions on the gold answers for the specified `language`.
Args:
annotations: a mapping from example id to corresponding MKQAAnnotation
predictions: a mapping from example id to corresponding MKQAPrediction
language: language code in MKQA_LANGUAGES
out_dir: (Optional) Saves evaluation results into this directory.
f1_plot.png: comparing answerable, unanswerable, and overall f1 across all thresholds
metrics.json: reports best_em, best_f1, best_answerable_em, best_answerable_f1, best_unanswerable_em, and best_f1_threshold
na_prob_hist_hasAns.png: histgram of no answer probability for answerable questions
na_prob_hist_noAns.png: histgram of no answer probability for unanswerable questions
print_metrics: (Optional) Print metrics to console
Returns:
A dictionary of metrics and individual f1 and em scores
"""
# Argument validation
assert language in MKQA_LANGUAGES
raw_em_scores, raw_f1_scores = compute_mkqa_scores_for_language(
predictions, annotations, language=language
)
qid_is_answerable = {
ex_id: bool(annotation.answers != [""]) for ex_id, annotation in annotations.items()
}
# Find best thresholds
no_answer_probs = {ex_id: p.no_answer_prob for ex_id, p in predictions.items()}
metrics = compute_best_threshold(
predictions, raw_em_scores, raw_f1_scores, no_answer_probs, qid_is_answerable
)
if out_dir:
if not os.path.exists(out_dir):
os.makedirs(out_dir, exist_ok=True)
has_ans_qids = [ex_id for ex_id in qid_is_answerable if qid_is_answerable[ex_id]]
no_ans_qids = [ex_id for ex_id in qid_is_answerable if not qid_is_answerable[ex_id]]
if has_ans_qids:
eval_util.plot_na_prob_histogram(no_answer_probs, has_ans_qids, out_dir, "hasAns")
if no_ans_qids:
eval_util.plot_na_prob_histogram(no_answer_probs, no_ans_qids, out_dir, "noAns")
# plot the answerable, unanswerable and overall f1 curve
ans_f1_scores = {qid: raw_f1_scores[qid] for qid in raw_f1_scores if qid_is_answerable[qid]}
unans_em_scores = {
qid: raw_em_scores[qid] for qid in raw_em_scores if not qid_is_answerable[qid]
}
eval_util.plot_f1(
ans_f1_scores, unans_em_scores, no_answer_probs, qid_is_answerable, out_dir
)
if verbose:
default_metrics = eval_util.summarize_default_metrics(
raw_em_scores, raw_f1_scores, qid_is_answerable, metrics,
)
metrics.update(default_metrics)
if print_metrics:
print(json.dumps(metrics, indent=4))
if out_dir:
with open(os.path.join(out_dir, "metrics.json"), "w") as f:
f.write(json.dumps(metrics, indent=4))
return metrics
if __name__ == "__main__":
args = parse_args()
annotations = read_annotations(args.annotation_file)[args.language]
predictions = read_predictions(args.predictions_file)
evaluate(
annotations, predictions, args.language, args.out_dir, args.verbose,
)
================================================
FILE: mkqa_eval_all_languages.py
================================================
import argparse
import logging
import os
import sys
from collections import defaultdict
from multiprocessing.pool import Pool
from typing import Optional, Dict
import pandas as pd
from tabulate import tabulate
import mkqa_eval
import mkqa_eval_util as eval_util
def parse_args():
parser = argparse.ArgumentParser("Official evaluation script for all MKQA languages.")
parser.add_argument(
"-a",
"--annotation_file",
metavar="mkqa.jsonl.gz",
required=True,
help="Input annotations MKQA JSON Lines gzip file.",
)
parser.add_argument(
"-p",
"--predictions_dir",
metavar="preds/",
required=True,
help="Model predictions for each language",
)
parser.add_argument(
"-o",
"--out-dir",
metavar="results/",
help="Write evaluation metrics to files (default is stdout).",
)
parser.add_argument("--verbose", "-v", action="store_true")
if len(sys.argv) == 1:
parser.print_help()
sys.exit(1)
return parser.parse_args()
def read_prediction_dir(predictions_dir: str) -> Dict[str, Dict[str, mkqa_eval.MKQAAnnotation]]:
"""Read a directory that contains predictions for all languages
Args:
predictions_dir: a directory that contains predictions for all languages
Returns:
A mapping from example id to MKQAPrediction for all languages
"""
assert os.path.exists(predictions_dir)
all_language_predictions = {}
for language in mkqa_eval.MKQA_LANGUAGES:
prediction_file = os.path.join(predictions_dir, f"{language}.jsonl")
if not os.path.exists(prediction_file):
logging.info(
f"WARNING: Missing predictions file for language `{language}`. Expecting file `{prediction_file}``"
)
continue
all_language_predictions[language] = mkqa_eval.read_predictions(prediction_file)
return all_language_predictions
def evaluate_predictions(
annotations: Dict[str, mkqa_eval.MKQAAnnotation],
predictions: Dict[str, mkqa_eval.MKQAPrediction],
language: str,
out_dir: str = None,
verbose: bool = False,
) -> Dict[str, Dict[str, float]]:
"""Evaluate predictions for one language from its prediction file and MKQAAnnotations
Args:
annotations: a mapping from example id to corresponding MKQAAnnotation
predictions: a mapping from example id to corresponding MKQAPrediction
language: language code in MKQA_LANGUAGES
out_dir: (Optional) Saves evaluation results into this directory.
verbose: (Optional) Collection additional metrics
Returns:
A dictionary of metrics for the specified language
"""
metrics_by_language = {}
metrics = mkqa_eval.evaluate(
annotations=annotations,
predictions=predictions,
language=language,
out_dir=os.path.join(out_dir, f"{language}_results/") if out_dir else None,
verbose=verbose,
print_metrics=False,
)
metrics_by_language[language] = metrics
return metrics_by_language
def evaluate_all_languages(
annotations_path: str,
all_language_predictions: Dict[str, Dict[str, mkqa_eval.MKQAAnnotation]],
out_dir: Optional[str] = None,
verbose: bool = False,
serialize: bool = False,
):
"""Evaluate all predictions for all supported languages
Args:
annotations_path: annotations MKQA JSON Lines gzip file.
all_language_predictions: a mapping from example id to MKQAPrediction for all languages
out_dir: (Optional) saves evaluation results into this directory.
verbose: (Optional) print out additional metrics
serialize: (Optional) disable parallel processing
"""
assert os.path.exists(annotations_path), "Missing MKQA annotation file"
if out_dir and not os.path.exists(out_dir):
os.makedirs(out_dir, exist_ok=True)
all_annotations = mkqa_eval.read_annotations(annotations_path)
metrics_by_language = {}
if not serialize:
with Pool() as pool:
values = pool.starmap(
evaluate_predictions,
[
(
all_annotations[language],
all_language_predictions[language],
language,
out_dir,
verbose,
)
for language in all_language_predictions.keys()
],
)
else:
values = []
for language in all_language_predictions.keys():
values.append(
evaluate_predictions(
all_annotations[language],
all_language_predictions[language],
language,
out_dir,
verbose,
)
)
for value in values:
metrics_by_language.update(value)
# Here we compute the macro average over the languages for which prediction files were supplied for.
# The official macro-average requires all 26 prediction files to be included.
metrics_by_language["Macro Average"] = eval_util.aggregate_summaries(
list(metrics_by_language.values())
)
logging.info("Metrics by languages:")
metrics_summary = defaultdict(list)
for language, metrics in metrics_by_language.items():
metrics_summary["language"].append(language)
for key, value in metrics.items():
metrics_summary[key].append(value)
metrics_df = pd.DataFrame(metrics_summary)
print(tabulate(metrics_df, headers="keys", tablefmt="psql", showindex=False))
if out_dir:
metrics_df.to_json(os.path.join(out_dir, "metrics.json"))
if __name__ == "__main__":
args = parse_args()
all_language_predictions = read_prediction_dir(args.predictions_dir)
evaluate_all_languages(
args.annotation_file, all_language_predictions, args.out_dir, args.verbose,
)
================================================
FILE: mkqa_eval_util.py
================================================
import collections
import os
import re
import string
from collections import Counter, OrderedDict
from multiprocessing import Pool
from typing import Dict, List, Optional
import matplotlib
import matplotlib.pyplot as plt
import numpy as np
matplotlib.use("Agg")
MIXED_SEGMENTATION_LANGS = ["zh_cn", "zh_hk", "zh_tw", "ja", "th", "km"]
ARTICLE_REGEX_BY_LANG = {
"en": r"\b(a|an|the)\b",
"es": r"\b(un|una|unos|unas|el|la|los|las)\b",
"vi": r"\b(của|là|cái|chiếc|những)\b",
"de": r"\b(ein|eine|einen|einem|eines|einer|der|die|das|den|dem|des)\b",
"ar": "\sال^|ال",
"nl": r"\b(de|het|een|des|der|den)\b",
"sv": r"\b(en|ett)\b",
"da": r"\b(en|et)\b",
"no": r"\b(en|et|ei)\b",
"fr": r"\b(le|la|l'|les|du|de|d'|des|un|une|des)",
"pt": r"\b(o|a|os|as|um|uma|uns|umas)\b",
"it": r"\b(il|lo|la|l'|i|gli|le|del|dello|della|dell'|dei|degli|degl'|delle|un'|uno|una|un)",
"fi": r"\b(se|yks|yksi)\b",
"hu": r"\b(a|az|egy)\b",
}
def map_em_value(prediction, gold_answers, lang):
em_value = compute_max_score_over_answers(calculate_em, prediction, gold_answers, lang)
return float(em_value)
def map_f1_value(prediction, gold_answers, lang):
f1_value = compute_max_score_over_answers(calculate_f1, prediction, gold_answers, lang)
return float(f1_value)
def get_text_metrics(
predictions: List[str], gold_answers: List[List[str]], lang: str, serial=True, workers=None
) -> Dict[str, List[float]]:
"""Compute metrics from the predicted and answer texts."""
if serial:
f1_scores = [
map_f1_value(predictions[i], gold_answers[i], lang) for i in range(len(predictions))
]
em_scores = [
map_em_value(predictions[i], gold_answers[i], lang) for i in range(len(predictions))
]
else:
with Pool(workers) as p:
f1_scores = p.starmap(
map_f1_value,
[(predictions[i], gold_answers[i], lang) for i in range(len(predictions))],
chunksize=64,
)
em_scores = p.starmap(
map_em_value,
[(predictions[i], gold_answers[i], lang) for i in range(len(predictions))],
chunksize=64,
)
return {"f1": f1_scores, "exact_match": em_scores}
def summarize_default_metrics(
em_scores, f1_scores, qid_is_answerable, metrics: Optional[Dict[str, float]] = None,
):
"""Summarize EM and F1 based on default threshold"""
assert set(em_scores.keys()) == set(f1_scores.keys()) == set(qid_is_answerable.keys())
ans_em_scores = {qid: em_scores[qid] for qid in em_scores if qid_is_answerable[qid]}
ans_f1_scores = {qid: f1_scores[qid] for qid in f1_scores if qid_is_answerable[qid]}
unans_em_scores = {qid: em_scores[qid] for qid in em_scores if not qid_is_answerable[qid]}
summary = OrderedDict(
[
("exact_match", round(100.0 * np.mean(list(em_scores.values())), 2)),
("f1", round(100.0 * np.mean(list(f1_scores.values())), 2)),
("answerable_exact_match", round(100.0 * np.mean(list(ans_em_scores.values())), 2)),
("answerable_f1", round(100.0 * np.mean(list(ans_f1_scores.values())), 2)),
("unanswerable_exact_match", round(100.0 * np.mean(list(unans_em_scores.values())), 2)),
]
)
if metrics:
metrics.update(summary)
return summary
def aggregate_summaries(dicts):
summaries = collections.defaultdict(list)
for d in dicts:
for k, v in d.items():
assert isinstance(v, float) or isinstance(v, int)
summaries[k].append(v)
results = {}
for k, v in summaries.items():
results[k] = round(float(np.mean(v)), 2)
return results
def whitespace_tokenize(text):
return text.split()
def mixed_segmentation(text):
segs_out = []
temp_str = ""
for char in text:
if temp_str != "":
ss = whitespace_tokenize(temp_str)
segs_out.extend(ss)
temp_str = ""
segs_out.append(char)
if temp_str != "":
ss = whitespace_tokenize(temp_str)
segs_out.extend(ss)
return segs_out
def normalize_answer_by_language(s, lang):
"""Lower text, remove punctuation, articles and extra whitespace.
This function is customized by language.
"""
def remove_articles(text, lang):
article_regex = ARTICLE_REGEX_BY_LANG.get(lang)
if article_regex:
return re.sub(article_regex, " ", text)
else:
return text
def white_space_fix(text, lang):
if lang in MIXED_SEGMENTATION_LANGS:
tokens = mixed_segmentation(text)
else:
tokens = whitespace_tokenize(text)
return " ".join([t for t in tokens if t.strip() != ""])
def remove_punc(text):
exclude = set(string.punctuation)
return "".join(ch for ch in text if ch not in exclude)
def lower(text):
return text.lower()
return white_space_fix(remove_articles(remove_punc(lower(s)), lang), lang)
def plot_f1(answerable_f1_by_id, unanswerable_em_by_id, na_probs_by_id, qid_to_has_ans, image_dir):
num_no_ans = sum(1 for k in qid_to_has_ans if not qid_to_has_ans[k])
qid_list = sorted(na_probs_by_id, key=lambda k: na_probs_by_id[k])
question_counts = len(qid_list)
answerable_f1 = []
overall_f1 = []
unanswerable_em = []
thresholds = []
sum_answerable_f1 = 0
sum_unanswerable_em = num_no_ans
for i, qid in enumerate(qid_list):
thresholds.append(na_probs_by_id[qid])
if qid in answerable_f1_by_id:
sum_answerable_f1 += answerable_f1_by_id[qid]
elif qid in unanswerable_em_by_id:
sum_unanswerable_em += unanswerable_em_by_id[qid] - 1
else:
raise ValueError(f"{qid} is not in either answerable or unanswerable predictions")
answerable_f1.append(sum_answerable_f1 / question_counts)
unanswerable_em.append(sum_unanswerable_em / question_counts)
overall_f1.append((sum_answerable_f1 + sum_unanswerable_em) / question_counts)
plt.plot(thresholds, answerable_f1, color="green", label="Answerable F1")
plt.plot(thresholds, unanswerable_em, color="red", label="Unanswerable F1")
plt.plot(thresholds, overall_f1, color="blue", label="Overall F1")
plt.legend()
plt.xlabel("No Answer Threshold")
plt.ylabel("F1")
plt.title("F1 plot for different answer types")
plt.savefig(os.path.join(image_dir, "f1_plot.png"))
plt.clf()
def calculate_em(prediction, gold_answer, language):
norm_pred = normalize_answer_by_language(prediction, language)
norm_answer = normalize_answer_by_language(gold_answer, language)
return int(norm_pred == norm_answer)
def calculate_f1(prediction, gold_answer, language):
gold_toks = normalize_answer_by_language(gold_answer, language).split() if gold_answer else []
pred_toks = normalize_answer_by_language(prediction, language).split() if prediction else []
common = Counter(gold_toks) & Counter(pred_toks)
num_common = sum(common.values())
if len(gold_toks) == 0 or len(pred_toks) == 0:
# If the prediction or gold_answer is No Answer, then F1 is 1 if they agree, 0 otherwise
return int(gold_toks == pred_toks)
if num_common == 0:
return 0.0
recall = 1.0 * num_common / len(gold_toks)
precision = 1.0 * num_common / len(pred_toks)
return (2.0 * precision * recall) / (precision + recall)
def compute_max_score_over_answers(metric_fn, prediction, ground_truths, language):
assert len(ground_truths) > 0, "Gold truth answers list should never be empty."
scores_by_answer = [
metric_fn(prediction, ground_truth, language) for ground_truth in ground_truths
]
return max(scores_by_answer)
def compute_best_score_and_threshold(
predictions, scores, no_answer_probs, qid_has_answer
) -> Dict[str, float]:
# Begin at threshold of 0, where all predictions are No Answer.
best_threshold = 0.0
current_score = best_score = sum(1 for k in qid_has_answer if not qid_has_answer[k])
exs_sorted_by_na_prob = sorted(no_answer_probs, key=lambda k: no_answer_probs[k])
for qid in exs_sorted_by_na_prob:
if qid_has_answer[qid]: # Gold truth is answer, and we predict an answer
score_diff = scores[qid]
elif predictions[qid]: # If gold truth is No Answer, but we predict an answer
score_diff = -1
else: # If gold truth and prediction are both No Answer
score_diff = 0
current_score += score_diff
# Update best score and threshold if new max value
if current_score > best_score:
best_threshold = no_answer_probs[qid]
best_score = current_score
return {
"best_score": 100.0 * best_score / len(scores),
"best_threshold": best_threshold,
}
def apply_no_answer_threshold(scores, no_answer_probs, qid_has_answer, no_answer_thresh):
new_scores = {}
for qid, s in scores.items():
pred_no_answer = no_answer_probs[qid] > no_answer_thresh
new_scores[qid] = float(not qid_has_answer[qid]) if pred_no_answer else s
return new_scores
def plot_na_prob_histogram(no_answer_probs, qid_list, outdir, name):
x = [no_answer_probs[k] for k in qid_list]
weights = np.ones_like(x) / float(len(x))
plt.hist(x, weights=weights, bins=20, range=(0.0, 1.0))
plt.xlabel("No Answer Probability")
plt.ylabel("Proportion of Dataset")
plt.title(f"No Answer Probability Histogram: {name}")
plt.savefig(os.path.join(outdir, f"na_prob_histogram_{name}.png"))
plt.clf()
================================================
FILE: requirements.txt
================================================
matplotlib
numpy
pandas
pytest
tabulate
================================================
FILE: sample_predictions/en.jsonl
================================================
{"example_id": -8817357831042426028, "prediction": "365.256 days", "binary_answer": null, "no_answer_prob": -10.777592301368713}
{"example_id": 1326178481868679533, "prediction": "50 metres ( 164.0 ft ) in length", "binary_answer": null, "no_answer_prob": -4.017424821853638}
{"example_id": 5156881367671150240, "prediction": "April 4 , 1973", "binary_answer": null, "no_answer_prob": -9.927809596061707}
{"example_id": 7896399267156135965, "prediction": "Konrad Zuse", "binary_answer": null, "no_answer_prob": -10.966487884521484}
{"example_id": -1865341688150735283, "prediction": "The Matrix Reloaded and The Matrix Revolutions", "binary_answer": null, "no_answer_prob": -5.570307493209839}
{"example_id": 2544891513471080945, "prediction": "Storm botnet or Storm worm botnet", "binary_answer": null, "no_answer_prob": -8.836552143096924}
{"example_id": 5977990164505192934, "prediction": "Channel 4 in the UK", "binary_answer": null, "no_answer_prob": 1.809189796447754}
{"example_id": 4878511094358831450, "prediction": "2010", "binary_answer": null, "no_answer_prob": -0.22725772857666016}
{"example_id": -2647017312134397209, "prediction": "price discrimination between different purchasers", "binary_answer": null, "no_answer_prob": 0.5896624326705933}
{"example_id": -8408180440642663270, "prediction": "Pechoggin , Wisconsin", "binary_answer": null, "no_answer_prob": -9.765551328659058}
{"example_id": 4867617541405387846, "prediction": "", "binary_answer": null, "no_answer_prob": -3.2042319774627686}
{"example_id": 1106823340405298621, "prediction": "Simba", "binary_answer": null, "no_answer_prob": -0.5248816013336182}
{"example_id": -8073311805426057133, "prediction": "Marvin Gaye", "binary_answer": null, "no_answer_prob": -6.214727759361267}
{"example_id": 7356818995600641236, "prediction": "Commonly , 10 electrodes attached to the body are used to form 12 ECG leads , with each lead measuring a specific electrical potential difference", "binary_answer": null, "no_answer_prob": -0.3639945983886719}
{"example_id": 3536060843224553263, "prediction": "Bride and Prejudice", "binary_answer": null, "no_answer_prob": 1.6413865089416504}
{"example_id": 8744169892717093640, "prediction": "VO is measured in METs ( mL / kg / min )", "binary_answer": null, "no_answer_prob": -3.0590415000915527}
{"example_id": 1368063860101804235, "prediction": "Kanye West", "binary_answer": null, "no_answer_prob": -10.459341287612915}
{"example_id": 2595603813477692152, "prediction": "Since the 2000 United States presidential election , red states and blue states", "binary_answer": null, "no_answer_prob": -0.9092601537704468}
{"example_id": -8241537849833386136, "prediction": "nearly 2,000 metres ( 6,600 ft )", "binary_answer": null, "no_answer_prob": -6.374263525009155}
{"example_id": 3802206105778993088, "prediction": "John Delaney", "binary_answer": null, "no_answer_prob": -3.567781448364258}
{"example_id": 3824583619836343232, "prediction": "Bruce Smith", "binary_answer": null, "no_answer_prob": -7.4136704206466675}
{"example_id": -6681703664706904328, "prediction": "28 June 2018", "binary_answer": null, "no_answer_prob": -3.5337600708007812}
{"example_id": 2436049466006768227, "prediction": "Abingdon", "binary_answer": null, "no_answer_prob": -0.20542311668395996}
{"example_id": 6908191510823535629, "prediction": "Europe , the east by Asia", "binary_answer": null, "no_answer_prob": -4.517100095748901}
{"example_id": -4989821726385625117, "prediction": "14 - year terms", "binary_answer": null, "no_answer_prob": -5.156226396560669}
{"example_id": 4655668754667193182, "prediction": "`` Cups ''", "binary_answer": null, "no_answer_prob": -1.2245482206344604}
{"example_id": -4472922687466187846, "prediction": "the atmospheric circulation", "binary_answer": null, "no_answer_prob": -6.8067467212677}
{"example_id": -7336205338592575234, "prediction": "over 600", "binary_answer": null, "no_answer_prob": -7.576993823051453}
{"example_id": -6446069137791333046, "prediction": "Nancy Sinatra", "binary_answer": null, "no_answer_prob": -11.489583611488342}
{"example_id": -8094650067529620665, "prediction": "Selene", "binary_answer": null, "no_answer_prob": -13.083579659461975}
{"example_id": -3307181467072417769, "prediction": "Jackson , Georgia", "binary_answer": null, "no_answer_prob": -6.10265839099884}
{"example_id": 8783251888656509590, "prediction": "January 30 , 2000", "binary_answer": null, "no_answer_prob": -10.873160719871521}
{"example_id": -5967027131695331627, "prediction": "Badfinger", "binary_answer": null, "no_answer_prob": -6.614325523376465}
{"example_id": -7032762580920629255, "prediction": "", "binary_answer": null, "no_answer_prob": 6.1883755922317505}
{"example_id": -42399359865985476, "prediction": "Carson Palmer", "binary_answer": null, "no_answer_prob": -2.9675240516662598}
{"example_id": -2339248357324322528, "prediction": "1993", "binary_answer": null, "no_answer_prob": -5.413975715637207}
{"example_id": -3043010306828964586, "prediction": "the early days", "binary_answer": null, "no_answer_prob": -3.2599759101867676}
{"example_id": -5963759296990135581, "prediction": "Rome", "binary_answer": null, "no_answer_prob": -0.8314799070358276}
{"example_id": -2950768241264673239, "prediction": "Meg Ryan", "binary_answer": null, "no_answer_prob": -2.8856048583984375}
{"example_id": 7555434799084035943, "prediction": "0.08 percent or greater", "binary_answer": null, "no_answer_prob": -4.5200560092926025}
{"example_id": 4569540921014077677, "prediction": "1,500 miles ( 2,400 km )", "binary_answer": null, "no_answer_prob": -4.185544967651367}
{"example_id": 3260947583426165221, "prediction": "January 9 , 2000", "binary_answer": null, "no_answer_prob": -10.908682942390442}
{"example_id": 7618903265124987803, "prediction": "1939", "binary_answer": null, "no_answer_prob": -9.261886358261108}
{"example_id": 2258999864447164929, "prediction": "1964", "binary_answer": null, "no_answer_prob": -7.434452176094055}
{"example_id": 7268042447892190728, "prediction": "10 June 1982", "binary_answer": null, "no_answer_prob": -7.023532450199127}
{"example_id": 1870808988220094780, "prediction": "Emmitt Smith was the celebrity winner from this season .", "binary_answer": null, "no_answer_prob": -1.1643218994140625}
{"example_id": -3583320020709637733, "prediction": "June 15 , 1994", "binary_answer": null, "no_answer_prob": -10.562168598175049}
{"example_id": 6833767885322192424, "prediction": "`` You Ca n't Turn Me Off ( In the Middle of Turning Me On ) ''", "binary_answer": null, "no_answer_prob": -1.4581060409545898}
{"example_id": -5299283512151753462, "prediction": "Queen", "binary_answer": null, "no_answer_prob": -10.51740825176239}
{"example_id": -165974438696878500, "prediction": "five digits", "binary_answer": null, "no_answer_prob": -2.144167184829712}
{"example_id": -3401663362680488524, "prediction": "John James `` Jimbo '' Fisher Jr.", "binary_answer": null, "no_answer_prob": -11.888376116752625}
{"example_id": -2226448970074813724, "prediction": "Camila Cabello", "binary_answer": null, "no_answer_prob": -11.997050285339355}
{"example_id": -1281470309241383171, "prediction": "ecliptic", "binary_answer": null, "no_answer_prob": -8.354366779327393}
{"example_id": 512942171744482497, "prediction": "Government of India", "binary_answer": null, "no_answer_prob": -5.837420463562012}
{"example_id": -613065524168931491, "prediction": "2001", "binary_answer": null, "no_answer_prob": -10.886024117469788}
{"example_id": -2516801627420415432, "prediction": "Dick Grayson", "binary_answer": null, "no_answer_prob": 0.012103915214538574}
{"example_id": 3385484593962180700, "prediction": "", "binary_answer": null, "no_answer_prob": 5.252111911773682}
{"example_id": 1114798411474593640, "prediction": "Dodger Stadium in Los Angeles , California", "binary_answer": null, "no_answer_prob": -1.6753473281860352}
{"example_id": 3565009647830244983, "prediction": "the mujahideen", "binary_answer": null, "no_answer_prob": -2.123460292816162}
{"example_id": -2119017604689021198, "prediction": "1984 and 1998", "binary_answer": null, "no_answer_prob": -8.748164616525173}
{"example_id": 4318168127658657527, "prediction": "", "binary_answer": null, "no_answer_prob": -4.61332643032074}
{"example_id": 163812861735992154, "prediction": "Roger Cook and Sam Hogin", "binary_answer": null, "no_answer_prob": -11.74139153957367}
{"example_id": -5573357739146589469, "prediction": "Albert Studios , Sydney , Australia", "binary_answer": null, "no_answer_prob": -1.0216747522354126}
{"example_id": 5164044901055056385, "prediction": "over 200 mph ( 320 km / h )", "binary_answer": null, "no_answer_prob": -8.440218687057495}
{"example_id": 2699591821777367081, "prediction": "Dodge Charger", "binary_answer": null, "no_answer_prob": -8.828735709190369}
{"example_id": 6199305145750841239, "prediction": "Hemant Brijwasi", "binary_answer": null, "no_answer_prob": -8.343450546264648}
{"example_id": -135263648138393062, "prediction": "Jay Leno", "binary_answer": null, "no_answer_prob": -6.444971084594727}
{"example_id": -3361972665853010476, "prediction": "Star Wars : The Last Jedi", "binary_answer": null, "no_answer_prob": -3.694060802459717}
{"example_id": 585529813577875741, "prediction": "1870", "binary_answer": null, "no_answer_prob": -7.205366373062134}
{"example_id": 4904548508353963435, "prediction": "Law enforcement in Italy is an exclusive function of the State", "binary_answer": null, "no_answer_prob": -2.3438953161239624}
{"example_id": 1412542317709795470, "prediction": "Minneapolis", "binary_answer": null, "no_answer_prob": -8.552934169769287}
{"example_id": -6184574948788644454, "prediction": "Brock Lovett", "binary_answer": null, "no_answer_prob": -6.779603481292725}
{"example_id": -7842349076461411609, "prediction": "philosophes", "binary_answer": null, "no_answer_prob": -3.9286093711853027}
{"example_id": -1218306223522553377, "prediction": "During May , June , and July , the Northern Hemisphere is exposed to more direct sunlight because the hemisphere faces the Sun", "binary_answer": null, "no_answer_prob": -3.434272289276123}
{"example_id": -8397651993627578981, "prediction": "In a jury trial , a jury", "binary_answer": null, "no_answer_prob": -4.520658612251282}
{"example_id": -9081296789891648512, "prediction": "", "binary_answer": null, "no_answer_prob": 3.1034047603607178}
{"example_id": -7757189852370426684, "prediction": "New York New York", "binary_answer": null, "no_answer_prob": -6.812511444091797}
{"example_id": 8045845876003268145, "prediction": "Sean Maguire", "binary_answer": null, "no_answer_prob": -10.791131854057312}
{"example_id": 5267857117065594150, "prediction": "polyglot", "binary_answer": null, "no_answer_prob": -9.128541588783264}
{"example_id": 4497138226542293017, "prediction": "Chameleon Dmitri Anatoly Nikolayevich", "binary_answer": null, "no_answer_prob": -1.4205284118652344}
{"example_id": 3492183054587305823, "prediction": "February 10 , 2017", "binary_answer": null, "no_answer_prob": -7.411470055580139}
{"example_id": -7934440915520373726, "prediction": "", "binary_answer": null, "no_answer_prob": 3.516770839691162}
{"example_id": 6552660741986708878, "prediction": "within the Paris Basin in the north of France", "binary_answer": null, "no_answer_prob": -5.949185132980347}
{"example_id": -2586134561689458870, "prediction": "Procol Harum", "binary_answer": null, "no_answer_prob": -11.289556384086609}
{"example_id": -4509118558072041562, "prediction": "2008", "binary_answer": null, "no_answer_prob": -6.49227637052536}
{"example_id": -1204746644820116040, "prediction": "the 2004 autobiography Soul Surfer : A True Story of Faith , Family , and Fighting to Get Back on the Board by Bethany Hamilton", "binary_answer": null, "no_answer_prob": -8.461892127990723}
{"example_id": 2093732342481935035, "prediction": "48 BC", "binary_answer": null, "no_answer_prob": -9.292020320892334}
{"example_id": -6905205205636703090, "prediction": "18", "binary_answer": null, "no_answer_prob": -7.777143478393555}
{"example_id": -5435822709474983888, "prediction": "", "binary_answer": null, "no_answer_prob": 3.5837883949279785}
{"example_id": 6084451413318448247, "prediction": "Warren Beatty and Faye Dunaway", "binary_answer": null, "no_answer_prob": -7.931109070777893}
{"example_id": -8322770373387746836, "prediction": "The Serenity Prayer", "binary_answer": null, "no_answer_prob": -3.0765879154205322}
{"example_id": -1842130022579978141, "prediction": "In the seven - layer OSI model of computer networking , the session layer is layer 5 .", "binary_answer": null, "no_answer_prob": -2.315301299095154}
{"example_id": 4208264492711266333, "prediction": "Histology also microanatomy is the study of the anatomy of cells and tissues of plants and animals", "binary_answer": null, "no_answer_prob": -0.4700934886932373}
{"example_id": -2294613783191407303, "prediction": "the Eagles", "binary_answer": null, "no_answer_prob": -9.208224415779114}
{"example_id": 3821328473598857703, "prediction": "Employment contracts , pension benefits , and government entitlements such as Social Security", "binary_answer": null, "no_answer_prob": -0.7645235061645508}
{"example_id": -6147411494230562961, "prediction": "Donald Trump", "binary_answer": null, "no_answer_prob": -5.5600457191467285}
{"example_id": 3136119988434626555, "prediction": "May 1 , 1999", "binary_answer": null, "no_answer_prob": -8.95602834224701}
{"example_id": 32949967343890825, "prediction": "Paul McCartney and Wings", "binary_answer": null, "no_answer_prob": -9.309352159500122}
{"example_id": -8718461884360275859, "prediction": "January 8 , 1990", "binary_answer": null, "no_answer_prob": -7.752422571182251}
{"example_id": 5488771502904120932, "prediction": "Belle", "binary_answer": null, "no_answer_prob": -2.9361982345581055}
{"example_id": -8227218054836287119, "prediction": "Richard Bruton", "binary_answer": null, "no_answer_prob": -10.930979490280151}
{"example_id": 1711564609413586391, "prediction": "the late 19th century", "binary_answer": null, "no_answer_prob": -4.712328374385834}
{"example_id": 6257268952806502264, "prediction": "the small town of Laurel , Mississippi , to the New Orleans French Quarter", "binary_answer": null, "no_answer_prob": -4.540841460227966}
{"example_id": -4969087119135536443, "prediction": "rationalism", "binary_answer": null, "no_answer_prob": -4.685050964355469}
{"example_id": 8516947433438533152, "prediction": "Jack Gleeson", "binary_answer": null, "no_answer_prob": -12.408284187316895}
{"example_id": 7914809114380179546, "prediction": "March 7 , 1992", "binary_answer": null, "no_answer_prob": -9.11262995004654}
{"example_id": -6004071283045291202, "prediction": "severe drought", "binary_answer": null, "no_answer_prob": -8.18225884437561}
{"example_id": -1196846981135510318, "prediction": "Bede", "binary_answer": null, "no_answer_prob": -1.5640432834625244}
{"example_id": 2823945180791357197, "prediction": "inner - city African Americans", "binary_answer": null, "no_answer_prob": -6.085609972476959}
{"example_id": -8289161855062990086, "prediction": "Black Swan", "binary_answer": null, "no_answer_prob": -7.335773468017578}
{"example_id": 5006230593152811617, "prediction": "Bill Medley and Jennifer Warnes", "binary_answer": null, "no_answer_prob": -10.704128742218018}
{"example_id": -8874586848406346263, "prediction": "Warsaw Radio Mast", "binary_answer": null, "no_answer_prob": -7.309776306152344}
{"example_id": 802242686146481924, "prediction": "", "binary_answer": null, "no_answer_prob": 2.757947087287903}
{"example_id": 7940062665498064661, "prediction": "the serial number and the fore - end screw to secure the barrel to the stock", "binary_answer": null, "no_answer_prob": -3.086121082305908}
{"example_id": 7854421421102932055, "prediction": "The Forgotten Beasts of Eld is a fantasy novel", "binary_answer": null, "no_answer_prob": 2.6180339455604553}
{"example_id": 7219337640456442084, "prediction": "Roger Federer", "binary_answer": null, "no_answer_prob": -12.410157561302185}
{"example_id": 6502513170495123183, "prediction": "the legendary Fountain of Youth , near Lake Macaco ( now known as Lake Okeechobee , in Florida", "binary_answer": null, "no_answer_prob": -7.240392088890076}
{"example_id": 8037440722757115040, "prediction": "eighth season . As of April 6 , 2018 , 173 episodes of Blue Bloods have aired .", "binary_answer": null, "no_answer_prob": -6.9601356983184814}
{"example_id": 4488538952512614391, "prediction": "Pink Floyd", "binary_answer": null, "no_answer_prob": -2.024033546447754}
{"example_id": 6016553699819294544, "prediction": "A capsid is the protein shell of a virus . It consists of several oligomeric structural subunits made of protein called protomers", "binary_answer": null, "no_answer_prob": -4.331512093544006}
{"example_id": 1839258823869093465, "prediction": "May 15", "binary_answer": null, "no_answer_prob": -2.6564409732818604}
{"example_id": -8101954711358248454, "prediction": "He is perhaps best known for the remark in a letter to an Anglican bishop", "binary_answer": null, "no_answer_prob": -7.186660170555115}
{"example_id": -5517233925443928461, "prediction": "Robert Cray", "binary_answer": null, "no_answer_prob": -3.577745795249939}
{"example_id": -5965882007131819850, "prediction": "REO Speedwagon", "binary_answer": null, "no_answer_prob": -12.191870093345642}
{"example_id": -6544602658176910655, "prediction": "Detective Antonio Dawson", "binary_answer": null, "no_answer_prob": -0.3901398181915283}
{"example_id": -6786663451529216648, "prediction": "Pink", "binary_answer": null, "no_answer_prob": -8.484146237373352}
{"example_id": -2981467081011300302, "prediction": "CEO Rex Tillerson", "binary_answer": null, "no_answer_prob": -10.02569031715393}
{"example_id": -5155731306818028416, "prediction": "Heathcliff Andrew Ledger", "binary_answer": null, "no_answer_prob": -4.430732369422913}
{"example_id": 7475996979460723368, "prediction": "In variants in which the song describes a confrontation", "binary_answer": null, "no_answer_prob": 1.456092357635498}
{"example_id": -7016168820459507698, "prediction": "The N - terminus ( also known as the amino - terminus , NH - terminus , N - terminal end or amine - terminus", "binary_answer": null, "no_answer_prob": -3.6589670181274414}
{"example_id": 5819412337232935478, "prediction": "the southern United States and northern Mexico", "binary_answer": null, "no_answer_prob": -8.156259536743164}
{"example_id": -8558973673910499616, "prediction": "December 19 , 1801", "binary_answer": null, "no_answer_prob": -11.604907274246216}
{"example_id": 8315526696903279876, "prediction": "George Raymond Richard Martin", "binary_answer": null, "no_answer_prob": -11.757185459136963}
{"example_id": 2620562839483759589, "prediction": "between 330 and 435 km ( 205 and 270 mi )", "binary_answer": null, "no_answer_prob": -9.98000681400299}
{"example_id": -361906807922274081, "prediction": "", "binary_answer": null, "no_answer_prob": 6.450025513768196}
{"example_id": 2258832369743184112, "prediction": "R\u00e9jane `` Reggie '' Magloire and Rose Marie Ramsey", "binary_answer": null, "no_answer_prob": -11.893027544021606}
{"example_id": -5312549763179880778, "prediction": "3,573 ft ( 1,089 m )", "binary_answer": null, "no_answer_prob": -10.730311393737793}
{"example_id": -5611233705477236828, "prediction": "gravitational influence is such that the barycenter of the Pluto -- Charon system lies outside Pluto .", "binary_answer": null, "no_answer_prob": -3.197618007659912}
{"example_id": -6276446060731528591, "prediction": "Quincy McCall ( Epps ) and Monica Wright", "binary_answer": null, "no_answer_prob": 0.40161192417144775}
{"example_id": 7640115750041239037, "prediction": "Six other amino acids are considered conditionally essential in the human diet", "binary_answer": null, "no_answer_prob": 1.149665355682373}
{"example_id": 3787230311281172305, "prediction": "Kelly Monaco", "binary_answer": null, "no_answer_prob": -3.832118511199951}
{"example_id": -3765606054252682643, "prediction": "Matt Caldwell Nikki Fried Party Republican Democratic Incumbent Agriculture Commissioner Adam Putnam", "binary_answer": null, "no_answer_prob": -5.0727128982543945}
{"example_id": -6596494750045214057, "prediction": "Thomas Nashe", "binary_answer": null, "no_answer_prob": -6.900006055831909}
{"example_id": 1229184814518823221, "prediction": "Sri Lanka", "binary_answer": null, "no_answer_prob": -3.02645206451416}
{"example_id": -1109399901384006898, "prediction": "between the region of Lake Victoria and the Mediterranean Sea", "binary_answer": null, "no_answer_prob": -2.819030523300171}
{"example_id": 2864828716904811288, "prediction": "Theodora", "binary_answer": null, "no_answer_prob": -5.55584716796875}
{"example_id": 5518691545119288149, "prediction": "2018", "binary_answer": null, "no_answer_prob": -8.16554057598114}
{"example_id": -5583864421000089995, "prediction": "Michaela Dietz", "binary_answer": null, "no_answer_prob": -4.334614396095276}
{"example_id": 1384106640689642583, "prediction": "Paul Anka", "binary_answer": null, "no_answer_prob": -5.868202328681946}
{"example_id": 2319980475239464532, "prediction": "fourteen months", "binary_answer": null, "no_answer_prob": -5.737689018249512}
{"example_id": -8247960020494946676, "prediction": "Yuriy Norshteyn", "binary_answer": null, "no_answer_prob": -4.038188576698303}
{"example_id": -3780188924727299872, "prediction": "A personal watercraft ( PWC ) , also called water scooter , jetski , and comically a boatercycle", "binary_answer": null, "no_answer_prob": -3.4185400009155273}
{"example_id": 4330661765598157231, "prediction": "", "binary_answer": null, "no_answer_prob": 3.5348609685897827}
{"example_id": 4860550252268772892, "prediction": "from 7 to 9 June", "binary_answer": null, "no_answer_prob": -9.638000965118408}
{"example_id": -17314878580916020, "prediction": "Alice Cooper", "binary_answer": null, "no_answer_prob": -7.625054597854614}
{"example_id": -3882377811931187829, "prediction": "Chadwick Gaylord Smith", "binary_answer": null, "no_answer_prob": -13.601244032382965}
{"example_id": 2663900618010457747, "prediction": "Philadelphia , Pennsylvania .", "binary_answer": null, "no_answer_prob": 1.5479817390441895}
{"example_id": -1816845563011165745, "prediction": "Georgie Porgie , pudding and pie", "binary_answer": null, "no_answer_prob": -8.396120309829712}
{"example_id": 662548517647235170, "prediction": "", "binary_answer": null, "no_answer_prob": 0.9225081205368042}
{"example_id": 9054423847712059440, "prediction": "Dave Chappelle as George `` Noodles '' Stone", "binary_answer": null, "no_answer_prob": -4.225421190261841}
{"example_id": -8531942314563806288, "prediction": "558 minutes", "binary_answer": null, "no_answer_prob": -8.374345898628235}
{"example_id": -5459996306053015878, "prediction": "king Matthias Corvinus", "binary_answer": null, "no_answer_prob": -0.3141646385192871}
{"example_id": 1578659854961558558, "prediction": "quirky . ''", "binary_answer": null, "no_answer_prob": -0.15073025226593018}
{"example_id": 833117822520584676, "prediction": "", "binary_answer": null, "no_answer_prob": 1.8688701391220093}
{"example_id": 952823698977964583, "prediction": "Seabrook Island , South Carolina", "binary_answer": null, "no_answer_prob": -8.706898927688599}
{"example_id": 1278480802277665800, "prediction": "330 AD", "binary_answer": null, "no_answer_prob": -7.652534008026123}
{"example_id": -9122833663550948917, "prediction": "Aunt Em and Uncle Henry", "binary_answer": null, "no_answer_prob": 1.7592740058898926}
{"example_id": 8924269644784033060, "prediction": "", "binary_answer": null, "no_answer_prob": 2.9845921397209167}
{"example_id": 4702839898066253410, "prediction": "Chelsea are the defending champions , while Newcastle United , Brighton & Hove Albion and Huddersfield Town", "binary_answer": null, "no_answer_prob": -4.075741767883301}
{"example_id": -5967639920834959562, "prediction": "representatives", "binary_answer": null, "no_answer_prob": -6.613240480422974}
{"example_id": -872148709863456785, "prediction": "John the Baptist", "binary_answer": null, "no_answer_prob": -2.7220206260681152}
{"example_id": -3318122202395388302, "prediction": "Vikings from the north , Huns from the east , and Saracens", "binary_answer": null, "no_answer_prob": -3.7120277881622314}
{"example_id": 7127577961923843538, "prediction": "Tony Christie", "binary_answer": null, "no_answer_prob": -7.865835785865784}
{"example_id": 4235669204735674694, "prediction": "in honor of NFL coach Vince Lombardi", "binary_answer": null, "no_answer_prob": -7.106869220733643}
{"example_id": 5790059625390704028, "prediction": "The use of poison gas performed by all major belligerents throughout World War I constituted war crimes", "binary_answer": null, "no_answer_prob": 0.7614194601774216}
{"example_id": 3308348944106726345, "prediction": "2018", "binary_answer": null, "no_answer_prob": -1.0373274087905884}
{"example_id": -1167263495350719445, "prediction": "", "binary_answer": null, "no_answer_prob": 1.36552494764328}
{"example_id": -8591610983091296817, "prediction": "No . Title Artist Length 1 . `` Rock Star '' ( Art Alexakis", "binary_answer": null, "no_answer_prob": -2.303347587585449}
{"example_id": 1762345328813038847, "prediction": "2008", "binary_answer": null, "no_answer_prob": -6.755959868431091}
{"example_id": 1713054141029260325, "prediction": "Embeth Jean Davidtz", "binary_answer": null, "no_answer_prob": -9.100480675697327}
{"example_id": 2531516684687563287, "prediction": "Toy Caldwell", "binary_answer": null, "no_answer_prob": -11.855585217475891}
{"example_id": -3396975988439089652, "prediction": "1428", "binary_answer": null, "no_answer_prob": -7.062708497047424}
{"example_id": -6434753325781010339, "prediction": "January 1 , 2101 and ending on December 31 , 2200", "binary_answer": null, "no_answer_prob": -8.370445013046265}
{"example_id": -2535300810911583763, "prediction": "Jeanne Sagan", "binary_answer": null, "no_answer_prob": -8.573456048965454}
{"example_id": 912739715533119853, "prediction": "Hampton Roads MSA", "binary_answer": null, "no_answer_prob": -5.6131603717803955}
{"example_id": 5463039643256103598, "prediction": "2011", "binary_answer": null, "no_answer_prob": -8.720986723899841}
{"example_id": 6583630986800851715, "prediction": "Blast fishing or dynamite fishing is the practice of using explosives", "binary_answer": null, "no_answer_prob": 0.4715719223022461}
{"example_id": -8319975474139178450, "prediction": "Madras", "binary_answer": null, "no_answer_prob": -9.420405983924866}
{"example_id": 8674477261947815985, "prediction": "1810", "binary_answer": null, "no_answer_prob": -4.150386810302734}
{"example_id": 2950006076180984136, "prediction": "Harry Hadden - Paton", "binary_answer": null, "no_answer_prob": -5.33656644821167}
{"example_id": 7332569254170441425, "prediction": "the Kuomintang ( KMT ) - led government of the Republic of China and the Communist Party of China", "binary_answer": null, "no_answer_prob": -7.677340269088745}
{"example_id": -8938866779244492178, "prediction": "Horse Nonsense as `` How I Brought the Good News from Aix to Ghent ( or Vice Versa ) '' .", "binary_answer": null, "no_answer_prob": -3.8673675060272217}
{"example_id": -8640818569601107110, "prediction": "Port state control", "binary_answer": null, "no_answer_prob": -4.31926691532135}
{"example_id": -5687260239070204687, "prediction": "Mary Beth Evans", "binary_answer": null, "no_answer_prob": -13.371541380882263}
{"example_id": -7059085327774987280, "prediction": "Beijing", "binary_answer": null, "no_answer_prob": -5.3224639892578125}
{"example_id": 6887634788433337209, "prediction": "Virginia Bruce", "binary_answer": null, "no_answer_prob": -8.539737105369568}
{"example_id": 1778587505575968014, "prediction": "linear chromosomes in eukaryotes , and circular chromosomes in prokaryotes", "binary_answer": null, "no_answer_prob": -1.8205490112304688}
{"example_id": -7391449118245016117, "prediction": "15 miles ( 24 km )", "binary_answer": null, "no_answer_prob": -3.7888402938842773}
{"example_id": -2721022090667625722, "prediction": "Dylan Sprouse", "binary_answer": null, "no_answer_prob": -8.764170169830322}
{"example_id": -8583211498455578525, "prediction": "Due to the moderating influence of the Atlantic or Pacific", "binary_answer": null, "no_answer_prob": -2.8737711906433105}
{"example_id": -1005672146492412109, "prediction": "Andrew Johnson", "binary_answer": null, "no_answer_prob": -8.071446895599365}
{"example_id": -3681267594457866032, "prediction": "Every Picture Tells a Story , released in 1971 .", "binary_answer": null, "no_answer_prob": -1.5528322458267212}
{"example_id": 2836845871426149567, "prediction": "[ContextId=-1] [NoLongAnswer] [ContextId=0] [Paragraph=1]", "binary_answer": null, "no_answer_prob": 2.6273915767669678}
{"example_id": 3235145058323535822, "prediction": "Loudon Wainwright III", "binary_answer": null, "no_answer_prob": -10.415135443210602}
{"example_id": 2523114317094070929, "prediction": "Niagara", "binary_answer": null, "no_answer_prob": -6.28502631187439}
{"example_id": -2205439830862864188, "prediction": "Timberlake", "binary_answer": null, "no_answer_prob": -6.584423542022705}
{"example_id": -8109431955313655366, "prediction": "", "binary_answer": null, "no_answer_prob": 4.832530319690704}
{"example_id": -4071625595659185770, "prediction": "In gridiron football , the safety", "binary_answer": null, "no_answer_prob": -0.049560606479644775}
{"example_id": -3347415205620550250, "prediction": "Seven", "binary_answer": null, "no_answer_prob": -2.8363802433013916}
{"example_id": 7240062746594540469, "prediction": "Ken Curtis", "binary_answer": null, "no_answer_prob": -12.878517270088196}
{"example_id": -2935343613109800417, "prediction": "Christine McVie", "binary_answer": null, "no_answer_prob": -5.676991581916809}
{"example_id": -6017219146541630098, "prediction": "526", "binary_answer": null, "no_answer_prob": -4.391540050506592}
{"example_id": 7049879526021736257, "prediction": "Burl Ives", "binary_answer": null, "no_answer_prob": -8.283509492874146}
{"example_id": 5389106437312212883, "prediction": "R. Kelly", "binary_answer": null, "no_answer_prob": -10.441942572593689}
{"example_id": -8530219745420764651, "prediction": "", "binary_answer": null, "no_answer_prob": 3.706611752510071}
{"example_id": -7676627916208638811, "prediction": "Rupert Everett", "binary_answer": null, "no_answer_prob": -9.236120223999023}
{"example_id": -2829988372546807487, "prediction": "near the namesake Hawaiian Islands , in the northern Pacific Ocean", "binary_answer": null, "no_answer_prob": -7.453548192977905}
{"example_id": 2091438961808499174, "prediction": "Denver Broncos and National Football Conference ( NFC ) champion Seattle Seahawks", "binary_answer": null, "no_answer_prob": -9.326718807220459}
{"example_id": -4197310212641231952, "prediction": "", "binary_answer": null, "no_answer_prob": 3.214908182621002}
{"example_id": -255311412987158941, "prediction": "La Jeune Am\u00e9ricaine et les contes marins ( The Young American and Marine Tales )", "binary_answer": null, "no_answer_prob": -0.3816014528274536}
{"example_id": -7690718669519742777, "prediction": "Seals and Crofts", "binary_answer": null, "no_answer_prob": -6.097313761711121}
{"example_id": -6514095796654523284, "prediction": "John Stamos", "binary_answer": null, "no_answer_prob": -5.991351366043091}
{"example_id": 7818112641371223126, "prediction": "State Farm Stadium", "binary_answer": null, "no_answer_prob": -10.786849975585938}
{"example_id": -7759303418807088024, "prediction": "His first paper in 1915 demonstrated genetic linkage in mammals while subsequent works helped to create population genetics", "binary_answer": null, "no_answer_prob": 4.05146849155426}
{"example_id": -4785701522801361688, "prediction": "Europe , Asia , northwest Africa , and southeastern Australia", "binary_answer": null, "no_answer_prob": -4.686972141265869}
{"example_id": -3399227448895739738, "prediction": "1908", "binary_answer": null, "no_answer_prob": -8.955570459365845}
{"example_id": -2991000124654590247, "prediction": "Jonathan Pangborn", "binary_answer": null, "no_answer_prob": -2.116161346435547}
{"example_id": 8698224606927607838, "prediction": "a means of sending and receiving information", "binary_answer": null, "no_answer_prob": 2.271271228790283}
{"example_id": -8164561486411707609, "prediction": "`` Friend Like Me '' is a song from the 1992 Disney film Aladdin", "binary_answer": null, "no_answer_prob": 2.509090542793274}
{"example_id": 1605762510932027661, "prediction": "1994", "binary_answer": null, "no_answer_prob": -10.425333678722382}
{"example_id": -3267642790590927032, "prediction": "", "binary_answer": null, "no_answer_prob": 1.5024604201316833}
{"example_id": -1738867496255288805, "prediction": "Kathajodi River and the Mahanadi River", "binary_answer": null, "no_answer_prob": -8.426782608032227}
{"example_id": -8083772678784794712, "prediction": "9 7", "binary_answer": null, "no_answer_prob": -5.367637395858765}
{"example_id": -8646655647332839419, "prediction": "samizdat", "binary_answer": null, "no_answer_prob": 1.6986680030822754}
{"example_id": -2430892891443078620, "prediction": "the Psychopannychia", "binary_answer": null, "no_answer_prob": -5.261099457740784}
{"example_id": 5516731406225529709, "prediction": "provide banking services to cater to the needs of agricultural and industrial enterprises", "binary_answer": null, "no_answer_prob": -6.040560007095337}
{"example_id": 3453182683045423186, "prediction": "Day trippers are people who go on a day trip , right ? Usually on a ferryboat or something", "binary_answer": null, "no_answer_prob": -1.8898699283599854}
{"example_id": 5485836099821146604, "prediction": "84 / 56", "binary_answer": null, "no_answer_prob": -4.558686017990112}
{"example_id": 4472987726489577416, "prediction": "1955 1959 1963 1965 1981 1988", "binary_answer": null, "no_answer_prob": -8.079099535942078}
{"example_id": -2143259971354570447, "prediction": "Romanticism characterized by enthusiasm , emotion , and an appeal to the super-natural", "binary_answer": null, "no_answer_prob": -1.921602487564087}
{"example_id": -8432129556778587735, "prediction": "the sacraments of initiation", "binary_answer": null, "no_answer_prob": -4.596418380737305}
{"example_id": -7504397554360672876, "prediction": "Casinos now operate in Goa , Daman and Sikkim .", "binary_answer": null, "no_answer_prob": -1.414887547492981}
{"example_id": -1679520392051387388, "prediction": "Glen Campbell", "binary_answer": null, "no_answer_prob": -12.905067086219788}
{"example_id": -1594127086659780245, "prediction": "Qualcomm Stadium", "binary_answer": null, "no_answer_prob": -6.747988224029541}
{"example_id": -1448524641308572783, "prediction": "The Fellowship of the Ring ( 2001 ) , The Two Towers ( 2002 ) and The Return of the King", "binary_answer": null, "no_answer_prob": -5.066574692726135}
{"example_id": 1538521268068601128, "prediction": "26 March 2015", "binary_answer": null, "no_answer_prob": -11.703308463096619}
{"example_id": 8033165278091535801, "prediction": "1987", "binary_answer": null, "no_answer_prob": -9.976036787033081}
{"example_id": -6595353527328528997, "prediction": "Curtis", "binary_answer": null, "no_answer_prob": -5.453083515167236}
{"example_id": -4384412567296328220, "prediction": "The main responsibilities of CPS are to provide legal advice to the police and other investigative agencies during the course of criminal investigations", "binary_answer": null, "no_answer_prob": -1.5158569812774658}
{"example_id": -2791031496653886371, "prediction": "Mowgli ( \u092e\u094b\u0917\u0932\u0940 Maogal\u012b ; feral child ) - Also referred to as `` Man Cub '' .", "binary_answer": null, "no_answer_prob": -0.8148130178451538}
{"example_id": 1318181099423578694, "prediction": "", "binary_answer": null, "no_answer_prob": 6.877087771892548}
{"example_id": 6291700309486366112, "prediction": "Frank Sinatra", "binary_answer": null, "no_answer_prob": -8.079230070114136}
{"example_id": -2216288005540330811, "prediction": "The Foundation for Environmental Education", "binary_answer": null, "no_answer_prob": -3.4223718643188477}
{"example_id": -9159775735515348520, "prediction": "approximately one in five", "binary_answer": null, "no_answer_prob": -6.567391633987427}
{"example_id": 7423675480415732989, "prediction": "1977", "binary_answer": null, "no_answer_prob": -10.824138224124908}
{"example_id": -7582745877240343576, "prediction": "the late 1970s", "binary_answer": null, "no_answer_prob": 0.9014039039611816}
{"example_id": -6308952806760794704, "prediction": "May 12 , 2010", "binary_answer": null, "no_answer_prob": -9.026169300079346}
{"example_id": -8819224356617100445, "prediction": "1975", "binary_answer": null, "no_answer_prob": -5.362198829650879}
{"example_id": -4324999623175716840, "prediction": "Alex Vause Orange Is the New Black character Laura Prepon as Alex Vause First appearance `` I Was n't Ready", "binary_answer": null, "no_answer_prob": 1.2328563034534454}
{"example_id": -7946914703140878316, "prediction": "`` Hit Me with Your Best Shot '' is a song by American rock singer Pat Benatar", "binary_answer": null, "no_answer_prob": 3.551857963204384}
{"example_id": -8846954740518018553, "prediction": "The Government of the Commonwealth of Australia", "binary_answer": null, "no_answer_prob": 4.050994820892811}
{"example_id": 2403539739622541523, "prediction": "September 28 , 2017", "binary_answer": null, "no_answer_prob": -11.889234066009521}
{"example_id": -6593786172448058094, "prediction": "", "binary_answer": null, "no_answer_prob": 4.437323987483978}
{"example_id": -2928349628862605636, "prediction": "Gross domestic product", "binary_answer": null, "no_answer_prob": 3.734221011400223}
{"example_id": 3697238135623150616, "prediction": "2004", "binary_answer": null, "no_answer_prob": -8.834844827651978}
{"example_id": -4505429291678080448, "prediction": "contributes to immune surveillance and helps to prevent the entrance of microbes into the eye .", "binary_answer": null, "no_answer_prob": -3.4989938735961914}
{"example_id": 3411765644140833034, "prediction": "", "binary_answer": null, "no_answer_prob": 3.842372477054596}
{"example_id": -6367057330647505746, "prediction": "16 years of age", "binary_answer": null, "no_answer_prob": -2.8818280696868896}
{"example_id": 2642082376424878345, "prediction": "September 1985 and finished in November 1985", "binary_answer": null, "no_answer_prob": -7.1830116510391235}
{"example_id": -849489452133706819, "prediction": "9", "binary_answer": null, "no_answer_prob": -4.085175573825836}
{"example_id": -8817675850857402488, "prediction": "not less than 25 years of age to be a member of the Legislative Assembly and not less than 30 years", "binary_answer": null, "no_answer_prob": -3.999040126800537}
{"example_id": 1578000675191134268, "prediction": "Mike Wheeler in the Netflix series Stranger Things , and Richie Tozier", "binary_answer": null, "no_answer_prob": -10.24436068534851}
{"example_id": -2206083377077463872, "prediction": "roughly five hundred experts across the world . Those selected to vote include academics , journalists , producers , and others with music industry experience", "binary_answer": null, "no_answer_prob": -0.712547779083252}
{"example_id": 3061118309240822442, "prediction": "France", "binary_answer": null, "no_answer_prob": -7.057622075080872}
{"example_id": -3371890911605392308, "prediction": "Total U.S. Deaths by War Civil War ( both U.S. and Confederate ) 750,000", "binary_answer": null, "no_answer_prob": 1.308966338634491}
{"example_id": 4708993485640470955, "prediction": "108", "binary_answer": null, "no_answer_prob": -6.246570467948914}
{"example_id": 2415035688479885844, "prediction": "", "binary_answer": null, "no_answer_prob": 3.001192092895508}
{"example_id": 9150955474788261425, "prediction": "Chuck Lorre Dennis C. Brown", "binary_answer": null, "no_answer_prob": -6.846186518669128}
{"example_id": -4480182871594501325, "prediction": "Todd Pipes", "binary_answer": null, "no_answer_prob": -4.833025932312012}
{"example_id": 6665054037218154779, "prediction": "Multiple inheritance", "binary_answer": null, "no_answer_prob": -0.12324726581573486}
{"example_id": -3207917703224559863, "prediction": "Jeffrey Dean Morgan", "binary_answer": null, "no_answer_prob": -13.922797560691833}
{"example_id": -7619831388087664305, "prediction": "Brandon Flynn", "binary_answer": null, "no_answer_prob": -13.184010624885559}
{"example_id": -5001999187490255083, "prediction": "Bill Cullen", "binary_answer": null, "no_answer_prob": -3.6703041791915894}
{"example_id": -2597595626409302649, "prediction": "", "binary_answer": null, "no_answer_prob": 1.841558575630188}
{"example_id": -4274425064083032411, "prediction": "174.0 cm ( 5 ft 8 \u2044 in )", "binary_answer": null, "no_answer_prob": 0.4602314233779907}
{"example_id": 4725591197712833258, "prediction": "Kareem Abdul - Jabbar", "binary_answer": null, "no_answer_prob": -9.456704378128052}
{"example_id": 2214506083332809384, "prediction": "Rhames", "binary_answer": null, "no_answer_prob": -5.241325855255127}
{"example_id": -2088638260702961294, "prediction": "George III", "binary_answer": null, "no_answer_prob": -10.770947337150574}
{"example_id": 1521108568561999394, "prediction": "Don Williams", "binary_answer": null, "no_answer_prob": -11.789602994918823}
{"example_id": -6039277764340706043, "prediction": "Christmas in July Promotional material for 2007 Christmas in Winter festival in Tulbagh , South Africa", "binary_answer": null, "no_answer_prob": 3.5276925563812256}
{"example_id": 2788150498848569248, "prediction": "Harry C. Meek", "binary_answer": null, "no_answer_prob": -8.591535568237305}
{"example_id": 2775840251067771416, "prediction": "Jess Walton", "binary_answer": null, "no_answer_prob": -12.315308928489685}
{"example_id": -5655551859270743763, "prediction": "loose connective tissue and dense connective tissue ( which is further subdivided into dense regular and dense irregular connective tissues", "binary_answer": null, "no_answer_prob": -6.406661748886108}
{"example_id": 5062857859256710776, "prediction": "0.35 miles ( 0.56 km )", "binary_answer": null, "no_answer_prob": -6.1931763887405396}
{"example_id": 8316699342584548454, "prediction": "25,000 yen ( $210 )", "binary_answer": null, "no_answer_prob": -6.642134428024292}
{"example_id": 4751475814131578832, "prediction": "2 fluid ounces or more .", "binary_answer": null, "no_answer_prob": -2.347424268722534}
{"example_id": 1096497891432710177, "prediction": "", "binary_answer": null, "no_answer_prob": 0.07885462045669556}
{"example_id": -3678606089626481357, "prediction": "\u00a3 231,843", "binary_answer": null, "no_answer_prob": -7.175175666809082}
{"example_id": 3051930912491995402, "prediction": "between 1975 and 1985", "binary_answer": null, "no_answer_prob": -8.25346064567566}
{"example_id": 3947524562237613038, "prediction": "Star Wars", "binary_answer": null, "no_answer_prob": -8.32945728302002}
{"example_id": 8317331295035673089, "prediction": "120,762", "binary_answer": null, "no_answer_prob": -11.715097784996033}
{"example_id": -1355322236662361701, "prediction": "Jeanne M. Ives", "binary_answer": null, "no_answer_prob": -11.267970323562622}
{"example_id": 8629904539305064115, "prediction": "Mufasa", "binary_answer": null, "no_answer_prob": -2.7314157485961914}
{"example_id": -2307288734161784915, "prediction": "Aloe vera ( / \u02c8\u00e6lo\u028ai\u02d0 / or / \u02c8\u00e6lo\u028a / ) is a plant", "binary_answer": null, "no_answer_prob": -4.266112804412842}
{"example_id": 1314972742570314327, "prediction": "Australia", "binary_answer": null, "no_answer_prob": -6.814965784549713}
{"example_id": -2483846273134722603, "prediction": "", "binary_answer": null, "no_answer_prob": 3.9663662910461426}
{"example_id": 7809715927293504990, "prediction": "Jeff Silbar and Larry Henley", "binary_answer": null, "no_answer_prob": -12.34669029712677}
{"example_id": 2842924527220481974, "prediction": "the Supreme Court", "binary_answer": null, "no_answer_prob": -4.893783330917358}
{"example_id": -1371894834665056317, "prediction": "August 8 , 2014", "binary_answer": null, "no_answer_prob": -4.040229797363281}
{"example_id": 968452373141674317, "prediction": "Things Goin ' On", "binary_answer": null, "no_answer_prob": 2.2522624135017395}
{"example_id": 9041546606351501546, "prediction": "Dashing through the snow In a one - horse open sleigh O'er the fields we go", "binary_answer": null, "no_answer_prob": -0.1724299192428589}
{"example_id": 3670471646375572127, "prediction": "not a part of any state", "binary_answer": null, "no_answer_prob": -2.48491370677948}
{"example_id": 4931818985411843949, "prediction": "Malcolm Brogdon", "binary_answer": null, "no_answer_prob": -7.6849623918533325}
{"example_id": 439103968035683784, "prediction": "22.6", "binary_answer": null, "no_answer_prob": -5.619459748268127}
{"example_id": -2061400266072030283, "prediction": "1 July and 18 November 1916", "binary_answer": null, "no_answer_prob": -10.647565126419067}
{"example_id": 4238469957250419529, "prediction": "The name `` Uruk - hai '' has the element Uruk , which is a Black Speech word related to Orc", "binary_answer": null, "no_answer_prob": -3.9398739337921143}
{"example_id": 1593604885185133296, "prediction": "50 points", "binary_answer": null, "no_answer_prob": -8.64196515083313}
{"example_id": 1810540537323127079, "prediction": "December 1941", "binary_answer": null, "no_answer_prob": -7.124344110488892}
{"example_id": -975348388226150640, "prediction": "103,190", "binary_answer": null, "no_answer_prob": -11.702885508537292}
{"example_id": 2484379349852541868, "prediction": "Bee Gees", "binary_answer": null, "no_answer_prob": -9.851103901863098}
{"example_id": 2592868951170471293, "prediction": "Pan", "binary_answer": null, "no_answer_prob": -4.7373881340026855}
{"example_id": -255536896297005191, "prediction": "the Ming dynasty", "binary_answer": null, "no_answer_prob": -1.5966918468475342}
{"example_id": -4502651912548446154, "prediction": "Sam Houston Tollway", "binary_answer": null, "no_answer_prob": -2.927659034729004}
{"example_id": 2857129857091778354, "prediction": "Sara Canning", "binary_answer": null, "no_answer_prob": -8.689361572265625}
{"example_id": 7611521031230728931, "prediction": "", "binary_answer": null, "no_answer_prob": 2.0748332142829895}
{"example_id": -5982266363084360835, "prediction": "Japan -- Pakistan relations refer to foreign relations between Japan and Pakistan .", "binary_answer": null, "no_answer_prob": 0.9064103364944458}
{"example_id": -610603108380732474, "prediction": "Women in Vatican City are those who live in or are from Vatican City", "binary_answer": null, "no_answer_prob": 1.3132846355438232}
{"example_id": -5476378947514613767, "prediction": "Ernie Birchill", "binary_answer": null, "no_answer_prob": -2.0657180547714233}
{"example_id": 2237156442325774327, "prediction": "NBC television network", "binary_answer": null, "no_answer_prob": -4.542241454124451}
{"example_id": 1333561422349553590, "prediction": "54", "binary_answer": null, "no_answer_prob": -9.086074113845825}
{"example_id": 4300920285715516714, "prediction": "James A. Garfield", "binary_answer": null, "no_answer_prob": -9.118098139762878}
{"example_id": 8863387829953462015, "prediction": "Albert Lawrence Brooks", "binary_answer": null, "no_answer_prob": -12.413699984550476}
{"example_id": -7311058425357650150, "prediction": "January 26 , 2018", "binary_answer": null, "no_answer_prob": -3.2482056617736816}
{"example_id": 5377124952027756432, "prediction": "Mark Charles Teixeira", "binary_answer": null, "no_answer_prob": -11.557354807853699}
{"example_id": -5539337897378937946, "prediction": "The Wright brothers , Orville", "binary_answer": null, "no_answer_prob": -8.061206340789795}
{"example_id": -204511291662634757, "prediction": "The queen ( \u2655 , \u265b ) is the most powerful piece", "binary_answer": null, "no_answer_prob": 1.190840482711792}
{"example_id": 4361212919138879970, "prediction": "Edwyn Stephen Collins", "binary_answer": null, "no_answer_prob": -4.585337162017822}
{"example_id": 5414217009975450483, "prediction": "104", "binary_answer": null, "no_answer_prob": -10.576447069644928}
{"example_id": -2157579152859231028, "prediction": "American professional baseball outfielder", "binary_answer": null, "no_answer_prob": -2.8315184116363525}
{"example_id": 816418660000550176, "prediction": "Lectures on Government and Binding : The Pisa Lectures ( LGB ) is a book by American linguist Noam Chomsky", "binary_answer": null, "no_answer_prob": 2.091919422149658}
{"example_id": 2051141609383316115, "prediction": "Leigh Francis", "binary_answer": null, "no_answer_prob": -10.743029832839966}
{"example_id": -113932269959780342, "prediction": "the cycle of life", "binary_answer": null, "no_answer_prob": -5.519747734069824}
{"example_id": 3914831453130343952, "prediction": "", "binary_answer": null, "no_answer_prob": 3.133393406867981}
{"example_id": -6080455760651479514, "prediction": "present - day Vincennes , Indiana", "binary_answer": null, "no_answer_prob": -11.379263758659363}
{"example_id": -5332415273865797138, "prediction": "2016", "binary_answer": null, "no_answer_prob": -11.332675457000732}
{"example_id": 1887982148086162345, "prediction": "EuroMillions is a transnational lottery requiring 7 correct numbers to win the jackpot", "binary_answer": null, "no_answer_prob": -2.4974347352981567}
{"example_id": 6442806862613309751, "prediction": "25 June 1950", "binary_answer": null, "no_answer_prob": -9.204012155532837}
{"example_id": 8913105154530829141, "prediction": "3 miles ( 4.8 km )", "binary_answer": null, "no_answer_prob": -1.0106706619262695}
{"example_id": 5214492005049085177, "prediction": "Kavan Smith", "binary_answer": null, "no_answer_prob": -7.544864177703857}
{"example_id": 2114543821065231555, "prediction": "Wild horses were known since prehistory from central Asia to Europe", "binary_answer": null, "no_answer_prob": -5.950834274291992}
{"example_id": 4690511621222533149, "prediction": "the Rolling Stones", "binary_answer": null, "no_answer_prob": -8.903651118278503}
{"example_id": -783096745406898875, "prediction": "", "binary_answer": null, "no_answer_prob": -3.6364471912384033}
{"example_id": 5690381869447232848, "prediction": "LaVern Baker", "binary_answer": null, "no_answer_prob": -12.797051191329956}
{"example_id": -7398902543044764111, "prediction": "1992", "binary_answer": null, "no_answer_prob": -11.444614171981812}
{"example_id": -2669642407853091086, "prediction": "Styx", "binary_answer": null, "no_answer_prob": -11.080378293991089}
{"example_id": 4776225375457532582, "prediction": "1937", "binary_answer": null, "no_answer_prob": -4.144308805465698}
{"example_id": 4117310771229415953, "prediction": "79.6 billion percent", "binary_answer": null, "no_answer_prob": -9.047676801681519}
{"example_id": 2602863819101792398, "prediction": "Steve Goodman", "binary_answer": null, "no_answer_prob": -4.414814710617065}
{"example_id": -5240355935747123550, "prediction": "Latin and Greek were the official languages of the Roman Empire", "binary_answer": null, "no_answer_prob": 1.8207266926765442}
{"example_id": 5050934905017493363, "prediction": "The Security Log", "binary_answer": null, "no_answer_prob": -3.951174020767212}
{"example_id": -4905586577653408328, "prediction": "January 22 and 31 , 1818", "binary_answer": null, "no_answer_prob": -9.698065996170044}
{"example_id": 7707548579271785778, "prediction": "along the southern shore of the island", "binary_answer": null, "no_answer_prob": -4.703914999961853}
{"example_id": 3515151414386408906, "prediction": "June 16 , 2017", "binary_answer": null, "no_answer_prob": -7.663393139839172}
{"example_id": 8434241301442786522, "prediction": "Natalie Portman", "binary_answer": null, "no_answer_prob": -11.738919615745544}
{"example_id": 1974334197476355529, "prediction": "School Mom", "binary_answer": null, "no_answer_prob": -4.300111293792725}
{"example_id": -5900680354277649426, "prediction": "", "binary_answer": null, "no_answer_prob": -0.5485774278640747}
{"example_id": 767737324563716850, "prediction": "Hank Aaron", "binary_answer": null, "no_answer_prob": -9.85917854309082}
{"example_id": 1480969524269694012, "prediction": "Emma immediately does some research and uncovers the truth that the woman , Lilith Page , is not only Maleficent 's daughter", "binary_answer": null, "no_answer_prob": -5.265490531921387}
{"example_id": -2890293044247175459, "prediction": "Winwood", "binary_answer": null, "no_answer_prob": -10.071904182434082}
{"example_id": 902421138395947064, "prediction": "John Roberts", "binary_answer": null, "no_answer_prob": -10.102174520492554}
{"example_id": -7704253508178510013, "prediction": "Stand - your - ground laws eliminate the retreat requirement at any location the defendant has a legal right to be .", "binary_answer": null, "no_answer_prob": 0.2360224723815918}
{"example_id": 6714980332279571793, "prediction": "38,502", "binary_answer": null, "no_answer_prob": -7.7003138065338135}
{"example_id": -107512131368495195, "prediction": "121", "binary_answer": null, "no_answer_prob": -9.606509566307068}
{"example_id": -4205700322029949198, "prediction": "124", "binary_answer": null, "no_answer_prob": -6.745464324951172}
{"example_id": -5569779348654685882, "prediction": "Linda Lavin", "binary_answer": null, "no_answer_prob": -8.563175201416016}
{"example_id": -2664875860902576660, "prediction": "social control , deterring and mitigating crime , or sanctioning those who violate laws with criminal penalties and rehabilitation efforts", "binary_answer": null, "no_answer_prob": -5.28389835357666}
{"example_id": 4612831453436396417, "prediction": "right - hand side , facing forward", "binary_answer": null, "no_answer_prob": -8.921236157417297}
{"example_id": 3950383745144718481, "prediction": "", "binary_answer": null, "no_answer_prob": 1.9845256358385086}
{"example_id": 4402349211619821774, "prediction": "67", "binary_answer": null, "no_answer_prob": -8.5060555934906}
{"example_id": -8332536135894660017, "prediction": "sucking is sufficient to burst small superficial blood vessels under the skin", "binary_answer": null, "no_answer_prob": -4.839614033699036}
{"example_id": 4467124663752101822, "prediction": "Paul McCartney", "binary_answer": null, "no_answer_prob": -10.345021963119507}
{"example_id": -8102922300054732873, "prediction": "", "binary_answer": null, "no_answer_prob": 3.937212586402893}
{"example_id": 565682737207179323, "prediction": "1,200", "binary_answer": null, "no_answer_prob": -3.9558730125427246}
{"example_id": -2393437492692142124, "prediction": "Samantha , Kirsten and Molly", "binary_answer": null, "no_answer_prob": -4.688422203063965}
{"example_id": 1111434664120048441, "prediction": "Millennium", "binary_answer": null, "no_answer_prob": -3.424503803253174}
{"example_id": 4791591886298092096, "prediction": "Marvel 's The Avengers May 4 , 2012 ( 2012 - 05 - 04 ) Joss Whedon", "binary_answer": null, "no_answer_prob": -1.1564245223999023}
{"example_id": 748591396750513100, "prediction": "a home video", "binary_answer": null, "no_answer_prob": 1.1535504460334778}
{"example_id": 3119531755108222951, "prediction": "2017 - 18", "binary_answer": null, "no_answer_prob": -2.9005017280578613}
{"example_id": -783242176789132678, "prediction": "Magnetic tape is a medium for magnetic recording", "binary_answer": null, "no_answer_prob": 0.9890540540218353}
{"example_id": -5194108449751231910, "prediction": "21 July 2007", "binary_answer": null, "no_answer_prob": -12.171864867210388}
{"example_id": 4067701251562760225, "prediction": "You Could Have Been with Me is Sheena Easton 's second album , released in 1981 on EMI .", "binary_answer": null, "no_answer_prob": 0.06104123592376709}
{"example_id": -6286972300938607713, "prediction": "3150 BC", "binary_answer": null, "no_answer_prob": -7.381326794624329}
{"example_id": -913619980878273000, "prediction": "A website is a collection of related web pages , including multimedia content , typically identified with a common domain name , and published on at least one web server", "binary_answer": null, "no_answer_prob": -1.6496909856796265}
{"example_id": 3449079148029779691, "prediction": "13", "binary_answer": null, "no_answer_prob": -7.222597599029541}
{"example_id": -4623820609625611622, "prediction": "247", "binary_answer": null, "no_answer_prob": -8.223783493041992}
{"example_id": -6844718002288756327, "prediction": "The Flys", "binary_answer": null, "no_answer_prob": -4.152034163475037}
{"example_id": -3604689468797734023, "prediction": "the non-canonical Gospel of Thomas", "binary_answer": null, "no_answer_prob": -7.633214473724365}
{"example_id": -184514849147804399, "prediction": "The British Empire", "binary_answer": null, "no_answer_prob": -5.328910708427429}
{"example_id": -1067733361741448875, "prediction": "the forces of the deposed King James II of England , and those of Dutch Prince William of Orange", "binary_answer": null, "no_answer_prob": -5.913164973258972}
{"example_id": -4555607120929378828, "prediction": "18", "binary_answer": null, "no_answer_prob": -4.696760416030884}
{"example_id": 3571249258802836985, "prediction": "2010 -- 11", "binary_answer": null, "no_answer_prob": -6.504569202661514}
{"example_id": 8406403268461972798, "prediction": "Dallas Cowboys", "binary_answer": null, "no_answer_prob": -7.302894592285156}
{"example_id": -1021449407380886303, "prediction": "Seattle Grace Hospital", "binary_answer": null, "no_answer_prob": -5.586153745651245}
{"example_id": 1361489657346379965, "prediction": "up to 9 cm ( 3.5 in ) in length", "binary_answer": null, "no_answer_prob": -2.170067071914673}
{"example_id": -8828918796186164857, "prediction": "`` Who 's Gonna Ride Your Wild Horses ''", "binary_answer": null, "no_answer_prob": 5.9901039600372314}
{"example_id": -6199566045645137334, "prediction": "Brad William Henke", "binary_answer": null, "no_answer_prob": -14.30839717388153}
{"example_id": 8775046812845610128, "prediction": "Iain Andrew Stirling", "binary_answer": null, "no_answer_prob": -7.440453052520752}
{"example_id": 6329885226176742212, "prediction": "The Russian Revolution", "binary_answer": null, "no_answer_prob": -2.5145318508148193}
{"example_id": 3036072595999208897, "prediction": "Penguin Random House", "binary_answer": null, "no_answer_prob": -4.803218483924866}
{"example_id": -9063078585701991493, "prediction": "Staying Alive", "binary_answer": null, "no_answer_prob": -11.652254581451416}
{"example_id": 969324601901312657, "prediction": "Robert De Niro", "binary_answer": null, "no_answer_prob": -2.2309625148773193}
{"example_id": -3562092540401324912, "prediction": "Jill Sobule", "binary_answer": null, "no_answer_prob": -10.13913893699646}
{"example_id": 3337072338727261065, "prediction": "United Arab Emirates", "binary_answer": null, "no_answer_prob": -6.648795008659363}
{"example_id": 2725249733204059116, "prediction": "The waiter turns to the relevant entry in the manual and , sure enough , finds an explanation . `` Panda", "binary_answer": null, "no_answer_prob": -6.660064220428467}
{"example_id": 2243428794543876135, "prediction": "Utah , Spain , Italy , the United Kingdom , Turkey and Jordan", "binary_answer": null, "no_answer_prob": -9.172279357910156}
{"example_id": 7825968315857673266, "prediction": "Scarlett", "binary_answer": null, "no_answer_prob": -2.3210490942001343}
{"example_id": 1682705669236106442, "prediction": "Debbie Harry", "binary_answer": null, "no_answer_prob": -7.706185817718506}
{"example_id": -3968902118559396829, "prediction": "Franklin Delano Roosevelt", "binary_answer": null, "no_answer_prob": -7.359335541725159}
{"example_id": -398695217292310666, "prediction": "the United States federal government", "binary_answer": null, "no_answer_prob": -5.466696381568909}
{"example_id": 1518801672705404588, "prediction": "Teddy Hill & the Southern Soul", "binary_answer": null, "no_answer_prob": -6.446084141731262}
{"example_id": 3300336047125027433, "prediction": "the Wild Ones", "binary_answer": null, "no_answer_prob": -8.587031245231628}
{"example_id": 4461224924498578327, "prediction": "The Wrong Side of Heaven and the Righteous Side of Hell , Volume 1", "binary_answer": null, "no_answer_prob": -3.84634792804718}
{"example_id": 4929629774274616425, "prediction": "Interstate 35", "binary_answer": null, "no_answer_prob": -0.07646632194519043}
{"example_id": 8351660399007917209, "prediction": "The Delfonics", "binary_answer": null, "no_answer_prob": -11.376391172409058}
{"example_id": 5330941236576658337, "prediction": "around 283", "binary_answer": null, "no_answer_prob": -10.745162010192871}
{"example_id": 5114918023729260444, "prediction": "Giancarlo Stanton", "binary_answer": null, "no_answer_prob": -8.367509365081787}
{"example_id": 2148321916995355755, "prediction": "The music video for LMFAO 's song `` Party Rock Anthem", "binary_answer": null, "no_answer_prob": -7.835792541503906}
{"example_id": -8469790213690751647, "prediction": "March 1899", "binary_answer": null, "no_answer_prob": -9.047305822372437}
{"example_id": 8852644387182123778, "prediction": "Hebrew", "binary_answer": null, "no_answer_prob": -7.523178935050964}
{"example_id": -2076336102858163858, "prediction": "Scranton , Pennsylvania", "binary_answer": null, "no_answer_prob": -10.042353510856628}
{"example_id": 8450959737287541026, "prediction": "Mumbai Indians", "binary_answer": null, "no_answer_prob": -7.983122229576111}
{"example_id": -3059082701489355608, "prediction": "France and Croatia", "binary_answer": null, "no_answer_prob": -1.7123534679412842}
{"example_id": 103814493284162534, "prediction": "", "binary_answer": null, "no_answer_prob": 0.1657886505126953}
{"example_id": -5824018452742426380, "prediction": "Sean Patrick Astin", "binary_answer": null, "no_answer_prob": -11.956900358200073}
{"example_id": 6080089536040218777, "prediction": "", "binary_answer": null, "no_answer_prob": 1.757475197315216}
{"example_id": -7902280161156543824, "prediction": "Hon . Idris Bolaji Muse Ariyoh", "binary_answer": null, "no_answer_prob": -9.410988211631775}
{"example_id": 89280302097645121, "prediction": "", "binary_answer": null, "no_answer_prob": 3.0539930760860443}
{"example_id": 6366045689787365015, "prediction": "April 1972", "binary_answer": null, "no_answer_prob": -12.497735142707825}
{"example_id": 2643756372524412396, "prediction": "Tevye", "binary_answer": null, "no_answer_prob": -11.882394850254059}
{"example_id": 8875295946305828186, "prediction": "Diadophis punctatus", "binary_answer": null, "no_answer_prob": -6.918700814247131}
{"example_id": -7447862485650684321, "prediction": "John Robert Magaro", "binary_answer": null, "no_answer_prob": -1.294211506843567}
{"example_id": 4671936497246425470, "prediction": "The Dead Sea ( Hebrew : \u05d9\u05b8\u05dd \u05d4\u05b7\u05de\u05b6\u05bc\u05dc\u05b7\u05d7 \u202c Yam ha - Melah lit . Sea of Salt", "binary_answer": null, "no_answer_prob": -2.6880736351013184}
{"example_id": -88720005483927913, "prediction": "Stealers Wheel", "binary_answer": null, "no_answer_prob": -5.467040538787842}
{"example_id": -327395595562778531, "prediction": "Lake Superior", "binary_answer": null, "no_answer_prob": -9.77630341053009}
{"example_id": -4416600828683164615, "prediction": "Yuzuru Hanyu", "binary_answer": null, "no_answer_prob": -6.623175382614136}
{"example_id": 8932861924167756786, "prediction": "5 23 January 3 , 2017 ( 2017 - 01 - 03 ) September 12 , 2017 ( 2017 - 09 - 12 )", "binary_answer": null, "no_answer_prob": -0.8970072269439697}
{"example_id": -37658862816410823, "prediction": "Eddie Vedder", "binary_answer": null, "no_answer_prob": -10.49584186077118}
{"example_id": -8519611006086625660, "prediction": "Elton John", "binary_answer": null, "no_answer_prob": -4.192078590393066}
{"example_id": 8883856247336819625, "prediction": "The viridans streptococci", "binary_answer": null, "no_answer_prob": -1.0237740278244019}
{"example_id": -3380649337512165552, "prediction": "swamp snake", "binary_answer": null, "no_answer_prob": -5.505212187767029}
{"example_id": 2008527746692036960, "prediction": "", "binary_answer": null, "no_answer_prob": 5.877924084663391}
{"example_id": -5506786185759375604, "prediction": "Hebrew Bible", "binary_answer": null, "no_answer_prob": -0.7414138317108154}
{"example_id": -5548701788586753156, "prediction": "Phoenix Sky Harbor International Airport", "binary_answer": null, "no_answer_prob": -10.33933162689209}
{"example_id": 6669713412571790299, "prediction": "In probability theory , the birthday problem or birthday paradox", "binary_answer": null, "no_answer_prob": 0.8041751384735107}
{"example_id": -4967731513464486859, "prediction": "level of its consistency", "binary_answer": null, "no_answer_prob": -4.261715054512024}
{"example_id": -8136449854028693199, "prediction": "John Cynn", "binary_answer": null, "no_answer_prob": -6.859480381011963}
{"example_id": 4911310878658451190, "prediction": "Pierce Brosnan", "binary_answer": null, "no_answer_prob": -7.01527214050293}
{"example_id": -1445279784379797908, "prediction": "`` A Horse with No Name ''", "binary_answer": null, "no_answer_prob": 2.974949359893799}
{"example_id": -6362765075229966631, "prediction": "Patrick Henry", "binary_answer": null, "no_answer_prob": -12.500367164611816}
{"example_id": -49340226206283816, "prediction": "New York Yankees", "binary_answer": null, "no_answer_prob": -5.655797958374023}
{"example_id": 5901832122162705976, "prediction": "1957", "binary_answer": null, "no_answer_prob": -8.509245753288269}
{"example_id": 7663793374867451272, "prediction": "Sarafina", "binary_answer": null, "no_answer_prob": -1.9569573402404785}
{"example_id": -3299092586780664808, "prediction": "Katherine Kelly Lang", "binary_answer": null, "no_answer_prob": -13.298496723175049}
{"example_id": -1016307698495484725, "prediction": "Donna Douglas", "binary_answer": null, "no_answer_prob": -11.860349178314209}
{"example_id": 1358197395788363320, "prediction": "Data collection", "binary_answer": null, "no_answer_prob": 2.2732822448015213}
{"example_id": 5464968909896383642, "prediction": "Stephen Curry", "binary_answer": null, "no_answer_prob": -8.750447750091553}
{"example_id": -8415167338348863321, "prediction": ".", "binary_answer": null, "no_answer_prob": 0.8507921695709229}
{"example_id": 3812990408226508984, "prediction": "reflecting Houston 's role as the control center of the U.S. space program", "binary_answer": null, "no_answer_prob": -4.307153940200806}
{"example_id": 4018202592913541691, "prediction": "if its imports exceed its exports .", "binary_answer": null, "no_answer_prob": 0.23624169826507568}
{"example_id": 5826112080754115258, "prediction": "Tears for Fears", "binary_answer": null, "no_answer_prob": -12.844269633293152}
{"example_id": 7561758693801794012, "prediction": "", "binary_answer": null, "no_answer_prob": 2.3339082449674606}
{"example_id": 3143257260196765132, "prediction": "George V", "binary_answer": null, "no_answer_prob": -9.270832180976868}
{"example_id": -6640515572411652622, "prediction": "March 1999", "binary_answer": null, "no_answer_prob": -10.08552873134613}
{"example_id": 4145654931447908939, "prediction": "January 12 , 2017", "binary_answer": null, "no_answer_prob": -10.527859330177307}
{"example_id": 6695790733375613327, "prediction": "season six", "binary_answer": null, "no_answer_prob": -6.526474714279175}
{"example_id": 8207458569785267153, "prediction": "Metcalf", "binary_answer": null, "no_answer_prob": -2.7406864166259766}
{"example_id": -9005114087681438911, "prediction": "2015", "binary_answer": null, "no_answer_prob": -8.27813732624054}
{"example_id": 7204103577815612274, "prediction": "Hakeem Olajuwon", "binary_answer": null, "no_answer_prob": -9.792242527008057}
{"example_id": -1339157961853840172, "prediction": "13", "binary_answer": null, "no_answer_prob": -1.4862041473388672}
{"example_id": -601343259055032720, "prediction": "Stephen Curry", "binary_answer": null, "no_answer_prob": -9.132130861282349}
{"example_id": 3021993316353753890, "prediction": "Lok Sabha", "binary_answer": null, "no_answer_prob": -7.322610855102539}
{"example_id": 1529347612060466891, "prediction": "The Pawtucket Red Sox and the Rochester Red Wings", "binary_answer": null, "no_answer_prob": -6.985953629016876}
{"example_id": -8364535714252434107, "prediction": "21", "binary_answer": null, "no_answer_prob": -2.736011505126953}
{"example_id": 5094712210422893446, "prediction": "", "binary_answer": null, "no_answer_prob": 0.7187744379043579}
{"example_id": -5115818207031283048, "prediction": "[ContextId=-1] [NoLongAnswer] [ContextId=0] [Paragraph=1] The king", "binary_answer": null, "no_answer_prob": -0.310003399848938}
{"example_id": -1283343564030787069, "prediction": "Kannauj", "binary_answer": null, "no_answer_prob": -8.528096795082092}
{"example_id": -3806684492746395313, "prediction": "May 21 , 1980", "binary_answer": null, "no_answer_prob": -10.994740962982178}
{"example_id": 1817648432958914754, "prediction": "Guinea Highlands in southeastern Guinea", "binary_answer": null, "no_answer_prob": -3.8700755834579468}
{"example_id": 5594269509949867805, "prediction": "September 25 , 1789 May 5 , 1992", "binary_answer": null, "no_answer_prob": -4.28798770904541}
{"example_id": 6551710252690197340, "prediction": ".", "binary_answer": null, "no_answer_prob": -1.9423907101154327}
{"example_id": -8540808085059622094, "prediction": "Simba", "binary_answer": null, "no_answer_prob": -5.328455090522766}
{"example_id": -88437718346179973, "prediction": "Alan Sidney Patrick Rickman", "binary_answer": null, "no_answer_prob": -9.730681657791138}
{"example_id": 280865266118415051, "prediction": "Brunei", "binary_answer": null, "no_answer_prob": -3.4364962577819824}
{"example_id": 601662836705223415, "prediction": "sodium , magnesium , aluminium , silicon , phosphorus , sulfur , chlorine , and argon", "binary_answer": null, "no_answer_prob": -1.5053467750549316}
{"example_id": -2952789772011025865, "prediction": "366", "binary_answer": null, "no_answer_prob": -8.312986373901367}
{"example_id": 8259032885448966485, "prediction": "Jiroemon Kimura", "binary_answer": null, "no_answer_prob": -11.143114805221558}
{"example_id": 7899922391870951882, "prediction": "Boston", "binary_answer": null, "no_answer_prob": -2.3435484170913696}
{"example_id": 3132167100151831390, "prediction": "Since its broadcast , the special has received generally positive reviews from critics .", "binary_answer": null, "no_answer_prob": -2.499577045440674}
{"example_id": 6282083539122211095, "prediction": "Big Boi", "binary_answer": null, "no_answer_prob": -9.482038974761963}
{"example_id": 4497024653000501374, "prediction": "Sutter 's Mill in Coloma , California", "binary_answer": null, "no_answer_prob": -8.914123713970184}
{"example_id": -664478966980538828, "prediction": "Denver Broncos", "binary_answer": null, "no_answer_prob": -6.523876190185547}
{"example_id": 8796796989753007555, "prediction": "If the jury fails to reach either a unanimous or majority verdict after a reasonable time", "binary_answer": null, "no_answer_prob": -1.2039514780044556}
{"example_id": 9193607661159717976, "prediction": "up to ten", "binary_answer": null, "no_answer_prob": -5.978881001472473}
{"example_id": 750194787168214430, "prediction": "John of Patmos", "binary_answer": null, "no_answer_prob": -6.243020057678223}
{"example_id": 7975271076117517529, "prediction": "The Thirteen Colonies", "binary_answer": null, "no_answer_prob": -6.3947060108184814}
{"example_id": 7586230684004680498, "prediction": "the Rolling Stones", "binary_answer": null, "no_answer_prob": -10.592331767082214}
{"example_id": -8903207774639964736, "prediction": "108 miles ( 174 km ) southeast of Phoenix and 60 mi ( 97 km ) north of the U.S. -- Mexico border", "binary_answer": null, "no_answer_prob": -4.48765230178833}
{"example_id": -5379350041754676016, "prediction": "2010 -- 11", "binary_answer": null, "no_answer_prob": -8.058620512485504}
{"example_id": -2574040402255032466, "prediction": "Integrated information theory ( IIT ) attempts to explain what consciousness is and why it might be associated with certain physical systems", "binary_answer": null, "no_answer_prob": 0.3228367567062378}
{"example_id": -6384677143287666495, "prediction": "Kevin Sussman", "binary_answer": null, "no_answer_prob": -1.5553271770477295}
{"example_id": 7135753806440864350, "prediction": "", "binary_answer": null, "no_answer_prob": 2.697096526622772}
{"example_id": -5724562506561055624, "prediction": "New Jersey", "binary_answer": null, "no_answer_prob": 2.9348260164260864}
{"example_id": 7655642239586343302, "prediction": "Catherine Elise Blanchett", "binary_answer": null, "no_answer_prob": -3.7120654582977295}
{"example_id": 7998464299329658776, "prediction": "September 9 , 1996", "binary_answer": null, "no_answer_prob": -7.43175482749939}
{"example_id": -4139156079597237577, "prediction": "neural -- Information sciences and social -- Relational sciences", "binary_answer": null, "no_answer_prob": -2.9676536321640015}
{"example_id": 5812955432648909315, "prediction": "Dragon Ball Z : Battle of Gods", "binary_answer": null, "no_answer_prob": -6.052956700325012}
{"example_id": -7916483423266603230, "prediction": "", "binary_answer": null, "no_answer_prob": 2.84733909368515}
{"example_id": -1662778136463601808, "prediction": "2016", "binary_answer": null, "no_answer_prob": -6.4529478549957275}
{"example_id": -2428388592339417815, "prediction": "", "binary_answer": null, "no_answer_prob": 4.456361889839172}
{"example_id": 3777090832918264756, "prediction": "October 20 , 2016", "binary_answer": null, "no_answer_prob": -6.481086254119873}
{"example_id": -1295862734292663439, "prediction": "Lori Lieberman", "binary_answer": null, "no_answer_prob": -6.014932155609131}
{"example_id": -5211591889664230368, "prediction": "", "binary_answer": null, "no_answer_prob": 1.8993277102708817}
{"example_id": 526965982792109965, "prediction": "Clint Eastwood", "binary_answer": null, "no_answer_prob": -4.96638560295105}
{"example_id": 5146059878319515436, "prediction": "slaves , cash crops , and manufactured goods", "binary_answer": null, "no_answer_prob": -4.465969800949097}
{"example_id": 5224214262209254277, "prediction": "Green Lantern Cover of Green Lantern : Rebirth # 6", "binary_answer": null, "no_answer_prob": 3.675581693649292}
{"example_id": -411215153585483702, "prediction": "coastal habitats", "binary_answer": null, "no_answer_prob": -1.7466942071914673}
{"example_id": 3495933731379934865, "prediction": "Dolby Surround 7.1", "binary_answer": null, "no_answer_prob": 3.8743046168237925}
{"example_id": -506060975415377315, "prediction": "29", "binary_answer": null, "no_answer_prob": -4.918242335319519}
{"example_id": 5066766694112642360, "prediction": "Chelsea", "binary_answer": null, "no_answer_prob": -5.170440196990967}
{"example_id": 4319601600596588048, "prediction": "The Cranberries", "binary_answer": null, "no_answer_prob": -8.088529109954834}
{"example_id": -6590226321691524248, "prediction": "Jason Scott Lee , Lauren Holly and Robert Wagner", "binary_answer": null, "no_answer_prob": -6.537307262420654}
{"example_id": 4277370647950785243, "prediction": "Salem , Oregon , and the surrounding area , as well as on the Oregon coast .", "binary_answer": null, "no_answer_prob": -5.371793270111084}
{"example_id": -6124671108556740184, "prediction": "October 6 , 2016", "binary_answer": null, "no_answer_prob": -3.550928473472595}
{"example_id": 1616638867420802596, "prediction": "Crowdfunding", "binary_answer": null, "no_answer_prob": -0.1292414665222168}
{"example_id": 3872441116308992558, "prediction": "Schenectady , New York", "binary_answer": null, "no_answer_prob": -8.473101615905762}
{"example_id": -6449180247876167971, "prediction": "In ice hockey , a team is said to be on a power play when at least one opposing player is serving a penalty", "binary_answer": null, "no_answer_prob": -2.648788094520569}
{"example_id": 4992973967728216447, "prediction": "6", "binary_answer": null, "no_answer_prob": -6.543338298797607}
{"example_id": -6673013913103106465, "prediction": "Sam Snead", "binary_answer": null, "no_answer_prob": -7.504499197006226}
{"example_id": -3587139597075780529, "prediction": "the beginning of the 8th century CE", "binary_answer": null, "no_answer_prob": -8.272294402122498}
{"example_id": -2099854288536910233, "prediction": "Hansa Atlantic Sony", "binary_answer": null, "no_answer_prob": -2.5145167112350464}
{"example_id": -1727338615562196801, "prediction": "Accuser", "binary_answer": null, "no_answer_prob": -2.621916890144348}
{"example_id": 8429459117370155423, "prediction": "A heart valve normally allows blood to flow in only one direction through the heart", "binary_answer": null, "no_answer_prob": -7.40146267414093}
{"example_id": -1343175721933797206, "prediction": "", "binary_answer": null, "no_answer_prob": 3.684069573879242}
{"example_id": -6103669093953061571, "prediction": "a remote mountain range in the Urals", "binary_answer": null, "no_answer_prob": -7.1156245470047}
{"example_id": -5741532061363371071, "prediction": "Tiana", "binary_answer": null, "no_answer_prob": -6.399246454238892}
{"example_id": 8198686008114485545, "prediction": "the radiogenic heat produced by the radioactive decay of isotopes in the mantle and crust", "binary_answer": null, "no_answer_prob": -7.096644163131714}
{"example_id": 9190891431095873447, "prediction": "sixth - largest economy", "binary_answer": null, "no_answer_prob": -5.29314649105072}
{"example_id": -2880516450702807150, "prediction": "the Latin term for long - beaked .", "binary_answer": null, "no_answer_prob": -8.829224109649658}
{"example_id": 2689099986364480143, "prediction": "Bob Dylan", "binary_answer": null, "no_answer_prob": -6.647717714309692}
{"example_id": -4915659449158888905, "prediction": "90", "binary_answer": null, "no_answer_prob": -1.853302240371704}
{"example_id": 5278727362062722246, "prediction": "Nipper", "binary_answer": null, "no_answer_prob": -7.560700535774231}
{"example_id": -1950693760324928968, "prediction": "Manchester United", "binary_answer": null, "no_answer_prob": -1.7448766231536865}
{"example_id": -1921691318227357795, "prediction": "CHORUS", "binary_answer": null, "no_answer_prob": 0.7898556292057037}
{"example_id": -8950073014553165865, "prediction": "2018", "binary_answer": null, "no_answer_prob": -9.043438911437988}
{"example_id": -324100760681938275, "prediction": "Narendra Modi", "binary_answer": null, "no_answer_prob": -4.7839741706848145}
{"example_id": 3899419894917181346, "prediction": "2008", "binary_answer": null, "no_answer_prob": -10.057702839374542}
{"example_id": -4525915450131122051, "prediction": "There are nerve cells , also known as neurons , present in our human body", "binary_answer": null, "no_answer_prob": -0.11678004264831543}
{"example_id": 4262949888553654261, "prediction": "bargad , vatavriksh , and barh", "binary_answer": null, "no_answer_prob": -6.472209930419922}
{"example_id": -1316683382741344491, "prediction": "Craig Lowndes", "binary_answer": null, "no_answer_prob": -5.826645731925964}
{"example_id": 2288301848952024551, "prediction": "Tedros Adhanom", "binary_answer": null, "no_answer_prob": -11.578734874725342}
{"example_id": -4265116221276097551, "prediction": "Simeon , Dan , Naphtali , Gad , Asher , Issachar , Zebulun , Manasseh and Ephraim", "binary_answer": null, "no_answer_prob": -3.4221959114074707}
{"example_id": -2230102276063929932, "prediction": "1970 , 1973", "binary_answer": null, "no_answer_prob": -9.658970147371292}
{"example_id": -6730057309886370465, "prediction": "6,931,071", "binary_answer": null, "no_answer_prob": -8.456357836723328}
{"example_id": 7510529741389250321, "prediction": "Jared S. Gilmore", "binary_answer": null, "no_answer_prob": -8.005347728729248}
{"example_id": -4347567190099337423, "prediction": "The Red Arrows", "binary_answer": null, "no_answer_prob": -4.223541855812073}
{"example_id": 5907766322085943901, "prediction": "Message in a Bottle", "binary_answer": null, "no_answer_prob": 2.063242055475712}
{"example_id": -7720922104700456393, "prediction": "Neil Gorsuch", "binary_answer": null, "no_answer_prob": -6.407964468002319}
{"example_id": -8716379885610595960, "prediction": "4.7 and 5.5 inches ( 120 and 140 mm )", "binary_answer": null, "no_answer_prob": -7.926453113555908}
{"example_id": 676229618573714875, "prediction": "Macdonald", "binary_answer": null, "no_answer_prob": -4.781169772148132}
{"example_id": -3493453017718850390, "prediction": "Carnotaurus", "binary_answer": null, "no_answer_prob": -5.01554936170578}
{"example_id": 3478562218510210965, "prediction": "David Imelda `` Dave '' Lamb", "binary_answer": null, "no_answer_prob": -12.416845679283142}
{"example_id": -5875349907160844545, "prediction": "during the fall season", "binary_answer": null, "no_answer_prob": -0.28604865074157715}
{"example_id": 2134938192518032811, "prediction": "Justin Timberlake", "binary_answer": null, "no_answer_prob": -11.650144219398499}
{"example_id": 8268412012395772634, "prediction": "August 7 , 1881", "binary_answer": null, "no_answer_prob": -6.756246328353882}
{"example_id": -1566299948338411564, "prediction": "Bill Murray", "binary_answer": null, "no_answer_prob": -8.805099964141846}
{"example_id": -6919281563448758695, "prediction": "Pyeongchang Halfpipe", "binary_answer": null, "no_answer_prob": -1.5940711498260498}
{"example_id": -300398662391023444, "prediction": "A film adaptation of The Natural starring Robert Redford as Roy Hobbs was released in 1984 .", "binary_answer": null, "no_answer_prob": -6.029683470726013}
{"example_id": 6933028138973193669, "prediction": "the southwestern coast of the island", "binary_answer": null, "no_answer_prob": -8.376974821090698}
{"example_id": 6847483168682461285, "prediction": "8th September 2017", "binary_answer": null, "no_answer_prob": -9.27732527256012}
{"example_id": 6451287467374052151, "prediction": "Audrey Hepburn and George Peppard", "binary_answer": null, "no_answer_prob": -9.0536470413208}
{"example_id": 6059882643797764017, "prediction": "The Miracles", "binary_answer": null, "no_answer_prob": -7.534870743751526}
{"example_id": -4375882542141446572, "prediction": "the sun", "binary_answer": null, "no_answer_prob": -7.194590330123901}
{"example_id": -3334913234103339998, "prediction": "1988", "binary_answer": null, "no_answer_prob": -8.36987592279911}
{"example_id": 6794927844642561619, "prediction": "Los Angeles , California", "binary_answer": null, "no_answer_prob": -4.463745594024658}
{"example_id": -671343286763224953, "prediction": "Daisy", "binary_answer": null, "no_answer_prob": -7.262601375579834}
{"example_id": -8570741260785254801, "prediction": "37 -- 73 minutes", "binary_answer": null, "no_answer_prob": -2.9053722620010376}
{"example_id": -472916277033969389, "prediction": "`` solve a specific age - appropriate problem utilizing basic math skills", "binary_answer": null, "no_answer_prob": 2.0617269836366177}
{"example_id": -4396446690626410751, "prediction": "The MLIS or MLS degree is usually acquired", "binary_answer": null, "no_answer_prob": -2.262656092643738}
{"example_id": -8723007522114470742, "prediction": "Myanmar , northern Thailand , northern Vietnam , and northern Luzon of the Philippines", "binary_answer": null, "no_answer_prob": -1.8405780792236328}
{"example_id": -8561418966099492869, "prediction": "Alamodome in San Antonio , Texas", "binary_answer": null, "no_answer_prob": -6.253152370452881}
{"example_id": 8845144145742962785, "prediction": "Zoe Saldana - Perego", "binary_answer": null, "no_answer_prob": -7.088999807834625}
{"example_id": -1953038377290516285, "prediction": "John Kander", "binary_answer": null, "no_answer_prob": -12.536253929138184}
{"example_id": -18907135176901347, "prediction": "Vince Lombardi", "binary_answer": null, "no_answer_prob": -0.6972184181213379}
{"example_id": -5863393738946188631, "prediction": "rapping , djaying , beatboxing and breaking", "binary_answer": null, "no_answer_prob": -7.3828572034835815}
{"example_id": 3224239243235222174, "prediction": "Santonio Holmes", "binary_answer": null, "no_answer_prob": -5.400698184967041}
{"example_id": -2396453956760763463, "prediction": "Theresa May", "binary_answer": null, "no_answer_prob": -8.915021181106567}
{"example_id": -5562694927741210554, "prediction": "10 May 1940", "binary_answer": null, "no_answer_prob": -10.086426258087158}
{"example_id": -2959935050988596176, "prediction": "Pino , California", "binary_answer": null, "no_answer_prob": 2.3684166371822357}
{"example_id": -3578394134252581, "prediction": "The Himalayan range has many of the Earth 's highest peaks , including the highest , Mount Everest", "binary_answer": null, "no_answer_prob": -0.9360351450741291}
{"example_id": -6786420307215595113, "prediction": "Endless Love", "binary_answer": null, "no_answer_prob": -6.573446989059448}
{"example_id": -4160515052735027671, "prediction": "where m and b designate constants ( parameters )", "binary_answer": null, "no_answer_prob": 0.8522770404815674}
{"example_id": -3422681705334541693, "prediction": "northeastern United States", "binary_answer": null, "no_answer_prob": -5.050545394420624}
{"example_id": 3630994325243827850, "prediction": "1957", "binary_answer": null, "no_answer_prob": -9.786915898323059}
{"example_id": -550870475070866888, "prediction": "Not Afraid", "binary_answer": null, "no_answer_prob": -6.199352264404297}
{"example_id": 773838079842099403, "prediction": "July 20 , 1969", "binary_answer": null, "no_answer_prob": -10.157881021499634}
{"example_id": -6755191382094968028, "prediction": "", "binary_answer": null, "no_answer_prob": 2.3863560929894447}
{"example_id": 416035876866880719, "prediction": "John Hancock", "binary_answer": null, "no_answer_prob": -5.835405349731445}
{"example_id": -4565960426615421529, "prediction": "An acting company prepares to rehearse the play The Rules of the Game", "binary_answer": null, "no_answer_prob": 1.4598520398139954}
{"example_id": -2750191549338820772, "prediction": "nasopharynx", "binary_answer": null, "no_answer_prob": -9.831434488296509}
{"example_id": -6036754827460950862, "prediction": "Donna Summer", "binary_answer": null, "no_answer_prob": -12.109759449958801}
{"example_id": -4673918127825193912, "prediction": "1789", "binary_answer": null, "no_answer_prob": -2.4576486349105835}
{"example_id": 2821997138793050066, "prediction": "Kenneth Howard Norton", "binary_answer": null, "no_answer_prob": -5.666471362113953}
{"example_id": 2880763444157938261, "prediction": "Preslav Literary School in the First Bulgarian Empire", "binary_answer": null, "no_answer_prob": -7.304695963859558}
{"example_id": -7121242283645893242, "prediction": "the `` Little Creamery '' in Brenham , Texas", "binary_answer": null, "no_answer_prob": -8.454232335090637}
{"example_id": 908684624871162279, "prediction": "The site of the ancient city contains the ruins of a Bronze Age fortified city", "binary_answer": null, "no_answer_prob": 0.5744946002960205}
{"example_id": -3938418128963750084, "prediction": "22 January 1901", "binary_answer": null, "no_answer_prob": -5.55946671962738}
{"example_id": -5658908063988599530, "prediction": "curtailing the rights of former Confederates , such as through the provisions of the Wade -- Davis Bill", "binary_answer": null, "no_answer_prob": -0.25870585441589355}
{"example_id": 5320523448309021485, "prediction": "4,226 feet ( 1288 m )", "binary_answer": null, "no_answer_prob": -2.7375524044036865}
{"example_id": 878819802175317138, "prediction": "The causes of the French Revolution can be attributed to several intertwining factors :", "binary_answer": null, "no_answer_prob": -1.410703182220459}
{"example_id": 4874743856997153275, "prediction": "Camden County", "binary_answer": null, "no_answer_prob": -12.309858322143555}
{"example_id": -2806574866301317815, "prediction": "Somatic nervous system 1 . ( Brain ) Precentral gyrus : the origin of nerve signals initiating movement", "binary_answer": null, "no_answer_prob": 4.719745606184006}
{"example_id": 2706772051202747690, "prediction": "an onomatopoeia of the sound made by the sandals when walking in them", "binary_answer": null, "no_answer_prob": -6.769906520843506}
{"example_id": -3626536403950874804, "prediction": "fourth season", "binary_answer": null, "no_answer_prob": -1.933289885520935}
{"example_id": 7680250653225458398, "prediction": "East High stage", "binary_answer": null, "no_answer_prob": -2.219693660736084}
{"example_id": -4188033072191712006, "prediction": "November 9", "binary_answer": null, "no_answer_prob": -1.4384126663208008}
{"example_id": 1203799362752889541, "prediction": "decisions regarding investment , production , and distribution are based on the interplay of supply and demand , which determines the prices of goods and services", "binary_answer": null, "no_answer_prob": -0.984046220779419}
{"example_id": -7612024550116089321, "prediction": "Brad Rutter", "binary_answer": null, "no_answer_prob": -8.360599756240845}
{"example_id": -593737677179785327, "prediction": "The river rises in northwestern Wyoming", "binary_answer": null, "no_answer_prob": -1.9565153121948242}
{"example_id": -185822075714062550, "prediction": "", "binary_answer": null, "no_answer_prob": -4.224567532539368}
{"example_id": 2201445608240998715, "prediction": "22", "binary_answer": null, "no_answer_prob": -7.386041164398193}
{"example_id": 6704097382281610906, "prediction": "Monopolies can be established by a government , form naturally , or form by integration .", "binary_answer": null, "no_answer_prob": -0.43719005584716797}
{"example_id": 7845678000213926561, "prediction": "Selena Gomez & the Scene", "binary_answer": null, "no_answer_prob": -9.645416140556335}
{"example_id": 4313852159449213029, "prediction": "Nicholas II or Nikolai II", "binary_answer": null, "no_answer_prob": -10.099276423454285}
{"example_id": -5333049627570569397, "prediction": "1990", "binary_answer": null, "no_answer_prob": -8.682212710380554}
{"example_id": 648107110992565869, "prediction": "Apollo 17", "binary_answer": null, "no_answer_prob": -6.645974516868591}
{"example_id": -4466814547263499533, "prediction": "Luton Airport", "binary_answer": null, "no_answer_prob": 0.7643735408782959}
{"example_id": -7243407486979574406, "prediction": "Sigmund Freud", "binary_answer": null, "no_answer_prob": -7.598135709762573}
{"example_id": -8296135311678816131, "prediction": "June 18 , 2018 ( 2018 - 06 - 18 ) TBA", "binary_answer": null, "no_answer_prob": -2.7301536202430725}
{"example_id": 2931333912169043178, "prediction": "John Bercow", "binary_answer": null, "no_answer_prob": -10.801064252853394}
{"example_id": 5615520588656931873, "prediction": "Victoria Leigh Blum", "binary_answer": null, "no_answer_prob": -10.118081212043762}
{"example_id": -6041295842342855092, "prediction": "Kimberly Brubaker Bradley", "binary_answer": null, "no_answer_prob": -12.65705156326294}
{"example_id": 5981180143912321829, "prediction": "", "binary_answer": null, "no_answer_prob": 3.6136667132377625}
{"example_id": -8299327714934748282, "prediction": "Snowdon", "binary_answer": null, "no_answer_prob": -6.000926494598389}
{"example_id": -7501795688741775956, "prediction": "Eagles", "binary_answer": null, "no_answer_prob": -6.886950433254242}
{"example_id": 6535327901184619009, "prediction": "small intestine", "binary_answer": null, "no_answer_prob": -0.2789773941040039}
{"example_id": 5053530425008567999, "prediction": "Io , Europa , Ganymede , and Callisto", "binary_answer": null, "no_answer_prob": -7.628088116645813}
{"example_id": 8674362045981682121, "prediction": "Christopher Cross", "binary_answer": null, "no_answer_prob": -9.549935817718506}
{"example_id": 1449590311317003611, "prediction": "a similar chord structure .", "binary_answer": null, "no_answer_prob": -2.0517685413360596}
{"example_id": -3014439715701017818, "prediction": "late April and early May 1945", "binary_answer": null, "no_answer_prob": -6.937013626098633}
{"example_id": 7703160512514943325, "prediction": "Delaware", "binary_answer": null, "no_answer_prob": -6.481381416320801}
{"example_id": 1093277294874038598, "prediction": "Scott Wilk", "binary_answer": null, "no_answer_prob": -4.179266095161438}
{"example_id": 2398620722121614226, "prediction": "Sammy Fain", "binary_answer": null, "no_answer_prob": -10.022850632667542}
{"example_id": -4361842955899466716, "prediction": "the five participating regions : Africa , Asia , America , Oceania and Europe", "binary_answer": null, "no_answer_prob": -5.8397910594940186}
{"example_id": -9035534225130269976, "prediction": "William Shakespeare", "binary_answer": null, "no_answer_prob": -12.921867966651917}
{"example_id": 4104206487285547804, "prediction": "in the Western Mediterranean , to the immediate south of the French island of Corsica .", "binary_answer": null, "no_answer_prob": -7.612825870513916}
{"example_id": -8784014532005017657, "prediction": "Sean Connery", "binary_answer": null, "no_answer_prob": -1.5831629037857056}
{"example_id": 7538105252315279291, "prediction": "Brahma the Creator , Vishnu the Preserver , and Shiva the Destroyer", "binary_answer": null, "no_answer_prob": -3.49761700630188}
{"example_id": -8673571620754144399, "prediction": "In population genetics , gene flow", "binary_answer": null, "no_answer_prob": 0.5751042366027832}
{"example_id": 5973430427289461596, "prediction": "to obscure the relationship between the key and the ciphertext -- Shannon 's property of confusion .", "binary_answer": null, "no_answer_prob": -5.927834510803223}
{"example_id": -5262266930523233451, "prediction": "November 15 , 2013", "binary_answer": null, "no_answer_prob": -8.530167818069458}
{"example_id": 1681263313889159963, "prediction": "Noble gas compounds", "binary_answer": null, "no_answer_prob": -8.976337909698486}
{"example_id": 186181115175340669, "prediction": "Green Bay Packers", "binary_answer": null, "no_answer_prob": -6.083116054534912}
{"example_id": 979017841265121724, "prediction": "2016", "binary_answer": null, "no_answer_prob": -8.185995519161224}
{"example_id": -4358039032953240899, "prediction": "1", "binary_answer": null, "no_answer_prob": -8.244933247566223}
{"example_id": -4875625407842699318, "prediction": "Rob McElhenney who developed it with Glenn Howerton", "binary_answer": null, "no_answer_prob": -7.2547725439071655}
{"example_id": -922069763848898980, "prediction": "`` Pit Stops", "binary_answer": null, "no_answer_prob": 0.6103405952453613}
{"example_id": 3553897128964761965, "prediction": "", "binary_answer": null, "no_answer_prob": 1.7481865882873535}
{"example_id": -1646821937415717394, "prediction": "Robert Andrew McGladdery", "binary_answer": null, "no_answer_prob": -12.002115607261658}
{"example_id": 3053384651214436293, "prediction": "The 1986 Chernobyl disaster triggered the release of substantial amounts of radioactivity", "binary_answer": null, "no_answer_prob": -2.4727041721343994}
{"example_id": 5004366395871527947, "prediction": "university marching bands", "binary_answer": null, "no_answer_prob": -0.4960888624191284}
{"example_id": 5142884761317633335, "prediction": "Megan Mullally", "binary_answer": null, "no_answer_prob": -8.058491230010986}
{"example_id": -6121678671414277509, "prediction": "18", "binary_answer": null, "no_answer_prob": -8.82707965373993}
{"example_id": -4564644739544350083, "prediction": "[ContextId=-1] [NoLongAnswer] [ContextId=0] [Table=1]", "binary_answer": null, "no_answer_prob": 1.9894659519195557}
{"example_id": 7631360939705650347, "prediction": "Ridley Scott", "binary_answer": null, "no_answer_prob": -11.557607769966125}
{"example_id": 5557550006522357115, "prediction": "around 1600", "binary_answer": null, "no_answer_prob": -7.8933950662612915}
{"example_id": -5741181354494823999, "prediction": "`` Delicate", "binary_answer": null, "no_answer_prob": -4.362773895263672}
{"example_id": 4435186283977609746, "prediction": "Russia", "binary_answer": null, "no_answer_prob": -10.571397364139557}
{"example_id": 5146730298316782239, "prediction": "Yvonne Marianne Elliman", "binary_answer": null, "no_answer_prob": -7.5414958000183105}
{"example_id": 7206472777975261414, "prediction": "Artax", "binary_answer": null, "no_answer_prob": -6.983666896820068}
{"example_id": -5974603066256812381, "prediction": "1931", "binary_answer": null, "no_answer_prob": -4.000918865203857}
{"example_id": 8888820056108909829, "prediction": "1 January 1904", "binary_answer": null, "no_answer_prob": -6.543839931488037}
{"example_id": -584014262632318057, "prediction": "Article Five of the United States Constitution describes the process whereby the Constitution , the nation 's frame of government , may be altered", "binary_answer": null, "no_answer_prob": -7.995413184165955}
{"example_id": -1913782093169433455, "prediction": "", "binary_answer": null, "no_answer_prob": -6.01975405216217}
{"example_id": 2510334644407695934, "prediction": "DeMarco Murray", "binary_answer": null, "no_answer_prob": -11.79198032617569}
{"example_id": -6752982719531275085, "prediction": "Portugal under Henry the Navigator", "binary_answer": null, "no_answer_prob": -1.5921496152877808}
{"example_id": -1507477232221481184, "prediction": "18 years or older", "binary_answer": null, "no_answer_prob": -6.929661989212036}
{"example_id": -3319273542838643375, "prediction": "[ContextId=-1] [NoLongAnswer] [ContextId=0]", "binary_answer": null, "no_answer_prob": 4.038027703762054}
{"example_id": 961960731165923057, "prediction": "Charlton Heston", "binary_answer": null, "no_answer_prob": -6.720831632614136}
{"example_id": 3165390421169765070, "prediction": "Heathcliff Andrew Ledger", "binary_answer": null, "no_answer_prob": -5.017813563346863}
{"example_id": 3662525361480473895, "prediction": "1994", "binary_answer": null, "no_answer_prob": -7.087378263473511}
{"example_id": -1811119010021354688, "prediction": "Simone Arianne Biles", "binary_answer": null, "no_answer_prob": -7.877274751663208}
{"example_id": 3728096353136074107, "prediction": "herbs and sugar", "binary_answer": null, "no_answer_prob": -4.647509455680847}
{"example_id": 442673897495140695, "prediction": "Iago", "binary_answer": null, "no_answer_prob": -6.663148880004883}
{"example_id": 2725480316434557991, "prediction": "Wolf v. Colorado", "binary_answer": null, "no_answer_prob": -3.745759963989258}
{"example_id": 1736622075043572243, "prediction": "Edd Kimber", "binary_answer": null, "no_answer_prob": -6.965868234634399}
{"example_id": 7430959133297391237, "prediction": "the third episode of the fifth season , titled `` The Quarterback ''", "binary_answer": null, "no_answer_prob": -4.8873032331466675}
{"example_id": 7344887121662171588, "prediction": "2007", "binary_answer": null, "no_answer_prob": -9.053587943315506}
{"example_id": -7201506064142607984, "prediction": "Kangchenjunga", "binary_answer": null, "no_answer_prob": -4.986169815063477}
{"example_id": -4184192392780326541, "prediction": "A music video was filmed to promote the single . In America , it received active rotation on MTV .", "binary_answer": null, "no_answer_prob": -4.047701835632324}
{"example_id": 3386507892752554052, "prediction": "prohibited in any school , except for authorized security personnel or armed marshals .", "binary_answer": null, "no_answer_prob": 1.915893018245697}
{"example_id": -1886827504821971176, "prediction": "President Franklin D. Roosevelt", "binary_answer": null, "no_answer_prob": -3.3716598749160767}
{"example_id": -3058681202143997361, "prediction": "", "binary_answer": null, "no_answer_prob": 2.4321926832199097}
{"example_id": -4262416515038237866, "prediction": "Kim Carnes", "binary_answer": null, "no_answer_prob": -9.028224289417267}
{"example_id": 4599818683267685144, "prediction": "Gary Clark", "binary_answer": null, "no_answer_prob": -7.114534139633179}
{"example_id": 6178525411886667879, "prediction": "Martin John Christopher Freeman", "binary_answer": null, "no_answer_prob": -13.399835228919983}
{"example_id": -6162333554866458614, "prediction": "Caesar", "binary_answer": null, "no_answer_prob": 1.8973127603530884}
{"example_id": 3962652752183366973, "prediction": "September 19 , 2016", "binary_answer": null, "no_answer_prob": -3.9588416814804077}
{"example_id": -152205142552387889, "prediction": "T\u014dru `` TK '' Kitajima", "binary_answer": null, "no_answer_prob": -10.027960300445557}
{"example_id": 42487324306043315, "prediction": "Puducherry", "binary_answer": null, "no_answer_prob": -2.052765369415283}
{"example_id": -3017048041506678277, "prediction": "Katy Perry", "binary_answer": null, "no_answer_prob": -6.709493160247803}
{"example_id": 2000594543803840275, "prediction": "two", "binary_answer": null, "no_answer_prob": -5.893071174621582}
{"example_id": -1526770203553756272, "prediction": "Jonathan Drew Groff", "binary_answer": null, "no_answer_prob": -4.4989423751831055}
{"example_id": 1980292495431083212, "prediction": "Southern Hemisphere", "binary_answer": null, "no_answer_prob": -5.163057446479797}
{"example_id": 9100851838980618940, "prediction": "", "binary_answer": null, "no_answer_prob": 2.1316616535186768}
{"example_id": -891066635679048995, "prediction": "2010", "binary_answer": null, "no_answer_prob": -9.741121411323547}
{"example_id": -1929978794344186382, "prediction": "November 4 , 2016", "binary_answer": null, "no_answer_prob": -12.032678127288818}
{"example_id": 2062734965036424156, "prediction": "April 24 , 2018", "binary_answer": null, "no_answer_prob": -5.407985210418701}
{"example_id": -5949783439292862722, "prediction": "2002", "binary_answer": null, "no_answer_prob": -5.384233355522156}
{"example_id": -406531455840864444, "prediction": "Eddie Murphy", "binary_answer": null, "no_answer_prob": -10.057289600372314}
{"example_id": 7954642635917102763, "prediction": "The Voyageurs", "binary_answer": null, "no_answer_prob": -1.074662208557129}
{"example_id": -1325146971563258681, "prediction": "Bon Jovi guitarist Richie Sambora", "binary_answer": null, "no_answer_prob": -0.6774842739105225}
{"example_id": -609161534118796334, "prediction": "June 15 , 1994", "binary_answer": null, "no_answer_prob": -9.89072322845459}
{"example_id": 8439040944638176449, "prediction": "The usual English proper name for Earth 's natural satellite is `` the Moon '' . The noun moon is derived from moone", "binary_answer": null, "no_answer_prob": -3.087273120880127}
{"example_id": 2987825150735046357, "prediction": "3 September 2013", "binary_answer": null, "no_answer_prob": -6.993158280849457}
{"example_id": -7142245749745776633, "prediction": "The flag of the United States of America , often referred to as the American flag , is the national flag of the United States", "binary_answer": null, "no_answer_prob": -6.534769415855408}
{"example_id": -7127869736743610405, "prediction": "September 11 , 1974", "binary_answer": null, "no_answer_prob": -4.99444454908371}
{"example_id": -9090848222765992947, "prediction": "", "binary_answer": null, "no_answer_prob": -3.6502312421798706}
{"example_id": 6181794695224567324, "prediction": "It is available via its traditional derivation ( a mechanical equation of state ) , or via a derivation based in statistical thermodynamics", "binary_answer": null, "no_answer_prob": 1.6379770338535309}
{"example_id": -8534837945449659988, "prediction": "1993", "binary_answer": null, "no_answer_prob": -9.579369187355042}
{"example_id": -7840649730039677746, "prediction": "January 1 , 2018", "binary_answer": null, "no_answer_prob": -6.541943073272705}
{"example_id": 4374065344468420055, "prediction": "Wargrave", "binary_answer": null, "no_answer_prob": -7.127127647399902}
{"example_id": 9196950051804493286, "prediction": "", "binary_answer": null, "no_answer_prob": -4.6658161878585815}
{"example_id": 4032723618566492898, "prediction": "United States", "binary_answer": null, "no_answer_prob": -8.82546615600586}
{"example_id": -7219970968305775032, "prediction": "8", "binary_answer": null, "no_answer_prob": -8.89167034626007}
{"example_id": -4166659670216895268, "prediction": "D\u00eda de Muertos", "binary_answer": null, "no_answer_prob": -6.727237939834595}
{"example_id": -2112269171028726044, "prediction": "", "binary_answer": null, "no_answer_prob": -0.18889343738555908}
{"example_id": -3277425061797905582, "prediction": "William Tell 's feat .", "binary_answer": null, "no_answer_prob": -1.5097975134849548}
{"example_id": -7719864220311874429, "prediction": "Indian Ocean , Southern ( Antarctic ) Ocean , and Arctic Ocean", "binary_answer": null, "no_answer_prob": 2.4131808038800955}
{"example_id": 2157780501938755330, "prediction": "January 2006", "binary_answer": null, "no_answer_prob": -9.763270378112793}
{"example_id": 3570201737329114232, "prediction": "near the ulna bone", "binary_answer": null, "no_answer_prob": -6.628689408302307}
{"example_id": -6218634509249591890, "prediction": "6", "binary_answer": null, "no_answer_prob": 0.3085676431655884}
{"example_id": -5974448695270116115, "prediction": "480 ft ( 146 m )", "binary_answer": null, "no_answer_prob": -3.0956153869628906}
{"example_id": 5687139768598521226, "prediction": "General Theory of Employment , Interest and Money", "binary_answer": null, "no_answer_prob": 1.430589199066162}
{"example_id": -4670358332103076641, "prediction": "Donald Bradman", "binary_answer": null, "no_answer_prob": -7.724050283432007}
{"example_id": -8981654012514929728, "prediction": "Elias Koteas", "binary_answer": null, "no_answer_prob": -11.337560176849365}
{"example_id": 4893771522777365152, "prediction": "George Harrison", "binary_answer": null, "no_answer_prob": -12.574903845787048}
{"example_id": -6224694729879812527, "prediction": "1986", "binary_answer": null, "no_answer_prob": 0.6564458608627319}
{"example_id": 6819270051898582650, "prediction": "Winnie - the - Pooh", "binary_answer": null, "no_answer_prob": 2.740241229534149}
{"example_id": -916779827337808815, "prediction": "Michael , Gabriel , Raphael , Uriel , Camael , Jophiel , and Zadkiel .", "binary_answer": null, "no_answer_prob": -3.9292235374450684}
{"example_id": 9184423694390444844, "prediction": "186", "binary_answer": null, "no_answer_prob": -7.90020751953125}
{"example_id": 6425479750329143813, "prediction": "Flower Drum Song", "binary_answer": null, "no_answer_prob": -9.89834988117218}
{"example_id": -8578146909499274860, "prediction": "In 2012 , Skelton became the first person to reach the South Pole using a bicycle", "binary_answer": null, "no_answer_prob": -1.575684905052185}
{"example_id": 4570336723049550086, "prediction": ".", "binary_answer": null, "no_answer_prob": -1.342684030532837}
{"example_id": -8340633478893030582, "prediction": "Trey Parker , Robert Lopez and Matt Stone", "binary_answer": null, "no_answer_prob": -3.0582879781723022}
{"example_id": -7247685073590147464, "prediction": "Operation Safed Sagar", "binary_answer": null, "no_answer_prob": -9.166534185409546}
{"example_id": -3342236790058242656, "prediction": "", "binary_answer": null, "no_answer_prob": -3.1663841009140015}
{"example_id": -2046476599353137664, "prediction": "Kelly Gordon", "binary_answer": null, "no_answer_prob": -10.982741594314575}
{"example_id": 5030093591499374068, "prediction": "allows users to store files in the cloud , synchronize files across devices , and share files", "binary_answer": null, "no_answer_prob": -1.1372743844985962}
{"example_id": -179985532028050959, "prediction": "Frederick Hubbard Gwynne", "binary_answer": null, "no_answer_prob": -9.903345704078674}
{"example_id": -6273820990720866885, "prediction": "New developments took place after India achieved independence", "binary_answer": null, "no_answer_prob": 0.10926973819732666}
{"example_id": 7662094105996254842, "prediction": "December 1917", "binary_answer": null, "no_answer_prob": -8.455934524536133}
{"example_id": -7310092817258812068, "prediction": "Dionne Warwick", "binary_answer": null, "no_answer_prob": -8.902464985847473}
{"example_id": 4428477063386338964, "prediction": "Janet Gaynor", "binary_answer": null, "no_answer_prob": -5.913517117500305}
{"example_id": -5110677473958992001, "prediction": "17 miles ( 27 km )", "binary_answer": null, "no_answer_prob": -3.1221200227737427}
{"example_id": 8485594859075875751, "prediction": "a subtext of celebration about that death and of the slaves having contributed to it through deliberate negligence or even deniable action .", "binary_answer": null, "no_answer_prob": -3.6285274028778076}
{"example_id": -610789312044350118, "prediction": "The Incredible Burt Wonderstone", "binary_answer": null, "no_answer_prob": -6.919649124145508}
{"example_id": -5103942755635115713, "prediction": "\u03b1 - tocopherol", "binary_answer": null, "no_answer_prob": -5.793624520301819}
{"example_id": -3496494429125792878, "prediction": "the 20th century", "binary_answer": null, "no_answer_prob": -7.205172598361969}
{"example_id": 5331675519545383492, "prediction": "1944", "binary_answer": null, "no_answer_prob": 0.5616582632064819}
{"example_id": -8377555240087691312, "prediction": "the people of France", "binary_answer": null, "no_answer_prob": -11.428722977638245}
{"example_id": 7683050486467741935, "prediction": "every four years", "binary_answer": null, "no_answer_prob": -2.0938310623168945}
{"example_id": 8166336796034684596, "prediction": "The Tribe of Levi served particular religious duties for the Israelites and had political and educational responsibilities as well", "binary_answer": null, "no_answer_prob": 0.9465841054916382}
{"example_id": -2844708009105670382, "prediction": "The increased availability of nutrients", "binary_answer": null, "no_answer_prob": -1.0975669622421265}
{"example_id": -1066867686550304382, "prediction": "Stanford University", "binary_answer": null, "no_answer_prob": -2.264001965522766}
{"example_id": -6603844747009996534, "prediction": "2000", "binary_answer": null, "no_answer_prob": -9.652326941490173}
{"example_id": -8645060630087450787, "prediction": "In computer science , a high - level programming language is a programming language with strong abstraction from the details of the computer", "binary_answer": null, "no_answer_prob": -3.1894264221191406}
{"example_id": -1973367318241141257, "prediction": "defends the strategy of nonviolent resistance to racism", "binary_answer": null, "no_answer_prob": -7.778748631477356}
{"example_id": 433860295881558835, "prediction": "10 December 1948", "binary_answer": null, "no_answer_prob": -11.219697773456573}
{"example_id": 3124593057061980254, "prediction": "Homer", "binary_answer": null, "no_answer_prob": -10.183884978294373}
{"example_id": -5264683755889871760, "prediction": "June 23 , 2017", "binary_answer": null, "no_answer_prob": -4.595182418823242}
{"example_id": 1898408217132572194, "prediction": "26 June 1997", "binary_answer": null, "no_answer_prob": -12.129119992256165}
{"example_id": -1571031626000266279, "prediction": "Tuesday , 6 June 1944", "binary_answer": null, "no_answer_prob": -9.34253740310669}
{"example_id": -7510598543762331407, "prediction": "Health is the ability of a biological system to acquire , convert , allocate , distribute , and utilize energy with maximum efficiency", "binary_answer": null, "no_answer_prob": 4.961209148168564}
{"example_id": -3864774070623381201, "prediction": "Livin ' on a Prayer", "binary_answer": null, "no_answer_prob": -5.531340599060059}
{"example_id": -1107953808744692072, "prediction": "[ContextId=-1] [NoLongAnswer] [ContextId=0]", "binary_answer": null, "no_answer_prob": 5.715351223945618}
{"example_id": -2063915997175470045, "prediction": "Scottish / Hispanic origin", "binary_answer": null, "no_answer_prob": -8.313678979873657}
{"example_id": 1815743836097284535, "prediction": "Cerebral lobes Lateral surface of cerebrum. 4 lobes are shown . Medial surface of cerebrum", "binary_answer": null, "no_answer_prob": -3.6860222816467285}
{"example_id": 7901822206564032502, "prediction": "Grover Washington Jr. and Bill Withers", "binary_answer": null, "no_answer_prob": -9.386493921279907}
{"example_id": 2743465127030993088, "prediction": "1984", "binary_answer": null, "no_answer_prob": -10.618325591087341}
{"example_id": 8952596042745123728, "prediction": "A Song for Young Love", "binary_answer": null, "no_answer_prob": -2.869759440422058}
{"example_id": -6781927726630191682, "prediction": "1986", "binary_answer": null, "no_answer_prob": -9.115359634160995}
{"example_id": -8602721944840980647, "prediction": "Daniel J. Travanti", "binary_answer": null, "no_answer_prob": -9.709693312644958}
{"example_id": 8035634299863591515, "prediction": "The Posse Comitatus Act is a United States federal law ( 18 U.S.C. \u00a7 1385 , original at 20 Stat. 152", "binary_answer": null, "no_answer_prob": 0.3699793294072151}
{"example_id": -9115776326354221377, "prediction": "Emma Watson and Dan Stevens", "binary_answer": null, "no_answer_prob": -6.027971386909485}
{"example_id": -6533471219865480346, "prediction": "1776", "binary_answer": null, "no_answer_prob": -5.407170653343201}
{"example_id": -6852574981544078973, "prediction": "Salt - N - Pepa Hip - hop 11 ! WrestleMania XI", "binary_answer": null, "no_answer_prob": -3.813612461090088}
{"example_id": 1803103670371471654, "prediction": "In the album version of The Wall , `` Another Brick in the Wall ( Part 2 )", "binary_answer": null, "no_answer_prob": 4.047217130661011}
{"example_id": 4372135894747578717, "prediction": "Clint Eastwood", "binary_answer": null, "no_answer_prob": -6.21187698841095}
{"example_id": 5700508208773656117, "prediction": "Prince", "binary_answer": null, "no_answer_prob": -8.017743349075317}
{"example_id": -6565455394278301367, "prediction": "Cause - and - effect diagram", "binary_answer": null, "no_answer_prob": -0.47419965267181396}
{"example_id": 5415390249197251598, "prediction": "Lonnie Donegan", "binary_answer": null, "no_answer_prob": -12.616082310676575}
{"example_id": -7826444086913016557, "prediction": "Empedocles", "binary_answer": null, "no_answer_prob": -7.355418682098389}
{"example_id": -3883260061895046430, "prediction": "futhorc runic alphabet", "binary_answer": null, "no_answer_prob": -5.05308985710144}
{"example_id": -5717578118321923724, "prediction": "1865", "binary_answer": null, "no_answer_prob": -4.922234416007996}
{"example_id": 8643312430506346423, "prediction": "Tim Nolan", "binary_answer": null, "no_answer_prob": 0.4812356233596802}
{"example_id": 9197697957976007899, "prediction": "Heart", "binary_answer": null, "no_answer_prob": -2.00098192691803}
{"example_id": -2286222794759361450, "prediction": "to describe a situation in a shared - resource system where individual users acting independently according to their own self - interest behave", "binary_answer": null, "no_answer_prob": -4.115481495857239}
{"example_id": -3553235339366934090, "prediction": "of the South", "binary_answer": null, "no_answer_prob": -10.48246717453003}
{"example_id": -3672564080898419789, "prediction": "Bahrain China Indonesia Kuwait Oman Qatar South Africa Turkey United Arab Emirates Vietnam", "binary_answer": null, "no_answer_prob": 0.2534966468811035}
{"example_id": -31266649331466609, "prediction": "salinization", "binary_answer": null, "no_answer_prob": -6.261650443077087}
{"example_id": 6124564607389583163, "prediction": "Agni Varsha", "binary_answer": null, "no_answer_prob": 1.9312422461807728}
{"example_id": -1456243076459900802, "prediction": "Michael Jordan", "binary_answer": null, "no_answer_prob": -7.500293135643005}
{"example_id": 7208198842462801612, "prediction": "October 22 , 2012", "binary_answer": null, "no_answer_prob": -12.61325979232788}
{"example_id": 6862030151279084629, "prediction": "the coast of Northern California", "binary_answer": null, "no_answer_prob": -10.086568832397461}
{"example_id": -82488502400960368, "prediction": "Luka Modri\u0107", "binary_answer": null, "no_answer_prob": -9.138492465019226}
{"example_id": 2941278300029878738, "prediction": "", "binary_answer": null, "no_answer_prob": 3.2255402505397797}
{"example_id": -2447168067966798861, "prediction": "Sonata for Piano and Violin No. 21 in E minor , K. 304 ( K300c )", "binary_answer": null, "no_answer_prob": 3.314157545566559}
{"example_id": -1622898892596900140, "prediction": "UCLA", "binary_answer": null, "no_answer_prob": 0.9184796810150146}
{"example_id": 4298264361651415013, "prediction": "Beth Maitland", "binary_answer": null, "no_answer_prob": -11.96726942062378}
{"example_id": -8295818356153818250, "prediction": "January 1 , 2018", "binary_answer": null, "no_answer_prob": -11.75545847415924}
{"example_id": -5779087988204512933, "prediction": "Jeppe Tranholm - Mikkelsen", "binary_answer": null, "no_answer_prob": -3.757291555404663}
{"example_id": 1577628133359615175, "prediction": "Manitoulin Island", "binary_answer": null, "no_answer_prob": -11.799437522888184}
{"example_id": 269809616300895850, "prediction": "`` I Will Remember You '' is the seventh and final single from Amy Grant 's twelfth album , Heart in Motion", "binary_answer": null, "no_answer_prob": 2.7096784114837646}
{"example_id": -1091459330352686202, "prediction": "Malta , and Cyprus", "binary_answer": null, "no_answer_prob": -1.4288113117218018}
{"example_id": -4767536963343233422, "prediction": "Instagram 's own account , with over 232 million followers . Selena Gomez", "binary_answer": null, "no_answer_prob": -7.133730530738831}
{"example_id": 5361571041646823205, "prediction": "Charles Maurice de Talleyrand - P\u00e9rigord", "binary_answer": null, "no_answer_prob": -7.102445363998413}
{"example_id": -7977769336206568086, "prediction": "Bryan Adams", "binary_answer": null, "no_answer_prob": -11.204246878623962}
{"example_id": 8011778858030016852, "prediction": "1980 2008", "binary_answer": null, "no_answer_prob": -10.263377100229263}
{"example_id": 8262980487954113263, "prediction": "Long - term endurance training induces many physiological adaptations both centrally and peripherally mediated", "binary_answer": null, "no_answer_prob": -1.808613896369934}
{"example_id": -3665183353461528995, "prediction": "Tahrs", "binary_answer": null, "no_answer_prob": 1.7763415575027466}
{"example_id": 5060079643051950049, "prediction": "October 1956", "binary_answer": null, "no_answer_prob": -7.789636969566345}
{"example_id": -6471777462830423450, "prediction": "capitalism", "binary_answer": null, "no_answer_prob": -2.4429049491882324}
{"example_id": -1220410632514721566, "prediction": "Ashley Johnson", "binary_answer": null, "no_answer_prob": -10.70067298412323}
{"example_id": 1168611404834933720, "prediction": "ace Cristiano Ronaldo", "binary_answer": null, "no_answer_prob": -5.537628412246704}
{"example_id": 1628743276422317151, "prediction": "Topper", "binary_answer": null, "no_answer_prob": -6.481967091560364}
{"example_id": -4918552545119960975, "prediction": "North America", "binary_answer": null, "no_answer_prob": -7.510300159454346}
{"example_id": -5006003472085768921, "prediction": "Hematidrosis", "binary_answer": null, "no_answer_prob": -11.399617552757263}
{"example_id": -2661036796058259102, "prediction": "", "binary_answer": null, "no_answer_prob": 5.527945876121521}
{"example_id": 2353915828397382515, "prediction": "April 17 , 2017", "binary_answer": null, "no_answer_prob": -9.499538898468018}
{"example_id": 7380215464653213300, "prediction": "Nackawic", "binary_answer": null, "no_answer_prob": -4.9351619482040405}
{"example_id": -6595869810643409975, "prediction": "Bill Medley and Jennifer Warnes", "binary_answer": null, "no_answer_prob": -11.117063403129578}
{"example_id": 3343528535701725949, "prediction": "England", "binary_answer": null, "no_answer_prob": -6.256663799285889}
{"example_id": 37656685629962290, "prediction": "1980s Italy", "binary_answer": null, "no_answer_prob": -7.103292524814606}
{"example_id": 2238955366807560237, "prediction": "Lisa Edelstein", "binary_answer": null, "no_answer_prob": -11.163975834846497}
{"example_id": -6678480146006657324, "prediction": "Wilt Chamberlain", "binary_answer": null, "no_answer_prob": -7.451950669288635}
{"example_id": 1389884237545055223, "prediction": "African Americans", "binary_answer": null, "no_answer_prob": -6.692203521728516}
{"example_id": 1875588575642040266, "prediction": "the 5th to the 15th century", "binary_answer": null, "no_answer_prob": -5.102661728858948}
{"example_id": 5725856502455649226, "prediction": "The Lake Pontchartrain Causeway", "binary_answer": null, "no_answer_prob": -6.884024143218994}
{"example_id": 7351468666722440929, "prediction": "Due to their working nature , Jack Russell terriers remain much as they were some 200 years ago", "binary_answer": null, "no_answer_prob": -1.4160261154174805}
{"example_id": -8797520685185536485, "prediction": "Country # Years Best result Cameroon 7 1982 , 1990 , 1994 , 1998 , 2002 , 2010 , 2014 QF Nigeria", "binary_answer": null, "no_answer_prob": -3.187901496887207}
{"example_id": 909197700839615718, "prediction": "1963", "binary_answer": null, "no_answer_prob": -4.385527729988098}
{"example_id": -771122776530708638, "prediction": "Interphase is the phase of the cell cycle", "binary_answer": null, "no_answer_prob": -0.4520183801651001}
{"example_id": -107774371678004331, "prediction": "September 7 , 2017", "binary_answer": null, "no_answer_prob": -11.227819561958313}
{"example_id": -5248515246438650449, "prediction": "sacrospinous ligament changes this notch into an opening , the greater sciatic foramen", "binary_answer": null, "no_answer_prob": -1.5290477275848389}
{"example_id": -5997422494186860203, "prediction": "Ferlin Husky", "binary_answer": null, "no_answer_prob": -7.530698180198669}
{"example_id": -2051808296560493693, "prediction": "5", "binary_answer": null, "no_answer_prob": -8.223999381065369}
{"example_id": 9110447063539355271, "prediction": "Ivey Pramis", "binary_answer": null, "no_answer_prob": -2.942839503288269}
{"example_id": 1558952782569915800, "prediction": "1920s and 1930s", "binary_answer": null, "no_answer_prob": -2.758533000946045}
{"example_id": 6692214664647586967, "prediction": "Mexico", "binary_answer": null, "no_answer_prob": -4.3159027099609375}
{"example_id": -2071539794636121643, "prediction": "1972 season", "binary_answer": null, "no_answer_prob": -6.245344877243042}
{"example_id": -2857599565985997857, "prediction": "$8.50", "binary_answer": null, "no_answer_prob": -9.438984394073486}
{"example_id": -3430677623102384649, "prediction": "September 7 , 1931", "binary_answer": null, "no_answer_prob": -6.696255683898926}
{"example_id": 3475861671182753717, "prediction": "The abbreviations MC and MCC are both used to mean `` motorcycle club", "binary_answer": null, "no_answer_prob": -7.792941093444824}
{"example_id": 8349683213960510021, "prediction": "Theresa Hak Kyung Cha Dianne Feinstein Stacey Bendet Melissa de la Cruz", "binary_answer": null, "no_answer_prob": -0.00985097885131836}
{"example_id": 4105429651494328420, "prediction": "the Beatles", "binary_answer": null, "no_answer_prob": -7.5552356243133545}
{"example_id": -6325456239012977754, "prediction": "the future of the Sudetenland in the face of demands made by Adolf Hitler", "binary_answer": null, "no_answer_prob": -2.9534331560134888}
{"example_id": -3788148688750665049, "prediction": "Chicago Cubs", "binary_answer": null, "no_answer_prob": -8.726096987724304}
{"example_id": -4450594275822195392, "prediction": "Katniss Everdeen", "binary_answer": null, "no_answer_prob": -11.05947470664978}
{"example_id": -5457985114179249183, "prediction": "1979", "binary_answer": null, "no_answer_prob": -8.02764880657196}
{"example_id": -7422446586370875611, "prediction": "February 16 , 1885", "binary_answer": null, "no_answer_prob": -2.0275150537490845}
{"example_id": -7285819617816843101, "prediction": "2014", "binary_answer": null, "no_answer_prob": -8.883707895874977}
{"example_id": 2651073537500132323, "prediction": "245 members", "binary_answer": null, "no_answer_prob": -4.4986478090286255}
{"example_id": 9071457522738240882, "prediction": "precipitation", "binary_answer": null, "no_answer_prob": -9.212638854980469}
{"example_id": -8596127252162838467, "prediction": "Republican Lincoln", "binary_answer": null, "no_answer_prob": -3.2834222316741943}
{"example_id": -9078306006144333844, "prediction": "1848", "binary_answer": null, "no_answer_prob": -4.7800798416137695}
{"example_id": -4725794279514668038, "prediction": "[ContextId=-1] [NoLongAnswer] [ContextId=0]", "binary_answer": null, "no_answer_prob": 2.519199386239052}
{"example_id": -9183044269893269650, "prediction": "26 November 1949", "binary_answer": null, "no_answer_prob": -12.041327476501465}
{"example_id": -792092847943063236, "prediction": "innate ability", "binary_answer": null, "no_answer_prob": -0.30496835708618164}
{"example_id": 1506231216119364910, "prediction": "an unknown city alderman", "binary_answer": null, "no_answer_prob": -7.055718898773193}
{"example_id": 4163255986166529598, "prediction": "Harry Shearer", "binary_answer": null, "no_answer_prob": -10.674579620361328}
{"example_id": 7912016739218368308, "prediction": "Brian May", "binary_answer": null, "no_answer_prob": -11.510008573532104}
{"example_id": 20892934614192667, "prediction": "Tom Brady", "binary_answer": null, "no_answer_prob": -11.728010773658752}
{"example_id": 538663978508954824, "prediction": "Isaac A. Van Amburgh", "binary_answer": null, "no_answer_prob": 0.47814393043518066}
{"example_id": -3565362273059789154, "prediction": "Drama", "binary_answer": null, "no_answer_prob": -1.205337017774582}
{"example_id": -3620760146801685762, "prediction": "the sixth episode of the second season", "binary_answer": null, "no_answer_prob": -5.412491083145142}
{"example_id": -2674399045265674923, "prediction": "Doug Davidson", "binary_answer": null, "no_answer_prob": -9.674018859863281}
{"example_id": 3204727903014000783, "prediction": "Dry January", "binary_answer": null, "no_answer_prob": 0.7714805603027344}
{"example_id": -6449208412503638211, "prediction": "William Ray Guy", "binary_answer": null, "no_answer_prob": -4.65323007106781}
{"example_id": 1382001459812186727, "prediction": "Jordan -- Hare Stadium in Auburn every odd - numbered year and at Bryant -- Denny Stadium in Tuscaloosa", "binary_answer": null, "no_answer_prob": -5.4925758838653564}
{"example_id": -6137096914640534688, "prediction": "Dead or Alive", "binary_answer": null, "no_answer_prob": -10.000979900360107}
{"example_id": 6317665932229681250, "prediction": "In mathematical analysis , the maxima and minima ( the respective plurals of maximum and minimum ) of a function", "binary_answer": null, "no_answer_prob": 1.3123546838760376}
{"example_id": 8512969563249507975, "prediction": "38 matches each ( playing each team in the league twice , home and away ) , totalling 380", "binary_answer": null, "no_answer_prob": -8.63896632194519}
{"example_id": -559698274556631429, "prediction": "Lou Ferrigno", "binary_answer": null, "no_answer_prob": -10.57030189037323}
{"example_id": -9048525402428403724, "prediction": "an estimated 1.2 million", "binary_answer": null, "no_answer_prob": -4.522862911224365}
{"example_id": -6624144040864748331, "prediction": "Key Indicates single release Indicates promotional single release Indicates song written solely by Bruno Mars", "binary_answer": null, "no_answer_prob": 3.4054765105247498}
{"example_id": 2625664724283116392, "prediction": "Geena Davis , Tom Hanks , Madonna , and Lori Petty", "binary_answer": null, "no_answer_prob": -0.2163541316986084}
{"example_id": 2964949885548380473, "prediction": "2001", "binary_answer": null, "no_answer_prob": -6.531445503234863}
{"example_id": 9014191735442103133, "prediction": "July 4 , 1776", "binary_answer": null, "no_answer_prob": -6.095571160316467}
{"example_id": -7517400459684508950, "prediction": "the legendary Polo Grounds", "binary_answer": null, "no_answer_prob": -3.5843738317489624}
{"example_id": -937747653413672681, "prediction": "160 \u00b0 F ( 71 \u00b0 C )", "binary_answer": null, "no_answer_prob": 1.9407835006713867}
{"example_id": -2732065960785690243, "prediction": "The Three Degrees", "binary_answer": null, "no_answer_prob": -10.26429271697998}
{"example_id": -9063281357515829584, "prediction": "$3,515 . For employers , their contribution was $4,247 in 1999 and $9,860 in 2009 .", "binary_answer": null, "no_answer_prob": -2.1602526903152466}
{"example_id": 6967912443257526318, "prediction": "The Hollies", "binary_answer": null, "no_answer_prob": -12.866644382476807}
{"example_id": 3678497416144446173, "prediction": "The Virginia Resolves were a series of resolutions", "binary_answer": null, "no_answer_prob": 3.5983203984797}
{"example_id": -9093223812193834272, "prediction": "Royal Mail , G4S , Tesco and Compass Group", "binary_answer": null, "no_answer_prob": -6.274266481399536}
{"example_id": 4610728675765610071, "prediction": "Takeru Kobayashi", "binary_answer": null, "no_answer_prob": -12.198432803153992}
{"example_id": -5467568048068622040, "prediction": "Eglin Air Force Base", "binary_answer": null, "no_answer_prob": -9.135667085647583}
{"example_id": -5953310106477616873, "prediction": "Up above the world so high , Like a diamond in the sky", "binary_answer": null, "no_answer_prob": 2.5031756162643433}
{"example_id": -2956046135428289049, "prediction": "Sally Anne Struthers", "binary_answer": null, "no_answer_prob": -13.475054025650024}
{"example_id": 6448746358073143708, "prediction": "experimental physicist", "binary_answer": null, "no_answer_prob": -6.04457950592041}
{"example_id": -1827954270671149490, "prediction": "Bob Marley", "binary_answer": null, "no_answer_prob": -12.084923148155212}
{"example_id": -5803617122969434646, "prediction": "Tobey Maguire , Carey Mulligan , Joel Edgerton and Elizabeth Debicki", "binary_answer": null, "no_answer_prob": -0.899023711681366}
{"example_id": 9194054088793021849, "prediction": "Dame Lynley Dodd", "binary_answer": null, "no_answer_prob": -7.439472794532776}
{"example_id": -4156870232683209747, "prediction": "", "binary_answer": null, "no_answer_prob": 3.074731469154358}
{"example_id": 5979914142234156056, "prediction": "June 15 , 1787", "binary_answer": null, "no_answer_prob": -8.953064799308777}
{"example_id": -8829984544160063797, "prediction": "Palos Verdes , Saint Vincent and the Grenadines , Dominica , and The Bahamas", "binary_answer": null, "no_answer_prob": -9.457995414733887}
{"example_id": -6962948485363607069, "prediction": "2017", "binary_answer": null, "no_answer_prob": -0.961489200592041}
{"example_id": -3007091317310168067, "prediction": "", "binary_answer": null, "no_answer_prob": -1.890657901763916}
{"example_id": 3083649704446766556, "prediction": "Jennifer Garner", "binary_answer": null, "no_answer_prob": -10.01811957359314}
{"example_id": -8791735237243353128, "prediction": "John Michael Montgomery", "binary_answer": null, "no_answer_prob": -12.369332432746887}
{"example_id": 5800851554581760435, "prediction": "Pittsburgh Penguins", "binary_answer": null, "no_answer_prob": -7.459520936012268}
{"example_id": 8194768658000869141, "prediction": "Mack Sennett", "binary_answer": null, "no_answer_prob": -7.1440699100494385}
{"example_id": 4939784197471510957, "prediction": "Look Who 's Talking Now", "binary_answer": null, "no_answer_prob": -8.221819400787354}
{"example_id": -4763157148860949286, "prediction": "The median is a commonly used measure of the properties of a data set in statistics and probability theory", "binary_answer": null, "no_answer_prob": -5.523587584495544}
{"example_id": 6747693054679197435, "prediction": "", "binary_answer": null, "no_answer_prob": -3.91857647895813}
{"example_id": -3979738176844987327, "prediction": "September 6 , 2018", "binary_answer": null, "no_answer_prob": -12.782790780067444}
{"example_id": -3735409163458838950, "prediction": "Second World War", "binary_answer": null, "no_answer_prob": -0.07049751281738281}
{"example_id": -4020206287264197422, "prediction": "PDCA ( plan -- do -- check -- act or plan -- do -- check -- adjust", "binary_answer": null, "no_answer_prob": -0.7984412908554077}
{"example_id": 4497104537102220856, "prediction": "electric oscillations", "binary_answer": null, "no_answer_prob": -4.742270112037659}
{"example_id": 2618498254651279342, "prediction": "`` They Ca n't Take That Away from Me ''", "binary_answer": null, "no_answer_prob": 0.8120230436325073}
{"example_id": -4419975866641434841, "prediction": "Stephen Collins Foster", "binary_answer": null, "no_answer_prob": -10.689613342285156}
{"example_id": 4367743777191326308, "prediction": "Drake", "binary_answer": null, "no_answer_prob": -9.561021566390991}
{"example_id": 2137982139849145838, "prediction": "February 22 , 2002", "binary_answer": null, "no_answer_prob": -11.715377449989319}
{"example_id": -4457993161912433368, "prediction": "Andean civilizations", "binary_answer": null, "no_answer_prob": -6.085400462150574}
{"example_id": 5373171442629886216, "prediction": "1920s New Orleans", "binary_answer": null, "no_answer_prob": -11.21215307712555}
{"example_id": 7289436044211985719, "prediction": "Central to this trend was a folk roots revival", "binary_answer": null, "no_answer_prob": 0.12164980173110962}
{"example_id": -7409274561056244488, "prediction": "Practical Magic was partially filmed on an artificial set in California", "binary_answer": null, "no_answer_prob": -7.217319488525391}
{"example_id": 6765454316883171080, "prediction": "The Precambrian", "binary_answer": null, "no_answer_prob": -4.968926429748535}
{"example_id": -7287494750640645348, "prediction": "Character Current copyright holder In Creator Date Location Reason Big Bird Sesame Workshop Jim Henson", "binary_answer": null, "no_answer_prob": 0.8570417687296867}
{"example_id": -1198227656095162134, "prediction": "fourth season , and the series finale aired on June 20 , 2011 .", "binary_answer": null, "no_answer_prob": -5.925725936889648}
{"example_id": -3333264597997019941, "prediction": "No . overall No. in season Title Directed by Written by Original release date 14 `` The Land Before Tim", "binary_answer": null, "no_answer_prob": 3.115536332130432}
{"example_id": -1955635597397014775, "prediction": "Fusarium spp .", "binary_answer": null, "no_answer_prob": 0.39493417739868164}
{"example_id": -6676990950364246265, "prediction": "southwestern Tennessee", "binary_answer": null, "no_answer_prob": -10.189952373504639}
{"example_id": 8827559639258790371, "prediction": "New York has two", "binary_answer": null, "no_answer_prob": -0.7444369792938232}
{"example_id": -562815493883482506, "prediction": "Directories containing all of the names are located on nearby podiums at both ends of the monument where visitors may locate specific names .", "binary_answer": null, "no_answer_prob": -1.7631592750549316}
{"example_id": 7306754908990268475, "prediction": "Lieutenant Governor of Texas", "binary_answer": null, "no_answer_prob": -7.260640025138855}
{"example_id": 7769420646768824672, "prediction": "Tom Coyne", "binary_answer": null, "no_answer_prob": -8.404091954231262}
{"example_id": 5599669105153346278, "prediction": "Ian David McShane", "binary_answer": null, "no_answer_prob": -7.8276636600494385}
{"example_id": -3845254214087419206, "prediction": "6 years", "binary_answer": null, "no_answer_prob": -8.043965339660645}
{"example_id": -4758780141297467760, "prediction": "Carpe diem", "binary_answer": null, "no_answer_prob": -11.935441136360168}
{"example_id": 5307697781908168437, "prediction": "Screamin ' Jay Hawkins", "binary_answer": null, "no_answer_prob": -5.710612416267395}
{"example_id": -8080892130396390822, "prediction": "Frederick the Great of Prussia", "binary_answer": null, "no_answer_prob": -8.44608736038208}
{"example_id": -8690253199564830255, "prediction": "Kareem Abdul - Jabbar", "binary_answer": null, "no_answer_prob": -8.549284100532532}
{"example_id": -8482066971437990217, "prediction": "Bowling For Soup", "binary_answer": null, "no_answer_prob": -9.876266241073608}
{"example_id": -1432339570613043528, "prediction": "Somerset County , Pennsylvania", "binary_answer": null, "no_answer_prob": -6.518078804016113}
{"example_id": 6411002291181104591, "prediction": "Filming commenced October 13 , 1938 , on the MGM lot in Culver City , California", "binary_answer": null, "no_answer_prob": -6.398234844207764}
{"example_id": 6524714690813938003, "prediction": "searching for drugs and explosives , locating missing people , finding crime scene evidence , and protecting their handlers", "binary_answer": null, "no_answer_prob": -2.814992070198059}
{"example_id": -7537969246928822061, "prediction": "`` All I Want for Christmas Is You ''", "binary_answer": null, "no_answer_prob": -1.19819974899292}
{"example_id": 2320733481589738782, "prediction": "", "binary_answer": null, "no_answer_prob": 1.9728039652109146}
{"example_id": -7235383409711791885, "prediction": "Chicago Bears", "binary_answer": null, "no_answer_prob": -2.6146183013916016}
{"example_id": 6791572512890775191, "prediction": "At 7 feet tall , Sligh", "binary_answer": null, "no_answer_prob": -5.907264351844788}
{"example_id": -4233803687136985192, "prediction": "United States Department of Health and Human Services", "binary_answer": null, "no_answer_prob": -2.581740140914917}
{"example_id": 3150540880765930232, "prediction": "Jerry Rice", "binary_answer": null, "no_answer_prob": -11.112614512443542}
{"example_id": 4964381607968119645, "prediction": "South Pacific", "binary_answer": null, "no_answer_prob": -3.080893874168396}
{"example_id": 1912075917808114714, "prediction": "9,677 ft ( 2,950 m ) to 8,363 ft ( 2,549 m )", "binary_answer": null, "no_answer_prob": -3.3095542192459106}
{"example_id": 5225600715739431112, "prediction": "Area is the quantity that expresses the extent of a two - dimensional figure or shape , or planar lamina , in the plane", "binary_answer": null, "no_answer_prob": -8.19085431098938}
{"example_id": 6361671696133085371, "prediction": "J. Neilson ( Jason Knight during portions of season 3 and 4 ) , David Baker , and Doug Marcaida", "binary_answer": null, "no_answer_prob": -4.406253814697266}
{"example_id": -819718931031629224, "prediction": "red , white , or blue", "binary_answer": null, "no_answer_prob": -4.592070460319519}
{"example_id": -3424807532081918274, "prediction": "Alfred Thomas `` Freddie '' Highmore", "binary_answer": null, "no_answer_prob": -9.411845564842224}
{"example_id": 8268823516550377360, "prediction": "trapezius muscle", "binary_answer": null, "no_answer_prob": -5.330597877502441}
{"example_id": 5282280587374153152, "prediction": "Brazil", "binary_answer": null, "no_answer_prob": -3.708507776260376}
{"example_id": -2849268148654144394, "prediction": "in the right upper quadrant of the abdomen", "binary_answer": null, "no_answer_prob": -9.834796667098999}
{"example_id": -358830793947444696, "prediction": "Sine wave", "binary_answer": null, "no_answer_prob": 1.6600396633148193}
{"example_id": 8585212527779664827, "prediction": "Margaret Brainard Hamilton", "binary_answer": null, "no_answer_prob": -11.870900869369507}
{"example_id": 5127090235680483100, "prediction": "The Atlantic standard time zone The Eastern standard time zone The Central standard time zone The Mountain standard time zone", "binary_answer": null, "no_answer_prob": -1.734656810760498}
{"example_id": -3546232832573849648, "prediction": "Shake Me Loose", "binary_answer": null, "no_answer_prob": -8.99948501586914}
{"example_id": -5468287088726908682, "prediction": "Amartya Kumar Sen", "binary_answer": null, "no_answer_prob": -6.6029558181762695}
{"example_id": 3390994259504658173, "prediction": "Anne - Marie and Ryan McMullan", "binary_answer": null, "no_answer_prob": -7.355270743370056}
{"example_id": -4389957759313357248, "prediction": "The Battle of Moscow", "binary_answer": null, "no_answer_prob": -0.03272807598114014}
{"example_id": 2158798004485737458, "prediction": "Allman Brothers Band", "binary_answer": null, "no_answer_prob": -10.122450709342957}
{"example_id": -6915859372604977461, "prediction": "", "binary_answer": null, "no_answer_prob": 0.2549188733100891}
{"example_id": -3480941453199535714, "prediction": "Norberto Barba", "binary_answer": null, "no_answer_prob": -0.3709430694580078}
{"example_id": 1905131062351526241, "prediction": "Regina Spektor", "binary_answer": null, "no_answer_prob": -11.841021656990051}
{"example_id": 7064770311616051114, "prediction": "Joyce DeWitt", "binary_answer": null, "no_answer_prob": -4.443147897720337}
{"example_id": 819207418696106881, "prediction": "1989", "binary_answer": null, "no_answer_prob": -10.647515416145325}
{"example_id": 680308234395099133, "prediction": "", "binary_answer": null, "no_answer_prob": 0.12921763211488724}
{"example_id": -8455580571333479494, "prediction": "Collusion", "binary_answer": null, "no_answer_prob": -0.8745667934417725}
{"example_id": -5160001114356640544, "prediction": "Marks ' brother - in - law , Robert L. May", "binary_answer": null, "no_answer_prob": -1.1564393043518066}
{"example_id": 3458961565721598188, "prediction": "June Hershey", "binary_answer": null, "no_answer_prob": -9.566551804542542}
{"example_id": -4606467240167473800, "prediction": "legislative Powers herein granted shall be vested in a Congress of the United States , which shall consist of a Senate and House of Representatives", "binary_answer": null, "no_answer_prob": -0.701094388961792}
{"example_id": -1265869086580830715, "prediction": "Beaumont , Texas", "binary_answer": null, "no_answer_prob": -7.3481364250183105}
{"example_id": 5710637710172357311, "prediction": "Canals", "binary_answer": null, "no_answer_prob": -1.1834843158721924}
{"example_id": 6036983470086931583, "prediction": "20,871 weeks including 97", "binary_answer": null, "no_answer_prob": -2.7027840614318848}
{"example_id": 1636473245041621096, "prediction": "Hartsfield -- Jackson Atlanta International Airport", "binary_answer": null, "no_answer_prob": -0.47652435302734375}
{"example_id": 7882377647132382238, "prediction": "Clemson Tigers", "binary_answer": null, "no_answer_prob": -7.19428563117981}
{"example_id": 7009888309090765284, "prediction": "Brad Gillis", "binary_answer": null, "no_answer_prob": -3.6918416023254395}
{"example_id": 1849503016710677308, "prediction": "Mikhail Lomonosov", "binary_answer": null, "no_answer_prob": -8.066902875900269}
{"example_id": 6442818086505478469, "prediction": "Christopher Allen Lloyd", "binary_answer": null, "no_answer_prob": -12.74228048324585}
{"example_id": 7305314114844262069, "prediction": "eleven", "binary_answer": null, "no_answer_prob": -5.2612714767456055}
{"example_id": -6047775712481463623, "prediction": "`` Because I could not stop for Death '' is a lyrical poem by Emily Dickinson", "binary_answer": null, "no_answer_prob": 3.6492156982421875}
{"example_id": 6213045939017576916, "prediction": "Richard Madden", "binary_answer": null, "no_answer_prob": -13.107998847961426}
{"example_id": -7607636557088944531, "prediction": "Francis Ouimet", "binary_answer": null, "no_answer_prob": -11.29207968711853}
{"example_id": 2697009989772615027, "prediction": "The Einstein field equations", "binary_answer": null, "no_answer_prob": -3.0579882860183716}
{"example_id": -3232415045124682326, "prediction": "Ahmet Ertegun", "binary_answer": null, "no_answer_prob": -9.203169465065002}
{"example_id": -5158070734256921008, "prediction": "23.1", "binary_answer": null, "no_answer_prob": -5.236285924911499}
{"example_id": -8314105431291602582, "prediction": "Skeeter Davis", "binary_answer": null, "no_answer_prob": -5.529036641120911}
{"example_id": -6113209455303237524, "prediction": "The economic impacts of NAFTA have been modest", "binary_answer": null, "no_answer_prob": -2.4513343572616577}
{"example_id": -5322872077213434192, "prediction": "Grace VanderWaal", "binary_answer": null, "no_answer_prob": -10.345922231674194}
{"example_id": 8026875231534990860, "prediction": "The autoignition temperature", "binary_answer": null, "no_answer_prob": -2.473456561565399}
{"example_id": -6267988425840311468, "prediction": "cutaneous receptors", "binary_answer": null, "no_answer_prob": -5.229246497154236}
{"example_id": 4864611424904346120, "prediction": "the planet Mars", "binary_answer": null, "no_answer_prob": 1.6747967600822449}
{"example_id": 696430550631107443, "prediction": "Burbank , California", "binary_answer": null, "no_answer_prob": 0.6161127090454102}
{"example_id": 8574416704593075001, "prediction": "Adams", "binary_answer": null, "no_answer_prob": -6.361627578735352}
{"example_id": 5281427415896915594, "prediction": "Benjamin Franklin", "binary_answer": null, "no_answer_prob": -7.676265001296997}
{"example_id": -4939620613803238600, "prediction": "Alicia Keys", "binary_answer": null, "no_answer_prob": -11.266140580177307}
{"example_id": -2575291953817347411, "prediction": "Finnerty Steeves", "binary_answer": null, "no_answer_prob": -7.573302268981934}
{"example_id": 5718957752827445186, "prediction": "the Reconstruction Era", "binary_answer": null, "no_answer_prob": -1.641709327697754}
{"example_id": -1935426877361531992, "prediction": "Roger Miller", "binary_answer": null, "no_answer_prob": -11.892120003700256}
{"example_id": -5247200363195004684, "prediction": "Rosemary Almond", "binary_answer": null, "no_answer_prob": -7.886786937713623}
{"example_id": -2399190467145139348, "prediction": "1963", "binary_answer": null, "no_answer_prob": -7.392428040504456}
{"example_id": 4477300473899301723, "prediction": "1538", "binary_answer": null, "no_answer_prob": -3.829354763031006}
{"example_id": -414366433822264308, "prediction": "30 days or 90 days", "binary_answer": null, "no_answer_prob": -7.092146277427673}
{"example_id": 6016187989170734632, "prediction": "State Route 85 ( SR 85 ) is a freeway which connects the cities of Mountain View and southern San Jose", "binary_answer": null, "no_answer_prob": 1.0789812803268433}
{"example_id": 9140144440547621820, "prediction": "", "binary_answer": null, "no_answer_prob": 3.770084500312805}
{"example_id": -6555500747537564434, "prediction": "Emma Caulfield", "binary_answer": null, "no_answer_prob": -0.7570962905883789}
{"example_id": 8886367537984761778, "prediction": "`` What 's It Gonna Be ? ! ''", "binary_answer": null, "no_answer_prob": -3.72695791721344}
{"example_id": 1595705604057241798, "prediction": "The Lettermen", "binary_answer": null, "no_answer_prob": -2.5494786500930786}
{"example_id": -5070110728554912727, "prediction": "Fran\u00e7ois Coignet", "binary_answer": null, "no_answer_prob": 2.0219218730926514}
{"example_id": -8142222187570927809, "prediction": "
gitextract_i684binf/
├── .gitignore
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── LICENSE
├── README.md
├── mkqa_eval.py
├── mkqa_eval_all_languages.py
├── mkqa_eval_util.py
├── requirements.txt
├── sample_predictions/
│ ├── en.jsonl
│ ├── es.jsonl
│ ├── results/
│ │ ├── en_results/
│ │ │ └── metrics.json
│ │ ├── es_results/
│ │ │ └── metrics.json
│ │ ├── metrics.json
│ │ └── zh_cn_results/
│ │ └── metrics.json
│ └── zh_cn.jsonl
└── tests/
├── test_mkqa_eval.py
└── testdata/
└── en_prediction.jsonl
SYMBOL INDEX (28 symbols across 4 files) FILE: mkqa_eval.py function parse_args (line 66) | def parse_args(): function read_annotations (line 105) | def read_annotations(gold_path: str) -> Dict[str, Any]: function read_predictions (line 147) | def read_predictions(predictions_path: str) -> Dict[str, MKQAPrediction]: function compute_mkqa_scores_for_language (line 188) | def compute_mkqa_scores_for_language( function compute_best_threshold (line 220) | def compute_best_threshold( function evaluate (line 278) | def evaluate( FILE: mkqa_eval_all_languages.py function parse_args (line 16) | def parse_args(): function read_prediction_dir (line 46) | def read_prediction_dir(predictions_dir: str) -> Dict[str, Dict[str, mkq... function evaluate_predictions (line 71) | def evaluate_predictions( function evaluate_all_languages (line 104) | def evaluate_all_languages( FILE: mkqa_eval_util.py function map_em_value (line 35) | def map_em_value(prediction, gold_answers, lang): function map_f1_value (line 40) | def map_f1_value(prediction, gold_answers, lang): function get_text_metrics (line 45) | def get_text_metrics( function summarize_default_metrics (line 72) | def summarize_default_metrics( function aggregate_summaries (line 96) | def aggregate_summaries(dicts): function whitespace_tokenize (line 109) | def whitespace_tokenize(text): function mixed_segmentation (line 113) | def mixed_segmentation(text): function normalize_answer_by_language (line 130) | def normalize_answer_by_language(s, lang): function plot_f1 (line 160) | def plot_f1(answerable_f1_by_id, unanswerable_em_by_id, na_probs_by_id, ... function calculate_em (line 196) | def calculate_em(prediction, gold_answer, language): function calculate_f1 (line 202) | def calculate_f1(prediction, gold_answer, language): function compute_max_score_over_answers (line 219) | def compute_max_score_over_answers(metric_fn, prediction, ground_truths,... function compute_best_score_and_threshold (line 227) | def compute_best_score_and_threshold( function apply_no_answer_threshold (line 256) | def apply_no_answer_threshold(scores, no_answer_probs, qid_has_answer, n... function plot_na_prob_histogram (line 264) | def plot_na_prob_histogram(no_answer_probs, qid_list, outdir, name): FILE: tests/test_mkqa_eval.py function test_compute_mkqa_scores (line 16) | def test_compute_mkqa_scores(): function test_compute_mkqa_scores_in_different_languages (line 150) | def test_compute_mkqa_scores_in_different_languages(): function test_compute_metrics_end_2_end (line 242) | def test_compute_metrics_end_2_end():
Condensed preview — 18 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (4,627K chars).
[
{
"path": ".gitignore",
"chars": 33,
"preview": "__pycache__\n*.pyc\n.DS_STORE\n.idea"
},
{
"path": "CODE_OF_CONDUCT.md",
"chars": 3357,
"preview": "# Code of Conduct\n\n## Our Pledge\n\nIn the interest of fostering an open and welcoming environment, we as\ncontributors and"
},
{
"path": "CONTRIBUTING.md",
"chars": 747,
"preview": "# Contribution Guide\n\nThanks for your interest in contributing. This project was released to accompany a research paper "
},
{
"path": "LICENSE",
"chars": 10175,
"preview": "\n\n Apache License\n Version 2.0, January 2004\n "
},
{
"path": "README.md",
"chars": 8620,
"preview": "# MKQA: Multilingual Knowledge Questions & Answers\n\n[**Tasks**](#task-description) | [**Dataset**](#dataset) | [**Leader"
},
{
"path": "mkqa_eval.py",
"chars": 12579,
"preview": "import argparse\nimport collections\nimport json\nimport logging\nimport os\nimport sys\nfrom gzip import GzipFile\nfrom typing"
},
{
"path": "mkqa_eval_all_languages.py",
"chars": 6039,
"preview": "import argparse\nimport logging\nimport os\nimport sys\nfrom collections import defaultdict\nfrom multiprocessing.pool import"
},
{
"path": "mkqa_eval_util.py",
"chars": 9718,
"preview": "import collections\nimport os\nimport re\nimport string\nfrom collections import Counter, OrderedDict\nfrom multiprocessing i"
},
{
"path": "requirements.txt",
"chars": 39,
"preview": "matplotlib\nnumpy\npandas\npytest\ntabulate"
},
{
"path": "sample_predictions/en.jsonl",
"chars": 1438774,
"preview": "{\"example_id\": -8817357831042426028, \"prediction\": \"365.256 days\", \"binary_answer\": null, \"no_answer_prob\": -10.77759230"
},
{
"path": "sample_predictions/es.jsonl",
"chars": 1384859,
"preview": "{\"example_id\": -6137218241299521243, \"prediction\": \"Murphy is a town in and the county seat of Cherokee County\", \"binary"
},
{
"path": "sample_predictions/results/en_results/metrics.json",
"chars": 179,
"preview": "{\n \"best_em\": 45.39,\n \"best_f1\": 51.97,\n \"best_answerable_em\": 26.65,\n \"best_answerable_f1\": 36.38,\n \"bes"
},
{
"path": "sample_predictions/results/es_results/metrics.json",
"chars": 179,
"preview": "{\n \"best_em\": 39.73,\n \"best_f1\": 43.83,\n \"best_answerable_em\": 16.69,\n \"best_answerable_f1\": 22.76,\n \"bes"
},
{
"path": "sample_predictions/results/metrics.json",
"chars": 419,
"preview": "{\"language\":{\"0\":\"en\",\"1\":\"es\",\"2\":\"zh_cn\",\"3\":\"Macro Average\"},\"best_em\":{\"0\":45.39,\"1\":39.73,\"2\":32.42,\"3\":39.18},\"bes"
},
{
"path": "sample_predictions/results/zh_cn_results/metrics.json",
"chars": 177,
"preview": "{\n \"best_em\": 32.42,\n \"best_f1\": 32.43,\n \"best_answerable_em\": 0.0,\n \"best_answerable_f1\": 0.01,\n \"best_u"
},
{
"path": "sample_predictions/zh_cn.jsonl",
"chars": 1404842,
"preview": "{\"example_id\": 2634368050480933808, \"prediction\": \"Accounting\", \"binary_answer\": null, \"no_answer_prob\": 2.2015234231948"
},
{
"path": "tests/test_mkqa_eval.py",
"chars": 9832,
"preview": "import os\n\nimport tests.testdata as test_data\nfrom mkqa_eval import (\n compute_mkqa_scores_for_language,\n MKQAAnno"
},
{
"path": "tests/testdata/en_prediction.jsonl",
"chars": 841,
"preview": "{\"example_id\": 7631360939705650347, \"prediction\": \"Ridley Scott\", \"binary_answer\": null, \"no_answer_prob\": -11.557607769"
}
]
About this extraction
This page contains the full source code of the apple/ml-mkqa GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 18 files (4.1 MB), approximately 1.1M tokens, and a symbol index with 28 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.