Full Code of openvax/mhcflurry for AI

master f011145590f0 cached
238 files
2.4 MB
649.4k tokens
794 symbols
1 requests
Download .txt
Showing preview only (2,598K chars total). Download the full file or copy to clipboard to get everything.
Repository: openvax/mhcflurry
Branch: master
Commit: f011145590f0
Files: 238
Total size: 2.4 MB

Directory structure:
gitextract_be3gthnp/

├── .dockerignore
├── .github/
│   └── workflows/
│       ├── build.yml
│       ├── ci.yml
│       ├── release.yml
│       └── release_testpypi.yml
├── .gitignore
├── AGENTS.md
├── CONTRIBUTING.md
├── Dockerfile
├── LICENSE
├── NOTES.md
├── README.md
├── TODO.md
├── code-of-conduct.md
├── compatibility_check/
│   └── figures/
│       └── summary.csv
├── develop.sh
├── docs/
│   ├── Makefile
│   ├── README.md
│   ├── api.rst
│   ├── commandline_tools.rst
│   ├── commandline_tutorial.rst
│   ├── conf.py
│   ├── doctest.sh
│   ├── example.fasta
│   ├── index.rst
│   ├── intro.rst
│   ├── python_tutorial.rst
│   └── requirements.txt
├── downloads-generation/
│   ├── README.md
│   ├── allele_sequences/
│   │   ├── GENERATE.sh
│   │   ├── class1_pseudosequences.csv
│   │   ├── filter_sequences.py
│   │   ├── make_allele_sequences.py
│   │   └── select_alleles_to_disambiguate.py
│   ├── analysis_predictor_info/
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── generate_artifacts.py
│   │   ├── generate_model_selection_with_decoys.py
│   │   ├── predict_on_model_selection_data.py
│   │   └── requirements.txt
│   ├── data_curated/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── curate.py
│   │   ├── curate_ms_by_pmid.py
│   │   └── requirements.txt
│   ├── data_evaluation/
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── join_with_precomputed.py
│   │   ├── make_benchmark.py
│   │   └── split_by_sample.py
│   ├── data_iedb/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── data_mass_spec_annotated/
│   │   ├── GENERATE.sh
│   │   ├── annotate.py
│   │   └── requirements.txt
│   ├── data_predictions/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.gpu.lsf
│   │   ├── cluster_submit_script_header.mssm_hpc.nogpu.lsf
│   │   ├── requirements.txt
│   │   ├── run_predictors.py
│   │   ├── write_allele_list.py
│   │   └── write_proteome_peptides.py
│   ├── data_published/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── data_references/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── process.py
│   │   └── requirements.txt
│   ├── data_systemhcatlas/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── models_class1/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   └── write_validation_data.py
│   ├── models_class1_kim_benchmark/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── class1_pseudosequences.csv
│   │   ├── curate.py
│   │   ├── generate_hyperparameters.py
│   │   └── write_validation_data.py
│   ├── models_class1_minimal/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── models_class1_pan/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── additional_alleles.txt
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── generate_hyperparameters.py
│   │   └── reassign_mass_spec_training_data.py
│   ├── models_class1_pan_variants/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.gpu.lsf
│   │   ├── exclude_data_from_training.py
│   │   └── generate_hyperparameters.py
│   ├── models_class1_presentation/
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   └── make_train_data.py
│   ├── models_class1_processing/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── annotate_hits_with_expression.py
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── generate_hyperparameters.base.py
│   │   ├── generate_hyperparameters.variants.py
│   │   └── make_train_data.py
│   ├── models_class1_selected_no_mass_spec/
│   │   └── GENERATE.sh
│   ├── models_class1_trained_with_mass_spec/
│   │   └── GENERATE.sh
│   ├── models_class1_unselected/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── class1_pseudosequences.csv
│   │   └── generate_hyperparameters.py
│   ├── models_class1_unselected_with_mass_spec/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── class1_pseudosequences.csv
│   │   └── generate_hyperparameters.py
│   └── random_peptide_predictions/
│       ├── GENERATE.sh
│       └── random_predictions.py
├── lint.sh
├── mhcflurry/
│   ├── __init__.py
│   ├── allele_encoding.py
│   ├── amino_acid.py
│   ├── calibrate_percentile_ranks_command.py
│   ├── class1_affinity_predictor.py
│   ├── class1_neural_network.py
│   ├── class1_presentation_predictor.py
│   ├── class1_processing_neural_network.py
│   ├── class1_processing_predictor.py
│   ├── cluster_parallelism.py
│   ├── cluster_worker_entry_point.py
│   ├── common.py
│   ├── custom_loss.py
│   ├── data_dependent_weights_initialization.py
│   ├── downloads.py
│   ├── downloads.yml
│   ├── downloads_command.py
│   ├── encodable_sequences.py
│   ├── ensemble_centrality.py
│   ├── fasta.py
│   ├── flanking_encoding.py
│   ├── hyperparameters.py
│   ├── local_parallelism.py
│   ├── percent_rank_transform.py
│   ├── predict_command.py
│   ├── predict_scan_command.py
│   ├── pytorch_layers.py
│   ├── pytorch_losses.py
│   ├── random_negative_peptides.py
│   ├── regression_target.py
│   ├── scoring.py
│   ├── select_allele_specific_models_command.py
│   ├── select_pan_allele_models_command.py
│   ├── select_processing_models_command.py
│   ├── testing_utils.py
│   ├── train_allele_specific_models_command.py
│   ├── train_pan_allele_models_command.py
│   ├── train_presentation_models_command.py
│   ├── train_processing_models_command.py
│   └── version.py
├── notebooks/
│   ├── example1.ipynb
│   └── mhcflurry-colab.ipynb
├── pylintrc
├── readthedocs.yml
├── requirements.txt
├── scripts/
│   ├── compare_tf_pytorch_random_outputs.py
│   ├── cross_allele_parity_analysis.py
│   ├── extract_high_presentation_fixture.py
│   ├── generate_fixture_error_report.py
│   ├── modal_train_mhcflurry.py
│   ├── plot_fixture_diffs.py
│   └── validate_allele_sequences.py
├── selected-peptides.csv
├── setup.py
├── setup_local_env.sh
├── test/
│   ├── __init__.py
│   ├── conftest.py
│   ├── data/
│   │   ├── data_10mer.csv
│   │   ├── data_8mer.csv
│   │   ├── data_9mer.csv
│   │   ├── example.fasta
│   │   ├── hpv_predictions.csv
│   │   ├── master_affinity_fixture_config.json
│   │   ├── master_affinity_fixture_predictions.json
│   │   ├── master_affinity_fixture_weights.npz
│   │   ├── master_densenet_fixture_config.json
│   │   ├── master_densenet_fixture_predictions.json
│   │   ├── master_densenet_fixture_weights.npz
│   │   ├── master_multi_output_fixture_config.json
│   │   ├── master_multi_output_fixture_predictions.json
│   │   ├── master_multi_output_fixture_weights.npz
│   │   ├── master_pan_concat_fixture_config.json
│   │   ├── master_pan_concat_fixture_predictions.json
│   │   ├── master_pan_concat_fixture_weights.npz
│   │   ├── master_pan_multiply_fixture_config.json
│   │   ├── master_pan_multiply_fixture_predictions.json
│   │   ├── master_pan_multiply_fixture_weights.npz
│   │   ├── master_released_class1_affinity_predictions.json
│   │   ├── master_released_class1_presentation_highscore_rows_metadata.json
│   │   ├── multiallelic.benchmark.small.csv.bz2
│   │   └── multiallelic_ms.benchmark1.csv.bz2
│   ├── expensive_verify_pretrain_optimizable.py
│   ├── pytest_helpers.py
│   ├── test_allele_encoding.py
│   ├── test_amino_acid.py
│   ├── test_api_compat_shims.py
│   ├── test_calibrate_percentile_ranks_command.py
│   ├── test_changing_allele_representations.py
│   ├── test_class1_affinity_predictor.py
│   ├── test_class1_neural_network.py
│   ├── test_class1_pan.py
│   ├── test_class1_presentation_predictor.py
│   ├── test_class1_processing_neural_network.py
│   ├── test_class1_processing_predictor.py
│   ├── test_custom_loss.py
│   ├── test_doctest.py
│   ├── test_download_models_class1.py
│   ├── test_ensemble_centrality.py
│   ├── test_hyperparameters.py
│   ├── test_local_parallelism.py
│   ├── test_master_compat_predictions.py
│   ├── test_multi_output.py
│   ├── test_network_merging.py
│   ├── test_percent_rank_transform.py
│   ├── test_predict_command.py
│   ├── test_predict_scan_command.py
│   ├── test_pytorch_coverage.py
│   ├── test_pytorch_regressions.py
│   ├── test_random_negative_peptides.py
│   ├── test_regression_target.py
│   ├── test_released_master_predictions.py
│   ├── test_released_predictors_on_hpv_dataset.py
│   ├── test_released_predictors_well_correlated.py
│   ├── test_released_presentation_highscore_rows.py
│   ├── test_selected_peptides_csv.py
│   ├── test_speed.py
│   ├── test_train_and_related_commands.py
│   ├── test_train_pan_allele_models_command.py
│   ├── test_train_processing_models_command.py
│   └── test_training_variants.py
└── test-environment.yml

================================================
FILE CONTENTS
================================================

================================================
FILE: .dockerignore
================================================
.git
.gitignore
LICENSE
*.zip
*.swp
experiments
mhc_ligand_full*
training/class1_allele_specific/data/mhc_ligand_full.*


================================================
FILE: .github/workflows/build.yml
================================================
name: build

on:
  workflow_dispatch: {}
  workflow_call: {}


jobs:
  build:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout
        uses: actions/checkout@v3
      - name: Set up Python
        uses: actions/setup-python@v3
        with:
          python-version: 3.11
      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          pip install build
      - name: Build package
        run: |
          python -m build -sw
          ls -lh dist/
      - name: Upload artifact
        uses: actions/upload-artifact@v4
        with:
          name: dist
          path: dist/


================================================
FILE: .github/workflows/ci.yml
================================================
name: CI

on:
  push:
    branches: ["master"]
  pull_request:
    branches: ["master"]

jobs:
  build:
    runs-on: ubuntu-latest
    defaults:
      run:
        shell: bash -el {0}
    strategy:
      fail-fast: false
      matrix:
        python-version: ["3.10", "3.11", "3.12"]

    steps:
      - name: Checkout code
        uses: actions/checkout@v3

      - name: Set up Python ${{ matrix.python-version }}
        uses: actions/setup-python@v3
        with:
          python-version: ${{ matrix.python-version }}

      - name: Install system dependencies
        run: |
          sudo apt-get update
          sudo apt-get install -y pandoc
      - name: Build Conda environment
        uses: conda-incubator/setup-miniconda@v2
        with:
          activate-environment: test-environment
          environment-file: test-environment.yml
          python-version: ${{ matrix.python-version }}
          auto-activate-base: false

      - name: Install python dependencies
        run: |
          pip install --upgrade pip
          pip install flake8 nose-py3 pytest pytest-cov coveralls
          pip install -r requirements.txt
          pip install -r docs/requirements.txt
          pip install .
      #- name: Lint with flake8
      #  run: |
      #    # stop the build if there are Python syntax errors or undefined names
      #    flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
      #    # exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
      #    # flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics

      - name: Download data and models
        run: |
          mhcflurry-downloads fetch data_curated data_mass_spec_annotated models_class1 models_class1_presentation models_class1_processing models_class1_pan allele_sequences

      - name: Test with pytest
        run: |
          pytest --tb=long --cov=mhcflurry/ --cov-report=term-missing ./test

      # - name: Publish coverage to Coveralls
      #  uses: coverallsapp/github-action@v2.2.3
      #  with:
      #    parallel: true


================================================
FILE: .github/workflows/release.yml
================================================
# Based on https://docs.pypi.org/trusted-publishers/using-a-publisher/

name: release

on:
  release:
    types: [published]

jobs:
  build:
    uses: ./.github/workflows/build.yml
  pypi-publish:
    name: upload release to PyPI
    needs: build
    runs-on: ubuntu-latest
    environment: release
    permissions:
      id-token: write  # IMPORTANT: mandatory for trusted publishing
    steps:
      - name: Download build artifacts
        uses: actions/download-artifact@v4
        with:
          name: dist
          path: dist
      - name: Publish package distributions to PyPI
        uses: pypa/gh-action-pypi-publish@release/v1



================================================
FILE: .github/workflows/release_testpypi.yml
================================================
# Based on https://docs.pypi.org/trusted-publishers/using-a-publisher/

name: release_testpypi

on:
  workflow_dispatch: {}

jobs:
  build:
    uses: ./.github/workflows/build.yml
  publish-to-testpypi:
    name: upload release to TestPyPI
    needs: build
    runs-on: ubuntu-latest
    environment: release_testpypi
    permissions:
      id-token: write  # IMPORTANT: mandatory for trusted publishing
    steps:
      - name: Download build artifacts
        uses: actions/download-artifact@v4
        with:
          name: dist
          path: dist
      - name: Publish distribution to TestPyPI
        uses: pypa/gh-action-pypi-publish@release/v1
        with:
          repository-url: https://test.pypi.org/legacy/


================================================
FILE: .gitignore
================================================
# Swap files
*.swp

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]

# C extensions
*.so

# Distribution / packaging
.Python
env/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg

# IDE
.idea/

# PyInstaller
#  Usually these files are written by a python script from a template
#  before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover

# Translations
*.mo
*.pot

# Django stuff:
*.log

# Sphinx documentation
docs/_build/
docs/_static
docs/_templates

# PyBuilder
target/

# Data files
*.zip

# Dask distributed clutter
.dask-web-ui.json

# ipython checkpoints
.ipynb_checkpoints

# OS X extra files
.DS_Store


================================================
FILE: AGENTS.md
================================================
# AGENTS.md — mhcflurry

Guide for coding agents working in this repo. Read this before touching code.

---

## Golden Rules

1. **Never commit to `main`.** Always `git checkout -b <feature-branch>` before editing. Land via PR.
2. **Every PR bumps the version.** Even doc-only PRs — at minimum a patch bump in the package's `__init__.py` / `_version.py`.
3. **"Done" means merged AND released** — never stop at merge. mhcflurry doesn't (yet) have `deploy.sh`; follow the release recipe in `CONTRIBUTING.md` / `NOTES.md` and push the tag so PyPI gets the new version. Skipping release = task not done.
4. **File problems as issues, don't silently work around them.** If you hit a bug here or in a sibling openvax/pirl-unc repo, open a GitHub issue on the correct repo and link it from the PR.
5. **After a PR ships, look for the next block of work.** Read open issues across the relevant openvax repos, group by dependency + urgency. Prefer *foundational* changes that unblock multiple downstream improvements; otherwise chain the smallest independent improvements.

---

## Repo Shape (read before scripting)

Unlike its siblings, mhcflurry does **not** have `test.sh`, `deploy.sh`, or `format.sh`. It has:

- `develop.sh` — **source** this (`source develop.sh`) to create/activate `.venv` and editable-install. Do not `./develop.sh` (its venv activation won't persist).
- `lint.sh` — `ruff check mhcflurry/ test/` (note: tests live in `test/`, singular).
- `setup.py` — packaging.
- No `pyproject.toml`.

If you want to add `test.sh` / `deploy.sh` / `format.sh` to match the other openvax repos, that's a welcome foundational PR — discuss with Alex first.

## Before Completing Any Task

Before telling the user a change is "complete":

1. **`./lint.sh`** — must pass (ruff check)
2. **Run tests**: `pytest test/` (no `test.sh` wrapper). For the slow ML suite you may need downloaded models — see `docker/` or `test-environment.yml`.
3. For a PR: **CI must be green on GitHub**, then merge, then release (see Golden Rule 3).

## Code Style

- Python 3.9+
- Lint: ruff (concise output)
- Docstrings: numpy style
- Bugfixes include a regression test where feasible
- mhcflurry is a trained ML model system — be extremely cautious about changes that could alter predictions without a clear reason. Prediction-affecting changes need empirical validation, not just green tests.

---

## Workflow Orchestration

### 1. Upfront Planning
- For any non-trivial task (3+ steps or architectural): write a short spec first. If something goes sideways, STOP and re-plan — don't keep pushing.

### 2. Verification Before Done
- Never claim complete without proof: tests green, CI green, release tagged.
- For model or training changes: include before/after metrics on a held-out set.

### 3. Autonomous Bug Fixing
- Given a bug report: just fix it. Point at logs/errors/failing tests and resolve them without hand-holding.

### 4. Demand Elegance (Balanced)
- For non-trivial changes pause and ask "is there a more elegant way?" — skip for trivial fixes.
- Treat workarounds as bugs, not new abstractions. Rip out legacy paths decisively rather than accumulating special cases.

### 5. Issue Triage After Each Ship
- Close superseded/outdated issues as you notice them.
- New problems mid-task → file as issues (on the right repo, even if it's not this one), don't bury.

---

## Core Principles

- **Simplicity first.** Minimal diffs, minimal abstractions.
- **No laziness.** Find root causes; no temporary fixes, no empty-category fudges.
- **Minimal blast radius.** Touch only what the task requires.

## Scientific Domain Knowledge

- If a change touches immunology/genomics semantics, check primary sources (papers, UniProt, GenBank) before edits.
- If the code expresses a scientific model at odds with your understanding, flag it — don't silently "fix" it into something wrong.
- Use `mhcgnomes` for MHC allele parsing. Never `startswith("HLA-")` or other string hacks — alleles aren't always human.


================================================
FILE: CONTRIBUTING.md
================================================
# Contributing to MHCflurry

We would love your help in making MHCflurry a useful resource for the community. No contribution is too small, and we especially appreciate usability improvements like better documentation, tutorials, tests, or code cleanup.

## Project scope
We hope MHCflurry will grow to include **reference implementations for state-of-the-art approaches for T cell epitope prediction**. This includes pan-allele MHC I and II prediction and closely related tasks such as prediction of antigen processing and immunogenicity. It does not include tasks such as B cell (antibody) epitope prediction, prediction of TCR/pMHC interactions, or downstream tasks such as cancer vaccine design. All committed code to MHCflurry should be suitable for regular research use by practioners. This likely means that new models will require a benchmark evaluation with a publication or preprint before they can be accepted.

If you are contemplating a large contribution, such as the addition of a new predictive model, it probably makes sense to reach out on the Github issue tracker (or email us at hello@openvax.org) to discuss and coordinate the work.

## Making a contribution
All contributions can be made as pull requests on Github. One of the core developers will review your contribution. As needed the core contributors will also make releases and submit to PyPI.

A few other guidelines:

 * Any generated resource, such as trained models, must be associated with a `GENERATE.sh` script in [downloads-generation](https://github.com/openvax/mhcflurry/tree/master/downloads-generation). Running this script with no arguments should fully reproduce the generated result. Reproducability of MHCflurry trained models and related data (such as curated training data, allele sequences, etc.) is key to allowing others to build upon and improve our work.
 * MHCflurry supports Python 3.10+ on Linux and OS X. We can't guarantee support for Windows. If you are having trouble running MHCflurry on Windows we would appreciate contributions that help us address this.
 * All functions should be documented using [numpy-style docstrings](https://numpydoc.readthedocs.io/en/latest/format.html) and associated with unit tests.
 * Bugfixes should be accompanied with test that illustrates the bug when feasible.
 * Contributions are licensed under Apache 2.0
 * Please adhere to our [code of conduct](https://github.com/openvax/mhcflurry/blob/master/code-of-conduct.md).

Working on your first Pull Request? One resource that may be helpful is [How to Contribute to an Open Source Project on GitHub](https://egghead.io/series/how-to-contribute-to-an-open-source-project-on-github).


================================================
FILE: Dockerfile
================================================
FROM continuumio/miniconda3:latest

LABEL maintainer="Tim O'Donnell timodonnell@gmail.com"

WORKDIR /root

# Install system dependencies
RUN apt-get update -y && apt-get install -y gcc && \
    apt-get clean && rm -rf /var/lib/apt/lists/*

# Create a lightweight conda env with Python 3.10
RUN conda create -n mhcflurry python=3.10 -y && \
    conda clean -afy

# Activate the env by modifying PATH
ENV PATH /opt/conda/envs/mhcflurry/bin:$PATH

# Install pip packages in the env
RUN pip install --no-cache-dir --upgrade pip && \
    pip install --no-cache-dir jupyter seaborn

# Install dependencies (doing this first to have them cached)
COPY requirements.txt /tmp/mhcflurry-requirements.txt
RUN pip install --no-cache-dir -r /tmp/mhcflurry-requirements.txt

# Pre-download resources for mhcflurry
RUN mkdir /tmp/mhcflurry-downloads
COPY mhcflurry/downloads.yml /tmp/mhcflurry-downloads
RUN python -c '\
import yaml, subprocess; \
d = yaml.safe_load(open("/tmp/mhcflurry-downloads/downloads.yml")); \
downloads = d["releases"][d["current-release"]]["downloads"]; \
urls = [item["url"] for item in downloads if item["default"]]; \
[subprocess.run(["wget", "-P", "/tmp/mhcflurry-downloads", url]) for url in urls]'

# Copy example notebooks
COPY notebooks/* ./

# Copy source code and install mhcflurry in editable mode
COPY . mhcflurry
RUN pip install -e mhcflurry/

# Fetch resources from pre-downloaded data
RUN mhcflurry-downloads fetch --already-downloaded-dir /tmp/mhcflurry-downloads

EXPOSE 9999
CMD ["jupyter", "notebook", "--port=9999", "--no-browser", "--ip=0.0.0.0", "--allow-root", "--NotebookApp.token=''", "--NotebookApp.password=''"]



================================================
FILE: LICENSE
================================================
                                 Apache License
                           Version 2.0, January 2004
                        http://www.apache.org/licenses/

   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

   1. Definitions.

      "License" shall mean the terms and conditions for use, reproduction,
      and distribution as defined by Sections 1 through 9 of this document.

      "Licensor" shall mean the copyright owner or entity authorized by
      the copyright owner that is granting the License.

      "Legal Entity" shall mean the union of the acting entity and all
      other entities that control, are controlled by, or are under common
      control with that entity. For the purposes of this definition,
      "control" means (i) the power, direct or indirect, to cause the
      direction or management of such entity, whether by contract or
      otherwise, or (ii) ownership of fifty percent (50%) or more of the
      outstanding shares, or (iii) beneficial ownership of such entity.

      "You" (or "Your") shall mean an individual or Legal Entity
      exercising permissions granted by this License.

      "Source" form shall mean the preferred form for making modifications,
      including but not limited to software source code, documentation
      source, and configuration files.

      "Object" form shall mean any form resulting from mechanical
      transformation or translation of a Source form, including but
      not limited to compiled object code, generated documentation,
      and conversions to other media types.

      "Work" shall mean the work of authorship, whether in Source or
      Object form, made available under the License, as indicated by a
      copyright notice that is included in or attached to the work
      (an example is provided in the Appendix below).

      "Derivative Works" shall mean any work, whether in Source or Object
      form, that is based on (or derived from) the Work and for which the
      editorial revisions, annotations, elaborations, or other modifications
      represent, as a whole, an original work of authorship. For the purposes
      of this License, Derivative Works shall not include works that remain
      separable from, or merely link (or bind by name) to the interfaces of,
      the Work and Derivative Works thereof.

      "Contribution" shall mean any work of authorship, including
      the original version of the Work and any modifications or additions
      to that Work or Derivative Works thereof, that is intentionally
      submitted to Licensor for inclusion in the Work by the copyright owner
      or by an individual or Legal Entity authorized to submit on behalf of
      the copyright owner. For the purposes of this definition, "submitted"
      means any form of electronic, verbal, or written communication sent
      to the Licensor or its representatives, including but not limited to
      communication on electronic mailing lists, source code control systems,
      and issue tracking systems that are managed by, or on behalf of, the
      Licensor for the purpose of discussing and improving the Work, but
      excluding communication that is conspicuously marked or otherwise
      designated in writing by the copyright owner as "Not a Contribution."

      "Contributor" shall mean Licensor and any individual or Legal Entity
      on behalf of whom a Contribution has been received by Licensor and
      subsequently incorporated within the Work.

   2. Grant of Copyright License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      copyright license to reproduce, prepare Derivative Works of,
      publicly display, publicly perform, sublicense, and distribute the
      Work and such Derivative Works in Source or Object form.

   3. Grant of Patent License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      (except as stated in this section) patent license to make, have made,
      use, offer to sell, sell, import, and otherwise transfer the Work,
      where such license applies only to those patent claims licensable
      by such Contributor that are necessarily infringed by their
      Contribution(s) alone or by combination of their Contribution(s)
      with the Work to which such Contribution(s) was submitted. If You
      institute patent litigation against any entity (including a
      cross-claim or counterclaim in a lawsuit) alleging that the Work
      or a Contribution incorporated within the Work constitutes direct
      or contributory patent infringement, then any patent licenses
      granted to You under this License for that Work shall terminate
      as of the date such litigation is filed.

   4. Redistribution. You may reproduce and distribute copies of the
      Work or Derivative Works thereof in any medium, with or without
      modifications, and in Source or Object form, provided that You
      meet the following conditions:

      (a) You must give any other recipients of the Work or
          Derivative Works a copy of this License; and

      (b) You must cause any modified files to carry prominent notices
          stating that You changed the files; and

      (c) You must retain, in the Source form of any Derivative Works
          that You distribute, all copyright, patent, trademark, and
          attribution notices from the Source form of the Work,
          excluding those notices that do not pertain to any part of
          the Derivative Works; and

      (d) If the Work includes a "NOTICE" text file as part of its
          distribution, then any Derivative Works that You distribute must
          include a readable copy of the attribution notices contained
          within such NOTICE file, excluding those notices that do not
          pertain to any part of the Derivative Works, in at least one
          of the following places: within a NOTICE text file distributed
          as part of the Derivative Works; within the Source form or
          documentation, if provided along with the Derivative Works; or,
          within a display generated by the Derivative Works, if and
          wherever such third-party notices normally appear. The contents
          of the NOTICE file are for informational purposes only and
          do not modify the License. You may add Your own attribution
          notices within Derivative Works that You distribute, alongside
          or as an addendum to the NOTICE text from the Work, provided
          that such additional attribution notices cannot be construed
          as modifying the License.

      You may add Your own copyright statement to Your modifications and
      may provide additional or different license terms and conditions
      for use, reproduction, or distribution of Your modifications, or
      for any such Derivative Works as a whole, provided Your use,
      reproduction, and distribution of the Work otherwise complies with
      the conditions stated in this License.

   5. Submission of Contributions. Unless You explicitly state otherwise,
      any Contribution intentionally submitted for inclusion in the Work
      by You to the Licensor shall be under the terms and conditions of
      this License, without any additional terms or conditions.
      Notwithstanding the above, nothing herein shall supersede or modify
      the terms of any separate license agreement you may have executed
      with Licensor regarding such Contributions.

   6. Trademarks. This License does not grant permission to use the trade
      names, trademarks, service marks, or product names of the Licensor,
      except as required for reasonable and customary use in describing the
      origin of the Work and reproducing the content of the NOTICE file.

   7. Disclaimer of Warranty. Unless required by applicable law or
      agreed to in writing, Licensor provides the Work (and each
      Contributor provides its Contributions) on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
      implied, including, without limitation, any warranties or conditions
      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
      PARTICULAR PURPOSE. You are solely responsible for determining the
      appropriateness of using or redistributing the Work and assume any
      risks associated with Your exercise of permissions under this License.

   8. Limitation of Liability. In no event and under no legal theory,
      whether in tort (including negligence), contract, or otherwise,
      unless required by applicable law (such as deliberate and grossly
      negligent acts) or agreed to in writing, shall any Contributor be
      liable to You for damages, including any direct, indirect, special,
      incidental, or consequential damages of any character arising as a
      result of this License or out of the use or inability to use the
      Work (including but not limited to damages for loss of goodwill,
      work stoppage, computer failure or malfunction, or any and all
      other commercial damages or losses), even if such Contributor
      has been advised of the possibility of such damages.

   9. Accepting Warranty or Additional Liability. While redistributing
      the Work or Derivative Works thereof, You may choose to offer,
      and charge a fee for, acceptance of support, warranty, indemnity,
      or other liability obligations and/or rights consistent with this
      License. However, in accepting such obligations, You may act only
      on Your own behalf and on Your sole responsibility, not on behalf
      of any other Contributor, and only if You agree to indemnify,
      defend, and hold each Contributor harmless for any liability
      incurred by, or claims asserted against, such Contributor by reason
      of your accepting any such warranty or additional liability.

   END OF TERMS AND CONDITIONS

   APPENDIX: How to apply the Apache License to your work.

      To apply the Apache License to your work, attach the following
      boilerplate notice, with the fields enclosed by brackets "{}"
      replaced with your own identifying information. (Don't include
      the brackets!)  The text should be enclosed in the appropriate
      comment syntax for the file format. We also recommend that a
      file or class name and description of purpose be included on the
      same "printed page" as the copyright notice for easier
      identification within third-party archives.

   Copyright {yyyy} {name of copyright owner}

   Licensed under the Apache License, Version 2.0 (the "License");
   you may not use this file except in compliance with the License.
   You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.



================================================
FILE: NOTES.md
================================================
# Notes

## 2026-02-10

- Goal: match PyTorch branch behavior to TensorFlow master for class I presentation prediction.
- Confirmed mismatch is isolated to `processing_predictor_with_flanks` path:
  - Affinity outputs match TF nearly exactly.
  - Processing without flanks matches TF nearly exactly.
  - Processing with flanks differs materially.
- Intermediate feature comparison (single processing model) shows:
  - `n_flank_cleaved`, `n_flank_internal_cleaved`, `c_flank_cleaved`, `c_flank_internal_cleaved` match TF.
  - Only `n_flank_avg_dense` and `c_flank_avg_dense` inputs differ.
- Root cause identified:
  - TF computes masked flank averages with `reduce_mean(..., axis=1)` over full sequence length.
  - Current PyTorch computes average over flank positions only.
  - This changes the two flank-average scalar features and can change top peptide ranking in presentation mode.
- Fix implemented:
  - Updated `Class1ProcessingModel` N/C flank-average pooling math to mirror TF exactly:
    - `mean((x + 1) * mask, axis=sequence_axis) - 1`
    - denominator is full sequence length.
- Validation after fix:
  - Single-model intermediate features now match TF to float noise.
  - With-flanks processing predictions now match TF to float noise.
  - End-to-end presentation predictions for test sequences now match TF best-peptide selection.
  - `test/test_class1_presentation_predictor.py::test_downloaded_predictor` passes.
  - Parity test subset passes:
    - `test/test_master_compat_predictions.py`
    - `test/test_released_master_predictions.py`
    - `test/test_pytorch_regressions.py`
- Regression coverage:
  - Added `test_processing_flank_averages_use_tf_masked_mean_semantics` in
    `test/test_pytorch_regressions.py`.
- Tooling add-on:
  - Added `scripts/modal_train_mhcflurry.py` for running parallel training jobs on Modal.
- Random TF-vs-PyTorch comparison harness improvements:
  - Added curated default allele panel in `scripts/compare_tf_pytorch_random_outputs.py`:
    - ~30 common HLA alleles plus a few animal alleles (`--allele-panel iedb_plus_animals`).
  - Reduced duplicate work in backend prediction:
    - Reused `Class1PresentationPredictor.predict(...)` processing outputs for
      `processing_with_score` and `processing_without_score` columns.
    - Removed separate direct processing predictor passes.
  - Runtime sanity:
    - Full `run --num-examples 5000` dropped from ~142s to ~80s on this machine.

- Added cross-product parity analysis workflow:
  - New script: `scripts/cross_allele_parity_analysis.py`
  - Generates random peptides uniformly across supported lengths (requested 7-15).
  - Crosses peptides against curated allele panel and predicts PT vs TF.
  - Produces:
    - prediction tables
    - numeric parity summaries
    - break analysis tables/report
    - plots under `plots/`
- Executed full run:
  - `1000` peptides x `35` alleles = `35000` pMHC rows
  - lengths: `7..15`
  - key result: no thresholded break events observed; differences remained at
    expected floating-point noise scale for score outputs and tiny absolute nM
    differences for affinity outputs.
- Follow-up experiment with random flanks:
  - Updated `scripts/cross_allele_parity_analysis.py` to:
    - generate random N/C flanks per peptide (length 5/5 from model support),
    - enforce pre-run uniqueness checks on peptide entries:
      - no repeated `peptide`, `n_flank`, or `c_flank`,
      - no duplicate `(peptide, n_flank, c_flank)` rows,
      - no duplicate `(peptide, allele, n_flank, c_flank)` in full dataset,
    - enforce post-run presentation sanity checks on both PT and TF:
      - at least 1% rows with score > 0.2,
      - at least one row with score > 0.9.
  - Run output dir: `/tmp/mhcflurry-cross-allele-1000-randflanks`
    - `1000` peptides x `35` alleles = `35000` rows.
  - Sanity thresholds passed:
    - PT with-flanks: 1.28% > 0.2, max 0.973
    - TF with-flanks: 1.28% > 0.2, max 0.973
    - PT without-flanks: 1.32% > 0.2, max 0.970
    - TF without-flanks: 1.32% > 0.2, max 0.970
- High-score fixture extraction for unit tests:
  - Added `scripts/extract_high_presentation_fixture.py`.
  - Extracted TF fixture rows from
    `/tmp/mhcflurry-cross-allele-1000-randflanks/tf_predictions.csv.gz`:
    - selected peptide+flank contexts where any allele had presentation score > 0.9,
    - retained all alleles for each selected context (including low scorers),
    - produced `315` rows (`9` contexts x `35` alleles).
  - Added fixture files:
    - `test/data/master_released_class1_presentation_highscore_rows.csv.gz`
    - `test/data/master_released_class1_presentation_highscore_rows_metadata.json`
  - Added regression test:
    - `test/test_released_presentation_highscore_rows.py`
    - validates fixture high/low context properties and compares released
      PyTorch predictions against TF fixture outputs.

## 2026-02-12

- Packaging / Torch readiness checks:
  - Verified `setup.py` publishes `torch>=2.0.0` in metadata and wheel:
    - `python setup.py egg_info`
    - `python -m pip wheel --no-deps .`
  - Verified generated metadata includes `Requires-Dist: torch>=2.0.0`.
  - Editable install attempt failed in this sandbox due permissions in
    shared virtualenv `bin/` path, not due packaging metadata.

- Warning triage:
  - Important forward-compat warnings fixed:
    - `class1_presentation_predictor.py`: avoid `idxmin` on all-NA rows.
    - `random_negative_peptides.py`: avoid assigning `NaN` into int-typed frame.
  - Test warning cleanup:
    - `test_class1_processing_neural_network.py`: avoid `SettingWithCopyWarning`
      by copying train/test subsets before assignment.
  - Deprecated imports cleanup:
    - `downloads.py`: replaced `pipes.quote` with `shlex.quote`.
    - `downloads.py`: replaced `pkg_resources.resource_string` with
      `importlib.resources.files(...).read_text()`.

- Targeted validation after fixes:
  - `pytest -q test/test_class1_presentation_predictor.py::test_downloaded_predictor_invalid_peptides`
  - `pytest -q test/test_random_negative_peptides.py::test_random_negative_peptides_by_allele`
  - `pytest -q test/test_class1_processing_neural_network.py::test_small`
  - Result: all pass; only isolated `pytest.mark.slow` registration warning remains
    when running that single test file directly.


================================================
FILE: README.md
================================================
[![Build Status](https://github.com/openvax/mhcflurry/actions/workflows/ci.yml/badge.svg)](https://github.com/openvax/mhcflurry/actions/workflows/ci.yml)
[![Coverage Status](https://coveralls.io/repos/github/openvax/mhcflurry/badge.svg?branch=master)](https://coveralls.io/github/openvax/mhcflurry?branch=master)
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/openvax/mhcflurry/blob/master/notebooks/mhcflurry-colab.ipynb)

# mhcflurry
[MHC I](https://en.wikipedia.org/wiki/MHC_class_I) ligand
prediction package with competitive accuracy and a fast and
[documented](http://openvax.github.io/mhcflurry/) implementation.

> [!IMPORTANT]
> **Version 2.2.0** is the first release to use [PyTorch](https://pytorch.org/) as its neural network backend, replacing TensorFlow/Keras used in previous versions. It loads the same published weights and produces equivalent predictions, so existing workflows should continue to work with no changes.
>
> Key changes in 2.2.0:
> - **Backend**: TensorFlow/Keras replaced by PyTorch (>= 2.0)
> - **Python**: Requires Python 3.10+ (previously 3.9+)
> - **Dependencies**: `pandas >= 2.0` is now required; `tensorflow` and `keras` are no longer needed
> - **Hardware**: Automatic GPU detection; Apple Silicon (MPS) is now supported
>
> If you are upgrading from 2.1.x, simply `pip install --upgrade mhcflurry`. The published pre-trained models are unchanged and will be loaded and converted automatically.

MHCflurry implements class I peptide/MHC binding affinity prediction.
The current version provides pan-MHC I predictors supporting any MHC
allele of known sequence. MHCflurry runs on Python 3.10+ using the
[PyTorch](https://pytorch.org/) neural network library.
It exposes [command-line](http://openvax.github.io/mhcflurry/commandline_tutorial.html)
and [Python library](http://openvax.github.io/mhcflurry/python_tutorial.html)
interfaces.

MHCflurry also includes two experimental predictors,
an "antigen processing" predictor that attempts to model MHC allele-independent
effects such as proteosomal cleavage and a "presentation" predictor that
integrates processing predictions with binding affinity predictions to give a
composite "presentation score." Both models are trained on mass spec-identified
MHC ligands.

If you find MHCflurry useful in your research please cite:

> T. O'Donnell, A. Rubinsteyn, U. Laserson. "MHCflurry 2.0: Improved pan-allele prediction of MHC I-presented peptides by incorporating antigen processing," *Cell Systems*, 2020. https://doi.org/10.1016/j.cels.2020.06.010

> T. O'Donnell, A. Rubinsteyn, M. Bonsack, A. B. Riemer, U. Laserson, and J. Hammerbacher, "MHCflurry: Open-Source Class I MHC Binding Affinity Prediction," *Cell Systems*, 2018. https://doi.org/10.1016/j.cels.2018.05.014

Please file an issue if you have questions or encounter problems.

Have a bugfix or other contribution? We would love your help. See our [contributing guidelines](CONTRIBUTING.md).

## Try it now

You can generate MHCflurry predictions without any setup by running our Google colaboratory [notebook](https://colab.research.google.com/github/openvax/mhcflurry/blob/master/notebooks/mhcflurry-colab.ipynb).

## Installation (pip)

Install the package:

```
$ pip install mhcflurry
```

Download our datasets and trained models:

```
$ mhcflurry-downloads fetch
```

You can now generate predictions:

```
$ mhcflurry-predict \
       --alleles HLA-A0201 HLA-A0301 \
       --peptides SIINFEKL SIINFEKD SIINFEKQ \
       --out /tmp/predictions.csv

Wrote: /tmp/predictions.csv
```

Or scan protein sequences for potential epitopes:

```
$ mhcflurry-predict-scan \
        --sequences MFVFLVLLPLVSSQCVNLTTRTQLPPAYTNSFTRGVYYPDKVFRSSVLHS \
        --alleles HLA-A*02:01 \
        --out /tmp/predictions.csv

Wrote: /tmp/predictions.csv
```


See the [documentation](http://openvax.github.io/mhcflurry/) for more details.


## Docker
You can also try the latest (GitHub master) version of MHCflurry using the Docker
image hosted on [Dockerhub](https://hub.docker.com/r/openvax/mhcflurry) by
running:

```
$ docker run -p 9999:9999 --rm openvax/mhcflurry:latest
```

This will start a [jupyter](https://jupyter.org/) notebook server in an
environment that has MHCflurry installed. Go to `http://localhost:9999` in a
browser to use it.

To build the Docker image yourself, from a checkout run:

```
$ docker build -t mhcflurry:latest .
$ docker run -p 9999:9999 --rm mhcflurry:latest
```
## Predicted sequence motifs
Sequence logos for the binding motifs learned by MHCflurry BA are available [here](https://openvax.github.io/mhcflurry-motifs/).

## Common issues and fixes

### Problems downloading data and models
Some users have reported HTTP connection issues when using `mhcflurry-downloads fetch`. As a workaround, you can download the data manually (e.g. using `wget`) and then use `mhcflurry-downloads` just to copy the data to the right place.

To do this, first get the URL(s) of the downloads you need using `mhcflurry-downloads url`:

```
$ mhcflurry-downloads url models_class1_presentation
https://github.com/openvax/mhcflurry/releases/download/1.6.0/models_class1_presentation.20200205.tar.bz2```
```

Then make a directory and download the needed files to this directory:

```
$ mkdir downloads
$ wget  --directory-prefix downloads https://github.com/openvax/mhcflurry/releases/download/1.6.0/models_class1_presentation.20200205.tar.bz2```

HTTP request sent, awaiting response... 200 OK
Length: 72616448 (69M) [application/octet-stream]
Saving to: 'downloads/models_class1_presentation.20200205.tar.bz2'
```

Now call `mhcflurry-downloads fetch` with the `--already-downloaded-dir` option to indicate that the downloads should be retrived from the specified directory:

```
$ mhcflurry-downloads fetch models_class1_presentation --already-downloaded-dir downloads
```


================================================
FILE: TODO.md
================================================
# TODO

## DONE

- [x] Run broader/full test suite before merge.
  - `pytest -q` passed: 100 tests.

- [x] Localize parity mismatch component.
  - Affinity and processing-without-flanks parity confirmed.
  - Processing-with-flanks identified as source of presentation divergence.

- [x] Create development tracking docs.
  - Added `NOTES.md` and `TODO.md`.

- [x] Fix with-flanks processing parity vs TF in `mhcflurry/class1_processing_neural_network.py`.
  - Changed N/C flank-average pooling to match TF masked `reduce_mean` semantics.
  - Verified by comparing intermediate feature vectors and outputs against TF.

- [x] Validate end-to-end parity after fix.
  - Targeted TF-vs-PyTorch comparisons now match to near float precision.
  - `test/test_class1_presentation_predictor.py::test_downloaded_predictor` now passes.
  - Parity-focused tests pass:
    - `test/test_master_compat_predictions.py`
    - `test/test_released_master_predictions.py`
    - `test/test_pytorch_regressions.py`

- [x] Add regression coverage for with-flanks average behavior.
  - Added `test_processing_flank_averages_use_tf_masked_mean_semantics`.

- [x] Add Modal training script for larger jobs.
  - Added `scripts/modal_train_mhcflurry.py` with:
    - GPU worker function
    - shared artifacts volume
    - command-template based parallel launch

- [x] Speed up TF-vs-PyTorch random comparison harness.
  - Added curated default allele panel (`iedb_plus_animals`) to reduce per-run
    affinity-group fragmentation.
  - Removed redundant direct processing passes in `predict-backend`; processing
    outputs now reused from presentation predictions.
  - Verified end-to-end run succeeds with expected parity metrics and faster runtime.

- [x] Add cross-product parity analysis + plots for fixed peptide panel across alleles.
  - Added `scripts/cross_allele_parity_analysis.py`.
  - Ran `1000` random peptides (uniform lengths `7-15`) across curated panel (`35` alleles).
  - Generated summaries and plots in `/tmp/mhcflurry-cross-allele-1000-panel`.

- [x] Extend cross-product analysis to random flanks + strict sanity requirements.
  - Added unique random flank generation per peptide.
  - Added pre-run duplicate checks for peptide/flank fields.
  - Added post-run presentation score checks:
    - >=1% rows with score >0.2
    - at least one row with score >0.9
  - Ran and validated in `/tmp/mhcflurry-cross-allele-1000-randflanks`.

- [x] Build a high-score TF fixture for presentation regression tests.
  - Added `scripts/extract_high_presentation_fixture.py`.
  - Extracted contexts with any presentation score > 0.9 and retained all
  alleles per context (including low-score alleles).
  - Added fixture files under `test/data/` and new test
  `test/test_released_presentation_highscore_rows.py`.

- [x] Triage and fix important warnings.
  - Fixed future pandas warning in `Class1PresentationPredictor` (`idxmin` on all-NA rows).
  - Fixed future pandas warning in `random_negative_peptides` (assigning NaN into int dtype).
  - Fixed test `SettingWithCopyWarning` in processing NN tests.
  - Removed deprecated `pipes` and `pkg_resources` usage from `downloads.py`.


================================================
FILE: code-of-conduct.md
================================================
# Contributor Covenant Code of Conduct

## Our Pledge

In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.

## Our Standards

Examples of behavior that contributes to creating a positive environment
include:

* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members

Examples of unacceptable behavior by participants include:

* The use of sexualized language or imagery and unwelcome sexual attention or
  advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
  address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
  professional setting

## Our Responsibilities

Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.

Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.

## Scope

This Code of Conduct applies within all project spaces, and it also applies when
an individual is representing the project or its community in public spaces.
Examples of representing a project or community include using an official
project e-mail address, posting via an official social media account, or acting
as an appointed representative at an online or offline event. Representation of
a project may be further defined and clarified by project maintainers.

## Enforcement

Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at hello@openvax.org. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.

Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.

## Attribution

This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html

[homepage]: https://www.contributor-covenant.org

For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq


================================================
FILE: compatibility_check/figures/summary.csv
================================================
column,mean_abs_diff,max_abs_diff,median_pct_diff,p99_pct_diff
affinity_prediction,0.000855926090815266,0.005215815061092144,6.4490902031258496e-06,3.179401059073912e-05
affinity_prediction_high,0.0008876603356824831,0.007624668971402571,1.3224372910328715e-14,5.7364675654514835e-05
affinity_prediction_low,0.001672594107855598,0.011930973687412916,9.06903376028833e-06,6.726403554922356e-05
affinity_prediction_percentile,0.0,0.0,0.0,0.0
pres_with_affinity,0.0012569356975690076,0.006232064377400093,9.673636122167752e-06,7.026281525141103e-05
pres_with_affinity_percentile,0.0,0.0,0.0,0.0
pres_with_presentation_percentile,6.398100744094776e-17,1.7763568394002505e-15,0.0,1.1063297085880059e-13
pres_with_presentation_score,1.8840357408035613e-08,1.6878932457276008e-07,9.296827030315175e-06,5.1157195465321214e-05
pres_with_processing_score,2.2056083823879512e-08,6.332993507385254e-08,3.609134553534807e-06,8.415549260993538e-06
pres_without_affinity,0.0012569356975690076,0.006232064377400093,9.673636122167752e-06,7.026281525141103e-05
pres_without_affinity_percentile,0.0,0.0,0.0,0.0
pres_without_presentation_percentile,1.2749942195496836e-16,1.7763568394002505e-15,0.0,1.7011266438791655e-13
pres_without_presentation_score,1.668495578066763e-08,1.7208790847877964e-07,6.2876170033895e-06,4.9695389639194476e-05
pres_without_processing_score,2.8383164760302833e-10,1.4901161138336505e-08,0.0,1.0314542159027186e-06
processing_with_score,2.2056083823879512e-08,6.332993507385254e-08,3.609134553534807e-06,8.415549260993538e-06
processing_without_score,2.8383164760302833e-10,1.4901161138336505e-08,0.0,1.0314542159027186e-06


================================================
FILE: develop.sh
================================================
#!/bin/bash
# Development environment setup script
# Source this script to activate the venv: source develop.sh

SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
VENV_DIR="$SCRIPT_DIR/.venv"

# Check if already in the venv
if [[ "$VIRTUAL_ENV" == "$VENV_DIR" ]]; then
    echo "Virtual environment already active."
else
    if [[ -d "$VENV_DIR" ]]; then
        source "$VENV_DIR/bin/activate"
        echo "Activated virtual environment: $VENV_DIR"
    else
        echo "Virtual environment not found. Creating and installing..."
        python -m venv "$VENV_DIR"
        source "$VENV_DIR/bin/activate"
        pip install -e .
        echo "Activated virtual environment: $VENV_DIR"
    fi
fi


================================================
FILE: docs/Makefile
================================================
# Makefile for Sphinx documentation
#

# You can set these variables from the command line.
SPHINXOPTS    =
SPHINXBUILD   = sphinx-build
PAPER         =
BUILDDIR      = _build

# Internal variables.
PAPEROPT_a4     = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS   = -v -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS  = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .

.PHONY: help
help:
	@echo "Please use \`make <target>' where <target> is one of"
	@echo "  html       to make standalone HTML files"
	@echo "  dirhtml    to make HTML files named index.html in directories"
	@echo "  singlehtml to make a single large HTML file"
	@echo "  pickle     to make pickle files"
	@echo "  json       to make JSON files"
	@echo "  htmlhelp   to make HTML files and a HTML help project"
	@echo "  qthelp     to make HTML files and a qthelp project"
	@echo "  applehelp  to make an Apple Help Book"
	@echo "  devhelp    to make HTML files and a Devhelp project"
	@echo "  epub       to make an epub"
	@echo "  epub3      to make an epub3"
	@echo "  latex      to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
	@echo "  latexpdf   to make LaTeX files and run them through pdflatex"
	@echo "  latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
	@echo "  text       to make text files"
	@echo "  man        to make manual pages"
	@echo "  texinfo    to make Texinfo files"
	@echo "  info       to make Texinfo files and run them through makeinfo"
	@echo "  gettext    to make PO message catalogs"
	@echo "  changes    to make an overview of all changed/added/deprecated items"
	@echo "  xml        to make Docutils-native XML files"
	@echo "  pseudoxml  to make pseudoxml-XML files for display purposes"
	@echo "  linkcheck  to check all external links for integrity"
	@echo "  doctest    to run all doctests embedded in the documentation (if enabled)"
	@echo "  coverage   to run coverage check of the documentation (if enabled)"
	@echo "  dummy      to check syntax errors of document sources"

# Added by Tim
.PHONY: generate
generate:
	sphinx-apidoc -M -f -o _build/ ../mhcflurry

.PHONY: clean
clean:
	# Added by tim: preserve html/.git
	rm -rf $(BUILDDIR)/html/*
	mv $(BUILDDIR)/html /tmp/html-bk
	rm -rf $(BUILDDIR)/*
	mv /tmp/html-bk $(BUILDDIR)/html

.PHONY: html
html:
	$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
	@echo
	@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."

.PHONY: dirhtml
dirhtml:
	$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
	@echo
	@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."

.PHONY: singlehtml
singlehtml:
	$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
	@echo
	@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."

.PHONY: pickle
pickle:
	$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
	@echo
	@echo "Build finished; now you can process the pickle files."

.PHONY: json
json:
	$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
	@echo
	@echo "Build finished; now you can process the JSON files."

.PHONY: htmlhelp
htmlhelp:
	$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
	@echo
	@echo "Build finished; now you can run HTML Help Workshop with the" \
	      ".hhp project file in $(BUILDDIR)/htmlhelp."

.PHONY: qthelp
qthelp:
	$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
	@echo
	@echo "Build finished; now you can run "qcollectiongenerator" with the" \
	      ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
	@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/MHCflurry.qhcp"
	@echo "To view the help file:"
	@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/MHCflurry.qhc"

.PHONY: applehelp
applehelp:
	$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
	@echo
	@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
	@echo "N.B. You won't be able to view it unless you put it in" \
	      "~/Library/Documentation/Help or install it in your application" \
	      "bundle."

.PHONY: devhelp
devhelp:
	$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
	@echo
	@echo "Build finished."
	@echo "To view the help file:"
	@echo "# mkdir -p $$HOME/.local/share/devhelp/MHCflurry"
	@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/MHCflurry"
	@echo "# devhelp"

.PHONY: epub
epub:
	$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
	@echo
	@echo "Build finished. The epub file is in $(BUILDDIR)/epub."

.PHONY: epub3
epub3:
	$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
	@echo
	@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."

.PHONY: latex
latex:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo
	@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
	@echo "Run \`make' in that directory to run these through (pdf)latex" \
	      "(use \`make latexpdf' here to do that automatically)."

.PHONY: latexpdf
latexpdf:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo "Running LaTeX files through pdflatex..."
	$(MAKE) -C $(BUILDDIR)/latex all-pdf
	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."

.PHONY: latexpdfja
latexpdfja:
	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
	@echo "Running LaTeX files through platex and dvipdfmx..."
	$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."

.PHONY: text
text:
	$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
	@echo
	@echo "Build finished. The text files are in $(BUILDDIR)/text."

.PHONY: man
man:
	$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
	@echo
	@echo "Build finished. The manual pages are in $(BUILDDIR)/man."

.PHONY: texinfo
texinfo:
	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
	@echo
	@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
	@echo "Run \`make' in that directory to run these through makeinfo" \
	      "(use \`make info' here to do that automatically)."

.PHONY: info
info:
	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
	@echo "Running Texinfo files through makeinfo..."
	make -C $(BUILDDIR)/texinfo info
	@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."

.PHONY: gettext
gettext:
	$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
	@echo
	@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."

.PHONY: changes
changes:
	$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
	@echo
	@echo "The overview file is in $(BUILDDIR)/changes."

.PHONY: linkcheck
linkcheck:
	$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
	@echo
	@echo "Link check complete; look for any errors in the above output " \
	      "or in $(BUILDDIR)/linkcheck/output.txt."

.PHONY: doctest
doctest:
	$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
	@echo "Testing of doctests in the sources finished, look at the " \
	      "results in $(BUILDDIR)/doctest/output.txt."

.PHONY: coverage
coverage:
	$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
	@echo "Testing of coverage in the sources finished, look at the " \
	      "results in $(BUILDDIR)/coverage/python.txt."

.PHONY: xml
xml:
	$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
	@echo
	@echo "Build finished. The XML files are in $(BUILDDIR)/xml."

.PHONY: pseudoxml
pseudoxml:
	$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
	@echo
	@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

.PHONY: dummy
dummy:
	$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
	@echo
	@echo "Build finished. Dummy builder generates no files."


================================================
FILE: docs/README.md
================================================
# MHCflurry documentation

To generate Sphinx documentation, from this directory run:

```
$ pip install -r requirements.txt  # for the first time you generate docs
$ make generate html
```

Documentation is written to the _build/ directory. These files should not be
checked into the repo.

To test example code:
```
$ make doctest 
```

Then take a look at _build/doctest for detailed output.



================================================
FILE: docs/api.rst
================================================
.. _api-documentation:

API Documentation
=================

.. include:: _build/mhcflurry.rst
    :start-line: 2

================================================
FILE: docs/commandline_tools.rst
================================================
Command-line reference
============================

See also the :ref:`tutorial <commandline_tutorial>`.

.. _mhcflurry-predict:

.. autoprogram:: mhcflurry.predict_command:parser
    :prog: mhcflurry-predict

.. _mhcflurry-predict-scan:

.. autoprogram:: mhcflurry.predict_scan_command:parser
    :prog: mhcflurry-predict-scan

.. _mhcflurry-downloads:

.. autoprogram:: mhcflurry.downloads_command:parser
    :prog: mhcflurry-downloads

.. _mhcflurry-class1-train-allele-specific-models:

.. autoprogram:: mhcflurry.train_allele_specific_models_command:parser
    :prog: mhcflurry-class1-train-allele-specific-models

.. _mhcflurry-class1-select-allele-specific-models:

.. autoprogram:: mhcflurry.select_allele_specific_models_command:parser
    :prog: mhcflurry-class1-select-allele-specific-models

.. _mhcflurry-class1-train-pan-allele-models:

.. autoprogram:: mhcflurry.train_pan_allele_models_command:parser
    :prog: mhcflurry-class1-train-pan-allele-models

.. _mhcflurry-class1-select-pan-allele-models:

.. autoprogram:: mhcflurry.select_pan_allele_models_command:parser
    :prog: mhcflurry-class1-select-pan-allele-models

.. _mhcflurry-class1-train-processing-models:

.. autoprogram:: mhcflurry.train_processing_models_command:parser
    :prog: mhcflurry-class1-train-processing-models

.. _mhcflurry-class1-select-processing-models:

.. autoprogram:: mhcflurry.select_processing_models_command:parser
    :prog: mhcflurry-class1-select-processing-models

.. _mhcflurry-class1-train-presentation-models:

.. autoprogram:: mhcflurry.train_presentation_models_command:parser
    :prog: mhcflurry-class1-train-presentation-models



================================================
FILE: docs/commandline_tutorial.rst
================================================
.. _commandline_tutorial:

Command-line tutorial
=====================

.. _downloading:

Downloading models
------------------

Most users will use pre-trained MHCflurry models that we release. These models
are distributed separately from the pip package and may be downloaded with the
:ref:`mhcflurry-downloads` tool:

.. code-block:: shell

    $ mhcflurry-downloads fetch models_class1_presentation

Files downloaded with :ref:`mhcflurry-downloads` are stored in a platform-specific
directory. To get the path to downloaded data, you can use:

.. command-output:: mhcflurry-downloads path models_class1_presentation
    :nostderr:

We also release a number of other "downloads," such as curated training data and some
experimental models. To see what's available and what you have downloaded, run
``mhcflurry-downloads info``.

Most users will only need ``models_class1_presentation``, however, as the
presentation predictor includes a peptide / MHC I binding affinity (BA) predictor
as well as an antigen processing (AP) predictor.

.. note::

    The code we use for *generating* the downloads is in the
    ``downloads_generation`` directory in the repository (https://github.com/openvax/mhcflurry/tree/master/downloads-generation)


Generating predictions
----------------------

The :ref:`mhcflurry-predict` command generates predictions for individual peptides
(see the next section for how to scan protein sequences for epitopes). By
default it will use the pre-trained models you downloaded above. Other
models can be used by specifying the ``--models`` argument.

Running:

.. command-output::
    mhcflurry-predict
        --alleles HLA-A0201 HLA-A0301
        --peptides SIINFEKL SIINFEKD SIINFEKQ
        --out /tmp/predictions.csv
    :nostderr:

results in a file like this:

.. command-output::
    cat /tmp/predictions.csv

The binding affinity predictions are given as affinities (KD) in nM in the
``mhcflurry_affinity`` column. Lower values indicate stronger binders. A commonly-used
threshold for peptides with a reasonable chance of being immunogenic is 500 nM.

The ``mhcflurry_affinity_percentile`` gives the percentile of the affinity
prediction among a large number of random peptides tested on that allele (range
0 - 100). Lower is stronger. Two percent is a commonly-used threshold.

The last two columns give the antigen processing and presentation scores,
respectively. These range from 0 to 1 with higher values indicating more
favorable processing or presentation.

.. note::

    The processing predictor is experimental. It models allele-independent
    effects that influence whether a
    peptide will be detected in a mass spec experiment. The presentation score is
    a simple logistic regression model that combines the (log) binding affinity
    prediction with the processing score to give a composite prediction. The resulting
    prediction may be useful for prioritizing potential epitopes, but no
    thresholds have been established for what constitutes a "high enough"
    presentation score.

In most cases you'll want to specify the input as a CSV file instead of passing
peptides and alleles as commandline arguments. If you're relying on the
processing or presentation scores, you may also want to pass the upstream and
downstream sequences of the peptides from their source proteins for potentially more
accurate cleavage prediction. See the :ref:`mhcflurry-predict` docs.


Using the older, allele-specific models
-------------------------------------------

Previous versions of MHCflurry (described in the 2018 paper) used models
trained on affinity measurements, one allele per model (i.e. allele-specific).
Mass spec datasets were incorporated in the model selection step.

These models are still available to use with the latest version of MHCflurry.
To download these predictors, run:

.. code-block:: shell

    $ mhcflurry-downloads fetch models_class1

and specify ``--models`` when you call ``mhcflurry-predict``:


.. code-block:: shell

    $ mhcflurry-predict \
        --alleles HLA-A0201 HLA-A0301 \
        --peptides SIINFEKL SIINFEKD SIINFEKQ \
        --models "$(mhcflurry-downloads path models_class1)/models"
        --out /tmp/predictions.csv


Scanning protein sequences for predicted MHC I ligands
-------------------------------------------------

Starting in version 1.6.0, MHCflurry supports scanning proteins for MHC-binding
peptides using the ``mhcflurry-predict-scan`` command.

We'll generate predictions across ``example.fasta``, a FASTA file with two short
sequences:

.. literalinclude:: /example.fasta

Here's the ``mhcflurry-predict-scan`` invocation to scan the proteins for
binders to either of two MHC I genotypes (using a 100 nM threshold):

.. command-output::
    mhcflurry-predict-scan
        example.fasta
        --alleles
            HLA-A*02:01,HLA-A*03:01,HLA-B*57:01,HLA-B*45:01,HLA-C*02:02,HLA-C*07:02
            HLA-A*01:01,HLA-A*02:06,HLA-B*44:02,HLA-B*07:02,HLA-C*01:02,HLA-C*03:01
        --threshold-affinity 100
    :nostderr:

See the :ref:`mhcflurry-predict-scan` docs for more options.


Fitting your own models
-----------------------

If you have your own data and want to fit your own MHCflurry models, you have
a few options. If you have data for only one or a few MHC I alleles, the best
approach is to use the
:ref:`mhcflurry-class1-train-allele-specific-models` command to fit an
"allele-specific" predictor, in which separate neural networks are used for
each allele.

To call :ref:`mhcflurry-class1-train-allele-specific-models` you'll need some
training data. The data we use for our released predictors can be downloaded with
:ref:`mhcflurry-downloads`:

.. code-block:: shell

    $ mhcflurry-downloads fetch data_curated

It looks like this:

.. command-output::
    bzcat "$(mhcflurry-downloads path data_curated)/curated_training_data.csv.bz2" | head -n 3
    :shell:
    :nostderr:

Here's an example invocation to fit a predictor:

.. code-block:: shell

    $ mhcflurry-class1-train-allele-specific-models \
        --data curated_training_data.csv.bz2 \
        --hyperparameters hyperparameters.yaml \
        --min-measurements-per-allele 75 \
        --out-models-dir models

The ``hyperparameters.yaml`` file gives the list of neural network architectures
to train models for. Here's an example specifying a single architecture:

.. code-block:: yaml

    - activation: tanh
      dense_layer_l1_regularization: 0.0
      dropout_probability: 0.0
      early_stopping: true
      layer_sizes: [8]
      locally_connected_layers: []
      loss: custom:mse_with_inequalities
      max_epochs: 500
      minibatch_size: 128
      n_models: 4
      output_activation: sigmoid
      patience: 20
      peptide_amino_acid_encoding: BLOSUM62
      random_negative_affinity_max: 50000.0
      random_negative_affinity_min: 20000.0
      random_negative_constant: 25
      random_negative_rate: 0.0
      validation_split: 0.1

The available hyperparameters for binding predictors are defined in
`~mhcflurry.Class1NeuralNetwork`. To see exactly how
these are used you will need to read the source code.

.. note::

    MHCflurry predictors are serialized to disk as many files in a directory. The
    model training command above will write the models to the output directory specified by the
    ``--out-models-dir`` argument. This directory has files like:

    .. program-output::
        ls "$(mhcflurry-downloads path models_class1)/models"
        :shell:
        :nostderr:
        :ellipsis: 4,-4

    The ``manifest.csv`` file gives metadata for all the models used in the predictor.
    There will be a ``weights_...`` file for each model giving its weights
    (the parameters for the neural network). The ``percent_ranks.csv`` stores a
    histogram of model predictions for each allele over a large number of random
    peptides. It is used for generating the percent ranks at prediction time.

To fit pan-allele models like the ones released with MHCflurry, you can use
a similar tool, :ref:`mhcflurry-class1-train-pan-allele-models`. You'll probably
also want to take a look at the scripts used to generate the production models,
which are available in the *downloads-generation* directory in the MHCflurry
repository. See the scripts in the *models_class1_pan* subdirectory to see how the
fitting and model selection was done for models currently distributed with MHCflurry.

.. note::

    The production MHCflurry models were fit using a cluster with several
    dozen GPUs over a period of about two days. If you model select over fewer
    architectures, however, it should be possible to fit a predictor using less
    resources.


Environment variables
-------------------------------------------------

MHCflurry behavior can be modified using these environment variables:

``MHCFLURRY_DEFAULT_CLASS1_MODELS``
    Path to models directory. If you call ``Class1AffinityPredictor.load()``
    with no arguments, the models specified in this environment variable will be
    used. If this environment variable is undefined, the downloaded models for
    the current MHCflurry release are used.

``MHCFLURRY_OPTIMIZATION_LEVEL``
    The pan-allele models can be somewhat slow. As an optimization, when this
    variable is greater than 0 (default is 1), we merge the pan-allele models in
    the ensemble into a single combined network. In our experiments
    it gives about a 30% speed improvement. It has no effect on allele-specific
    models. Set this variable to 0 to disable this behavior. This may be helpful
    if you are running out of memory using the pan-allele models.


``MHCFLURRY_DEFAULT_PREDICT_BATCH_SIZE``
    For large prediction tasks, it can be helpful to increase the prediction batch
    size, which is set by this environment variable (default is 4096). This
    affects both allele-specific and pan-allele predictors. It can have large
    effects on performance. Alternatively, if you are running out of memory,
    you can try decreasing the batch size.




================================================
FILE: docs/conf.py
================================================
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# MHCflurry documentation build configuration file, created by
# sphinx-quickstart on Sun Dec 10 20:25:16 2017.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.

import sys
import os
import re
import textwrap
import logging
import subprocess

if os.environ.get("READTHEDOCS"):
    # For rtd builds, call "make generate" first.
    subprocess.check_call("make generate", shell=True)

# Hack added by tim for bug in autoprogram extension under Python 2.
from sphinx.util.pycompat import indent  # pylint: disable=import-error
textwrap.indent = indent

# Disable logging (added by tim)
logging.disable(logging.ERROR)

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path.insert(0, os.path.abspath('.'))

# -- General configuration ------------------------------------------------

# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
    'sphinx.ext.autodoc',
    'sphinx.ext.doctest',
    'sphinx.ext.coverage',
    'sphinx.ext.ifconfig',
    'sphinx.ext.viewcode',
    'sphinx.ext.githubpages',
    'numpydoc',
    'sphinxcontrib.programoutput',
    'sphinxcontrib.autoprogram',
    'sphinx.ext.githubpages',
]

doctest_global_setup = '''
import logging
logging.getLogger('matplotlib').disabled = True
import numpy
import pandas
import mhcflurry
pandas.set_option('max_columns', 20)
pandas.set_option('display.expand_frame_repr', False)
'''

doctest_test_doctest_blocks = ''

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'

# The encoding of source files.
#source_encoding = 'utf-8-sig'

# The master toctree document.
master_doc = 'index'

# General information about the project.
project = 'MHCflurry'
copyright = 'Timothy O\'Donnell'
author = 'Timothy O\'Donnell'

# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#

# The short X.Y version.
# Added by Tim: reading version from mhcflurry __init__.py as in setup.py
with open('../mhcflurry/version.py', 'r') as f:
    version = re.search(
        r'^__version__\s*=\s*[\'"]([^\'"]*)[\'"]',
        f.read(),
        re.MULTILINE).group(1)

# The full version, including alpha/beta/rc tags.
release = version

# Added by tim
autodoc_member_order = 'bysource'
autoclass_content = 'both'

# Added by tim
suppress_warnings = ['image.nonlocal_uri']

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None

# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This patterns also effect to html_static_path and html_extra_path
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']

# The reST default role (used for this markup: `text`) to use for all
# documents.
default_role = 'py:obj'

# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True

# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True

# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False

# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'

# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []

# If true, keep warnings as "system message" paragraphs in the built documents.
#keep_warnings = False

# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False

# Added by Tim
# http://stackoverflow.com/questions/12206334/sphinx-autosummary-toctree-contains-reference-to-nonexisting-document-warnings
numpydoc_show_class_members = False

# -- Options for HTML output ----------------------------------------------

# The theme to use for HTML and HTML Help pages.  See the documentation for
# a list of builtin themes.
html_theme = 'sphinx_rtd_theme'

# Theme options are theme-specific and customize the look and feel of a theme
# further.  For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}

# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []

# The name for this set of Sphinx documents.
# "<project> v<release> documentation" by default.
#html_title = 'MHCflurry v1.0.0'

# A shorter title for the navigation bar.  Default is the same as html_title.
#html_short_title = None

# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None

# The name of an image file (relative to this directory) to use as a favicon of
# the docs.  This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
# html_static_path = ['_static']

# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
#html_extra_path = []

# If not None, a 'Last updated on:' timestamp is inserted at every page
# bottom, using the given strftime format.
# The empty string is equivalent to '%b %d, %Y'.
html_last_updated_fmt = ""

# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True

# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}

# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}

# If false, no module index is generated.
html_domain_indices = False

# If false, no index is generated.
html_use_index = False

# If true, the index is split into individual pages for each letter.
#html_split_index = False

# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True

# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True

# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True

# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it.  The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''

# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None

# Language to be used for generating the HTML full-text search index.
# Sphinx supports the following languages:
#   'da', 'de', 'en', 'es', 'fi', 'fr', 'h', 'it', 'ja'
#   'nl', 'no', 'pt', 'ro', 'r', 'sv', 'tr', 'zh'
#html_search_language = 'en'

# A dictionary with options for the search language support, empty by default.
# 'ja' uses this config value.
# 'zh' user can custom change `jieba` dictionary path.
#html_search_options = {'type': 'default'}

# The name of a javascript file (relative to the configuration directory) that
# implements a search results scorer. If empty, the default will be used.
#html_search_scorer = 'scorer.js'

# Output file base name for HTML help builder.
htmlhelp_basename = 'MHCflurrydoc'

# -- Options for LaTeX output ---------------------------------------------

latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',

# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',

# Additional stuff for the LaTeX preamble.
#'preamble': '',

# Latex figure (float) alignment
#'figure_align': 'htbp',
}

# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
#  author, documentclass [howto, manual, or own class]).
latex_documents = [
    (master_doc, 'MHCflurry.tex', 'MHCflurry Documentation',
     'Timothy O\'Donnell', 'manual'),
]

# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None

# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False

# If true, show page references after internal links.
#latex_show_pagerefs = False

# If true, show URL addresses after external links.
#latex_show_urls = False

# Documents to append as an appendix to all manuals.
#latex_appendices = []

# If false, no module index is generated.
#latex_domain_indices = True


# -- Options for manual page output ---------------------------------------

# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
    (master_doc, 'mhcflurry', 'MHCflurry Documentation',
     [author], 1)
]

# If true, show URL addresses after external links.
#man_show_urls = False


# -- Options for Texinfo output -------------------------------------------

# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
#  dir menu entry, description, category)
texinfo_documents = [
    (master_doc, 'MHCflurry', 'MHCflurry Documentation',
     author, 'MHCflurry', 'One line description of project.',
     'Miscellaneous'),
]

# Documents to append as an appendix to all manuals.
#texinfo_appendices = []

# If false, no module index is generated.
#texinfo_domain_indices = True

# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'

# If true, do not generate a @detailmenu in the "Top" node's menu.
#texinfo_no_detailmenu = False


================================================
FILE: docs/doctest.sh
================================================
#!/bin/bash

sphinx-build -b doctest -d _build/doctrees . _build/doctest
RETVAL=$?
echo doctest returned $RETVAL
cat _build/doctest/output.txt
exit $RETVAL


================================================
FILE: docs/example.fasta
================================================
>protein1
MSSSSTPVCPNGPGNCQV
>protein2
MVENKRLLEGMEMIFGQVIPGA


================================================
FILE: docs/index.rst
================================================
MHCflurry documentation
=====================================

.. toctree::
   :maxdepth: 3

   intro
   commandline_tutorial
   python_tutorial
   model-info/allele_motifs
   commandline_tools
   api



================================================
FILE: docs/intro.rst
================================================
Introduction and setup
=======================

MHCflurry is an open source package for peptide/MHC I binding affinity prediction. It
aims to provide competitive accuracy with a fast and documented implementation.

You can download pre-trained MHCflurry models fit to mass spec-identified MHC I
ligands and peptide/MHC affinity measurements deposited in IEDB (plus a few other
sources) or train a MHCflurry predictor on your own data.

Starting in version 1.6.0, the default MHCflurry binding affinity predictors
are "pan-allele" models that support most sequenced MHC I alleles across humans
and a few other species (about 14,000 alleles in total). This version also
introduces two experimental predictors, an "antigen processing" predictor
that attempts to model MHC allele-independent effects such as proteosomal
cleavage and a "presentation" predictor that integrates processing predictions
with binding affinity predictions to give a composite "presentation score." Both
models are trained on mass spec-identified MHC ligands.

MHCflurry supports Python 3.10+. It uses the `PyTorch <https://pytorch.org/>`__
neural network library. GPUs and Apple Silicon (MPS) may optionally be used for
a speed improvement and are auto-detected.

If you find MHCflurry useful in your research, please cite:

    T. J. O'Donnell, et al. "MHCflurry 2.0: Improved pan-allele prediction of MHC
    I-presented peptides by incorporating antigen processing,"
    *Cell Systems*, 2020. https://doi.org/10.1016/j.cels.2020.06.010

    T. J. O'Donnell, et al., "MHCflurry: Open-Source Class I MHC Binding Affinity
    Prediction," *Cell Systems*, 2018. https://doi.org/10.1016/j.cels.2018.05.014

If you have questions or encounter problems, please file an issue at the
MHCflurry github repo: https://github.com/openvax/mhcflurry


Installation (pip)
-------------------

Install the package:

.. code-block:: shell

    $ pip install mhcflurry

Then download our datasets and trained models:

.. code-block:: shell

    $ mhcflurry-downloads fetch

From a checkout you can run the unit tests with:

.. code-block:: shell

    $ pip install pytest
    $ pytest


Using conda
-------------

You can alternatively get up and running with a `conda <https://conda.io/docs/>`__
environment as follows.

.. code-block:: shell

    $ conda create -q -n mhcflurry-env python=3.10
    $ source activate mhcflurry-env

Then continue as above:

.. code-block:: shell

    $ pip install mhcflurry
    $ mhcflurry-downloads fetch


================================================
FILE: docs/python_tutorial.rst
================================================
Python library tutorial
=======================

The MHCflurry Python API exposes additional options and features beyond those
supported by the commandline tools and can be more convenient for interactive
analyses and bioinformatic pipelines. This tutorial gives a basic overview
of the most important functionality. See the :ref:`API-documentation` for further
details.

Loading a predictor
----------------------------------

Most prediction tasks can be performed using the
`~mhcflurry.Class1PresentationPredictor` class, which provides a programmatic API
to the functionality in the :ref:`mhcflurry-predict` and
:ref:`mhcflurry-predict-scan` commands.

Instances of `~mhcflurry.Class1PresentationPredictor` wrap a
`~mhcflurry.Class1AffinityPredictor` to generate binding affinity predictions
and a `~mhcflurry.Class1ProcessingPredictor` to generate antigen processing
predictions. The presentation score is computed using a logistic regression
model over binding affinity and processing predictions.

Use the `~mhcflurry.Class1PresentationPredictor.load` static method to load a
trained predictor from disk. With no arguments this method will load the predictor
released with MHCflurry (see :ref:`downloading`\ ). If you pass a path to a
models directory, then it will load that predictor instead.

.. doctest::

    >>> from mhcflurry import Class1PresentationPredictor
    >>> predictor = Class1PresentationPredictor.load()
    >>> predictor.supported_alleles[:5]
    ['Atbe-B*01:01', 'Atbe-E*03:01', 'Atbe-G*03:01', 'Atbe-G*03:02', 'Atbe-G*06:01']

Predicting for individual peptides
----------------------------------

To generate predictions for individual peptides, we can use the
`~mhcflurry.Class1AffinityPredictor.predict` method of the `~mhcflurry.Class1PresentationPredictor`,
loaded above. This method returns a `pandas.DataFrame` with binding affinity, processing, and presentation
predictions:

.. doctest::

    >>> predictor.predict(
    ...     peptides=["SIINFEKL", "NLVPMVATV"],
    ...     alleles=["HLA-A0201", "HLA-A0301"],
    ...     verbose=0)
         peptide  peptide_num sample_name      affinity best_allele  processing_score  presentation_score
    0   SIINFEKL            0     sample1  12906.786173   HLA-A0201          0.101473            0.012503
    1  NLVPMVATV            1     sample1     15.038358   HLA-A0201          0.676289            0.975463

Here, the list of alleles is taken to be an individual's MHC I genotype (i.e. up
to 6 alleles), and the strongest binder across alleles for each peptide is
reported.

.. note::

    MHCflurry normalizes allele names using the `mhcgnomes <https://github.com/til-unc/mhcgnomes>`__
    package. Names like ``HLA-A0201`` or ``A*02:01`` will be
    normalized to ``HLA-A*02:01``, so most naming conventions can be used
    with methods such as `~mhcflurry.Class1PresentationPredictor.predict`.

If you have multiple sample genotypes, you can pass a dict, where the
keys are arbitrary sample names:

.. doctest::

    >>> predictor.predict(
    ...     peptides=["KSEYMTSWFY", "NLVPMVATV"],
    ...     alleles={
    ...        "sample1": ["A0201", "A0301", "B0702", "B4402", "C0201", "C0702"],
    ...        "sample2": ["A0101", "A0206", "B5701", "C0202"],
    ...     },
    ...     verbose=0)
          peptide  peptide_num sample_name      affinity best_allele  processing_score  presentation_score
    0  KSEYMTSWFY            0     sample1  16737.745268       A0301          0.381632            0.026550
    1   NLVPMVATV            1     sample1     15.038358       A0201          0.676289            0.975463
    2  KSEYMTSWFY            0     sample2     62.540779       A0101          0.381632            0.796731
    3   NLVPMVATV            1     sample2     15.765500       A0206          0.676289            0.974439

Here the strongest binder for each sample / peptide pair is returned.

Many users will focus on the binding affinity predictions, as the
processing and presentation predictions are experimental. If you do use the latter
scores, however, when available you should provide the upstream (N-flank)
and downstream (C-flank) sequences from the source proteins of the peptides for
a small boost in accuracy. To do so, specify the ``n_flank`` and ``c_flank``
arguments, which give the flanking sequences for the corresponding peptides:

.. doctest::

    >>> predictor.predict(
    ...     peptides=["KSEYMTSWFY", "NLVPMVATV"],
    ...     n_flanks=["NNNNNNN", "SSSSSSSS"],
    ...     c_flanks=["CCCCCCCC", "YYYAAAA"],
    ...     alleles={
    ...        "sample1": ["A0201", "A0301", "B0702", "B4402", "C0201", "C0702"],
    ...        "sample2": ["A0101", "A0206", "B5701", "C0202"],
    ...     },
    ...     verbose=0)
          peptide   n_flank   c_flank  peptide_num sample_name      affinity best_allele  processing_score  presentation_score
    0  KSEYMTSWFY   NNNNNNN  CCCCCCCC            0     sample1  16737.745268       A0301          0.605816            0.056190
    1   NLVPMVATV  SSSSSSSS   YYYAAAA            1     sample1     15.038358       A0201          0.824994            0.986719
    2  KSEYMTSWFY   NNNNNNN  CCCCCCCC            0     sample2     62.540779       A0101          0.605816            0.897493
    3   NLVPMVATV  SSSSSSSS   YYYAAAA            1     sample2     15.765500       A0206          0.824994            0.986155

Scanning protein sequences
--------------------------

The `~mhcflurry.Class1PresentationPredictor.predict_sequences` method supports
scanning protein sequences for MHC ligands. Here's an example to identify all
peptides with a predicted binding affinity of 500 nM or tighter to any allele
across two sample genotypes and two short peptide sequences.

.. doctest::

    >>> predictor.predict_sequences(
    ...    sequences={
    ...        'protein1': "MDSKGSSQKGSRLLLLLVVSNLL",
    ...        'protein2': "SSLPTPEDKEQAQQTHH",
    ...    },
    ...    alleles={
    ...        "sample1": ["A0201", "A0301", "B0702"],
    ...        "sample2": ["A0101", "C0202"],
    ...    },
    ...    result="filtered",
    ...    comparison_quantity="affinity",
    ...    filter_value=500,
    ...    verbose=0)
      sequence_name  pos     peptide         n_flank     c_flank sample_name    affinity best_allele  affinity_percentile  processing_score  presentation_score
    0      protein1   13   LLLLVVSNL   MDSKGSSQKGSRL           L     sample1   38.206225       A0201             0.380125          0.017644            0.571060
    1      protein1   14   LLLVVSNLL  MDSKGSSQKGSRLL                 sample1   42.243472       A0201             0.420250          0.090984            0.619213
    2      protein1    5   SSQKGSRLL           MDSKG   LLLVVSNLL     sample2   66.749223       C0202             0.803375          0.383608            0.774468
    3      protein1    6   SQKGSRLLL          MDSKGS    LLVVSNLL     sample2  178.033467       C0202             1.820000          0.275019            0.482206
    4      protein1   13  LLLLVVSNLL   MDSKGSSQKGSRL                 sample1  202.208167       A0201             1.112500          0.058782            0.261320
    5      protein1   12  LLLLLVVSNL    MDSKGSSQKGSR           L     sample1  202.506582       A0201             1.112500          0.010025            0.225648
    6      protein2    0   SSLPTPEDK                    EQAQQTHH     sample1  335.529377       A0301             1.011750          0.010443            0.156798
    7      protein2    0   SSLPTPEDK                    EQAQQTHH     sample2  353.451759       C0202             2.674250          0.010443            0.150753
    8      protein1    8   KGSRLLLLL        MDSKGSSQ      VVSNLL     sample2  410.327286       C0202             2.887000          0.121374            0.194081
    9      protein1    5    SSQKGSRL           MDSKG  LLLLVVSNLL     sample2  477.285937       C0202             3.107375          0.111982            0.168572

When using ``predict_sequences``, the flanking sequences for each peptide are
automatically included in the processing and presentation predictions.

See the documentation for `~mhcflurry.Class1PresentationPredictor` for other
useful methods.


Lower level interfaces
----------------------------------

The `~mhcflurry.Class1PresentationPredictor` delegates to a
`~mhcflurry.Class1AffinityPredictor` instance for binding affinity predictions.
If all you need are binding affinities, you can use this instance directly.

Here's an example:

.. doctest::

    >>> from mhcflurry import Class1AffinityPredictor
    >>> predictor = Class1AffinityPredictor.load()
    >>> predictor.predict_to_dataframe(allele="HLA-A0201", peptides=["SIINFEKL", "SIINFEQL"])
        peptide     allele    prediction  prediction_low  prediction_high  prediction_percentile
    0  SIINFEKL  HLA-A0201  12906.786173     8829.460289     18029.923061               6.566375
    1  SIINFEQL  HLA-A0201  13025.300796     9050.056312     18338.004869               6.623625

The ``prediction_low`` and ``prediction_high`` fields give the 5-95 percentile
predictions across the models in the ensemble. This detailed information is not
available through the higher-level `~mhcflurry.Class1PresentationPredictor`
interface.

Under the hood, `Class1AffinityPredictor` itself delegates to an ensemble of
of `~mhcflurry.Class1NeuralNetwork` instances, which implement the neural network
models used for prediction. To fit your own affinity prediction models, call
`~mhcflurry.Class1NeuralNetwork.fit`.

You can similarly use `~mhcflurry.Class1ProcessingPredictor` directly for
antigen processing prediction, and there is a low-level
`~mhcflurry.Class1ProcessingNeuralNetwork` with a `~mhcflurry.Class1ProcessingNeuralNetwork.fit` method.

See the API documentation of these classes for details.

================================================
FILE: docs/requirements.txt
================================================
sphinx
sphinxcontrib-programoutput
sphinxcontrib-autoprogram
sphinx-rtd-theme
numpydoc
pypandoc
pydot
tabulate
logomaker
tqdm


================================================
FILE: downloads-generation/README.md
================================================
# Downloads generation

This directory contains code and instructions needed to *generate* the datasets and trained models published with MHCflurry.

If you are only looking to download datasets and trained models, you do not need to use any of this. Just run `mhcflurry-downloads fetch` to download the standard models and datasets.

================================================
FILE: downloads-generation/allele_sequences/GENERATE.sh
================================================
#!/bin/bash
#
# Create allele sequences (sometimes referred to as pseudosequences) by
# performing a global alignment across all MHC amino acid sequences we can get
# our hands on.
#
# Requires: clustalo, wget
#
set -e
set -x

DOWNLOAD_NAME=allele_sequences
SCRATCH_DIR=${TMPDIR-/tmp}/mhcflurry-downloads-generation
SCRIPT_ABSOLUTE_PATH="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)/$(basename "${BASH_SOURCE[0]}")"
SCRIPT_DIR=$(dirname "$SCRIPT_ABSOLUTE_PATH")
export PYTHONUNBUFFERED=1

mkdir -p "$SCRATCH_DIR"
rm -rf "$SCRATCH_DIR/$DOWNLOAD_NAME"
mkdir "$SCRATCH_DIR/$DOWNLOAD_NAME"

# Send stdout and stderr to a logfile included with the archive.
exec >  >(tee -ia "$SCRATCH_DIR/$DOWNLOAD_NAME/LOG.txt")
exec 2> >(tee -ia "$SCRATCH_DIR/$DOWNLOAD_NAME/LOG.txt" >&2)

# Log some environment info
date
pip freeze
git status
which clustalo
clustalo --version

cd $SCRATCH_DIR/$DOWNLOAD_NAME
cp $SCRIPT_DIR/make_allele_sequences.py .
cp $SCRIPT_DIR/select_alleles_to_disambiguate.py .
cp $SCRIPT_DIR/filter_sequences.py .

cp $SCRIPT_DIR/class1_pseudosequences.csv .

cp $SCRIPT_ABSOLUTE_PATH .

# Generate sequences
# Training data is used to decide which additional positions to include in the
# allele sequences to differentiate alleles that have identical traditional
# pseudosequences but have associated training data
TRAINING_DATA="$(mhcflurry-downloads path data_curated)/curated_training_data.csv.bz2"

python select_alleles_to_disambiguate.py \
    "$TRAINING_DATA" \
    --min-count 1000 \
    --out training_data.alleles.txt

# Human
wget -q ftp://ftp.ebi.ac.uk/pub/databases/ipd/imgt/hla/fasta/A_prot.fasta
wget -q ftp://ftp.ebi.ac.uk/pub/databases/ipd/imgt/hla/fasta/B_prot.fasta
wget -q ftp://ftp.ebi.ac.uk/pub/databases/ipd/imgt/hla/fasta/C_prot.fasta
wget -q ftp://ftp.ebi.ac.uk/pub/databases/ipd/imgt/hla/fasta/E_prot.fasta
wget -q ftp://ftp.ebi.ac.uk/pub/databases/ipd/imgt/hla/fasta/F_prot.fasta
wget -q ftp://ftp.ebi.ac.uk/pub/databases/ipd/imgt/hla/fasta/G_prot.fasta

# Mouse
wget -q https://www.uniprot.org/uniprot/P01899.fasta  # H-2 Db
wget -q https://www.uniprot.org/uniprot/P01900.fasta  # H-2 Dd
wget -q https://www.uniprot.org/uniprot/P14427.fasta  # H-2 Dp
wget -q https://www.uniprot.org/uniprot/P14426.fasta  # H-2 Dk
wget -q https://www.uniprot.org/uniprot/Q31145.fasta  # H-2 Dq

wget -q https://www.uniprot.org/uniprot/P01901.fasta  # H-2 Kb
wget -q https://www.uniprot.org/uniprot/P01902.fasta  # H-2 Kd
wget -q https://www.uniprot.org/uniprot/P04223.fasta  # H-2 Kk
wget -q https://www.uniprot.org/uniprot/P14428.fasta  # H-2 Kq

wget -q https://www.uniprot.org/uniprot/P01897.fasta  # H-2 Ld
wget -q https://www.uniprot.org/uniprot/Q31151.fasta  # H-2 Lq

# Various
wget -q ftp://ftp.ebi.ac.uk/pub/databases/ipd/mhc/MHC_prot.fasta

python filter_sequences.py *.fasta --out class1.fasta

time clustalo -i class1.fasta -o class1.aligned.fasta

time python make_allele_sequences.py \
    class1.aligned.fasta \
    --recapitulate-sequences class1_pseudosequences.csv \
    --differentiate-alleles training_data.alleles.txt \
    --out-csv allele_sequences.csv

time python make_allele_sequences.py \
    class1.aligned.fasta \
    --recapitulate-sequences class1_pseudosequences.csv \
    --out-csv allele_sequences.no_differentiation.csv

# Cleanup
gzip -f class1.fasta
gzip -f class1.aligned.fasta
rm *.fasta

cp $SCRIPT_ABSOLUTE_PATH .
bzip2 LOG.txt
RESULT="$SCRATCH_DIR/${DOWNLOAD_NAME}.$(date +%Y%m%d).tar.bz2"
tar -cjf "$RESULT" *
echo "Created archive: $RESULT"


================================================
FILE: downloads-generation/allele_sequences/class1_pseudosequences.csv
================================================
allele,pseudosequence
BoLA-100901,YYSMYREISENVYGSNLYLLYRDYTWEYLNYRWY
BoLA-100902,YYSEYREISENVYESNLYLLYRDYTWEYLNYRWY
BoLA-101901,YHTKYREISENVYGSNLYYDYDYYTWAVFNYRGY
BoLA-102001,YHTKYREISENVYGSNLYFLYMDYTWAVFNYRGY
BoLA-102101,YYTKYREISENVYGSNLYFQFRYYTWADFNYEGY
BoLA-102301,YYSEYREISENVYESNLYIAYSDYTWEYLNYRWY
BoLA-102801,YYTKYREISEKLYENTLYLQFRYYTWADFNYEWY
BoLA-102901,YYTRYREISENLYKNTAYITFMYYTWANENYRGY
BoLA-103101,YYTKYDEISENLYKNTLYIAFRDYTWAYLNYTWY
BoLA-103102,YYTKYDEISENLYKDTLYIAFRDYTWAYLNYTWY
BoLA-104201,YHTKYDEISENLYKDTLYIAYRDYTWEYLNYRGY
BoLA-104901,YYAEYREISDTSFVGTLYIEYEYYTWAYLNYEGY
BoLA-106101,YYTIYREISENVYESNLYFRYDFYTWADFNYRWY
BoLA-106701,YYAMYEMDAEDRSLCTLYFQFTFYTWAAFNYTWY
BoLA-107401,YYTKYREISENLYKNTAYLRFSFYTWAAENYRGY
BoLA-1:00901,YYSMYREISENVYGSNLYLLYRDYTWEYLNYRWY
BoLA-1:00902,YYSEYREISENVYESNLYLLYRDYTWEYLNYRWY
BoLA-1:01901,YHTKYREISENVYGSNLYYDYDYYTWAVFNYRGY
BoLA-1:02001,YHTKYREISENVYGSNLYFLYMDYTWAVFNYRGY
BoLA-1:02101,YYTKYREISENVYGSNLYFQFRYYTWADFNYEGY
BoLA-1:02301,YYSEYREISENVYESNLYIAYSDYTWEYLNYRWY
BoLA-1:02801,YYTKYREISEKLYENTLYLQFRYYTWADFNYEWY
BoLA-1:02901,YYTRYREISENLYKNTAYITFMYYTWANENYRGY
BoLA-1:03101,YYTKYDEISENLYKNTLYIAFRDYTWAYLNYTWY
BoLA-1:03102,YYTKYDEISENLYKDTLYIAFRDYTWAYLNYTWY
BoLA-1:04201,YHTKYDEISENLYKDTLYIAYRDYTWEYLNYRGY
BoLA-1:04901,YYAEYREISDTSFVGTLYIEYEYYTWAYLNYEGY
BoLA-1:06101,YYTIYREISENVYESNLYFRYDFYTWADFNYRWY
BoLA-1:06701,YYAMYEMDAEDRSLCTLYFQFTFYTWAAFNYTWY
BoLA-1:07401,YYTKYREISENLYKNTAYLRFSFYTWAAENYRGY
BoLA-200501,YYAEYRNIYDTIFVDTLYIAYWFYTWAAWNYEWY
BoLA-200601,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWY
BoLA-200602,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWH
BoLA-200801,YLIMYRENSETTFANTAYVEYMDYTWADWNYRWY
BoLA-200802,YLIMYRENSETTFANTAYVEYMDYTWADWNYRGY
BoLA-201201,YYATYRENFDTTFVDTLYIAYRDYTWAEHNYTWY
BoLA-201601,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEGY
BoLA-201602,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWY
BoLA-201801,YYADYRNIYDTIFANTAYFEYMFYTWAEQNYRGY
BoLA-201802,YYADYRNIYDTIFANTAYFEYMFYTWAEQNYRGY
BoLA-202201,YHSEYEQIVDTSFVGTLYLLYEDYTRAALNYTGY
BoLA-202501,YSAEYRNIYDTTFVYALYLWSWFYTWAAENYRGY
BoLA-202601,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-202602,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-202603,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-203001,YYSEYRNIYDTNFVSNLYLWSWFYTWANENYEWY
BoLA-203202,YYATYRENLGATFVDTLYIEYRDYTWAYLNYTWY
BoLA-204301,YSEMYRERAGNTFVNTLYIWYRDYTWAVFNYLGY
BoLA-204401,YYAMYEEKADTTFVDTLYIAYRDYTWAVFNYLGY
BoLA-204402,YYAMYEEKADTTFVDTLYIWYRDYTWAVFNYLGY
BoLA-204501,YYATYRENLDTTFVDTLYIEYRDYTWAEFNYLGY
BoLA-204601,YSEMYRERAGNTFVNTLYIWYRDYTWAEQNYTWY
BoLA-204701,YSEMYQERAGNTFVDTLYLWYMDYTWAEQNYTWY
BoLA-204801,YYSEYEQIVDTSFVGTLYLLYMDYTRAAQNYRGY
BoLA-205401,YYIMYQENSGATFANTLYFWYWFYTWANENYRGY
BoLA-205501,YYAEYREISETTFVDSLYIAYRDYTWAYLNYRGY
BoLA-205601,YYATYQENFDATFANTLYFLSTYYTWEAHNYRGY
BoLA-205701,YYIMYREISETTFVDTLYIEYDFYTWEYLNYRGY
BoLA-206001,YSAEYRNIYDTTFVYTLYLWSWFYTWANGNYEGY
BoLA-206201,YYATYQEIQENTFANTLYIEYRDYTWAYFNYRWY
BoLA-206901,YYSEYEQIVDTSFVNTLYLWYRDYTWEAENYRWY
BoLA-207001,YYATYRENLDATFVNTLYLWYRDYTWAERNYRWY
BoLA-207101,YYATYRENLGATFVDTLYIAYSDYTWAEFNYRGY
BoLA-2:00501,YYAEYRNIYDTIFVDTLYIAYWFYTWAAWNYEWY
BoLA-2:00601,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWY
BoLA-2:00602,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWH
BoLA-2:00801,YLIMYRENSETTFANTAYVEYMDYTWADWNYRWY
BoLA-2:00802,YLIMYRENSETTFANTAYVEYMDYTWADWNYRGY
BoLA-2:01201,YYATYRENFDTTFVDTLYIAYRDYTWAEHNYTWY
BoLA-2:01601,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEGY
BoLA-2:01602,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWY
BoLA-2:01801,YYADYRNIYDTIFANTAYFEYMFYTWAEQNYRGY
BoLA-2:01802,YYADYRNIYDTIFANTAYFEYMFYTWAEQNYRGY
BoLA-2:02201,YHSEYEQIVDTSFVGTLYLLYEDYTRAALNYTGY
BoLA-2:02501,YSAEYRNIYDTTFVYALYLWSWFYTWAAENYRGY
BoLA-2:02601,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-2:02602,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-2:02603,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-2:03001,YYSEYRNIYDTNFVSNLYLWSWFYTWANENYEWY
BoLA-2:03202,YYATYRENLGATFVDTLYIEYRDYTWAYLNYTWY
BoLA-2:04301,YSEMYRERAGNTFVNTLYIWYRDYTWAVFNYLGY
BoLA-2:04401,YYAMYEEKADTTFVDTLYIAYRDYTWAVFNYLGY
BoLA-2:04402,YYAMYEEKADTTFVDTLYIWYRDYTWAVFNYLGY
BoLA-2:04501,YYATYRENLDTTFVDTLYIEYRDYTWAEFNYLGY
BoLA-2:04601,YSEMYRERAGNTFVNTLYIWYRDYTWAEQNYTWY
BoLA-2:04701,YSEMYQERAGNTFVDTLYLWYMDYTWAEQNYTWY
BoLA-2:04801,YYSEYEQIVDTSFVGTLYLLYMDYTRAAQNYRGY
BoLA-2:05401,YYIMYQENSGATFANTLYFWYWFYTWANENYRGY
BoLA-2:05501,YYAEYREISETTFVDSLYIAYRDYTWAYLNYRGY
BoLA-2:05601,YYATYQENFDATFANTLYFLSTYYTWEAHNYRGY
BoLA-2:05701,YYIMYREISETTFVDTLYIEYDFYTWEYLNYRGY
BoLA-2:06001,YSAEYRNIYDTTFVYTLYLWSWFYTWANGNYEGY
BoLA-2:06201,YYATYQEIQENTFANTLYIEYRDYTWAYFNYRWY
BoLA-2:06901,YYSEYEQIVDTSFVNTLYLWYRDYTWEAENYRWY
BoLA-2:07001,YYATYRENLDATFVNTLYLWYRDYTWAERNYRWY
BoLA-2:07101,YYATYRENLGATFVDTLYIAYSDYTWAEFNYRGY
BoLA-300101,YSEMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-300102,YSSMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-300103,YSIMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-300201,YLEMYQEKAGNFFVSNLYLLSMFYSMAEQNYRWY
BoLA-300401,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-300402,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-300403,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-301001,YYSEYRNIYDTTFVDTLYLEYEYYSVAEFNYRGY
BoLA-301101,YSEMYQEKAGTTFANIAYFWYMYYTWAEQNYTWY
BoLA-301701,YSEMYRERAGNIFVSNLYFWYEYYTWAAQNYRWY
BoLA-301702,YSEMYRERAGNIFVSNLYFWYMYYTWAAQNYRWY
BoLA-301703,YSEMYRERAGNIFVSNLYFWYMYYTWAEQNYRWY
BoLA-302701,YSEMYRNNAGNSFVGTLYLWSMYYTWEYQNYEWH
BoLA-302702,YSEMYRNNAGNSFVGTLYLWSMYYTWEYQNYEWH
BoLA-303501,YYNMYQENAGNTFVGTLYLWSEFYTWAAHNYTWY
BoLA-303601,YYAMYRNNADATFVNTLYFLYEYYTVADHNYRWY
BoLA-303701,YSEMYRNNAGNSFVGTLYLLYMDYSRAVQNYRWY
BoLA-303801,YNEMYRNNAGNDSVGTLYLWYMYYSMAVQNYTWY
BoLA-305001,YSEMYRNNAGNTFGSNLYFLYTYYTWAEWNYTWH
BoLA-305002,YSEMYRNNAGNTFGSNLYFWYMYYTWAEQNYTWH
BoLA-305101,YSEMYRERAGNTFVNTLYIWYRDYTWAAENYTWY
BoLA-305201,YYSMYRENSDTGFVDTLYLLYTYYSVAVQNYRWY
BoLA-305301,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-305801,YSEMYRERAGNTFVGTLYLWYMDYSRAVQNYRWY
BoLA-305901,YSEMYRNNAGNSFVGTLYLWSMFYTWEYQNYRWH
BoLA-306501,YSEMYQEKAGTSSVGTLYLAYMFYSMAVQNYEWY
BoLA-306601,YYEMYQEKADTTFVDTLYLLYTYYSMAEFNYTWY
BoLA-306602,YYEMYQEKADTTFVDTLYLLYTFYSMAEFNYTWY
BoLA-306801,YSIVYQNNAGTTFANTLYLLYMYYTWAAHNYEWY
BoLA-307301,YYIIYQEISDTSFVSNLYLWYTYYSMAVQNYEWY
BoLA-3:00101,YSEMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-3:00102,YSSMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-3:00103,YSIMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-3:00201,YLEMYQEKAGNFFVSNLYLLSMFYSMAEQNYRWY
BoLA-3:00401,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-3:00402,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-3:00403,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-3:01001,YYSEYRNIYDTTFVDTLYLEYEYYSVAEFNYRGY
BoLA-3:01101,YSEMYQEKAGTTFANIAYFWYMYYTWAEQNYTWY
BoLA-3:01701,YSEMYRERAGNIFVSNLYFWYEYYTWAAQNYRWY
BoLA-3:01702,YSEMYRERAGNIFVSNLYFWYMYYTWAAQNYRWY
BoLA-3:01703,YSEMYRERAGNIFVSNLYFWYMYYTWAEQNYRWY
BoLA-3:02701,YSEMYRNNAGNSFVGTLYLWSMYYTWEYQNYEWH
BoLA-3:02702,YSEMYRNNAGNSFVGTLYLWSMYYTWEYQNYEWH
BoLA-3:03501,YYNMYQENAGNTFVGTLYLWSEFYTWAAHNYTWY
BoLA-3:03601,YYAMYRNNADATFVNTLYFLYEYYTVADHNYRWY
BoLA-3:03701,YSEMYRNNAGNSFVGTLYLLYMDYSRAVQNYRWY
BoLA-3:03801,YNEMYRNNAGNDSVGTLYLWYMYYSMAVQNYTWY
BoLA-3:05001,YSEMYRNNAGNTFGSNLYFLYTYYTWAEWNYTWH
BoLA-3:05002,YSEMYRNNAGNTFGSNLYFWYMYYTWAEQNYTWH
BoLA-3:05101,YSEMYRERAGNTFVNTLYIWYRDYTWAAENYTWY
BoLA-3:05201,YYSMYRENSDTGFVDTLYLLYTYYSVAVQNYRWY
BoLA-3:05301,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-3:05801,YSEMYRERAGNTFVGTLYLWYMDYSRAVQNYRWY
BoLA-3:05901,YSEMYRNNAGNSFVGTLYLWSMFYTWEYQNYRWH
BoLA-3:06501,YSEMYQEKAGTSSVGTLYLAYMFYSMAVQNYEWY
BoLA-3:06601,YYEMYQEKADTTFVDTLYLLYTYYSMAEFNYTWY
BoLA-3:06602,YYEMYQEKADTTFVDTLYLLYTFYSMAEFNYTWY
BoLA-3:06801,YSIVYQNNAGTTFANTLYLLYMYYTWAAHNYEWY
BoLA-3:07301,YYIIYQEISDTSFVSNLYLWYTYYSMAVQNYEWY
BoLA-402401,YSIAYEQIVDTTFANTAYIAYSDYTWEYLNYTWY
BoLA-402402,YSIAYEEIVDTTFANTAYLPYSDYTWTYLNYTWY
BoLA-406301,YYSTYRENFETTFVNTLYILYTFYSRAALNYRGY
BoLA-4:02401,YSIAYEQIVDTTFANTAYIAYSDYTWEYLNYTWY
BoLA-4:02402,YSIAYEEIVDTTFANTAYLPYSDYTWTYLNYTWY
BoLA-4:06301,YYSTYRENFETTFVNTLYILYTFYSRAALNYRGY
BoLA-500301,YLIVYEERADHFFRGALYFEYEFYSWASYNYEWY
BoLA-503901,YYIVYQEKADTFFLGTLYLWCWFYTWANENYEWY
BoLA-506401,YYIVYQEKADHTFANTLYLWHWFYTWANENYEWY
BoLA-507201,YYIVYQEKADHFFLGTLYLWYWFYSWAVQNYTWY
BoLA-5:00301,YLIVYEERADHFFRGALYFEYEFYSWASYNYEWY
BoLA-5:03901,YYIVYQEKADTFFLGTLYLWCWFYTWANENYEWY
BoLA-5:06401,YYIVYQEKADHTFANTLYLWHWFYTWANENYEWY
BoLA-5:07201,YYIVYQEKADHFFLGTLYLWYWFYSWAVQNYTWY
BoLA-601301,YHTTYREISENWYEANLYLEYEYYSMAAFNYTWY
BoLA-601302,YHTTYREISENWYEANLYLLYEYYSMAAFNYTWY
BoLA-601401,YHTKYREISENWYEANLYYRYTFYTWAEFNYRGY
BoLA-601402,YHTKYREISENKYEAILYYRYTFYTWAEFNYRWY
BoLA-601501,YYTKYREISENWYEANLYLLYTFYSMADQNYRGY
BoLA-601502,YYTKYREISENWYEANLYLQFTFYSMADQNYRGY
BoLA-603401,YHTKYREISENVYGSNLYLLYTFYSMADRNYRGY
BoLA-604001,YSEMYEERAGIVFVNTLYLWCWFYSMAAGKYTWY
BoLA-604101,YHTKYREISENWYEATLYLEYEYYSMAAFNYRSY
BoLA-6:01301,YHTTYREISENWYEANLYLEYEYYSMAAFNYTWY
BoLA-6:01302,YHTTYREISENWYEANLYLLYEYYSMAAFNYTWY
BoLA-6:01401,YHTKYREISENWYEANLYYRYTFYTWAEFNYRGY
BoLA-6:01402,YHTKYREISENKYEAILYYRYTFYTWAEFNYRWY
BoLA-6:01501,YYTKYREISENWYEANLYLLYTFYSMADQNYRGY
BoLA-6:01502,YYTKYREISENWYEANLYLQFTFYSMADQNYRGY
BoLA-6:03401,YHTKYREISENVYGSNLYLLYTFYSMADRNYRGY
BoLA-6:04001,YSEMYEERAGIVFVNTLYLWCWFYSMAAGKYTWY
BoLA-6:04101,YHTKYREISENWYEATLYLEYEYYSMAAFNYRSY
BoLA-AW10,YSEMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-D18.4,YYSEYREISENVYESNLYIAYSDYTWEYLNYRWY
BoLA-HD6,YHTTYREISENWYEANLYLEYEYYSMAAFNYTWY
BoLA-JSP.1,YLEMYQEKAGNFFVSNLYLLSMFYSMAEQNYRWY
BoLA-N:00101,YSEMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-N:00102,YSSMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-N:00103,YSIMYRERAGNFFVSNLYLWSMFYSMAEQNYRWY
BoLA-N:00201,YLEMYQEKAGNFFVSNLYLLSMFYSMAEQNYRWY
BoLA-N:00301,YLIVYEERADHFFRGALYFEYEFYSWASYNYEWY
BoLA-N:00401,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-N:00402,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-N:00501,YYAEYRNIYDTIFVDTLYIAYWFYTWAAWNYEWY
BoLA-N:00601,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWY
BoLA-N:00602,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWH
BoLA-N:00801,YLIMYRENSETTFANTAYVEYMDYTWADWNYRWY
BoLA-N:00802,YLIMYRENSETTFANTAYVEYMDYTWADWNYRGY
BoLA-N:00901,YYSMYREISENVYGSNLYLLYRDYTWEYLNYRWY
BoLA-N:00902,YYSEYREISENVYESNLYLLYRDYTWEYLNYRWY
BoLA-N:01001,YYSEYRNIYDTTFVDTLYLEYEYYSVAEFNYRGY
BoLA-N:01101,YSEMYQEKAGTTFANIAYFWYMYYTWAEQNYTWY
BoLA-N:01201,YYATYRENFDTTFVDTLYIAYRDYTWAEHNYTWY
BoLA-N:01301,YHTTYREISENWYEANLYLEYEYYSMAAFNYTWY
BoLA-N:01302,YHTTYREISENWYEANLYLLYEYYSMAAFNYTWY
BoLA-N:01401,YHTKYREISENWYEANLYYRYTFYTWAEFNYRGY
BoLA-N:01402,YHTKYREISENKYEAILYYRYTFYTWAEFNYRWY
BoLA-N:01501,YYTKYREISENWYEANLYLLYTFYSMADQNYRGY
BoLA-N:01502,YYTKYREISENWYEANLYLQFTFYSMADQNYRGY
BoLA-N:01601,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEGY
BoLA-N:01602,YSAEYRNIYDTTFVYTLYLWSMFYTWANENYEWY
BoLA-N:01701,YSEMYRERAGNIFVSNLYFWYEYYTWAAQNYRWY
BoLA-N:01702,YSEMYRERAGNIFVSNLYFWYMYYTWAAQNYRWY
BoLA-N:01801,YYADYRNIYDTIFANTAYFEYMFYTWAEQNYRGY
BoLA-N:01802,YYADYRNIYDTIFANTAYFEYMFYTWAEQNYRGY
BoLA-N:01901,YHTKYREISENVYGSNLYYDYDYYTWAVFNYRGY
BoLA-N:02001,YHTKYREISENVYGSNLYFLYMDYTWAVFNYRGY
BoLA-N:02101,YYTKYREISENVYGSNLYFQFRYYTWADFNYEGY
BoLA-N:02201,YHSEYEQIVDTSFVGTLYLLYEDYTRAALNYTGY
BoLA-N:02301,YYSEYREISENVYESNLYIAYSDYTWEYLNYRWY
BoLA-N:02401,YSIAYEQIVDTTFANTAYIAYSDYTWEYLNYTWY
BoLA-N:02402,YSIAYEEIVDTTFANTAYLPYSDYTWTYLNYTWY
BoLA-N:02501,YSAEYRNIYDTTFVYALYLWSWFYTWAAENYRGY
BoLA-N:02601,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-N:02602,YYAEYREISETTFVDTLYIEYEYYTWAYLNYRGY
BoLA-N:02701,YSEMYRNNAGNSFVGTLYLWSMYYTWEYQNYEWH
BoLA-N:02702,YSEMYRNNAGNSFVGTLYLWSMYYTWEYQNYEWH
BoLA-N:02801,YYTKYREISEKLYENTLYLQFRYYTWADFNYEWY
BoLA-N:02901,YYTRYREISENLYKNTAYITFMYYTWANENYRGY
BoLA-N:03001,YYSEYRNIYDTNFVSNLYLWSWFYTWANENYEWY
BoLA-N:03101,YYTKYDEISENLYKNTLYIAFRDYTWAYLNYTWY
BoLA-N:03401,YHTKYREISENVYGSNLYLLYTFYSMADRNYRGY
BoLA-N:03501,YYNMYQENAGNTFVGTLYLWSEFYTWAAHNYTWY
BoLA-N:03601,YYAMYRNNADATFVNTLYFLYEYYTVADHNYRWY
BoLA-N:03701,YSEMYRNNAGNSFVGTLYLLYMDYSRAVQNYRWY
BoLA-N:03801,YNEMYRNNAGNDSVGTLYLWYMYYSMAVQNYTWY
BoLA-N:03901,YYIVYQEKADTFFLGTLYLWCWFYTWANENYEWY
BoLA-N:04001,YSEMYEERAGIVFVNTLYLWCWFYSMAAGKYTWY
BoLA-N:04101,YHTKYREISENWYEATLYLEYEYYSMAAFNYRSY
BoLA-N:04201,YHTKYDEISENLYKDTLYIAYRDYTWEYLNYRGY
BoLA-N:04301,YSEMYRERAGNTFVNTLYIWYRDYTWAVFNYLGY
BoLA-N:04401,YYAMYEEKADTTFVDTLYIAYRDYTWAVFNYLGY
BoLA-N:04501,YYATYRENLDTTFVDTLYIEYRDYTWAEFNYLGY
BoLA-N:04601,YSEMYRERAGNTFVNTLYIWYRDYTWAEQNYTWY
BoLA-N:04701,YSEMYQERAGNTFVDTLYLWYMDYTWAEQNYTWY
BoLA-N:04801,YYSEYEQIVDTSFVGTLYLLYMDYTRAAQNYRGY
BoLA-N:04901,YYAEYREISDTSFVGTLYIEYEYYTWAYLNYEGY
BoLA-N:05001,YSEMYRNNAGNTFGSNLYFLYTYYTWAEWNYTWH
BoLA-N:05101,YSEMYRERAGNTFVNTLYIWYRDYTWAAENYTWY
BoLA-N:05201,YYSMYRENSDTGFVDTLYLLYTYYSVAVQNYRWY
BoLA-N:05301,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
BoLA-N:05401,YYIMYQENSGATFANTLYFWYWFYTWANENYRGY
BoLA-N:05501,YYAEYREISETTFVDSLYIAYRDYTWAYLNYRGY
BoLA-N:05601,YYATYQENFDATFANTLYFLSTYYTWEAHNYRGY
BoLA-T2C,YYIIYRNISDTSFVSNLYLLYTYYSMAVQNYEWH
BoLA-T2a,YYATYRENFDTTFVDTLYIAYRDYTWAEHNYTWY
BoLA-T2b,YHTKYREISENWYEATLYLEYEYYSMAAFNYRSY
BoLA-T2c,YYIIYRNISDTSFVSNLYLLYTYYSMAVQNYEWH
BoLA-T5,YYSEYREISENVYESNLYLLYRDYTWEYLNYRWY
BoLA-T7,YLAMYRNNANTTFVNNLYVEHMYYSMAEQNYTWY
BoLA-amani.1,YYATYRENLDATFVNTAYIAYMDYTWEYQNYEWY
BoLA-gb1.7,YSEMYRNNAGNSFVNTLYLWSMYYTWAYQNYEWY
Chi-B0401,YRTYYGQIGLNINENIRRVWFRSYTWEEWNYTWY
Chi-B1201,YRDYYGQIGGNIDENILRVWYYMYTWGYLQYTWY
Chi-B1501,YSDAYSETSRTIDDGTLRVLYSDYTWGYLQYTWY
DLA-8803401,YYAMYGEKVETLYVDTLYITYSDYTRADLNYTWY
DLA-8850101,YYAMYPQTIETTFVDTLYRTYRDYTWAVWNYTWY
DLA-8850801,YYATYGEKVETVYVDTLYITYRDYTWAVWNYTWY
Eqca-100101,YKSMYEETAGHTFGNIAYFWSSFYTWAEHNYRWY
Eqca-1600101,YYTMYRESVGHTFVNTLYLLYFYYTWAAFNYRSY
Eqca-16:00101,YYTMYRESVGHTFVNTLYLLYFYYTWAAFNYRSY
Eqca-1:00101,YKSMYEETAGHTFGNIAYFWSSFYTWAEHNYRWY
Gogo-B0101,YDTMYRETSAQTDENIAYIRFSSYTWAELAYTWY
H-2-Db,YESYYREKAGQWFVSNLYLQSLFYTWSAYAYEWY
H-2-Dd,YVEYYRERAGNSFVDTAYLWAWFYTWAADAYEWY
H-2-Dq,YESYYRIIADNWFVSTAYIRYEFYTWGAYAYEWY
H-2-Kb,YVEYYREKAGNSFVDTLYIVSQYYTWAELAYTWY
H-2-Kd,YVAFYEQRASDWFVSTAYFRFQFYTWADYAYEWY
H-2-Kk,YHSYYRNIAGNIFVNTAYFRYEYYTWADDAYTWY
H-2-Kq,YHSYYRNIADNSSVDTLYIRYEVYTWAARAYAWH
H-2-Ld,YESYYRIIAGQWFVNTLYLWYEFYTWAAYAYEWY
H-2-Lq,YESYYRIIAGQWFVNTLYIRYEYYTWAAYAYEWY
H-2-Qa1,YHIMYREKADMNFVNTLYLWYCEYSSVEQAYPWY
H-2-Qa2,YHSMYREIAGHSFGSTAYLWYLFYTWAIDAYTSY
H2-Db,YESYYREKAGQWFVSNLYLQSLFYTWSAYAYEWY
H2-Dd,YVEYYRERAGNSFVDTAYLWAWFYTWAADAYEWY
H2-Dq,YESYYRIIADNWFVSTAYIRYEFYTWGAYAYEWY
H2-Kb,YVEYYREKAGNSFVDTLYIVSQYYTWAELAYTWY
H2-Kd,YVAFYEQRASDWFVSTAYFRFQFYTWADYAYEWY
H2-Kk,YHSYYRNIAGNIFVNTAYFRYEYYTWADDAYTWY
H2-Kq,YHSYYRNIADNSSVDTLYIRYEVYTWAARAYAWH
H2-Ld,YESYYRIIAGQWFVNTLYLWYEFYTWAAYAYEWY
H2-Lq,YESYYRIIAGQWFVNTLYIRYEYYTWAAYAYEWY
H2-Qa1,YHIMYREKADMNFVNTLYLWYCEYSSVEQAYPWY
H2-Qa2,YHSMYREIAGHSFGSTAYLWYLFYTWAIDAYTSY
HLA-A0101,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0102,YSAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0103,YFAMYQENMAHTDANTLYIMYRDYTWVARVYRGY
HLA-A0104,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0106,YFAMYQENMAHTDANTLYIIYRDYTWVALAYRGY
HLA-A0107,YFAMYQENVAHTDENTLYIIYRDYTWVARVYRGY
HLA-A0108,YFAMYQENMAHTDANTLYIIYRDYTWVARVYWGY
HLA-A0109,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0110,YFAMYQENMAHTDANTLYIIYRDYTWARRVYRGY
HLA-A0111,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0112,YFAMYQENMAHTDANTLYIIYRDYTWAVQAYTGY
HLA-A0113,YFAMYQENMAQTDVDTLYIIYRDYTWVARVYRGY
HLA-A0114,YFAMYQENMAHTDANTLYIIYRDYTWVARVYTGY
HLA-A0115,YFAMYQENMAHTDANTLYIIYRDYTWVARVYGGT
HLA-A0117,YFAMYQENMAQTDANTLYIIYRDYTWVARVYRGY
HLA-A0118,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0119,YFAMYQENMAHTDANTLYIIYRDYTWAVQAYTGY
HLA-A0120,YSAMYQENMAHTDANTLYVRYRDYTWVARVYRGY
HLA-A0121,YFAMYQENMAHTDANTLYIIYRDYTWAVRVYRGY
HLA-A0122,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0123,YFAMYQENVAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0124,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0125,YFAMYQENMAHTDANTLYIIYRDYTWVAQVYRGY
HLA-A0126,YFAMYQENMAHTDANTLYIIYRDYTWAARVYRGY
HLA-A01:01,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:02,YSAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:03,YFAMYQENMAHTDANTLYIMYRDYTWVARVYRGY
HLA-A01:04,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:06,YFAMYQENMAHTDANTLYIIYRDYTWVALAYRGY
HLA-A01:07,YFAMYQENVAHTDENTLYIIYRDYTWVARVYRGY
HLA-A01:08,YFAMYQENMAHTDANTLYIIYRDYTWVARVYWGY
HLA-A01:09,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:10,YFAMYQENMAHTDANTLYIIYRDYTWARRVYRGY
HLA-A01:100,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:101,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:102,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:103,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:104,YFAMYQENMAHTHANTLYIIYRDYTWVARVYRGY
HLA-A01:105,YFAMYQENIAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:106,YFAMYQENMAHTDANTLYIIYRDYSWVARVYRGY
HLA-A01:107,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:108,YFAMYQENMAHTNANTLYIIYRDYTWVARVYRGY
HLA-A01:109,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:110,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:111,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:112,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:113,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:114,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:115,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:116,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:117,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:118,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:119,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:12,YFAMYQENMAHTDANTLYIIYRDYTWAVQAYTGY
HLA-A01:120,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:121,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:122,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:124,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:125,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:126,YFAMYQENMAHTDANTLYIIYRDYTWVVRVYRGY
HLA-A01:127,YFAMYQENMAHTDANTLYIIYRDYTWVAQAYRGY
HLA-A01:128,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:129,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:13,YFAMYQENMAQTDVDTLYIIYRDYTWVARVYRGY
HLA-A01:130,YFAMYQENMAHTDANTLYVRCRDYTWVARVYRGY
HLA-A01:131,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:132,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:133,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:134,YFAMYQENMAHTHVNTLYIIYRDYTWVARVYRGY
HLA-A01:135,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:136,YFAMYQENMAHTDANTLYIIYRDYTWAAQAYRGY
HLA-A01:137,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:138,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:139,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:14,YFAMYQENMAHTDANTLYIIYRDYTWVARVYTGY
HLA-A01:140,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:141,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:142,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:143,YTAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:144,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:145,YFAMYQENMAHTDANTLYIIYQDYTWVARVYRGY
HLA-A01:146,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:148,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:149,YFAMYQENMAHTDANTLYIIYRDYTWVARVYGGY
HLA-A01:150,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:151,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:152,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:153,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:154,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:155,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:156,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:157,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:158,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:159,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:161,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:163,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:164,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:165,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:166,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:167,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:168,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:169,YFAMCQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:17,YFAMYQENMAQTDANTLYIIYRDYTWVARVYRGY
HLA-A01:170,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:171,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:172,YFAMYQENMAHTDANTQYIIYRDYTWVARVYRGY
HLA-A01:173,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:174,YFAMYQENMAHTDANTLYIIYRDHTWVARVYRGY
HLA-A01:175,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:176,YFAMYQENVAQTDVDTLYIIYRDYTWVARVYRGY
HLA-A01:177,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:180,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:181,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:182,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:183,YFAMYQENMAHTDANILYIIYRDYTWVARVYRGY
HLA-A01:184,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:185,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:187,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:188,YSAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:189,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:19,YFAMYQENMAHTDANTLYIIYRDYTWAVQAYTGY
HLA-A01:190,YSAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:191,YFAMYQEKVAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:192,YFAMYQENMAHTDANTLYIMYRDYTWAARVYRGY
HLA-A01:193,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:194,YFAMYQENMAQTDVDTLYIIYRDYTWVARVYRGY
HLA-A01:195,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:196,YFAMYQENMAHTDANTLYIIYRDYTWVERVYRGY
HLA-A01:197,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:198,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:199,YFSMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:20,YSAMYQENMAHTDANTLYVRYRDYTWVARVYRGY
HLA-A01:200,YFAMYQENMAHTDANTLYIIYRDYTWAVLAYTWY
HLA-A01:201,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:202,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:203,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:204,YFAMYQENMTHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:205,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:206,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:207,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:209,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:21,YFAMYQENMAHTDANTLYIIYRDYTWAVRVYRGY
HLA-A01:210,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:211,YFAMYQENMAHSDANTLYIIYRDYTWVARVYRGY
HLA-A01:212,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:213,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:214,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:215,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:216,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:217,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:218,YFAMYQENMAHTDANTLYIIYRGYTWVARVYRGY
HLA-A01:219,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:220,HFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:221,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:222,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:223,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:224,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:225,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:226,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:227,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:229,YFAMYQENMAHTHVDTLYIIYRDYTWVARVYRGY
HLA-A01:23,YFAMYQENVAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:230,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:231,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:232,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:233,YFAMYQENMAHTDANTLYIIYRDYTWVARIYRGY
HLA-A01:234,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:235,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:236,YFAMYQENMAHTDANTLYIIYHYYTWVARVYRGY
HLA-A01:237,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:238,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:239,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:24,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:241,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:242,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:243,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:244,YFAMYQENMAHTDANTLYIIYRDYTWAVLAYTWY
HLA-A01:245,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:246,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:249,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:25,YFAMYQENMAHTDANTLYIIYRDYTWVAQVYRGY
HLA-A01:251,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:252,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:253,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:254,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:255,YFAMYQENMAHTDANTLYITYRDYTWVARVYRGY
HLA-A01:256,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:257,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:259,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:26,YFAMYQENMAHTDANTLYIIYRDYTWAARVYRGY
HLA-A01:260,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:261,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:262,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:263,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:264,YFAMYQENMAHTDANTLYIIYRDYTWFARVYRGY
HLA-A01:265,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:266,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:267,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:268,YFAMYQENMAHTDANTLYIIYRDQTWVARVYRGY
HLA-A01:270,YFAMYQENMAHTGANTLYIIYRDYTWVARVYRGY
HLA-A01:271,YFAMYQENMAHTDANTLYIIYWDYTWVARVYRGY
HLA-A01:272,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:273,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:274,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:275,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:276,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:277,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:278,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:279,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:28,YFAMYQENMAHTDVDTLYIIYRDYTWVARVYRGY
HLA-A01:280,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:281,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:282,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:283,YFTMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:284,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRRY
HLA-A01:286,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:288,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:289,YFAMYQENMAHTDENIAYIIYRDYTWVARVYRGY
HLA-A01:29,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:291,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:292,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:294,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:295,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:296,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:297,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:30,YFAMYQENMAHTDANTLYIIYHYYTWVARVYRGY
HLA-A01:32,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:33,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:35,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:36,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:37,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:38,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:39,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:40,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:41,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:42,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:43,YYAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:44,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:45,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:46,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:47,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:48,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:49,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:50,YFAMYQENMAHTDANTLYIIYREYTWVARVYRGY
HLA-A01:51,YFAMYRNNVAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:54,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:55,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:58,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:59,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:60,YFAMYPENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:61,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:62,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:63,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:64,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:65,YFAMYQENMAHTDANTLYIIYRDYTWVARVCRGY
HLA-A01:66,YFAMYQENMAHTDANTLYVRYRDYTWVARVYRGY
HLA-A01:67,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:68,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:69,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:70,YFAMYQENMAHTDANTLYIIYRDYTCVARVYRGY
HLA-A01:71,YFAMYQDNMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:72,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRWY
HLA-A01:73,YFAMYQENMAHTDANTLYLRYRDYTWVARVYRGY
HLA-A01:74,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:75,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:76,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:77,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:78,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:79,YFAMYQENMAHTDANTLYIIYPDYTWVARVYRGY
HLA-A01:80,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:81,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:82,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:83,YFAMYGEKVAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:84,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:85,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:86,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:88,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:89,YFAMYQENMAHTDANTLYLIYRDYTWVARVYRGY
HLA-A01:90,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:91,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:92,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:93,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:94,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:95,YFAMYQENMAHTDENIAYIIYRDYTWVARVYRGY
HLA-A01:96,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:97,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A01:98,YFAMYQENMAHTDANTLYIIYRDYTWVARAYRGY
HLA-A01:99,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY
HLA-A0201,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0202,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A0203,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A0204,YFAMYGEKVAHTHVDTLYVMYHYYTWAVLAYTWY
HLA-A0205,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A0206,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0207,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A0208,YYAMYGENVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A0209,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0210,YYAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A0211,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A0212,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTWY
HLA-A0213,YFAMYGEKVAHTHVDTLYVRYHYYTWAEQAYTWY
HLA-A0214,YYAMYGEKVAHTHVDTLYLRYHYYTWAVLAYTWY
HLA-A0215,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A0216,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYEWY
HLA-A0217,YFAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A0218,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A0219,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A0220,YFAMYGENVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0221,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0222,YFAMYGEKVAHTHVDTLYVRYHYYTWAVWAYTWY
HLA-A0224,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0225,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0226,YFAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A0227,YFAMYGEKVAHTHVDTLYVRYHYYTWAAQAYTWY
HLA-A0228,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0229,YFAMYGEQVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0230,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0231,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0233,YFAMYGEKVAHTHVDTLYVRSHYYTWAVLAYTWY
HLA-A0234,YFAMYGEKVAQTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0235,YFAMYGEKVAQTDVDTLYVRYHYYTWAVLAYTWY
HLA-A0236,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTGY
HLA-A0237,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A0238,YFAMYGEKVAHTHVDTLYVRYHYYTWAEQAYRWY
HLA-A0239,YFAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A0240,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0241,YYAMYGEKVAHTHVDTLYVRYQYYTWAVLAYTWY
HLA-A0242,YFSMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0243,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0244,YYAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTWY
HLA-A0245,YFAMYQEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0246,YFAMYEEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0247,YFAMYGEKVAHSHVDTLYLRYHYYTWAVWAYTWY
HLA-A0248,YFAMYEEKVAHTDVDTLYVRYHYYTWAVLAYTWY
HLA-A0249,YFAMYGEKVAHTHVDTLYVRYHYYTWAVRAYTWY
HLA-A0250,YFAMYGEKVAHTHVDTLYIRYHYYTWAVWAYTWY
HLA-A0251,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0252,YFAMYGEKVAHTHVDTLYVRYEHYTWAVLAYTWY
HLA-A0254,YYAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A0255,YFAMYRNNVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0256,YFAMYQENVAQTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0257,YYAMYGEKVAHTHVDTLYLMYHYYTWAVLAYTWY
HLA-A0258,YFAMYGEKVAHTHVDTLYLRYHYYTWAVLAYTWY
HLA-A0259,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0260,YFAMYGEKVAHTHVDTLYVRYHFYTWAVLAYTWY
HLA-A0261,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0262,YFAMYGENVAQTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0263,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A0264,YFAMYGEKVAHTHVDTLYVRYHSYTWAVLAYTWY
HLA-A0265,YFAMYGEKVAHTHVDTLYIMYQDYTWAVLAYTWY
HLA-A0266,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0267,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0268,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0269,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A0270,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0271,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0272,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0273,YFAMYGEKVAHTHVDTLYIRYHYYTWAVLAYTWY
HLA-A0274,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0275,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0276,YSAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0277,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0278,YYAMYQENVAQTDVDTLYVRYHYYTWAVLAYTWY
HLA-A0279,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0280,YFAMYGEKVAHTHVDTLYVRYQDYTWAVLAYTWY
HLA-A0281,YFAMYGEKVAHTDESIAYVRYHYYTWAVLAYTWY
HLA-A0283,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0284,YYAMYGEKVAHTHVDTLYFRYHYYTWAVLAYTWY
HLA-A0285,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0286,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0287,YFAMYGEKVAHTDENIAYVRYHYYTWAVLAYTWY
HLA-A0289,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0290,YFAMYGEKVAHTDVDTLYVRYHYYTWAVLAYTWY
HLA-A0291,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0292,YFAMYEEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0293,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0295,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0296,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0297,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A0299,YYAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A02:01,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:02,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:03,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:04,YFAMYGEKVAHTHVDTLYVMYHYYTWAVLAYTWY
HLA-A02:05,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:06,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:07,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:08,YYAMYGENVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:09,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:10,YYAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A02:101,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYRWY
HLA-A02:102,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:103,YFAMYQENVAQTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:104,YFAMYGEKVAHTHVDTLYVRYHYYTWAVWAYTWY
HLA-A02:105,YFAMYGEKVAHTHVDTLYVRYEYYTWAVLAYTWY
HLA-A02:106,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:107,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:108,YYAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A02:109,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:11,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:110,YFAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A02:111,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:112,YFAMYGEKVAHTDENIAYVRCHYYTWAVLAYTWY
HLA-A02:114,YFAMYGEKVAHTHVDTLYVRYRDYTWAVLAYTWY
HLA-A02:115,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:116,YFAMYGEKVAHTHLDTLYVRYHYYTWAVLAYTWY
HLA-A02:117,YFAMYGEKVAHTHVDTLYVRYQDYTWAEWAYTWY
HLA-A02:118,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:119,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:12,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTWY
HLA-A02:120,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:121,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:122,YYAMYGEKVAHTHVDTLYIRYHYYTWAVWAYTWY
HLA-A02:123,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:124,YFAMYGEKVAHTDESIAYVRYHYYTWAVLAYTWY
HLA-A02:126,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:127,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYKWY
HLA-A02:128,YFAMYGENVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:129,YYAMYEEKVAHTDENIAYVRYHYYTWAVLAYTWY
HLA-A02:13,YFAMYGEKVAHTHVDTLYVRYHYYTWAEQAYTWY
HLA-A02:130,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:131,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYEWY
HLA-A02:132,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:133,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:134,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:135,YFAMYGEKVAHTHVDTLYIRYQDYTWAEWAYRWY
HLA-A02:136,YFAMYGEKVAHTDENIAYVRYHYYTWAVWAYTWY
HLA-A02:137,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:138,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:139,YFAMYGEKVTHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:14,YYAMYGEKVAHTHVDTLYLRYHYYTWAVLAYTWY
HLA-A02:140,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:141,YFVMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:142,YYAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTWY
HLA-A02:143,YYAMYREKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:144,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:145,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:146,YFAMYGEKVAHTDANTLYVRYHYYTWAVLAYTWY
HLA-A02:147,YFAMYGEKVAHTHVDTLYVRYDYYTWAVLAYTWY
HLA-A02:148,YFAMYGEKVAHTHVDTLYVRFHYYTWAEWAYTWY
HLA-A02:149,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:150,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:151,YFAMYGEKVAHTHVDTLYVRYDYYTWAVLAYTWY
HLA-A02:152,YFAMYGEKVAHTHVDTLYIMYQDYTWAVLAYTWY
HLA-A02:153,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:154,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYRWY
HLA-A02:155,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:156,YFAMYGEKVAHTHVDTLYIIYHYYTWAVLAYTWY
HLA-A02:157,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:158,YFAMYGEKVAHAHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:159,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:16,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYEWY
HLA-A02:160,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:161,YFAVYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:162,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:163,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:164,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:165,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:166,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:167,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYKWY
HLA-A02:168,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:169,YYAMYQENVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:17,YFAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A02:170,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:171,YFAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A02:172,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:173,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:174,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:175,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:176,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:177,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:178,YYAMYGEKVAHTHVDTLYVRYHSYTWAVLAYTWY
HLA-A02:179,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:18,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:180,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:181,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:182,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:183,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:184,YFAMYGEKVAHTHEDTLYVRYHYYTWAVLAYTWY
HLA-A02:185,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:186,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:187,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:188,YFAMYGEKVAHTHVDTLYVRYDSYTWAVLAYTWY
HLA-A02:189,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:19,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A02:190,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:191,YFAMYGEKVAHTHVDTLYVRCHYYTWAVWAYTWY
HLA-A02:192,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:193,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:194,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:195,YFAMYQENVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:196,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:197,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:198,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:199,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:20,YFAMYGENVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:200,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:201,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:202,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:203,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:204,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:205,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:206,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:207,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:208,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:209,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:21,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:210,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:211,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:212,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:213,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:214,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:215,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:216,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:217,YFAMYREKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:218,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:219,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:22,YFAMYGEKVAHTHVDTLYVRYHYYTWAVWAYTWY
HLA-A02:220,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:221,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:224,YFAMYGEKVAHTHVDTLYVGYHYYTWAVLAYTWY
HLA-A02:228,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:229,YYAMYGEKVAHTHVDTLYLRYRYYTWAVWAYTWY
HLA-A02:230,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:231,YFAMYGEKVAHTHVDTLYVRNHYYTWAVLAYTWY
HLA-A02:232,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:233,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTRY
HLA-A02:234,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:235,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:236,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:237,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:238,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:239,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:24,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:240,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:241,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:242,YFAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A02:243,YTAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:244,YYAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A02:245,YFAMYGEKVAHTHVDTLYIRYHYYTWAVLAYTWY
HLA-A02:246,YFAMYGEKVAHTHVDTLYVRYRDYTWAVLAYTWY
HLA-A02:247,YFAMYGEKVAHTDENTLYVRYHYYTWAVLAYTWY
HLA-A02:248,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:249,YFAMYVEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:25,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:251,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:252,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:253,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:254,YFAMYGEKVAHTHVDTLYVRYNFYTWAVLAYTWY
HLA-A02:255,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTGY
HLA-A02:256,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:257,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:258,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:259,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:26,YFAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A02:260,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:261,YFAMYGEKVAHTHMDTLYVRCHYYTWAVLAYTWY
HLA-A02:262,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLVYTWY
HLA-A02:263,YFAMYGEKVAHTHVDTLYVRYHYYTWSVLAYTWY
HLA-A02:264,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:265,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:266,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:267,YFAMYGEKVAHTHVDTLYVRYHYYTWAAWAYTWY
HLA-A02:268,YFAMYGEKVAHTHVDTLYVMFHYYTWAVLAYTWY
HLA-A02:269,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:27,YFAMYGEKVAHTHVDTLYVRYHYYTWAAQAYTWY
HLA-A02:270,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:271,YYAMYGEKVAHTHVDTLYLRYHYYTWAVQAYTWY
HLA-A02:272,YFAMYGEKLAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:273,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:274,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:275,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:276,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:277,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:278,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:279,YFAMYGEKVAHTHVDTLYVRYRDYTWAVLAYTWY
HLA-A02:28,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:280,YFAMYGEKVAHTHVDTLYVRYHYYTWAEQAYTWY
HLA-A02:281,YFAMYGEKVAHTHVDILYVRYHYYTWAEWAYTWY
HLA-A02:282,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:283,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:285,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:286,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:287,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:288,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:289,YFAMYGEKVAHTHVDTLYVRYQYYTWAVLAYTWY
HLA-A02:29,YFAMYGEQVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:290,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:291,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:292,YFAMYGEKVSHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:294,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:295,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:296,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:297,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:298,YFAMYGEKVAHIDVDTLYVRYHDYTWAVLAYTWY
HLA-A02:299,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:30,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:300,YYAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A02:302,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:303,YFAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A02:304,YFAMYGEKVAHTHVDTLYVRYQDYTWAVLAYTWY
HLA-A02:306,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:307,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:308,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:309,YFAMYGEKVAHTHVDTLYVRYQDYTWAVLAYTWY
HLA-A02:31,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:310,YYSMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:311,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:312,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:313,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:315,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:316,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:317,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:318,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:319,YSAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:320,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:322,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:323,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:324,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:325,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:326,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:327,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:328,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:329,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:33,YFAMYGEKVAHTHVDTLYVRSHYYTWAVLAYTWY
HLA-A02:330,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:331,YYAMYGEKVAHTDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:332,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWH
HLA-A02:333,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:334,YFAMYGEKVAHTHVDTLYIMYHYYTWAVLAYTWY
HLA-A02:335,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:336,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:337,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:338,YFAMYGEKVAHTHVDTLYIIYHYYTWAVLAYTWY
HLA-A02:339,YFAMYGEKVAHTHVDTLYVRYDLYTWAVLAYTWY
HLA-A02:34,YFAMYGEKVAQTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:340,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:341,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:342,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:343,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:344,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:345,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:346,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:347,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:348,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:349,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:35,YFAMYGEKVAQTDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:351,YFAMYGEKVARTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:352,CFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:353,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:354,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYAWY
HLA-A02:355,YYAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:357,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:358,YYAMYEEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:359,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:36,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTGY
HLA-A02:360,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:361,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:362,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:363,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:364,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:365,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:367,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:368,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:369,YFAMYEEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:37,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A02:370,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:371,YFAMYGEKVAHTHVDTLYVRYHYYIWAVLAYTWY
HLA-A02:372,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:374,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:375,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:376,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYLWY
HLA-A02:377,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:378,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:379,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:38,YFAMYGEKVAHTHVDTLYVRYHYYTWAEQAYRWY
HLA-A02:380,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:381,YFAMYGEKVAHTHVDSLYVRYHYYTWAVLAYTWY
HLA-A02:382,YYAMYGEKVAHTHVDTLYVRYHYYTWAVWAYTWY
HLA-A02:383,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:384,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:385,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYMWY
HLA-A02:386,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:387,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:388,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:389,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:39,YFAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A02:390,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:391,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:392,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:393,YFAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A02:394,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:396,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:397,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:398,YYAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A02:399,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:40,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:400,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:401,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:402,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:403,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLTYTWY
HLA-A02:404,YYAMYGEKVAHTHVDTLYVRYHHYTWAVLAYTWY
HLA-A02:405,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:406,YFAMYGEKVAHTHVDTLYVRYHDYTWAVLAYTWY
HLA-A02:407,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:408,YFAMYGEKVAHTHVDTLYVRCHYYTWAALAYTWY
HLA-A02:409,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:41,YYAMYGEKVAHTHVDTLYVRYQYYTWAVLAYTWY
HLA-A02:410,YFAMYAEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:411,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:412,YFAMYGEKVAHTHVDTLYVRYHSYTWAEWAYTWY
HLA-A02:413,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:414,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:415,YYAMYGENVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:416,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:417,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A02:418,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:419,YYAMYREKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:42,YFSMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:420,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:421,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:422,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:423,YFAMYGEKVAHTHVDTLYVRYHHYTWAVLAYTWY
HLA-A02:424,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:425,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:426,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:427,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:428,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:429,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:430,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:431,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:432,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTSY
HLA-A02:433,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:434,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:435,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:436,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:437,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYEWY
HLA-A02:438,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:44,YYAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTWY
HLA-A02:441,HFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:442,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:443,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:444,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:445,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:446,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:447,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYRWY
HLA-A02:448,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:449,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:45,YFAMYQEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:450,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:451,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:452,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:453,YYAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A02:454,YYAMYGEKVAHTHVDTLYVRYQDYTWAVLAYTWY
HLA-A02:455,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:456,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:457,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:458,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:459,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:46,YFAMYEEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:460,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:461,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:462,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:463,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:464,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:465,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:466,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:467,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:469,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:47,YFAMYGEKVAHSHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:470,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:471,YYAMYGEKVVHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:472,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:473,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:474,YYAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:475,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:477,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:478,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:479,YFAMYGEKVAHSHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:48,YFAMYEEKVAHTDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:480,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:481,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:482,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:483,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:484,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:485,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:486,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:487,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYEWY
HLA-A02:488,YFAMYGEKVAHTHVDTLYVRYHYCTWAVLAYTWY
HLA-A02:489,YYAMYGEKVAHTHVDTLYLRYHYYTWAEWAYTWY
HLA-A02:49,YFAMYGEKVAHTHVDTLYVRYHYYTWAVRAYTWY
HLA-A02:491,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:492,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:493,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:494,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:495,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:496,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:497,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:498,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:499,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:50,YFAMYGEKVAHTHVDTLYIRYHYYTWAVWAYTWY
HLA-A02:502,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:503,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:504,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:505,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:507,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:508,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:509,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:51,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:510,YFAMYGEKVAHTHVDTLYVRYHLYTWAVLAYTWY
HLA-A02:511,YFAMYGEKVAHTHVDTLYVSYHYYTWAVLAYTWY
HLA-A02:512,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:513,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:515,YFAMYGEKVAHTHMDTLYVRYHYYTWAVLAYTWY
HLA-A02:517,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:518,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:519,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:52,YFAMYGEKVAHTHVDTLYVRYEHYTWAVLAYTWY
HLA-A02:520,YFAMYGEKVAHTHVDTLYVRYYYYTWAVLAYTWY
HLA-A02:521,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:522,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:523,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:524,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:526,YFAMYGEKVAHTHVDTLYVKYHYYTWAVLAYTWY
HLA-A02:527,YYAMYGEKVAHTHVDTLYLRYRDYTWAVWAYTWY
HLA-A02:528,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:529,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYMWY
HLA-A02:530,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:531,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:532,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:533,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:534,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:535,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:536,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:537,YFAMYGEKVAHTHVDTLYVRYHYYTWDVLAYTWY
HLA-A02:538,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:539,YFAMYGEKVAHTHVDTLYVRYHYYTLAVLAYTWY
HLA-A02:54,YYAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A02:541,YFAMYGEKVAHTHVDTLYVRCHYYTWAELAYTWY
HLA-A02:542,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:543,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQAYRWY
HLA-A02:544,YFAMYGEKVAHTHVDTLYVRCHYYTWAEWAYTWY
HLA-A02:545,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:546,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:547,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLVYTWY
HLA-A02:548,YFAMYGEKVAHTHVDTLYVRHHYYTWAVLAYTWY
HLA-A02:549,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:55,YFAMYRNNVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:550,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:551,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:552,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:553,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:554,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:555,YFAMYGEKVAHTHVDTLYVRYNYYTWAVLAYTWY
HLA-A02:556,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:557,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:558,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:559,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:56,YFAMYQENVAQTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:560,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYEWY
HLA-A02:561,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:562,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:563,YFAMYGEKVAHTHVDTLYVRYHYYAWAVLAYTWY
HLA-A02:564,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:565,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:566,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:567,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:568,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:569,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:57,YYAMYGEKVAHTHVDTLYLMYHYYTWAVLAYTWY
HLA-A02:570,YFTMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:571,YFAMYEEKVAHTDENIAYVRYHYYTWAVLAYTWY
HLA-A02:572,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:573,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:574,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:575,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:576,YFAMYGEKVAHTHVDTLYVRYHYYTWVVLAYTWY
HLA-A02:577,YYAMYGEKVAHTHGDTLYLRYHYYTWAVWAYTWY
HLA-A02:578,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:579,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:58,YFAMYGEKVAHTHVDTLYLRYHYYTWAVLAYTWY
HLA-A02:580,YFAMYGEKVAQTDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:581,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYEWY
HLA-A02:582,YFAMYGEKVAHTHVDTLYVRYRDYTWAVWAYTWY
HLA-A02:583,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:584,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:585,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:586,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:587,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:588,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:589,YFAMYGEKVAHIDVDTLYVRYHYYTWAELAYTWY
HLA-A02:59,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:590,YFAMYGEKVAHTHVDTLYVRYHYYTWAALAYTWY
HLA-A02:591,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:592,YYAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A02:593,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:594,YFAMYGEKVAHTHVDTLYVRYNFYTWAVLAYTWY
HLA-A02:595,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:596,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:597,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:598,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:599,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:60,YFAMYGEKVAHTHVDTLYVRYHFYTWAVLAYTWY
HLA-A02:600,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:601,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:602,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:603,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:604,YFAMYGEKVAHTHVDTLYVRIHYYTWAVLAYTWY
HLA-A02:606,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:607,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:609,YFAMYGENMAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:61,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:610,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:611,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:612,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:613,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:614,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:615,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:616,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:617,YFAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A02:619,YFAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A02:62,YFAMYGENVAQTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:620,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:621,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:623,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:624,YFAMYGEKVAHTHVDTLCVRYHYYTWAVLAYTWY
HLA-A02:625,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:626,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:627,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:628,YFAMYGEKVAHTHVDTLYVRFHYYTWAVLAYTWY
HLA-A02:629,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:63,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:630,YYAMYGEKVAHTHVDTLYVRFHYYTWAVQAYTWY
HLA-A02:631,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:632,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:633,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:634,YFAMYGENVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:635,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:636,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:637,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:638,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:639,YFAMYGEKVAHTHVDILYVRYHYYTWAVLAYTWY
HLA-A02:64,YFAMYGEKVAHTHVDTLYVRYHSYTWAVLAYTWY
HLA-A02:640,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:641,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:642,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:644,YFAMYRNNVAHTDANTLYVRYHYYTWAVLAYTWY
HLA-A02:645,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:646,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:647,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:648,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:649,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:65,YFAMYGEKVAHTHVDTLYIMYQDYTWAVLAYTWY
HLA-A02:650,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:651,YFAMYGEKVAHTHVDTLYVRYHYYTWAVWAYTWY
HLA-A02:652,YFAMYGEKVAHTHVDTLNVRCHYYTWAVLAYTWY
HLA-A02:653,YFAMYGEKVAHTHVDTLHVRYHYYTWAVLAYTWY
HLA-A02:654,YFAMYGEKVAHTHVDTLYVRYHYYTCAVLAYTWY
HLA-A02:655,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:656,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:657,YFAMYGEKVAHTHVDTLYLMFHYYTWAVLAYTWY
HLA-A02:658,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:659,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:66,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:660,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:661,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:662,YFAMYGEKVAHTHVDTLYVRYRDYTWAAQAYTWY
HLA-A02:663,YFAMYGEKVAYTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:664,YFAMYGEKVAHTHVDTLYVMYHYYTWAVLAYTWY
HLA-A02:665,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:666,YSAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:667,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:668,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:669,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:67,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:670,YYAMYGEKVAHTHVDTLHLRYHYYTWAVWAYTWY
HLA-A02:671,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:673,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:674,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:676,YYAMYGEKVAHTHVDTLYLRYHSYTWAVWAYTWY
HLA-A02:677,YFAMYGEKVDHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:678,YFAMYGEKVAHTHVDTLYVRCHSYTWAVLAYTWY
HLA-A02:679,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:68,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:680,YFAMYGEKVAHTHVDTLYLMFHYYTWAVWAYTWY
HLA-A02:681,YFAMYGEKVAHTHVDTLYVRYRYYTWAVLAYTWY
HLA-A02:682,YFAMYGEKVAHTHVDTLYVRYHYYTWVARAYTWY
HLA-A02:683,YFAMYGEKVAHTHVDTLYVRYHYYTWAVRAYTWY
HLA-A02:684,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:685,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:686,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:687,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:688,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:689,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:69,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:690,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:692,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:693,YFAMYGEKVAHTHVDTLYVRYHYYTWAVFAYEWY
HLA-A02:694,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:695,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:697,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:698,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:699,YFAMYGEKVAHTHVDTLYVRYHYYTWAGLAYTWY
HLA-A02:70,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:700,YFAMYGEKVAHTHVDTLYVRYHYYTWAVQVYTWY
HLA-A02:701,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:702,YFAMYGEKVALTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:703,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:704,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:705,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:706,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:707,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:708,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLANTWY
HLA-A02:709,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:71,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:711,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:712,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:713,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:714,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:716,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:717,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:718,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:719,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:72,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:720,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:721,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:722,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:723,YYAMYGEKVAHTHVDTLYVRYHYYTWAVQAYTGY
HLA-A02:724,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:725,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:726,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:727,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:728,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:729,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:73,YFAMYGEKVAHTHVDTLYIRYHYYTWAVLAYTWY
HLA-A02:730,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:731,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:732,YFAMYGEKVAHTHVYTLYVRYHYYTWAVLAYTWY
HLA-A02:733,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:734,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:735,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:736,YFAMYGEKVAHTHVDTLYVWYHYYTWAVLAYTWY
HLA-A02:737,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:738,YFAMYGEKVVHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:739,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:74,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:740,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:741,YFAMYRNKVAQTDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:742,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:743,YFAMYGEKVAHTHVDTLYVRYNYYTWAVLAYTWY
HLA-A02:744,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:745,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:746,YFAMYWEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:747,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:749,YFAMYGEKVAHTDANTLYVRYHYYTWAVLAYTWY
HLA-A02:75,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:750,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:751,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:752,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:753,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:754,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:755,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:756,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:757,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:758,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:759,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:76,YSAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:761,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:762,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:763,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:764,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:765,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:766,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:767,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:768,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:769,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:77,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:770,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:771,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:772,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:774,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:776,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:777,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:778,YYAMYGEKVAHNHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:779,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:78,YYAMYQENVAQTDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:780,YFAMYGEQVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:781,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:782,YFAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A02:783,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:784,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:785,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:786,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:787,YFAMYGEKVVHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:79,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:790,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:794,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:795,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:798,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:799,YFAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:80,YFAMYGEKVAHTHVDTLYVRYQDYTWAVLAYTWY
HLA-A02:800,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:801,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:802,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:804,YFAMYGEKVAHTHVDTLYLMFHYYTWAVQAYTGY
HLA-A02:808,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:809,YFAMYGEKVAHTHVDTLYVRYHYYTWAEWAYTWY
HLA-A02:81,YFAMYGEKVAHTDESIAYVRYHYYTWAVLAYTWY
HLA-A02:810,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:811,YFAMYGEKVAHTHVDTLYVRYHYYTWAVFAYTWY
HLA-A02:812,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:813,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:814,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:815,YFAMYRNNVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:816,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:817,YYAMYGEKVAHTHVDTLYLRYHYYTWAVWAYTWY
HLA-A02:818,YYAMYGEKVAHTHVDTLYLRYHYYTWAVLAYTWY
HLA-A02:819,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:820,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:821,YFAMYGEKVAHIDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:822,YFAMYGEKVAHTHVDTLYVRCHYYTWAVLAYTWY
HLA-A02:823,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:824,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:825,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:84,YYAMYGEKVAHTHVDTLYFRYHYYTWAVLAYTWY
HLA-A02:85,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:86,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:87,YFAMYGEKVAHTDENIAYVRYHYYTWAVLAYTWY
HLA-A02:89,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:90,YFAMYGEKVAHTDVDTLYVRYHYYTWAVLAYTWY
HLA-A02:91,YYAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:92,YFAMYEEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:93,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:95,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:96,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:97,YFAMYGEKVAHTHVDTLYVRYHYYTWAVLAYTWY
HLA-A02:99,YYAMYGEKVAHTHVDTLYVRYHYYTWAELAYTWY
HLA-A0301,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0302,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A0303,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0304,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0305,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0306,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0307,YFAMYQENVAQTDVDTLYIIYRDYTWAVLAYTWY
HLA-A0308,YFAMYQENVAHTDVDTLYIIYRDYTWAELAYTWY
HLA-A0309,YFAMYQENVAQTHVDTLYIIYRDYTWAELAYTWY
HLA-A0310,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A0312,YYAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0313,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0314,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0315,YFAMYQENVAQTDVDTLYIIFRDYTWAELAYTWY
HLA-A0316,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0317,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0318,YFAMYQENVAQTDVDTLYIIYRDYTWVARVYRGY
HLA-A0319,YFAMYQENVAQTDVDTLYIIFHYYTWAELAYTWY
HLA-A0320,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0321,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0322,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0323,YFAMYGEKVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0324,YFAMYRNNVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0325,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0326,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0327,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0328,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0329,YFAMYQENVVQTDVDTLYIIYRDYTWAELAYTWY
HLA-A0330,YFAMYEEKVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:01,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:02,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:04,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:05,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:06,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:07,YFAMYQENVAQTDVDTLYIIYRDYTWAVLAYTWY
HLA-A03:08,YFAMYQENVAHTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:09,YFAMYQENVAQTHVDTLYIIYRDYTWAELAYTWY
HLA-A03:10,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:100,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:101,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:102,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:103,YFAMYQENVAQTDVDTLYIIYQDYTWAELAYTWY
HLA-A03:104,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWH
HLA-A03:105,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYRWY
HLA-A03:106,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:107,YFAMYQENMAHTDANTLYIIYRDYTWAELAYTWY
HLA-A03:108,YFAMYQENVAHTHVDTLYIIYRDYTWAELAYTWY
HLA-A03:109,YFAMYQENVAQTDVHTLYIIYRDYTWAELAYTWY
HLA-A03:110,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:111,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:112,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:113,YFAMYQEKVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:114,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:115,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:116,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:117,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:118,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:119,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:12,YYAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:120,YFAMYQENVAQTDVDTLYIIYRDCTWAELAYTWY
HLA-A03:121,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:122,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTGY
HLA-A03:123,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:124,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:125,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:126,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:127,YFAMYQENVAQTDVDTLYIIYRDYTWAALAYTWY
HLA-A03:128,YFAMYQENVAQTDLDTLYIIYRDYTWAELAYTWY
HLA-A03:13,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:130,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:131,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:132,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:133,YFAMYQENVAQTDVDTLYIIYRDYTWAVLAYTWY
HLA-A03:134,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:135,YFAMYQENVAQTDVDTLYIIYRDYTWAERVYRGY
HLA-A03:136,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:137,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:138,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:139,YFAMYQENVAQTDVDTLYIIYRDYTWAKLAYTWY
HLA-A03:14,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:140,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:141,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:142,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:143,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:144,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:145,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:146,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:147,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:148,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:149,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:15,YFAMYQENVAQTDVDTLYIIFRDYTWAELAYTWY
HLA-A03:150,YFAMYQENVAQTDVDTLYIIYRDYTWAELVYTWY
HLA-A03:151,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:152,YFAMYEEKVAHTDENIAYIIYRDYTWAELAYTWY
HLA-A03:153,YFAMYQENVAQTDVDTLYIIYRDYTWAERVYTWY
HLA-A03:154,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:155,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:156,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:157,YFAMYQEKVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:158,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:159,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:16,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:160,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:163,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:164,YFAMYQENMAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:165,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:166,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:167,YFAMYQENVAQTDVDTLYIIYRDYTWAEQAYTGY
HLA-A03:169,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:17,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:170,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:171,YFAMYQENVAQTDVDTLYIIYRDYTWAVLAYTWY
HLA-A03:172,YFAMYQEKVAHTHVDTLYIIYRDYTWAELAYTWY
HLA-A03:173,YFAMYQENVAQTDEDTLYIIYRDYTWAELAYTWY
HLA-A03:174,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:175,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:176,YFAMYQEKVAHTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:177,YFAMYQENVAQTDVDTLYIRYRDYTWAELAYTWY
HLA-A03:179,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:18,YFAMYQENVAQTDVDTLYIIYRDYTWVARVYRGY
HLA-A03:180,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:181,YFAMYQENVAQTDVDSLYIIYRDYTWAELAYTWY
HLA-A03:182,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:183,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:184,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:185,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:186,YFAMYQENVAQTDVDTLYIIYEHYTWAELAYTWY
HLA-A03:187,YFAMYQENVAQTDVDTLYIIYRDYTWVARVYTWY
HLA-A03:188,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:189,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:19,YFAMYQENVAQTDVDTLYIIFHYYTWAELAYTWY
HLA-A03:190,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:191,YFAMYQENVAQTDVDTLYIIYGDYTWAELAYTWY
HLA-A03:193,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:195,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:196,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:198,YFAMYGEKVAHTHVDTLYIIYRDYTWAVQAYTWY
HLA-A03:199,YFAMYQENVAQSDVDTLYIIYRDYTWAELAYTWY
HLA-A03:20,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:201,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:202,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:203,YFAMYQENVAQSDVDTLYIIYRDYTWAELAYTWY
HLA-A03:204,YFAMYQENVAQTDVDTLYMVYRDYTWAELAYTWY
HLA-A03:205,YTAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:206,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:207,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:208,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYEWY
HLA-A03:209,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:210,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:211,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:212,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:213,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:214,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:215,YFAMYQENVAQTDVDTLYIMYRDYTWAELAYTWY
HLA-A03:216,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:217,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:218,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:219,YFAMYQENVAQTDENIAYIIYRDYTWAELAYTWY
HLA-A03:22,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:220,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:221,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:222,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:223,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:224,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:225,YFAMYQENVAQTDVDTLYIIYRDYTWAERAYTWY
HLA-A03:226,YFAMYQENVAQTDVDTLYIIYPDYTWAELAYTWY
HLA-A03:227,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:228,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:229,YFAMYQENVAQTDVDTLYIIYRDYTWAEQAYTWY
HLA-A03:23,YFAMYGEKVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:230,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:231,YFAMYQENVAQTDVDTLYIIYRDYTWARLAYTWY
HLA-A03:232,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:233,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:235,YFAMYQENVAQTDVDTLYNIYRDYTWAELAYTWY
HLA-A03:236,YFAMYQENVAQTDVDTLYIIYGDYTWAVQAYTWY
HLA-A03:237,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:238,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:239,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:24,YFAMYRNNVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:240,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:241,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:242,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:243,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:244,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:245,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:246,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:247,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:248,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:249,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:25,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:250,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:251,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:252,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:253,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:254,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:255,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:256,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:257,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:258,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:259,YFAMYQENVAQTYVDTLYIIYRDYTWAELAYTWY
HLA-A03:26,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:260,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:261,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:263,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:264,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:265,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:267,YFAMYQENVAQTDVNTLYIIYRDYTWAELAYTWY
HLA-A03:268,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:27,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:270,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:271,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:272,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:273,YFAMYEEKVAHTDENTLYIIYRDYTWAELAYTWY
HLA-A03:274,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:276,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:277,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:278,YFAMYLQNVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:28,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:280,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:281,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:282,YFAMYQENVAQTDVDTLYIIYQDYTWAELAYTWY
HLA-A03:285,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:287,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:288,YFAMYQENVAQTDVDTLYMIYRDYTWAELAYTWY
HLA-A03:289,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:29,YFAMYQENVVQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:290,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:291,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:292,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:293,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:294,YFAMYQENVAQTDVDTLYIIYRDYIWAELAYTWY
HLA-A03:295,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:296,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:298,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:299,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:30,YFAMYEEKVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:300,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:301,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:302,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:303,YFAMYEENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:304,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:305,YFAMYQENVAQTDVDILYIIYRDYTWAELAYTWY
HLA-A03:306,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:307,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:308,YFAMYQENVAQTDVDTLYIIYRDYTWAELAHTWY
HLA-A03:309,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:31,YFAMYQENVAQTDVDTLYIIYRYYTWAVQAYTWY
HLA-A03:310,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:311,YFAMYQENVAQTDVDTLYIIHRDYTWAELAYTWY
HLA-A03:312,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:313,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:314,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:315,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:316,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:317,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:318,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:319,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:32,YFAMYQENVAHIDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:320,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:321,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:322,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:324,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:325,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:326,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYKWY
HLA-A03:327,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:328,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:33,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:331,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:332,YFAMYQENVAQTDVDTLYIIYRDYTWAVLAYTWY
HLA-A03:333,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:34,YFAMYQENVAPTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:35,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:37,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:38,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:39,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:40,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:41,YFAMYQENVAHTDANTLYIIYRDYTWAELAYTWY
HLA-A03:42,YFAMYQENVAQTDVDTLYIIYRDYTWAVLAYTWY
HLA-A03:43,YFAMYQENVAQTDVDTLYIIYEHYTWAELAYTWY
HLA-A03:44,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:45,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:46,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:47,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:48,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:49,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:50,YFAMYQENVAQTDVDTLYIIYRDYTWAEWAYTWY
HLA-A03:51,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:52,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:53,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:54,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:55,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:56,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:57,YFAMYQENVAQTDANTLYIIYRDYTWAELAYTWY
HLA-A03:58,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:59,CFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:60,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:61,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:62,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:63,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:64,YFAMYQENVAQTDVDTLYIIYRDYTWADLAYTWY
HLA-A03:65,YFAMYQENVAQTDVDTLYIIYRDYTWAEQAYTWY
HLA-A03:66,YFAMYQENVAQTDVDTLYIIYRDYTWAERAYTWY
HLA-A03:67,YFATYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:70,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:71,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:72,YSAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:73,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:74,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:75,YFAMYQENVAQTDVDTLYLMYRDYTWAELAYTWY
HLA-A03:76,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:77,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:78,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:79,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:80,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:81,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:82,YFAMYQENVAQTDVDTLYIIYEHYTWAVQAYTWY
HLA-A03:83,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:84,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:85,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:86,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:87,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:88,YYAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:89,YFAMYGEKVAHTHVDTLYIIYRDYTWAELAYTWY
HLA-A03:90,YFAMYQENVAQTDVDTLYIIYRDYTWAVQAYTWY
HLA-A03:92,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:93,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:94,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:95,YFAMYQENVAQTDVDTLYVRYRDYTWAELAYTWY
HLA-A03:96,YFDMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A03:97,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTGY
HLA-A03:98,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYMWY
HLA-A03:99,YFAMYQENVAQTDVDTLYIIYRDYTWAELAYTWY
HLA-A1101,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1102,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1103,YYAMYQENVAQTDVDTLYIIYRDYTWAEQAYRWY
HLA-A1104,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYTWY
HLA-A1105,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1106,YYAMYQENVAQTHVDTLYIIYRDYTWAAQAYRWY
HLA-A1107,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1108,YYAMYQENVAQTDVDTLYIIYRDYTWAERAYRWY
HLA-A1109,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1110,YYAMYRNNVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1111,YYAMYLQNVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1112,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1113,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1114,YYAMYQENVAQTDVDTLYIIYRDYTWARQAYRWY
HLA-A1115,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1116,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1117,YYAMYQENMAHTDANTLYIIYRDYTWAAQAYRWY
HLA-A1118,YYAMYQENVAHTHVDTLYIIYRDYTWAAQAYRWY
HLA-A1119,YYAMYQENVAHTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1120,YYAMYQENVAQTDVDTLYIIYRDYTWAEQAYRWY
HLA-A1121,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1122,YYAMYQENVAQTDVDTLYIIYPDYTWAAQAYRWY
HLA-A1123,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1124,YYAMYQENVAQTDVDTLYIIYRDYTWAALAYRWY
HLA-A1125,YYAMYQENVAQTDVDTLYIIYRDYTWAELAYRWY
HLA-A1126,YYAMYQENVAQTDVDTLYIMYRDYTWAAQAYRWY
HLA-A1127,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYTGY
HLA-A1128,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1129,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1130,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A1131,YYAMYQENVAQTDVDTLYIIYRDYTWAVLAYRWY
HLA-A1132,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:01,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:02,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:03,YYAMYQENVAQTDVDTLYIIYRDYTWAEQAYRWY
HLA-A11:04,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYTWY
HLA-A11:05,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:06,YYAMYQENVAQTHVDTLYIIYRDYTWAAQAYRWY
HLA-A11:07,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:08,YYAMYQENVAQTDVDTLYIIYRDYTWAERAYRWY
HLA-A11:09,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:10,YYAMYRNNVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:100,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:101,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:102,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:103,YYAMYRENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:104,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:105,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:106,YYAMYQEKVVHTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:107,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:108,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:11,YYAMYLQNVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:110,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:111,YYAMYQENVAQTDEDTLYIIYRDYTWAAQAYRWY
HLA-A11:112,YYAMYQENVAQTDVDTLYIIYRDYTWAAQVYRWY
HLA-A11:113,YYAMYQENVAQTDVDTLYIIYEHYTWAAQAYRWY
HLA-A11:114,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:116,YYAMYQENVAQTDVDTLYIIYQDYTWAAQAYRWY
HLA-A11:117,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:118,YYAMYQENVAQTDVDTLYIMYRDYTWAAQAYRWY
HLA-A11:119,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:12,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:120,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:121,YYAMYGEKVAHTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:122,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:123,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:124,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:125,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:126,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:128,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:129,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:13,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:130,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYTWY
HLA-A11:131,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:132,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:133,YYSMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:134,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:135,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:136,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:138,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:139,YYAMYQENVAQTDVDTLYLMFRDYTWAAQAYRWY
HLA-A11:14,YYAMYQENVAQTDVDTLYIIYRDYTWARQAYRWY
HLA-A11:140,YYAMYQENVAQTDVDTLYIIYQDYTWAAQAYRWY
HLA-A11:141,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:142,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:143,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:144,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:145,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:146,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:147,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:148,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:149,YYAMYQENVAQTDVDTLSIIYRDYTWAAQAYRWY
HLA-A11:15,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:150,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:151,YYAMYQENVAQTDVDTLYIISRDYTWAAQAYRWY
HLA-A11:152,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:153,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:154,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:155,YYAMYQENVAQTDVDTLYIIYRDYTWVAQAYRWY
HLA-A11:156,YYAMYQDNVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:157,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRRY
HLA-A11:158,YYAMYQENVAQTDVDTLYIIYRDYTWAVLAYRWY
HLA-A11:159,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:16,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:160,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:161,YYAMYQENVAQADVDTLYIIYRDYTWAAQAYRWY
HLA-A11:162,YYAMYQENVAQTDVDTLYIIYEHYTWAAQAYRWY
HLA-A11:163,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:164,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:165,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:166,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:167,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:168,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:169,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:17,YYAMYQENMAHTDANTLYIIYRDYTWAAQAYRWY
HLA-A11:171,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:172,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:173,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:174,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:175,YYAMYQENVAQTDVDTLYIIYRDYTWAEQAYRWY
HLA-A11:176,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:177,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:178,YYAMYQENVAHTDENIAYIIYRDYTWAAQAYRWY
HLA-A11:179,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:18,YYAMYQENVAHTHVDTLYIIYRDYTWAAQAYRWY
HLA-A11:181,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:183,YYAMYQENVAQTDVDTLYIIYRDYTWAVWAYRWY
HLA-A11:184,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:185,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:186,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:187,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:188,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:189,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:19,YYAMYQENVAHTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:190,YYAMYQENVAQTDENIAYIIYRDYTWAAQAYRWY
HLA-A11:191,YYAMYQENVAQTDVDTLYIIYRDYTWAEWAYRWY
HLA-A11:192,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:193,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:194,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:195,YYAMYQENVAQTDVDTLYIIYRDYTWGAQAYRWY
HLA-A11:196,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:197,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:198,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:199,YFAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:20,YYAMYQENVAQTDVDTLYIIYRDYTWAEQAYRWY
HLA-A11:200,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:201,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:202,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:203,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:204,YYAMYQENVAQTDVDTLYIIYRDYTWAAEAYRWY
HLA-A11:205,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:206,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:207,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:209,YYAMYQENVAQTDVDTLYIIYRDYTWAVQAYTGY
HLA-A11:211,YYAMYQENVAQTDVDTLYIIYRDYTWAARVYRWY
HLA-A11:212,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:213,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:214,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:216,YYAMYQENVAQTDVDTLYIIYWDYTWAAQAYRWY
HLA-A11:217,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:218,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:219,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:22,YYAMYQENVAQTDVDTLYIIYPDYTWAAQAYRWY
HLA-A11:220,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWD
HLA-A11:221,YHAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:222,YFAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:223,YYAMYQENVAQTDANTLYIIYRDYTWAAQAYRWY
HLA-A11:224,YYAMYQEKVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:225,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:226,YYAMYQENVAQTDVDTLYIIYRDYTWVARVYRWY
HLA-A11:227,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:228,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:229,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:23,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:230,YCAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:231,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:232,YYAMYQENVAQTDVDTLYIIYRDCTWAAQAYRWY
HLA-A11:233,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:234,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:236,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:237,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:239,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:24,YYAMYQENVAQTDVDTLYIIYRDYTWAALAYRWY
HLA-A11:240,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:241,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:242,YYAMYQENVAQTDVDTLYITYRDYTWAAQAYRWY
HLA-A11:243,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:244,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:245,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:246,YYAMYQENVAQTDVDALYIIYRDYTWAAQAYRWY
HLA-A11:247,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:248,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:249,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:25,YYAMYQENVAQTDVDTLYIIYRDYTWAELAYRWY
HLA-A11:250,YYAMYQENVAHIDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:252,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:253,YYAMYQENVAQTDVATLYIIYRDYTWAAQAYRWY
HLA-A11:254,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:255,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:257,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:258,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:259,YYAMYQENVAQTDVDSLYIIYRDYTWAAQAYRWY
HLA-A11:26,YYAMYQENVAQTDVDTLYIMYRDYTWAAQAYRWY
HLA-A11:260,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:261,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:262,YYAEYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:263,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:264,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYGWY
HLA-A11:265,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:266,YYAIYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:267,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:268,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:269,YYAMYQENVAQTDVDTLYIIYRDYTWAAWAYRWY
HLA-A11:27,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYTGY
HLA-A11:270,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:271,YYAMYQENVAQTDANTLYIIYRDYTWVARVYRGY
HLA-A11:273,YYAMYQENVAQTDVDTLYIIYRSYTWAAQAYRWY
HLA-A11:274,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:275,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:276,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:277,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:278,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:279,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:280,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:281,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:282,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:283,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:284,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:285,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:286,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:288,YYAMYQENVAQTDVDTLYVRYRDYTWAAQAYRWY
HLA-A11:289,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:29,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:290,YYAMYQENVAQTDVDTLYIIYRDYTWARRVYRWY
HLA-A11:291,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:292,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:293,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:294,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:295,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:296,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:297,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:298,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:299,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:30,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:300,YYAMYQENVAQTDVDTLYIIYRDYTWTAQAYRWY
HLA-A11:301,YYAMYQENVTQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:302,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:303,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:304,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:305,YYAMYQENVAQNDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:306,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:307,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:308,YYAMYQENVAHTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:309,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:31,YYAMYQENVAQTDVDTLYIIYRDYTWAVLAYRWY
HLA-A11:311,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:312,YYAMYQENVAHTHVDTLYIIYRDYTWAAQAYRWY
HLA-A11:32,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:33,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:34,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:35,YYAMYQENVAQTDVDTLYIIYRDYTWAVLAYTWY
HLA-A11:36,YYAMYQENVAQTDVDTLYIICRDYTWAAQAYRWY
HLA-A11:37,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:38,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRGY
HLA-A11:39,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRGY
HLA-A11:40,YYAMYQENVAHTDANTLYIIYRDYTWAAQAYRWY
HLA-A11:41,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:42,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:43,YTAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:44,YYAMYQENVAQTDVDTLYIIYRDYTWAARAYRWY
HLA-A11:45,YYAMYQENVAQTDADTLYIIYRDYTWAAQAYRWY
HLA-A11:46,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:47,YHAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:48,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:49,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:51,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:53,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:54,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:55,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:56,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:57,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:58,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:59,YYAMYQENVAQTDVDTLYIIYGDYTWAAQAYRWY
HLA-A11:60,YYAMYQENVAQTDVDTLYIIYRDYTWAVQAYRWY
HLA-A11:61,YYAMYQENAAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:62,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:63,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:64,YYAMYQENVAQTDVDTLHIIYRDYTWAAQAYRWY
HLA-A11:65,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:66,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:67,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:68,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:70,YYAMYGENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:71,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:72,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:73,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:74,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:75,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:76,YYAMYQENVAQTDVDTLYIIYRDYTRAAQAYRWY
HLA-A11:77,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:79,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:80,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:81,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:82,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:83,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:84,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:85,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:86,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:87,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:88,YSAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:89,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:90,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYEWY
HLA-A11:91,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:92,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:93,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:94,YYAMYQENVAQTDVDTLYIIYRDYTWAARVYRGY
HLA-A11:95,YYAMHQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:96,YYAMYQENVSQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:97,YYAMYQENVAQTDVDTLYIIYRDYTWAAQAYRWY
HLA-A11:98,YYAMYQENVAHIDVDTLYIIYRDYTWAAQAYRWY
HLA-A2301,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2302,YSAMYEEKVAHTDENIAYLMFHYYTWAVWAYTGY
HLA-A2303,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2304,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTWY
HLA-A2305,CSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2306,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2307,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2309,YSAMYQENMAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2310,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYRGY
HLA-A2312,YSAMYEEKVAHTHENIAYLMFHYYTWAVLAYTGY
HLA-A2313,YSAMYEEKVAQTDENIAYLMFHYYTWAVLAYTGY
HLA-A2314,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2315,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2316,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:01,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:02,YSAMYEEKVAHTDENIAYLMFHYYTWAVWAYTGY
HLA-A23:03,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:04,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTWY
HLA-A23:05,CSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:06,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:09,YSAMYQENMAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:10,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYRGY
HLA-A23:12,YSAMYEEKVAHTHENIAYLMFHYYTWAVLAYTGY
HLA-A23:13,YSAMYEEKVAQTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:14,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:15,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:16,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:17,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:18,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:20,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:21,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:22,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:23,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:24,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:25,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:26,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:27,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:28,YSAMYQEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:29,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:30,YSAMYEEKVAHTDENIAYLMFHCYTWAVLAYTGY
HLA-A23:31,YSAMYEEKVAHTDENIAYLMFDDYTWAVLAYTGY
HLA-A23:32,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:33,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:34,YSAMYEEKVAHTDENIAYLMFHYYTWAVVAYTGY
HLA-A23:35,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:36,YSAMYEEKVAHTDESIAYLMFHYYTWAVLAYTGY
HLA-A23:37,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:39,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:40,YSAMYEEKVAHTDANIAYLMFHYYTWAVLAYTGY
HLA-A23:41,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:42,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:43,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTRY
HLA-A23:44,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:45,YSAMYEEKVAHTDENIAYLMFQDYTWAVLAYTGY
HLA-A23:46,YSAMYEEKVAHTDENIAYLMFEHYTWAVLAYTGY
HLA-A23:47,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:48,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:49,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:50,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:51,YSAMYEENVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:52,YSAMYEEKVAHTDENIAYLMFDYYTWAVLAYTGY
HLA-A23:53,YSAMYEEKVAHTDENIAYLMFRDYTWAVLAYTGY
HLA-A23:54,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:55,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:56,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:57,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:58,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:59,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:60,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:61,YSAMYKEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:62,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:63,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:64,YFAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:65,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:66,YSAMYEEKVAHTDENIAYLMFHYYTWAVWAYTGY
HLA-A23:67,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:68,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:70,YSAMYEEKVAHTDENIAYLMFRDYTWAVLAYTGY
HLA-A23:71,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:72,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:73,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:74,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:75,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:76,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:77,YSAMCEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:78,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:79,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:80,YSAMYGEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:81,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:82,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:83,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTWY
HLA-A23:85,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:86,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:87,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:88,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A23:89,YSAMYEEKVAHTDENIAYLMFHYCTWAVLAYTGY
HLA-A23:90,YSAMYEEKVAHTDENIAHLMFHYYTWAVLAYTGY
HLA-A23:92,YSAMYEEKVAHTDENIAYLMFHYYTWAVLAYTGY
HLA-A2402,YSAMYEEKVAHTDENIAYLMFHYYTWAVQAYTGY
HLA-A2403,YSAMYEEKVAHTDENIAYLMFHYYTWAVQAYTWY
HLA-A2404,YSAMYEEKVAHTDANTLYLMFHYYTWAVQAYTGY
HLA-A2405,YSAMYEEKVAHTDENIAYLMFHYYTWAVQAYTGY
HLA-A2406,YSAMYEEKVAHTDENIAYLMFHYYTWAVWAYTGY
HLA-A2407,YSAMYEEKVAQTDENIAYLMFHYYTWAVQAYTGY
HLA-A2408,YSAMYGEKVAHTDENIAYLMFHYYTWAVQAYTGY
HLA-A2409,YSAMYEEKVAHTDENIAYLMFHYYTWAVQAYTGY
HLA-A2410,YSAMYEEKVAHTDENIAYLMFHYYTWAVQAYRWY
HLA-A2411,YSAMYEE
Download .txt
gitextract_be3gthnp/

├── .dockerignore
├── .github/
│   └── workflows/
│       ├── build.yml
│       ├── ci.yml
│       ├── release.yml
│       └── release_testpypi.yml
├── .gitignore
├── AGENTS.md
├── CONTRIBUTING.md
├── Dockerfile
├── LICENSE
├── NOTES.md
├── README.md
├── TODO.md
├── code-of-conduct.md
├── compatibility_check/
│   └── figures/
│       └── summary.csv
├── develop.sh
├── docs/
│   ├── Makefile
│   ├── README.md
│   ├── api.rst
│   ├── commandline_tools.rst
│   ├── commandline_tutorial.rst
│   ├── conf.py
│   ├── doctest.sh
│   ├── example.fasta
│   ├── index.rst
│   ├── intro.rst
│   ├── python_tutorial.rst
│   └── requirements.txt
├── downloads-generation/
│   ├── README.md
│   ├── allele_sequences/
│   │   ├── GENERATE.sh
│   │   ├── class1_pseudosequences.csv
│   │   ├── filter_sequences.py
│   │   ├── make_allele_sequences.py
│   │   └── select_alleles_to_disambiguate.py
│   ├── analysis_predictor_info/
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── generate_artifacts.py
│   │   ├── generate_model_selection_with_decoys.py
│   │   ├── predict_on_model_selection_data.py
│   │   └── requirements.txt
│   ├── data_curated/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── curate.py
│   │   ├── curate_ms_by_pmid.py
│   │   └── requirements.txt
│   ├── data_evaluation/
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── join_with_precomputed.py
│   │   ├── make_benchmark.py
│   │   └── split_by_sample.py
│   ├── data_iedb/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── data_mass_spec_annotated/
│   │   ├── GENERATE.sh
│   │   ├── annotate.py
│   │   └── requirements.txt
│   ├── data_predictions/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.gpu.lsf
│   │   ├── cluster_submit_script_header.mssm_hpc.nogpu.lsf
│   │   ├── requirements.txt
│   │   ├── run_predictors.py
│   │   ├── write_allele_list.py
│   │   └── write_proteome_peptides.py
│   ├── data_published/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── data_references/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── process.py
│   │   └── requirements.txt
│   ├── data_systemhcatlas/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── models_class1/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   └── write_validation_data.py
│   ├── models_class1_kim_benchmark/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── class1_pseudosequences.csv
│   │   ├── curate.py
│   │   ├── generate_hyperparameters.py
│   │   └── write_validation_data.py
│   ├── models_class1_minimal/
│   │   ├── GENERATE.sh
│   │   └── README.md
│   ├── models_class1_pan/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── additional_alleles.txt
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── generate_hyperparameters.py
│   │   └── reassign_mass_spec_training_data.py
│   ├── models_class1_pan_variants/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.gpu.lsf
│   │   ├── exclude_data_from_training.py
│   │   └── generate_hyperparameters.py
│   ├── models_class1_presentation/
│   │   ├── GENERATE.sh
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   └── make_train_data.py
│   ├── models_class1_processing/
│   │   ├── GENERATE.WITH_HPC_CLUSTER.sh
│   │   ├── GENERATE.sh
│   │   ├── annotate_hits_with_expression.py
│   │   ├── cluster_submit_script_header.mssm_hpc.lsf
│   │   ├── generate_hyperparameters.base.py
│   │   ├── generate_hyperparameters.variants.py
│   │   └── make_train_data.py
│   ├── models_class1_selected_no_mass_spec/
│   │   └── GENERATE.sh
│   ├── models_class1_trained_with_mass_spec/
│   │   └── GENERATE.sh
│   ├── models_class1_unselected/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── class1_pseudosequences.csv
│   │   └── generate_hyperparameters.py
│   ├── models_class1_unselected_with_mass_spec/
│   │   ├── GENERATE.sh
│   │   ├── README.md
│   │   ├── class1_pseudosequences.csv
│   │   └── generate_hyperparameters.py
│   └── random_peptide_predictions/
│       ├── GENERATE.sh
│       └── random_predictions.py
├── lint.sh
├── mhcflurry/
│   ├── __init__.py
│   ├── allele_encoding.py
│   ├── amino_acid.py
│   ├── calibrate_percentile_ranks_command.py
│   ├── class1_affinity_predictor.py
│   ├── class1_neural_network.py
│   ├── class1_presentation_predictor.py
│   ├── class1_processing_neural_network.py
│   ├── class1_processing_predictor.py
│   ├── cluster_parallelism.py
│   ├── cluster_worker_entry_point.py
│   ├── common.py
│   ├── custom_loss.py
│   ├── data_dependent_weights_initialization.py
│   ├── downloads.py
│   ├── downloads.yml
│   ├── downloads_command.py
│   ├── encodable_sequences.py
│   ├── ensemble_centrality.py
│   ├── fasta.py
│   ├── flanking_encoding.py
│   ├── hyperparameters.py
│   ├── local_parallelism.py
│   ├── percent_rank_transform.py
│   ├── predict_command.py
│   ├── predict_scan_command.py
│   ├── pytorch_layers.py
│   ├── pytorch_losses.py
│   ├── random_negative_peptides.py
│   ├── regression_target.py
│   ├── scoring.py
│   ├── select_allele_specific_models_command.py
│   ├── select_pan_allele_models_command.py
│   ├── select_processing_models_command.py
│   ├── testing_utils.py
│   ├── train_allele_specific_models_command.py
│   ├── train_pan_allele_models_command.py
│   ├── train_presentation_models_command.py
│   ├── train_processing_models_command.py
│   └── version.py
├── notebooks/
│   ├── example1.ipynb
│   └── mhcflurry-colab.ipynb
├── pylintrc
├── readthedocs.yml
├── requirements.txt
├── scripts/
│   ├── compare_tf_pytorch_random_outputs.py
│   ├── cross_allele_parity_analysis.py
│   ├── extract_high_presentation_fixture.py
│   ├── generate_fixture_error_report.py
│   ├── modal_train_mhcflurry.py
│   ├── plot_fixture_diffs.py
│   └── validate_allele_sequences.py
├── selected-peptides.csv
├── setup.py
├── setup_local_env.sh
├── test/
│   ├── __init__.py
│   ├── conftest.py
│   ├── data/
│   │   ├── data_10mer.csv
│   │   ├── data_8mer.csv
│   │   ├── data_9mer.csv
│   │   ├── example.fasta
│   │   ├── hpv_predictions.csv
│   │   ├── master_affinity_fixture_config.json
│   │   ├── master_affinity_fixture_predictions.json
│   │   ├── master_affinity_fixture_weights.npz
│   │   ├── master_densenet_fixture_config.json
│   │   ├── master_densenet_fixture_predictions.json
│   │   ├── master_densenet_fixture_weights.npz
│   │   ├── master_multi_output_fixture_config.json
│   │   ├── master_multi_output_fixture_predictions.json
│   │   ├── master_multi_output_fixture_weights.npz
│   │   ├── master_pan_concat_fixture_config.json
│   │   ├── master_pan_concat_fixture_predictions.json
│   │   ├── master_pan_concat_fixture_weights.npz
│   │   ├── master_pan_multiply_fixture_config.json
│   │   ├── master_pan_multiply_fixture_predictions.json
│   │   ├── master_pan_multiply_fixture_weights.npz
│   │   ├── master_released_class1_affinity_predictions.json
│   │   ├── master_released_class1_presentation_highscore_rows_metadata.json
│   │   ├── multiallelic.benchmark.small.csv.bz2
│   │   └── multiallelic_ms.benchmark1.csv.bz2
│   ├── expensive_verify_pretrain_optimizable.py
│   ├── pytest_helpers.py
│   ├── test_allele_encoding.py
│   ├── test_amino_acid.py
│   ├── test_api_compat_shims.py
│   ├── test_calibrate_percentile_ranks_command.py
│   ├── test_changing_allele_representations.py
│   ├── test_class1_affinity_predictor.py
│   ├── test_class1_neural_network.py
│   ├── test_class1_pan.py
│   ├── test_class1_presentation_predictor.py
│   ├── test_class1_processing_neural_network.py
│   ├── test_class1_processing_predictor.py
│   ├── test_custom_loss.py
│   ├── test_doctest.py
│   ├── test_download_models_class1.py
│   ├── test_ensemble_centrality.py
│   ├── test_hyperparameters.py
│   ├── test_local_parallelism.py
│   ├── test_master_compat_predictions.py
│   ├── test_multi_output.py
│   ├── test_network_merging.py
│   ├── test_percent_rank_transform.py
│   ├── test_predict_command.py
│   ├── test_predict_scan_command.py
│   ├── test_pytorch_coverage.py
│   ├── test_pytorch_regressions.py
│   ├── test_random_negative_peptides.py
│   ├── test_regression_target.py
│   ├── test_released_master_predictions.py
│   ├── test_released_predictors_on_hpv_dataset.py
│   ├── test_released_predictors_well_correlated.py
│   ├── test_released_presentation_highscore_rows.py
│   ├── test_selected_peptides_csv.py
│   ├── test_speed.py
│   ├── test_train_and_related_commands.py
│   ├── test_train_pan_allele_models_command.py
│   ├── test_train_processing_models_command.py
│   └── test_training_variants.py
└── test-environment.yml
Download .txt
SYMBOL INDEX (794 symbols across 112 files)

FILE: downloads-generation/allele_sequences/filter_sequences.py
  function run (line 26) | def run():

FILE: downloads-generation/allele_sequences/make_allele_sequences.py
  function normalize_allele_name_optional (line 37) | def normalize_allele_name_optional(s):
  function run (line 40) | def run():

FILE: downloads-generation/allele_sequences/select_alleles_to_disambiguate.py
  function run (line 27) | def run():

FILE: downloads-generation/analysis_predictor_info/generate_artifacts.py
  function run (line 106) | def run():
  function do_job (line 246) | def do_job(tasks, constant_data=GLOBAL_DATA):

FILE: downloads-generation/analysis_predictor_info/generate_model_selection_with_decoys.py
  function run (line 41) | def run():

FILE: downloads-generation/analysis_predictor_info/predict_on_model_selection_data.py
  function do_predict (line 57) | def do_predict(predictor, key, sub_df, constant_data=None):
  function run (line 69) | def run():

FILE: downloads-generation/data_curated/curate.py
  function normalize_allele_name_or_return_unknown (line 15) | def normalize_allele_name_or_return_unknown(s):
  function load_data_kim2014 (line 88) | def load_data_kim2014(filename):
  function load_data_systemhc_atlas (line 110) | def load_data_systemhc_atlas(filename, min_probability=0.99):
  function load_data_iedb (line 139) | def load_data_iedb(iedb_csv, include_qualitative=True):
  function load_data_additional_ms (line 252) | def load_data_additional_ms(filename):
  function run (line 276) | def run():

FILE: downloads-generation/data_curated/curate_ms_by_pmid.py
  function normalize_allele_name_or_return_unknown (line 17) | def normalize_allele_name_or_return_unknown(s):
  function load (line 61) | def load(filenames, **kwargs):
  function debug (line 74) | def debug(*filenames):
  function handle_pmid_27600516 (line 80) | def handle_pmid_27600516(filename):
  function handle_pmid_23481700 (line 115) | def handle_pmid_23481700(filename):
  function handle_pmid_24616531 (line 154) | def handle_pmid_24616531(filename):
  function handle_pmid_25576301 (line 177) | def handle_pmid_25576301(filename):
  function handle_pmid_26992070 (line 246) | def handle_pmid_26992070(*filenames):
  function handle_pmid_27412690 (line 307) | def handle_pmid_27412690(filename):
  function handle_pmid_28832583 (line 359) | def handle_pmid_28832583(*filenames):
  function handle_pmid_31495665 (line 470) | def handle_pmid_31495665(filename):
  function handle_pmid_27869121 (line 656) | def handle_pmid_27869121(filename):
  function handle_pmid_31154438 (line 711) | def handle_pmid_31154438(*filenames):
  function handle_pmid_31844290 (line 767) | def handle_pmid_31844290(*filenames):
  function make_expression_groups (line 865) | def make_expression_groups(dataset_identifier, df, groups):
  function handle_expression_GSE113126 (line 877) | def handle_expression_GSE113126(*filenames):
  function handle_expression_expression_atlas_22460905 (line 900) | def handle_expression_expression_atlas_22460905(filename):
  function handle_expression_human_protein_atlas (line 939) | def handle_expression_human_protein_atlas(*filenames):
  function make_expression_mixtures (line 995) | def make_expression_mixtures(expression_df):
  function run (line 1027) | def run():

FILE: downloads-generation/data_evaluation/join_with_precomputed.py
  function load_results (line 31) | def load_results(dirname, result_df=None, columns=None):
  function run (line 67) | def run():

FILE: downloads-generation/data_evaluation/make_benchmark.py
  function run (line 63) | def run():

FILE: downloads-generation/data_evaluation/split_by_sample.py
  function run (line 30) | def run():

FILE: downloads-generation/data_mass_spec_annotated/annotate.py
  function run (line 51) | def run():

FILE: downloads-generation/data_predictions/run_predictors.py
  function load_results (line 97) | def load_results(dirname, result_df=None, dtype="float32"):
  function run (line 140) | def run(argv=sys.argv[1:]):
  function do_predictions_mhctools (line 361) | def do_predictions_mhctools(work_item_dicts, constant_data=None):
  function do_predictions_mhcflurry (line 444) | def do_predictions_mhcflurry(work_item_dicts, constant_data=None):

FILE: downloads-generation/data_predictions/write_allele_list.py
  function run (line 24) | def run():

FILE: downloads-generation/data_predictions/write_proteome_peptides.py
  function run (line 51) | def run():

FILE: downloads-generation/data_references/process.py
  function run (line 41) | def run():

FILE: downloads-generation/models_class1/write_validation_data.py
  function run (line 49) | def run(argv):

FILE: downloads-generation/models_class1_kim_benchmark/curate.py
  function normalize_allele_name_or_return_unknown (line 13) | def normalize_allele_name_or_return_unknown(s):
  function load_data_kim2014 (line 71) | def load_data_kim2014(filename):
  function load_data_systemhc_atlas (line 92) | def load_data_systemhc_atlas(filename, min_probability=0.99):
  function load_data_abelin_mass_spec (line 120) | def load_data_abelin_mass_spec(filename):
  function load_data_iedb (line 143) | def load_data_iedb(iedb_csv, include_qualitative=True, include_mass_spec...
  function run (line 232) | def run():

FILE: downloads-generation/models_class1_kim_benchmark/write_validation_data.py
  function run (line 49) | def run(argv):

FILE: downloads-generation/models_class1_pan/reassign_mass_spec_training_data.py
  function go (line 21) | def go(args):

FILE: downloads-generation/models_class1_pan_variants/exclude_data_from_training.py
  function normalize_allele_name_or_return_unknown (line 13) | def normalize_allele_name_or_return_unknown(s):
  function load_30377561 (line 49) | def load_30377561(filename):
  function go (line 83) | def go(args):

FILE: downloads-generation/models_class1_pan_variants/generate_hyperparameters.py
  function transform_to_single_hidden (line 26) | def transform_to_single_hidden(hyperparameters):
  function transform_to_no_pretrain (line 35) | def transform_to_no_pretrain(hyperparameters):
  function transform_to_compact_peptide (line 41) | def transform_to_compact_peptide(hyperparameters):

FILE: downloads-generation/models_class1_presentation/make_train_data.py
  function run (line 63) | def run():

FILE: downloads-generation/models_class1_processing/annotate_hits_with_expression.py
  function run (line 33) | def run():

FILE: downloads-generation/models_class1_processing/generate_hyperparameters.base.py
  function hyperparrameters_grid (line 26) | def hyperparrameters_grid():

FILE: downloads-generation/models_class1_processing/generate_hyperparameters.variants.py
  function transform (line 26) | def transform(kind, hyperparameters):

FILE: downloads-generation/models_class1_processing/make_train_data.py
  function do_process_samples (line 79) | def do_process_samples(samples, constant_data=None):
  function run (line 166) | def run():

FILE: downloads-generation/random_peptide_predictions/random_predictions.py
  function run (line 25) | def run():

FILE: mhcflurry/allele_encoding.py
  class AlleleEncoding (line 6) | class AlleleEncoding(object):
    method __init__ (line 7) | def __init__(self, alleles=None, allele_to_sequence=None, borrow_from=...
    method compact (line 73) | def compact(self):
    method allele_representations (line 89) | def allele_representations(self, encoding_name):
    method fixed_length_vector_encoded_sequences (line 121) | def fixed_length_vector_encoded_sequences(self, encoding_name):

FILE: mhcflurry/amino_acid.py
  function available_vector_encodings (line 82) | def available_vector_encodings():
  function vector_encoding_length (line 94) | def vector_encoding_length(name):
  function index_encoding (line 109) | def index_encoding(sequences, letter_to_index_dict):
  function fixed_vectors_encoding (line 137) | def fixed_vectors_encoding(index_encoded_sequences, letter_to_vector_df):

FILE: mhcflurry/calibrate_percentile_ranks_command.py
  function run (line 130) | def run(argv=sys.argv[1:]):
  function run_class1_presentation_predictor (line 173) | def run_class1_presentation_predictor(args, peptides):
  function run_class1_affinity_predictor (line 233) | def run_class1_affinity_predictor(args, peptides):
  function do_class1_affinity_calibrate_percentile_ranks (line 365) | def do_class1_affinity_calibrate_percentile_ranks(
  function class1_affinity_calibrate_percentile_ranks (line 383) | def class1_affinity_calibrate_percentile_ranks(

FILE: mhcflurry/class1_affinity_predictor.py
  class Class1AffinityPredictor (line 40) | class Class1AffinityPredictor(object):
    method __init__ (line 50) | def __init__(
    method manifest_df (line 128) | def manifest_df(self):
    method clear_cache (line 162) | def clear_cache(self):
    method neural_networks (line 178) | def neural_networks(self):
    method merge (line 193) | def merge(cls, predictors):
    method merge_in_place (line 230) | def merge_in_place(self, others):
    method canonicalize_allele_name (line 285) | def canonicalize_allele_name(self, raw_name):
    method supported_alleles (line 314) | def supported_alleles(self):
    method supported_peptide_lengths (line 330) | def supported_peptide_lengths(self):
    method check_consistency (line 350) | def check_consistency(self):
    method save (line 371) | def save(self, models_dir, model_names_to_write=None, write_metadata=T...
    method load (line 489) | def load(models_dir=None, max_models=None, optimization_level=None):
    method __repr__ (line 681) | def __repr__(self):
    method optimize (line 700) | def optimize(self, warn=True):
    method model_name (line 739) | def model_name(allele, num):
    method weights_path (line 761) | def weights_path(models_dir, model_name):
    method master_allele_encoding (line 777) | def master_allele_encoding(self):
    method fit_allele_specific_predictors (line 793) | def fit_allele_specific_predictors(
    method fit_class1_pan_allele_models (line 933) | def fit_class1_pan_allele_models(
    method add_pan_allele_model (line 1025) | def add_pan_allele_model(self, model, models_dir_for_save=None):
    method percentile_ranks (line 1052) | def percentile_ranks(self, affinities, allele=None, alleles=None, thro...
    method predict (line 1114) | def predict(
    method predict_to_dataframe (line 1164) | def predict_to_dataframe(
    method calibrate_percentile_ranks (line 1462) | def calibrate_percentile_ranks(
    method model_select (line 1614) | def model_select(

FILE: mhcflurry/class1_neural_network.py
  class Class1NeuralNetworkModel (line 44) | class Class1NeuralNetworkModel(nn.Module):
    method __init__ (line 49) | def __init__(
    method _initialize_weights (line 232) | def _initialize_weights(self, init):
    method forward (line 247) | def forward(self, inputs):
    method get_weights_list (line 342) | def get_weights_list(self):
    method set_weights_list (line 358) | def set_weights_list(self, weights, auto_convert_keras=True):
    method to_json (line 632) | def to_json(self):
  class Class1NeuralNetwork (line 686) | class Class1NeuralNetwork(object):
    method apply_hyperparameter_renames (line 796) | def apply_hyperparameter_renames(cls, hyperparameters):
    method __init__ (line 816) | def __init__(self, **hyperparameters):
    method clear_model_cache (line 836) | def clear_model_cache(klass):
    method borrow_cached_network (line 843) | def borrow_cached_network(klass, network_json, network_weights):
    method _parse_keras_json_config (line 880) | def _parse_keras_json_config(cls, config):
    method _create_model_from_config (line 1005) | def _create_model_from_config(cls, config, instance_hyperparameters=No...
    method _create_merged_model_from_config (line 1105) | def _create_merged_model_from_config(cls, config, instance_hyperparame...
    method model_cache_key (line 1164) | def model_cache_key(network_json):
    method keras_network_cache_key (line 1187) | def keras_network_cache_key(network_json):
    method network (line 1193) | def network(self, borrow=False):
    method update_network_description (line 1229) | def update_network_description(self):
    method get_config (line 1290) | def get_config(self):
    method from_config (line 1307) | def from_config(cls, config, weights=None, weights_loader=None):
    method load_weights (line 1347) | def load_weights(self):
    method get_weights (line 1355) | def get_weights(self):
    method get_weights_list (line 1368) | def get_weights_list(self):
    method set_weights_list (line 1379) | def set_weights_list(self, weights, auto_convert_keras=True):
    method __getstate__ (line 1404) | def __getstate__(self):
    method __setstate__ (line 1420) | def __setstate__(self, state):
    method peptides_to_network_input (line 1427) | def peptides_to_network_input(self, peptides):
    method supported_peptide_lengths (line 1448) | def supported_peptide_lengths(self):
    method allele_encoding_to_network_input (line 1463) | def allele_encoding_to_network_input(self, allele_encoding):
    method data_dependent_weights_initialization (line 1487) | def data_dependent_weights_initialization(network, x_dict=None, method...
    method _regularized_parameters (line 1510) | def _regularized_parameters(network):
    method _regularization_penalty (line 1524) | def _regularization_penalty(parameters, l1=0.0, l2=0.0):
    method get_device (line 1539) | def get_device(self):
    method fit_generator (line 1543) | def fit_generator(
    method _create_optimizer (line 1747) | def _create_optimizer(self, network):
    method fit (line 1768) | def fit(
    method predict (line 2171) | def predict(
    method merge (line 2254) | def merge(cls, models, merge_method="average"):
    method make_network (line 2294) | def make_network(
    method clear_allele_representations (line 2339) | def clear_allele_representations(self):
    method set_allele_representations (line 2356) | def set_allele_representations(self, allele_representations, force_sur...
    method _update_embedding (line 2383) | def _update_embedding(self, network, reshaped, force_surgery):
  class MergedClass1NeuralNetwork (line 2421) | class MergedClass1NeuralNetwork(nn.Module):
    method __init__ (line 2426) | def __init__(self, networks, merge_method="average"):
    method forward (line 2431) | def forward(self, inputs):
    method get_weights_list (line 2444) | def get_weights_list(self):
    method set_weights_list (line 2451) | def set_weights_list(self, weights, auto_convert_keras=False):

FILE: mhcflurry/class1_presentation_predictor.py
  class Class1PresentationPredictor (line 36) | class Class1PresentationPredictor(object):
    method __init__ (line 51) | def __init__(
    method supported_alleles (line 72) | def supported_alleles(self):
    method supported_peptide_lengths (line 79) | def supported_peptide_lengths(self):
    method supports_affinity_prediction (line 86) | def supports_affinity_prediction(self):
    method supports_processing_prediction (line 91) | def supports_processing_prediction(self):
    method supports_presentation_prediction (line 98) | def supports_presentation_prediction(self):
    method predict_affinity (line 105) | def predict_affinity(
    method predict_processing (line 264) | def predict_processing(
    method fit (line 323) | def fit(
    method get_model (line 398) | def get_model(self, name=None):
    method predict (line 425) | def predict(
    method predict_sequences (line 598) | def predict_sequences(
    method save (line 856) | def save(
    method load (line 935) | def load(cls, models_dir=None, max_models=None):
    method __repr__ (line 1014) | def __repr__(self):
    method percentile_ranks (line 1020) | def percentile_ranks(self, presentation_scores, throw=True):
    method calibrate_percentile_ranks (line 1044) | def calibrate_percentile_ranks(self, scores, bins=None):

FILE: mhcflurry/class1_processing_neural_network.py
  class Class1ProcessingModel (line 20) | class Class1ProcessingModel(nn.Module):
    method __init__ (line 25) | def __init__(
    method forward (line 117) | def forward(self, inputs):
    method _process_n_flank (line 170) | def _process_n_flank(self, conv_result, peptide_length):
    method _process_c_flank (line 204) | def _process_c_flank(self, conv_result, peptide_length):
    method _max_pool_over_peptide_n (line 237) | def _max_pool_over_peptide_n(self, x, peptide_length):
    method _max_pool_over_peptide_c (line 262) | def _max_pool_over_peptide_c(self, x, peptide_length):
    method _extract_c_cleavage (line 284) | def _extract_c_cleavage(self, x, peptide_length):
    method _extract_c_flank_avg (line 294) | def _extract_c_flank_avg(self, conv_result, peptide_length):
    method _extract_n_flank_avg (line 318) | def _extract_n_flank_avg(self, conv_result):
    method get_weights_list (line 336) | def get_weights_list(self):
    method set_weights_list (line 345) | def set_weights_list(self, weights, auto_convert_keras=True):
    method _reorder_keras_weights (line 399) | def _reorder_keras_weights(self, weights):
  class Class1ProcessingNeuralNetwork (line 452) | class Class1ProcessingNeuralNetwork(object):
    method __init__ (line 513) | def __init__(self, **hyperparameters):
    method sequence_lengths (line 523) | def sequence_lengths(self):
    method get_device (line 540) | def get_device(self):
    method network (line 544) | def network(self):
    method _regularized_parameters (line 568) | def _regularized_parameters(network):
    method _regularization_penalty (line 582) | def _regularization_penalty(parameters, l1=0.0, l2=0.0):
    method update_network_description (line 597) | def update_network_description(self):
    method fit (line 607) | def fit(
    method _create_optimizer (line 835) | def _create_optimizer(self, network):
    method predict (line 855) | def predict(
    method predict_encoded (line 893) | def predict_encoded(
    method network_input (line 947) | def network_input(self, sequences, throw=True):
    method make_network (line 978) | def make_network(
    method __getstate__ (line 1016) | def __getstate__(self):
    method __setstate__ (line 1030) | def __setstate__(self, state):
    method get_weights (line 1036) | def get_weights(self):
    method get_config (line 1048) | def get_config(self):
    method from_config (line 1063) | def from_config(cls, config, weights=None):

FILE: mhcflurry/class1_processing_predictor.py
  class Class1ProcessingPredictor (line 23) | class Class1ProcessingPredictor(object):
    method __init__ (line 29) | def __init__(
    method sequence_lengths (line 60) | def sequence_lengths(self):
    method add_models (line 84) | def add_models(self, models):
    method manifest_df (line 122) | def manifest_df(self):
    method model_name (line 145) | def model_name(num):
    method weights_path (line 161) | def weights_path(models_dir, model_name):
    method predict (line 176) | def predict(
    method predict_to_dataframe (line 215) | def predict_to_dataframe(
    method predict_to_dataframe_encoded (line 245) | def predict_to_dataframe_encoded(
    method check_consistency (line 280) | def check_consistency(self):
    method save (line 294) | def save(self, models_dir, model_names_to_write=None, write_metadata=T...
    method load (line 361) | def load(cls, models_dir=None, max_models=None):
    method __repr__ (line 414) | def __repr__(self):

FILE: mhcflurry/cluster_parallelism.py
  function add_cluster_parallelism_args (line 25) | def add_cluster_parallelism_args(parser):
  function cluster_results_from_args (line 62) | def cluster_results_from_args(
  function cluster_results (line 106) | def cluster_results(
  function worker_entry_point (line 380) | def worker_entry_point(argv=sys.argv[1:]):

FILE: mhcflurry/common.py
  function normalize_allele_name (line 16) | def normalize_allele_name(
  function normalize_pytorch_backend (line 98) | def normalize_pytorch_backend(backend):
  function configure_pytorch (line 123) | def configure_pytorch(backend=None, gpu_device_nums=None, num_threads=No...
  function configure_tensorflow (line 154) | def configure_tensorflow(backend=None, gpu_device_nums=None, num_threads...
  function get_pytorch_device (line 190) | def get_pytorch_device():
  function configure_logging (line 226) | def configure_logging(verbose=False):
  function amino_acid_distribution (line 246) | def amino_acid_distribution(peptides, smoothing=0.0):
  function random_peptides (line 270) | def random_peptides(num, length=9, distribution=None):
  function positional_frequency_matrix (line 306) | def positional_frequency_matrix(peptides):
  function save_weights (line 333) | def save_weights(weights_list, filename):
  function load_weights (line 348) | def load_weights(filename):
  class NumpyJSONEncoder (line 366) | class NumpyJSONEncoder(json.JSONEncoder):
    method default (line 371) | def default(self, obj):

FILE: mhcflurry/custom_loss.py
  function get_loss (line 23) | def get_loss(name):
  class Loss (line 49) | class Loss(object):
    method __init__ (line 61) | def __init__(self, name=None):
    method __str__ (line 64) | def __str__(self):
    method loss (line 67) | def loss(self, y_true, y_pred):
    method get_keras_loss (line 70) | def get_keras_loss(self, reduction="sum_over_batch_size"):
  class StandardKerasLoss (line 83) | class StandardKerasLoss(Loss):
    method __init__ (line 90) | def __init__(self, loss_name="mse"):
    method encode_y (line 96) | def encode_y(y):
  class TransformPredictionsLossWrapper (line 100) | class TransformPredictionsLossWrapper(Loss):
    method __init__ (line 107) | def __init__(
    method encode_y (line 117) | def encode_y(self, *args, **kwargs):
    method loss (line 120) | def loss(self, y_true, y_pred):
  class MSEWithInequalities (line 125) | class MSEWithInequalities(Loss):
    method __init__ (line 154) | def __init__(self):
    method encode_y (line 158) | def encode_y(y, inequalities=None):
    method _max_value (line 162) | def _max_value(values):
    method loss (line 167) | def loss(self, y_true, y_pred):
  class MSEWithInequalitiesAndMultipleOutputs (line 175) | class MSEWithInequalitiesAndMultipleOutputs(Loss):
    method __init__ (line 200) | def __init__(self):
    method encode_y (line 204) | def encode_y(y, inequalities=None, output_indices=None):
    method loss (line 209) | def loss(self, y_true, y_pred):
  class MultiallelicMassSpecLoss (line 226) | class MultiallelicMassSpecLoss(Loss):
    method __init__ (line 234) | def __init__(self, delta=0.2, multiplier=1.0):
    method encode_y (line 240) | def encode_y(y):
    method loss (line 243) | def loss(self, y_true, y_pred):
  function check_shape (line 251) | def check_shape(name, arr, expected_shape):

FILE: mhcflurry/data_dependent_weights_initialization.py
  function svd_orthonormal (line 46) | def svd_orthonormal(shape):
  function get_activations_pytorch (line 72) | def get_activations_pytorch(model, layer_name, x_dict, device=None):
  function get_activations (line 133) | def get_activations(model, layer, X_batch):
  function lsuv_init (line 152) | def lsuv_init(model, batch, verbose=True, margin=0.1, max_iter=100):

FILE: mhcflurry/downloads.py
  function get_downloads_dir (line 34) | def get_downloads_dir():
  function get_current_release (line 41) | def get_current_release():
  function get_downloads_metadata (line 48) | def get_downloads_metadata():
  function get_default_class1_models_dir (line 60) | def get_default_class1_models_dir(test_exists=True):
  function get_default_class1_presentation_models_dir (line 91) | def get_default_class1_presentation_models_dir(test_exists=True):
  function get_default_class1_processing_models_dir (line 123) | def get_default_class1_processing_models_dir(test_exists=True):
  function get_current_release_downloads (line 157) | def get_current_release_downloads():
  function get_path (line 198) | def get_path(download_name, filename='', test_exists=True):
  function configure (line 228) | def configure():

FILE: mhcflurry/downloads_command.py
  function run (line 105) | def run(argv=sys.argv[1:]):
  function mkdir_p (line 122) | def mkdir_p(path):
  function yes_no (line 138) | def yes_no(boolean):
  class TqdmUpTo (line 143) | class TqdmUpTo(tqdm):
    method update_to (line 145) | def update_to(self, b=1, bsize=1, tsize=None):
  function fetch_subcommand (line 159) | def fetch_subcommand(args):
  function info_subcommand (line 273) | def info_subcommand(args):
  function path_subcommand (line 324) | def path_subcommand(args):
  function url_subcommand (line 331) | def url_subcommand(args):

FILE: mhcflurry/encodable_sequences.py
  class EncodingError (line 12) | class EncodingError(ValueError):
    method __init__ (line 16) | def __init__(self, message, supported_peptide_lengths):
  class EncodableSequences (line 23) | class EncodableSequences(object):
    method create (line 35) | def create(klass, sequences):
    method __init__ (line 45) | def __init__(self, sequences):
    method __len__ (line 60) | def __len__(self):
    method variable_length_to_fixed_length_categorical (line 63) | def variable_length_to_fixed_length_categorical(
    method variable_length_to_fixed_length_vector_encoding (line 111) | def variable_length_to_fixed_length_vector_encoding(
    method sequences_to_fixed_length_index_encoded_array (line 187) | def sequences_to_fixed_length_index_encoded_array(

FILE: mhcflurry/ensemble_centrality.py
  function _nanmean_no_warnings (line 10) | def _nanmean_no_warnings(log_values):
  function _nanmedian_no_warnings (line 22) | def _nanmedian_no_warnings(log_values):
  function robust_mean (line 33) | def robust_mean(log_values):

FILE: mhcflurry/fasta.py
  function read_fasta_to_dataframe (line 15) | def read_fasta_to_dataframe(filename, full_descriptions=False):
  class FastaParser (line 37) | class FastaParser(object):
    method __init__ (line 41) | def __init__(self):
    method iterate_over_file (line 45) | def iterate_over_file(self, fasta_path, full_descriptions=False):
    method _current_entry (line 85) | def _current_entry(self):
    method open_file (line 96) | def open_file(fasta_path):
    method _parse_header_id (line 106) | def _parse_header_id(line, full_description=False):

FILE: mhcflurry/flanking_encoding.py
  class FlankingEncoding (line 18) | class FlankingEncoding(object):
    method __init__ (line 31) | def __init__(self, peptides, n_flanks, c_flanks):
    method __len__ (line 51) | def __len__(self):
    method vector_encode (line 57) | def vector_encode(
    method encode (line 115) | def encode(

FILE: mhcflurry/hyperparameters.py
  class HyperparameterDefaults (line 7) | class HyperparameterDefaults(object):
    method __init__ (line 16) | def __init__(self, **defaults):
    method extend (line 19) | def extend(self, other):
    method with_defaults (line 36) | def with_defaults(self, obj):
    method subselect (line 49) | def subselect(self, obj):
    method check_valid_keys (line 59) | def check_valid_keys(self, obj):
    method models_grid (line 72) | def models_grid(self, **kwargs):

FILE: mhcflurry/local_parallelism.py
  function add_local_parallelism_args (line 22) | def add_local_parallelism_args(parser):
  function worker_pool_with_gpu_assignments_from_args (line 76) | def worker_pool_with_gpu_assignments_from_args(args):
  function worker_pool_with_gpu_assignments (line 101) | def worker_pool_with_gpu_assignments(
  function validate_worker_pool_args (line 165) | def validate_worker_pool_args(
  function worker_init_kwargs_for_scheduler (line 193) | def worker_init_kwargs_for_scheduler(
  function make_worker_pool (line 233) | def make_worker_pool(
  function worker_init_entry_point (line 309) | def worker_init_entry_point(
  function worker_init (line 329) | def worker_init(
  class WrapException (line 350) | class WrapException(Exception):
    method __init__ (line 355) | def __init__(self):
    method __str__ (line 359) | def __str__(self):
  function call_wrapped (line 363) | def call_wrapped(function, *args, **kwargs):
  function call_wrapped_kwargs (line 384) | def call_wrapped_kwargs(function, kwargs):

FILE: mhcflurry/percent_rank_transform.py
  class PercentRankTransform (line 8) | class PercentRankTransform(object):
    method __init__ (line 13) | def __init__(self):
    method fit (line 17) | def fit(self, values, bins):
    method transform (line 39) | def transform(self, values):
    method to_series (line 54) | def to_series(self):
    method from_series (line 69) | def from_series(series):

FILE: mhcflurry/predict_command.py
  function run (line 176) | def run(argv=sys.argv[1:]):

FILE: mhcflurry/predict_scan_command.py
  function parse_peptide_lengths (line 185) | def parse_peptide_lengths(value):
  function run (line 202) | def run(argv=sys.argv[1:]):

FILE: mhcflurry/pytorch_layers.py
  function get_activation (line 9) | def get_activation(name):
  class LocallyConnected1D (line 36) | class LocallyConnected1D(nn.Module):
    method __init__ (line 57) | def __init__(self, in_channels, out_channels, input_length, kernel_size,
    method forward (line 82) | def forward(self, x):

FILE: mhcflurry/pytorch_losses.py
  class MSEWithInequalities (line 14) | class MSEWithInequalities(nn.Module):
    method encode_y (line 27) | def encode_y(y, inequalities=None):
    method forward (line 61) | def forward(self, y_pred, y_true, sample_weights=None):
  class MSEWithInequalitiesAndMultipleOutputs (line 108) | class MSEWithInequalitiesAndMultipleOutputs(nn.Module):
    method encode_y (line 119) | def encode_y(y, inequalities=None, output_indices=None):
    method forward (line 149) | def forward(self, y_pred, y_true, sample_weights=None):
  class MultiallelicMassSpecLoss (line 208) | class MultiallelicMassSpecLoss(nn.Module):
    method __init__ (line 223) | def __init__(self, delta=0.2, multiplier=1.0):
    method encode_y (line 229) | def encode_y(y):
    method forward (line 235) | def forward(self, y_pred, y_true, sample_weights=None):
  class StandardLoss (line 288) | class StandardLoss(nn.Module):
    method __init__ (line 295) | def __init__(self, loss_name="mse"):
    method encode_y (line 306) | def encode_y(y):
    method forward (line 310) | def forward(self, y_pred, y_true, sample_weights=None):
  function get_pytorch_loss (line 350) | def get_pytorch_loss(name):

FILE: mhcflurry/random_negative_peptides.py
  class RandomNegativePeptides (line 11) | class RandomNegativePeptides(object):
    method __init__ (line 50) | def __init__(self, **hyperparameters):
    method plan (line 56) | def plan(self, peptides, affinities, alleles=None, inequalities=None):
    method plan_by_length (line 134) | def plan_by_length(self, df_all, df_binders=None, df_nonbinders=None):
    method plan_by_allele (line 165) | def plan_by_allele(self, df_all, df_binders=None, df_nonbinders=None):
    method plan_by_allele_equalize_nonbinders (line 196) | def plan_by_allele_equalize_nonbinders(
    method get_alleles (line 245) | def get_alleles(self):
    method get_peptides (line 262) | def get_peptides(self):
    method get_total_count (line 284) | def get_total_count(self):

FILE: mhcflurry/regression_target.py
  function from_ic50 (line 4) | def from_ic50(ic50, max_ic50=50000.0):
  function to_ic50 (line 23) | def to_ic50(x, max_ic50=50000.0):

FILE: mhcflurry/scoring.py
  function make_scores (line 12) | def make_scores(

FILE: mhcflurry/select_allele_specific_models_command.py
  function run (line 185) | def run(argv=sys.argv[1:]):
  class ScrambledPredictor (line 411) | class ScrambledPredictor(object):
    method __init__ (line 412) | def __init__(self, predictor):
    method predict (line 417) | def predict(self, peptides, allele):
  function model_select (line 426) | def model_select(allele, constant_data=GLOBAL_DATA):
  function cache_encoding (line 490) | def cache_encoding(predictor, peptides):
  class ScoreFunction (line 497) | class ScoreFunction(object):
    method __init__ (line 502) | def __init__(self, function, summary=None):
    method __call__ (line 506) | def __call__(self, *args, **kwargs):
  class CombinedModelSelector (line 510) | class CombinedModelSelector(object):
    method __init__ (line 514) | def __init__(self, model_selectors, weights=None, min_contribution_per...
    method usable_for_allele (line 521) | def usable_for_allele(self, allele):
    method plan_summary (line 526) | def plan_summary(self, allele):
    method score_function (line 529) | def score_function(self, allele, dry_run=False):
  class ConsensusModelSelector (line 580) | class ConsensusModelSelector(object):
    method __init__ (line 585) | def __init__(
    method usable_for_allele (line 602) | def usable_for_allele(self, allele):
    method max_absolute_value (line 605) | def max_absolute_value(self, allele):
    method plan_summary (line 608) | def plan_summary(self, allele):
    method score_function (line 611) | def score_function(self, allele):
  class MSEModelSelector (line 630) | class MSEModelSelector(object):
    method __init__ (line 635) | def __init__(
    method usable_for_allele (line 647) | def usable_for_allele(self, allele):
    method max_absolute_value (line 650) | def max_absolute_value(self, allele):
    method plan_summary (line 656) | def plan_summary(self, allele):
    method score_function (line 659) | def score_function(self, allele):
  class MassSpecModelSelector (line 711) | class MassSpecModelSelector(object):
    method __init__ (line 716) | def __init__(
    method ppv (line 755) | def ppv(y_true, predictions):
    method usable_for_allele (line 761) | def usable_for_allele(self, allele):
    method max_absolute_value (line 765) | def max_absolute_value(self, allele):
    method plan_summary (line 771) | def plan_summary(self, allele):
    method score_function (line 774) | def score_function(self, allele):

FILE: mhcflurry/select_pan_allele_models_command.py
  function mse (line 93) | def mse(
  function run (line 134) | def run(argv=sys.argv[1:]):
  function do_model_select_task (line 296) | def do_model_select_task(item, constant_data=GLOBAL_DATA):
  function model_select (line 302) | def model_select(

FILE: mhcflurry/select_processing_models_command.py
  function run (line 87) | def run(argv=sys.argv[1:]):
  function do_model_select_task (line 229) | def do_model_select_task(item, constant_data=GLOBAL_DATA):
  function model_select (line 233) | def model_select(

FILE: mhcflurry/testing_utils.py
  function startup (line 8) | def startup():
  function cleanup (line 15) | def cleanup():

FILE: mhcflurry/train_allele_specific_models_command.py
  function run (line 131) | def run(argv=sys.argv[1:]):
  function alleles_by_similarity (line 331) | def alleles_by_similarity(allele):
  function train_model (line 345) | def train_model(
  function subselect_df_held_out (line 418) | def subselect_df_held_out(df, recriprocal_held_out_fraction=10, seed=0):

FILE: mhcflurry/train_pan_allele_models_command.py
  function assign_folds (line 134) | def assign_folds(df, num_folds, held_out_fraction, held_out_max):
  function pretrain_data_iterator (line 210) | def pretrain_data_iterator(
  function run (line 263) | def run(argv=sys.argv[1:]):
  function main (line 282) | def main(args):
  function initialize_training (line 296) | def initialize_training(args):
  function train_models (line 424) | def train_models(args):
  function train_model (line 537) | def train_model(

FILE: mhcflurry/train_presentation_models_command.py
  function run (line 66) | def run(argv=sys.argv[1:]):
  function main (line 85) | def main(args):

FILE: mhcflurry/train_processing_models_command.py
  function assign_folds (line 113) | def assign_folds(df, num_folds, held_out_samples):
  function run (line 153) | def run(argv=sys.argv[1:]):
  function main (line 172) | def main(args):
  function initialize_training (line 186) | def initialize_training(args):
  function train_models (line 270) | def train_models(args):
  function train_model (line 378) | def train_model(

FILE: scripts/compare_tf_pytorch_random_outputs.py
  function _json_default (line 94) | def _json_default(value):
  function _self_cmd (line 102) | def _self_cmd(*args: str) -> list[str]:
  function _run_subprocess_json (line 106) | def _run_subprocess_json(cmd: list[str]) -> dict:
  function _run_subprocess (line 115) | def _run_subprocess(cmd: list[str]) -> None:
  function _append_if_set (line 119) | def _append_if_set(cmd: list[str], flag: str, value: str | None) -> None:
  function _repo_root_default (line 124) | def _repo_root_default() -> Path:
  function _random_sequences (line 128) | def _random_sequences(rng: np.random.Generator, lengths: np.ndarray) -> ...
  function _generate_dataset (line 139) | def _generate_dataset(
  function _apply_allele_panel (line 171) | def _apply_allele_panel(
  function cmd_backend_metadata (line 194) | def cmd_backend_metadata(args: argparse.Namespace) -> None:
  function cmd_predict_backend (line 261) | def cmd_predict_backend(args: argparse.Namespace) -> None:
  function _numeric_stats (line 369) | def _numeric_stats(
  function cmd_analyze (line 433) | def cmd_analyze(args: argparse.Namespace) -> None:
  function cmd_run (line 533) | def cmd_run(args: argparse.Namespace) -> None:
  function _add_common_model_dir_args (line 701) | def _add_common_model_dir_args(parser: argparse.ArgumentParser) -> None:
  function build_parser (line 708) | def build_parser() -> argparse.ArgumentParser:
  function main (line 780) | def main() -> None:

FILE: scripts/cross_allele_parity_analysis.py
  function _repo_root_default (line 85) | def _repo_root_default() -> Path:
  function _compare_script_path (line 89) | def _compare_script_path() -> Path:
  function _self_cmd (line 93) | def _self_cmd(*args: str) -> list[str]:
  function _run_subprocess_json (line 97) | def _run_subprocess_json(cmd: list[str]) -> dict:
  function _run_subprocess (line 105) | def _run_subprocess(cmd: list[str]) -> None:
  function _append_if_set (line 109) | def _append_if_set(cmd: list[str], flag: str, value: str | None) -> None:
  function _select_alleles (line 114) | def _select_alleles(
  function _lengths_to_sample (line 149) | def _lengths_to_sample(
  function _random_sequences (line 166) | def _random_sequences(
  function _generate_uniform_peptides (line 189) | def _generate_uniform_peptides(
  function _cross_join_dataset (line 239) | def _cross_join_dataset(peptides_df: pd.DataFrame, alleles: list[str]) -...
  function _pre_run_sanity_checks (line 271) | def _pre_run_sanity_checks(peptides_df: pd.DataFrame, dataset: pd.DataFr...
  function _enforce_presentation_score_requirements (line 290) | def _enforce_presentation_score_requirements(
  function _numeric_output_columns (line 324) | def _numeric_output_columns(df: pd.DataFrame) -> list[str]:
  function _make_diff_frame (line 328) | def _make_diff_frame(merged: pd.DataFrame, numeric_columns: list[str]) -...
  function _per_output_summary (line 340) | def _per_output_summary(diff_df: pd.DataFrame, numeric_columns: list[str...
  function _break_thresholds_for_output (line 360) | def _break_thresholds_for_output(output: str) -> float:
  function _break_analysis (line 368) | def _break_analysis(
  function _plot_output_ranges (line 400) | def _plot_output_ranges(summary_df: pd.DataFrame, out_path: Path) -> None:
  function _plot_row_max_hist (line 437) | def _plot_row_max_hist(row_summary: pd.DataFrame, out_path: Path) -> None:
  function _plot_length_breakdown (line 459) | def _plot_length_breakdown(
  function _safe_plot_name (line 490) | def _safe_plot_name(name: str) -> str:
  function _plot_per_output_hists (line 494) | def _plot_per_output_hists(
  function _write_break_report (line 542) | def _write_break_report(
  function build_parser (line 590) | def build_parser() -> argparse.ArgumentParser:
  function main (line 620) | def main() -> None:

FILE: scripts/extract_high_presentation_fixture.py
  function _collect_model_metadata (line 33) | def _collect_model_metadata() -> dict:
  function build_parser (line 58) | def build_parser() -> argparse.ArgumentParser:
  function main (line 76) | def main() -> None:

FILE: scripts/generate_fixture_error_report.py
  class MetricReport (line 52) | class MetricReport:
    method summary (line 64) | def summary(self) -> dict:
  function build_parser (line 81) | def build_parser() -> argparse.ArgumentParser:
  function _format_number (line 91) | def _format_number(value: float, digits: int = 6) -> str:
  function _format_percent (line 102) | def _format_percent(value: float) -> str:
  function _as_float_array (line 108) | def _as_float_array(values: Iterable[float]) -> np.ndarray:
  function _clip_positive (line 112) | def _clip_positive(values: np.ndarray) -> np.ndarray:
  function _make_error_frame (line 118) | def _make_error_frame(
  function _load_affinity_fixture (line 137) | def _load_affinity_fixture() -> dict:
  function _load_presentation_fixture (line 142) | def _load_presentation_fixture() -> tuple[pd.DataFrame, dict]:
  function _predict_current_outputs (line 149) | def _predict_current_outputs() -> tuple[dict, pd.DataFrame, dict]:
  function _compute_metric_reports (line 290) | def _compute_metric_reports(
  function _axis_range (line 401) | def _axis_range(values: np.ndarray) -> tuple[float, float]:
  function _tick_values (line 411) | def _tick_values(start: float, stop: float, count: int = 5) -> list[float]:
  function _format_tick (line 417) | def _format_tick(value: float, log_scale: bool = False) -> str:
  function _render_scatter_svg (line 423) | def _render_scatter_svg(
  function _render_histogram_svg (line 500) | def _render_histogram_svg(
  function _render_summary_table (line 582) | def _render_summary_table(reports: list[MetricReport]) -> str:
  function _render_top_error_table (line 611) | def _render_top_error_table(report: MetricReport, limit: int = 10) -> str:
  function _render_metric_section (line 641) | def _render_metric_section(report: MetricReport) -> str:
  function _write_outputs (line 691) | def _write_outputs(
  function main (line 925) | def main() -> None:

FILE: scripts/modal_train_mhcflurry.py
  function _install_repo (line 39) | def _install_repo():
  function run_training_job (line 85) | def run_training_job(job: dict) -> dict:
  function main (line 131) | def main(command_template: str, workers: int = 1):

FILE: scripts/plot_fixture_diffs.py
  function load_fixture (line 40) | def load_fixture():
  function generate_predictions (line 47) | def generate_predictions(fixture_df):
  function plot_output (line 136) | def plot_output(col, tf_vals, pt_vals, out_dir):
  function main (line 232) | def main():

FILE: scripts/validate_allele_sequences.py
  function load_raw_csv (line 27) | def load_raw_csv(path):
  function renormalize (line 32) | def renormalize(raw_mapping):
  function main (line 53) | def main():

FILE: test/__init__.py
  function data_path (line 9) | def data_path(name):
  function initialize (line 17) | def initialize():

FILE: test/conftest.py
  function pytest_configure (line 9) | def pytest_configure(config):

FILE: test/expensive_verify_pretrain_optimizable.py
  function verify_optimizable (line 63) | def verify_optimizable():

FILE: test/pytest_helpers.py
  function mhcflurry_cli (line 20) | def mhcflurry_cli(command):
  function eq_ (line 32) | def eq_(a, b, msg=None):
  function assert_less (line 40) | def assert_less(a, b, msg=None):
  function assert_greater (line 48) | def assert_greater(a, b, msg=None):
  function assert_almost_equal (line 56) | def assert_almost_equal(a, b, places=7, msg=None):
  function assert_raises (line 68) | def assert_raises(exc_class, func=None, *args, **kwargs):

FILE: test/test_allele_encoding.py
  function test_allele_encoding_speed (line 9) | def test_allele_encoding_speed():

FILE: test/test_amino_acid.py
  function test_index_and_one_hot_encoding (line 15) | def test_index_and_one_hot_encoding():
  function test_index_encoding_no_downcast_futurewarning (line 54) | def test_index_encoding_no_downcast_futurewarning():

FILE: test/test_api_compat_shims.py
  function test_legacy_configure_tensorflow_entry_point (line 11) | def test_legacy_configure_tensorflow_entry_point():
  function test_legacy_worker_init_signature_kept (line 16) | def test_legacy_worker_init_signature_kept():
  function test_worker_init_preserves_empty_gpu_assignment (line 21) | def test_worker_init_preserves_empty_gpu_assignment(monkeypatch):
  function test_legacy_cache_key_alias (line 37) | def test_legacy_cache_key_alias():
  function test_legacy_get_keras_loss_accessor (line 48) | def test_legacy_get_keras_loss_accessor():
  function test_legacy_get_activations_symbol_kept (line 56) | def test_legacy_get_activations_symbol_kept():

FILE: test/test_calibrate_percentile_ranks_command.py
  function setup_module (line 25) | def setup_module():
  function run_and_check (line 31) | def run_and_check(n_jobs=0, delete=True, additional_args=[]):
  function test_run_serial (line 71) | def test_run_serial():
  function test_run_parallel (line 75) | def test_run_parallel():
  function test_run_cluster_parallelism (line 79) | def test_run_cluster_parallelism(delete=True):

FILE: test/test_changing_allele_representations.py
  function setup_module (line 12) | def setup_module():
  function test_changing_allele_representations (line 60) | def test_changing_allele_representations():

FILE: test/test_class1_affinity_predictor.py
  function setup_teardown (line 25) | def setup_teardown():
  function warn_with_traceback (line 40) | def warn_with_traceback(message, category, filename, lineno, file=None, ...
  function predict_and_check (line 49) | def predict_and_check(
  function test_a1_known_epitopes_in_newly_trained_model (line 69) | def test_a1_known_epitopes_in_newly_trained_model():
  function test_class1_affinity_predictor_a0205_memorize_training_data (line 136) | def test_class1_affinity_predictor_a0205_memorize_training_data():
  function test_no_nans (line 212) | def test_no_nans():
  function test_predict_implementations_equivalent (line 220) | def test_predict_implementations_equivalent():
  function test_no_runtime_warnings_for_unsupported_rows (line 248) | def test_no_runtime_warnings_for_unsupported_rows():

FILE: test/test_class1_neural_network.py
  function setup_teardown (line 20) | def setup_teardown():
  function test_class1_neural_network_a0205_training_accuracy (line 28) | def test_class1_neural_network_a0205_training_accuracy():
  function test_inequalities (line 84) | def test_inequalities():
  function test_basic_training (line 166) | def test_basic_training():
  function test_serialization (line 194) | def test_serialization():
  function test_different_peptide_lengths (line 228) | def test_different_peptide_lengths():
  function test_early_stopping (line 253) | def test_early_stopping():
  function test_batch_normalization (line 276) | def test_batch_normalization():
  function test_dropout (line 296) | def test_dropout():
  function test_multiple_outputs (line 316) | def test_multiple_outputs():

FILE: test/test_class1_pan.py
  function setup_module (line 18) | def setup_module():
  function test_train_simple (line 94) | def test_train_simple():

FILE: test/test_class1_presentation_predictor.py
  function setup_module (line 23) | def setup_module():
  function teardown_module (line 42) | def teardown_module():
  function predictors (line 55) | def predictors():
  function test_basic (line 64) | def test_basic(predictors):
  function test_downloaded_predictor_small (line 166) | def test_downloaded_predictor_small(predictors):
  function test_downloaded_predictor (line 246) | def test_downloaded_predictor(predictors):
  function test_downloaded_predictor_invalid_peptides (line 429) | def test_downloaded_predictor_invalid_peptides(predictors):

FILE: test/test_class1_processing_neural_network.py
  function setup_teardown (line 22) | def setup_teardown():
  function decode_matrix (line 35) | def decode_matrix(array):
  function test_neural_network_input (line 61) | def test_neural_network_input():
  function test_small (line 138) | def test_small():
  function test_more (line 144) | def test_more():
  function test_basic_indexing (line 156) | def test_basic_indexing(num=10000, do_assertions=True, **hyperparameters):
  function train_basic_network (line 177) | def train_basic_network(num, do_assertions=True, is_hit=None, **hyperpar...
  function test_serialization (line 250) | def test_serialization():
  function test_different_peptide_lengths (line 284) | def test_different_peptide_lengths():
  function test_empty_flanks (line 312) | def test_empty_flanks():
  function test_prediction_range (line 335) | def test_prediction_range():

FILE: test/test_class1_processing_predictor.py
  function setup (line 17) | def setup():
  function teardown (line 21) | def teardown():
  function test_basic (line 25) | def test_basic():

FILE: test/test_custom_loss.py
  function setup_module (line 15) | def setup_module():
  function setup_teardown (line 21) | def setup_teardown():
  function evaluate_loss (line 28) | def evaluate_loss(loss_obj, y_true, y_pred):
  function test_mse_with_inequalities (line 47) | def test_mse_with_inequalities(loss_obj=None):
  function test_mse_with_inequalities_and_multiple_outputs (line 93) | def test_mse_with_inequalities_and_multiple_outputs():
  function test_multiallelic_mass_spec_loss (line 155) | def test_multiallelic_mass_spec_loss():
  function test_encode_y_basic (line 226) | def test_encode_y_basic():
  function test_loss_gradient_flow (line 244) | def test_loss_gradient_flow():
  function test_inequality_gradient_respects_constraint (line 262) | def test_inequality_gradient_respects_constraint():

FILE: test/test_doctest.py
  function setup_module (line 20) | def setup_module():
  function test_doctests (line 26) | def test_doctests():

FILE: test/test_download_models_class1.py
  function setup_module (line 17) | def setup_module():
  function teardown_module (line 23) | def teardown_module():
  function downloaded_predictor (line 30) | def downloaded_predictor():
  function predict_and_check (line 34) | def predict_and_check(
  function test_a1_titin_epitope_downloaded_models (line 51) | def test_a1_titin_epitope_downloaded_models(downloaded_predictor):
  function test_a1_mage_epitope_downloaded_models (line 59) | def test_a1_mage_epitope_downloaded_models(downloaded_predictor):
  function test_a2_hiv_epitope_downloaded_models (line 67) | def test_a2_hiv_epitope_downloaded_models(downloaded_predictor):
  function test_caching (line 73) | def test_caching(downloaded_predictor):
  function test_downloaded_predictor_is_serializable (line 84) | def test_downloaded_predictor_is_serializable(downloaded_predictor):
  function test_downloaded_predictor_is_savable (line 99) | def test_downloaded_predictor_is_savable(downloaded_predictor):
  function test_downloaded_predictor_gives_percentile_ranks (line 115) | def test_downloaded_predictor_gives_percentile_ranks(downloaded_predictor):

FILE: test/test_ensemble_centrality.py
  function test_robust_mean (line 10) | def test_robust_mean():
  function test_no_runtime_warnings_for_all_nan_rows (line 33) | def test_no_runtime_warnings_for_all_nan_rows():

FILE: test/test_hyperparameters.py
  function test_all_combinations_of_hyperparameters (line 6) | def test_all_combinations_of_hyperparameters():

FILE: test/test_local_parallelism.py
  function test_worker_init_kwargs_round_robin_across_gpus (line 12) | def test_worker_init_kwargs_round_robin_across_gpus():
  function test_worker_init_kwargs_without_gpu_scheduling_uses_backend (line 27) | def test_worker_init_kwargs_without_gpu_scheduling_uses_backend():
  function test_worker_init_kwargs_normalizes_default_backend_alias (line 40) | def test_worker_init_kwargs_normalizes_default_backend_alias():
  function test_worker_init_kwargs_with_gpus_normalizes_default_backend_alias (line 52) | def test_worker_init_kwargs_with_gpus_normalizes_default_backend_alias():
  function test_backend_default_alias_parses (line 65) | def test_backend_default_alias_parses():
  function test_validate_worker_pool_args_requires_parallelism_for_gpus (line 72) | def test_validate_worker_pool_args_requires_parallelism_for_gpus():
  function test_validate_worker_pool_args_rejects_non_cuda_backends_for_gpus (line 82) | def test_validate_worker_pool_args_rejects_non_cuda_backends_for_gpus():
  function test_validate_worker_pool_args_rejects_invalid_backend (line 92) | def test_validate_worker_pool_args_rejects_invalid_backend():

FILE: test/test_master_compat_predictions.py
  function setup_module (line 34) | def setup_module():
  function teardown_module (line 38) | def teardown_module():
  function _load_model_and_expected (line 42) | def _load_model_and_expected(name):
  function _predict (line 52) | def _predict(config, weights, expected, backend=None):
  function test_predictions_match_expected (line 72) | def test_predictions_match_expected(model_name):
  function test_mps_matches_cpu (line 85) | def test_mps_matches_cpu(model_name):

FILE: test/test_multi_output.py
  function setup_module (line 13) | def setup_module():
  function test_multi_output (line 19) | def test_multi_output(setup_module):

FILE: test/test_network_merging.py
  function setup_module (line 14) | def setup_module():
  function teardown_module (line 22) | def teardown_module():
  function predictors (line 29) | def predictors():
  function test_merge (line 33) | def test_merge(predictors):

FILE: test/test_percent_rank_transform.py
  function test_percent_rank_transform (line 9) | def test_percent_rank_transform():

FILE: test/test_predict_command.py
  function setup_teardown (line 19) | def setup_teardown():
  function test_csv (line 35) | def test_csv():
  function test_no_csv (line 58) | def test_no_csv():

FILE: test/test_predict_scan_command.py
  function setup_module (line 16) | def setup_module():
  function read_output_csv (line 23) | def read_output_csv(filename):
  function test_fasta (line 29) | def test_fasta():
  function test_fasta_50nm (line 56) | def test_fasta_50nm():
  function test_fasta_percentile (line 81) | def test_fasta_percentile():
  function test_commandline_sequences (line 106) | def test_commandline_sequences():

FILE: test/test_pytorch_coverage.py
  function setup_teardown (line 21) | def setup_teardown():
  class TestGetPytorchLoss (line 30) | class TestGetPytorchLoss:
    method test_standard_mse (line 31) | def test_standard_mse(self):
    method test_standard_mae (line 40) | def test_standard_mae(self):
    method test_custom_loss_lookup (line 48) | def test_custom_loss_lookup(self):
    method test_custom_multi_output_lookup (line 54) | def test_custom_multi_output_lookup(self):
    method test_custom_mass_spec_lookup (line 60) | def test_custom_mass_spec_lookup(self):
    method test_unknown_standard_loss_raises (line 65) | def test_unknown_standard_loss_raises(self):
    method test_unknown_custom_loss_raises (line 70) | def test_unknown_custom_loss_raises(self):
  class TestStandardLossWeighted (line 76) | class TestStandardLossWeighted:
    method test_mse_with_sample_weights (line 77) | def test_mse_with_sample_weights(self):
    method test_mae_with_sample_weights (line 87) | def test_mae_with_sample_weights(self):
    method test_mse_column_vector_predictions_do_not_warn (line 96) | def test_mse_column_vector_predictions_do_not_warn(self):
  class TestMSEWithInequalitiesSampleWeights (line 108) | class TestMSEWithInequalitiesSampleWeights:
    method test_weighted_equality_loss (line 109) | def test_weighted_equality_loss(self):
    method test_encode_y_nan_raises (line 121) | def test_encode_y_nan_raises(self):
    method test_encode_y_length_mismatch_raises (line 126) | def test_encode_y_length_mismatch_raises(self):
  class TestMSEMultiOutputSampleWeights (line 132) | class TestMSEMultiOutputSampleWeights:
    method test_weighted_multi_output (line 133) | def test_weighted_multi_output(self):
    method test_encode_y_negative_output_indices_raises (line 144) | def test_encode_y_negative_output_indices_raises(self):
    method test_encode_y_output_indices_shape_mismatch_raises (line 150) | def test_encode_y_output_indices_shape_mismatch_raises(self):
  class TestMultiallelicMassSpecEdgeCases (line 157) | class TestMultiallelicMassSpecEdgeCases:
    method test_no_hits_returns_zero (line 158) | def test_no_hits_returns_zero(self):
    method test_no_decoys_returns_zero (line 167) | def test_no_decoys_returns_zero(self):
  class TestGetActivation (line 180) | class TestGetActivation:
    method test_tanh (line 181) | def test_tanh(self):
    method test_sigmoid (line 187) | def test_sigmoid(self):
    method test_relu (line 192) | def test_relu(self):
    method test_linear_returns_none (line 198) | def test_linear_returns_none(self):
    method test_unknown_raises (line 203) | def test_unknown_raises(self):
  class TestLocallyConnected1D (line 209) | class TestLocallyConnected1D:
    method test_output_shape (line 210) | def test_output_shape(self):
    method test_deterministic_forward (line 219) | def test_deterministic_forward(self):
  class TestEnsembleCentralityEdgeCases (line 236) | class TestEnsembleCentralityEdgeCases:
    method test_robust_mean_falls_back_to_nanmean_for_few_columns (line 237) | def test_robust_mean_falls_back_to_nanmean_for_few_columns(self):
    method test_robust_mean_two_columns (line 244) | def test_robust_mean_two_columns(self):
    method test_robust_mean_all_nan_many_columns (line 250) | def test_robust_mean_all_nan_many_columns(self):
    method test_nanmedian_mixed_nans (line 256) | def test_nanmedian_mixed_nans(self):
    method test_nanmean_single_value_per_row (line 268) | def test_nanmean_single_value_per_row(self):
    method test_centrality_measures_dict (line 276) | def test_centrality_measures_dict(self):
  class TestWeightInitialization (line 284) | class TestWeightInitialization:
    method _make_model (line 285) | def _make_model(self, init):
    method test_glorot_uniform (line 298) | def test_glorot_uniform(self):
    method test_glorot_normal (line 302) | def test_glorot_normal(self):
    method test_he_uniform (line 306) | def test_he_uniform(self):
    method test_he_normal (line 310) | def test_he_normal(self):
    method test_biases_are_zero (line 314) | def test_biases_are_zero(self):
  class TestMergedClass1NeuralNetwork (line 324) | class TestMergedClass1NeuralNetwork:
    method _make_merged (line 325) | def _make_merged(self, merge_method, n_networks=2):
    method test_average (line 343) | def test_average(self):
    method test_sum (line 352) | def test_sum(self):
    method test_concatenate (line 360) | def test_concatenate(self):
    method test_unknown_merge_method_raises (line 368) | def test_unknown_merge_method_raises(self):
    method test_get_set_weights_roundtrip (line 375) | def test_get_set_weights_roundtrip(self):
  class TestSkipConnectionsTopology (line 390) | class TestSkipConnectionsTopology:
    method test_forward_pass (line 391) | def test_forward_pass(self):
    method test_different_from_feedforward (line 408) | def test_different_from_feedforward(self):
  class TestCanonicalizeAlleleName (line 439) | class TestCanonicalizeAlleleName:
    method test_common_alleles_roundtrip (line 440) | def test_common_alleles_roundtrip(self):
    method test_aliases_false_avoids_remapping (line 451) | def test_aliases_false_avoids_remapping(self):
    method test_normalize_raises_on_invalid (line 458) | def test_normalize_raises_on_invalid(self):
    method test_normalize_returns_default_on_invalid (line 463) | def test_normalize_returns_default_on_invalid(self):
    method test_forbidden_substring_raises (line 469) | def test_forbidden_substring_raises(self):
    method test_forbidden_substring_returns_default (line 474) | def test_forbidden_substring_returns_default(self):
  class TestConfigurePyTorch (line 484) | class TestConfigurePyTorch:
    method test_reconfigure_backend (line 485) | def test_reconfigure_backend(self):
    method test_invalid_backend_raises (line 494) | def test_invalid_backend_raises(self):
    method test_default_backend_alias_maps_to_auto (line 499) | def test_default_backend_alias_maps_to_auto(self):
    method test_configure_tensorflow_cpu_backend_maps_to_cpu (line 508) | def test_configure_tensorflow_cpu_backend_maps_to_cpu(self):
    method test_configure_tensorflow_default_alias_maps_to_auto (line 519) | def test_configure_tensorflow_default_alias_maps_to_auto(self):
    method test_configure_tensorflow_gpu_backend_maps_to_gpu (line 530) | def test_configure_tensorflow_gpu_backend_maps_to_gpu(self):

FILE: test/test_pytorch_regressions.py
  function setup_teardown (line 32) | def setup_teardown():
  function _make_simple_affinity_model (line 38) | def _make_simple_affinity_model(**overrides):
  function _make_allele_representations (line 62) | def _make_allele_representations(num_alleles=2):
  function _seed_all (line 66) | def _seed_all(seed=1):
  function test_sample_weights_affect_training (line 72) | def test_sample_weights_affect_training():
  function test_validation_split_is_fixed_when_lr_zero (line 100) | def test_validation_split_is_fixed_when_lr_zero():
  function test_dropout_probability_is_keep_prob (line 122) | def test_dropout_probability_is_keep_prob():
  function test_batch_norm_uses_keras_defaults (line 134) | def test_batch_norm_uses_keras_defaults():
  function test_processing_dropout_is_spatial (line 151) | def test_processing_dropout_is_spatial():
  function test_processing_flank_averages_use_tf_masked_mean_semantics (line 178) | def test_processing_flank_averages_use_tf_masked_mean_semantics():
  function test_mse_with_inequalities_rejects_out_of_range_targets (line 207) | def test_mse_with_inequalities_rejects_out_of_range_targets():
  function test_mse_with_inequalities_rejects_invalid_inequality (line 214) | def test_mse_with_inequalities_rejects_invalid_inequality():
  function test_multiallelic_mass_spec_encode_y_validates_values (line 219) | def test_multiallelic_mass_spec_encode_y_validates_values():
  function test_merge_allele_specific_raises_not_implemented (line 224) | def test_merge_allele_specific_raises_not_implemented():
  function test_merged_network_serialization_preserves_dropout_keep_probability (line 242) | def test_merged_network_serialization_preserves_dropout_keep_probability():
  function test_dense_regularization_excludes_output_layer (line 279) | def test_dense_regularization_excludes_output_layer():
  function test_processing_validation_uses_last_fraction_and_sample_weights (line 311) | def test_processing_validation_uses_last_fraction_and_sample_weights():
  function test_optimizer_defaults_match_keras (line 380) | def test_optimizer_defaults_match_keras():
  function test_weight_and_embedding_updates_preserve_device (line 407) | def test_weight_and_embedding_updates_preserve_device():
  function test_cached_keras_weight_reload_preserves_device (line 459) | def test_cached_keras_weight_reload_preserves_device():
  function test_l1_regularization_changes_weights_even_with_zero_data_loss (line 485) | def test_l1_regularization_changes_weights_even_with_zero_data_loss():

FILE: test/test_random_negative_peptides.py
  function test_random_negative_peptides_by_allele_equalize_nonbinders (line 10) | def test_random_negative_peptides_by_allele_equalize_nonbinders():
  function test_random_negative_peptides_by_allele (line 63) | def test_random_negative_peptides_by_allele():

FILE: test/test_regression_target.py
  function test_regression_target_to_ic50 (line 9) | def test_regression_target_to_ic50():
  function test_ic50_to_regression_target (line 14) | def test_ic50_to_regression_target():

FILE: test/test_released_master_predictions.py
  function setup_module (line 18) | def setup_module():
  function teardown_module (line 22) | def teardown_module():
  function _load_expected (line 26) | def _load_expected(name):
  function test_allele_specific_affinity_predictions (line 32) | def test_allele_specific_affinity_predictions():
  function test_pan_allele_affinity_predictions (line 52) | def test_pan_allele_affinity_predictions():

FILE: test/test_released_predictors_on_hpv_dataset.py
  function data_path (line 19) | def data_path(name):
  function setup_module (line 30) | def setup_module():
  function teardown_module (line 41) | def teardown_module():
  function predictors (line 48) | def predictors():
  function test_on_hpv (line 52) | def test_on_hpv(predictors, df=DF):

FILE: test/test_released_predictors_well_correlated.py
  function setup (line 23) | def setup():
  function setup_teardown (line 34) | def setup_teardown():
  function predictors (line 53) | def predictors():
  function test_correlation (line 57) | def test_correlation(

FILE: test/test_released_presentation_highscore_rows.py
  function setup_module (line 34) | def setup_module():
  function teardown_module (line 38) | def teardown_module():
  function _load_expected (line 42) | def _load_expected():
  function _atol_for_output (line 48) | def _atol_for_output(column):
  function test_expected_data_has_high_and_low_contexts (line 56) | def test_expected_data_has_high_and_low_contexts():
  function test_presentation_predictions (line 72) | def test_presentation_predictions():

FILE: test/test_selected_peptides_csv.py
  function _normalize_allele (line 24) | def _normalize_allele(allele):
  function selected_peptides_predictions (line 36) | def selected_peptides_predictions():
  function test_selected_peptides_mhcflurry_matches_csv (line 64) | def test_selected_peptides_mhcflurry_matches_csv(selected_peptides_predi...
  function test_selected_peptides_netmhcpan_affinity_close (line 93) | def test_selected_peptides_netmhcpan_affinity_close(selected_peptides_pr...

FILE: test/test_speed.py
  function setup_module (line 28) | def setup_module():
  function teardown_module (line 38) | def teardown_module():
  function load_predictors (line 48) | def load_predictors():
  function predictors (line 57) | def predictors():
  function init (line 64) | def init():
  function test_speed_allele_specific (line 68) | def test_speed_allele_specific(predictors, profile=False, num=DEFAULT_NU...
  function test_speed_pan_allele (line 121) | def test_speed_pan_allele(predictors, profile=False, num=DEFAULT_NUM_PRE...

FILE: test/test_train_and_related_commands.py
  function setup_module (line 22) | def setup_module():
  function run_and_check (line 65) | def run_and_check(n_jobs=0):
  function run_and_check_with_model_selection (line 107) | def run_and_check_with_model_selection(n_jobs=1):
  function test_run_parallel (line 169) | def test_run_parallel():
  function test_run_serial (line 174) | def test_run_serial():

FILE: test/test_train_pan_allele_models_command.py
  function setup_module (line 24) | def setup_module():
  function run_and_check (line 138) | def run_and_check(n_jobs=0, delete=True, additional_args=[]):
  function test_run_parallel (line 197) | def test_run_parallel():
  function test_run_serial (line 202) | def test_run_serial():
  function test_run_cluster_parallelism (line 206) | def test_run_cluster_parallelism():

FILE: test/test_train_processing_models_command.py
  function setup_module (line 24) | def setup_module():
  function make_dataset (line 47) | def make_dataset(num=10000):
  function run_and_check (line 83) | def run_and_check(n_jobs=0, additional_args=[], delete=False):
  function Xtest_run_parallel (line 149) | def Xtest_run_parallel():
  function test_run_serial (line 153) | def test_run_serial():

FILE: test/test_training_variants.py
  function setup_teardown (line 21) | def setup_teardown():
  function _seed (line 27) | def _seed(s=42):
  function _make_model (line 33) | def _make_model(**overrides):
  function test_train_with_locally_connected (line 59) | def test_train_with_locally_connected():
  function test_train_with_dropout (line 81) | def test_train_with_dropout():
  function test_train_with_batch_normalization (line 100) | def test_train_with_batch_normalization():
  function test_train_lc_dropout_batchnorm (line 120) | def test_train_lc_dropout_batchnorm():
  function test_train_with_skip_connections (line 143) | def test_train_with_skip_connections():
  function test_train_with_peptide_dense_layers (line 162) | def test_train_with_peptide_dense_layers():
  function test_train_with_optimizer (line 182) | def test_train_with_optimizer(optimizer):
  function test_train_with_l2_regularization (line 202) | def test_train_with_l2_regularization():
  function test_train_with_l1_l2_regularization (line 221) | def test_train_with_l1_l2_regularization():
  function test_train_with_random_negatives (line 241) | def test_train_with_random_negatives():
  function test_train_with_early_stopping (line 261) | def test_train_with_early_stopping():
  function test_serialization_with_lc_dropout_batchnorm (line 285) | def test_serialization_with_lc_dropout_batchnorm():
  function test_train_mixed_lengths_with_lc (line 314) | def test_train_mixed_lengths_with_lc():
  function _random_aa (line 345) | def _random_aa(choices, rng):
  function _generate_a0201_binder (line 349) | def _generate_a0201_binder(rng, length=9):
  function _generate_non_binder (line 357) | def _generate_non_binder(rng, length=9):
  function test_learn_a0201_motif (line 367) | def test_learn_a0201_motif():
Condensed preview — 238 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (2,665K chars).
[
  {
    "path": ".dockerignore",
    "chars": 120,
    "preview": ".git\n.gitignore\nLICENSE\n*.zip\n*.swp\nexperiments\nmhc_ligand_full*\ntraining/class1_allele_specific/data/mhc_ligand_full.*\n"
  },
  {
    "path": ".github/workflows/build.yml",
    "chars": 633,
    "preview": "name: build\n\non:\n  workflow_dispatch: {}\n  workflow_call: {}\n\n\njobs:\n  build:\n    runs-on: ubuntu-latest\n\n    steps:\n   "
  },
  {
    "path": ".github/workflows/ci.yml",
    "chars": 2100,
    "preview": "name: CI\n\non:\n  push:\n    branches: [\"master\"]\n  pull_request:\n    branches: [\"master\"]\n\njobs:\n  build:\n    runs-on: ubu"
  },
  {
    "path": ".github/workflows/release.yml",
    "chars": 640,
    "preview": "# Based on https://docs.pypi.org/trusted-publishers/using-a-publisher/\n\nname: release\n\non:\n  release:\n    types: [publis"
  },
  {
    "path": ".github/workflows/release_testpypi.yml",
    "chars": 723,
    "preview": "# Based on https://docs.pypi.org/trusted-publishers/using-a-publisher/\n\nname: release_testpypi\n\non:\n  workflow_dispatch:"
  },
  {
    "path": ".gitignore",
    "chars": 903,
    "preview": "# Swap files\n*.swp\n\n# Byte-compiled / optimized / DLL files\n__pycache__/\n*.py[cod]\n\n# C extensions\n*.so\n\n# Distribution "
  },
  {
    "path": "AGENTS.md",
    "chars": 3994,
    "preview": "# AGENTS.md — mhcflurry\n\nGuide for coding agents working in this repo. Read this before touching code.\n\n---\n\n## Golden R"
  },
  {
    "path": "CONTRIBUTING.md",
    "chars": 2676,
    "preview": "# Contributing to MHCflurry\n\nWe would love your help in making MHCflurry a useful resource for the community. No contrib"
  },
  {
    "path": "Dockerfile",
    "chars": 1650,
    "preview": "FROM continuumio/miniconda3:latest\n\nLABEL maintainer=\"Tim O'Donnell timodonnell@gmail.com\"\n\nWORKDIR /root\n\n# Install sys"
  },
  {
    "path": "LICENSE",
    "chars": 11358,
    "preview": "                                 Apache License\n                           Version 2.0, January 2004\n                   "
  },
  {
    "path": "NOTES.md",
    "chars": 6321,
    "preview": "# Notes\n\n## 2026-02-10\n\n- Goal: match PyTorch branch behavior to TensorFlow master for class I presentation prediction.\n"
  },
  {
    "path": "README.md",
    "chars": 5910,
    "preview": "[![Build Status](https://github.com/openvax/mhcflurry/actions/workflows/ci.yml/badge.svg)](https://github.com/openvax/mh"
  },
  {
    "path": "TODO.md",
    "chars": 3162,
    "preview": "# TODO\n\n## DONE\n\n- [x] Run broader/full test suite before merge.\n  - `pytest -q` passed: 100 tests.\n\n- [x] Localize pari"
  },
  {
    "path": "code-of-conduct.md",
    "chars": 3368,
    "preview": "# Contributor Covenant Code of Conduct\n\n## Our Pledge\n\nIn the interest of fostering an open and welcoming environment, w"
  },
  {
    "path": "compatibility_check/figures/summary.csv",
    "chars": 1634,
    "preview": "column,mean_abs_diff,max_abs_diff,median_pct_diff,p99_pct_diff\naffinity_prediction,0.000855926090815266,0.00521581506109"
  },
  {
    "path": "develop.sh",
    "chars": 708,
    "preview": "#!/bin/bash\n# Development environment setup script\n# Source this script to activate the venv: source develop.sh\n\nSCRIPT_"
  },
  {
    "path": "docs/Makefile",
    "chars": 7840,
    "preview": "# Makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line.\nSPHINXOPTS    =\nSPHINXBUILD "
  },
  {
    "path": "docs/README.md",
    "chars": 396,
    "preview": "# MHCflurry documentation\n\nTo generate Sphinx documentation, from this directory run:\n\n```\n$ pip install -r requirements"
  },
  {
    "path": "docs/api.rst",
    "chars": 113,
    "preview": ".. _api-documentation:\n\nAPI Documentation\n=================\n\n.. include:: _build/mhcflurry.rst\n    :start-line: 2"
  },
  {
    "path": "docs/commandline_tools.rst",
    "chars": 1647,
    "preview": "Command-line reference\n============================\n\nSee also the :ref:`tutorial <commandline_tutorial>`.\n\n.. _mhcflurry"
  },
  {
    "path": "docs/commandline_tutorial.rst",
    "chars": 10036,
    "preview": ".. _commandline_tutorial:\n\nCommand-line tutorial\n=====================\n\n.. _downloading:\n\nDownloading models\n-----------"
  },
  {
    "path": "docs/conf.py",
    "chars": 10929,
    "preview": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# MHCflurry documentation build configuration file, created by\n# sphinx"
  },
  {
    "path": "docs/doctest.sh",
    "chars": 156,
    "preview": "#!/bin/bash\n\nsphinx-build -b doctest -d _build/doctrees . _build/doctest\nRETVAL=$?\necho doctest returned $RETVAL\ncat _bu"
  },
  {
    "path": "docs/example.fasta",
    "chars": 62,
    "preview": ">protein1\nMSSSSTPVCPNGPGNCQV\n>protein2\nMVENKRLLEGMEMIFGQVIPGA\n"
  },
  {
    "path": "docs/index.rst",
    "chars": 202,
    "preview": "MHCflurry documentation\n=====================================\n\n.. toctree::\n   :maxdepth: 3\n\n   intro\n   commandline_tut"
  },
  {
    "path": "docs/intro.rst",
    "chars": 2498,
    "preview": "Introduction and setup\n=======================\n\nMHCflurry is an open source package for peptide/MHC I binding affinity p"
  },
  {
    "path": "docs/python_tutorial.rst",
    "chars": 9795,
    "preview": "Python library tutorial\n=======================\n\nThe MHCflurry Python API exposes additional options and features beyond"
  },
  {
    "path": "docs/requirements.txt",
    "chars": 126,
    "preview": "sphinx\nsphinxcontrib-programoutput\nsphinxcontrib-autoprogram\nsphinx-rtd-theme\nnumpydoc\npypandoc\npydot\ntabulate\nlogomaker"
  },
  {
    "path": "downloads-generation/README.md",
    "chars": 333,
    "preview": "# Downloads generation\n\nThis directory contains code and instructions needed to *generate* the datasets and trained mode"
  },
  {
    "path": "downloads-generation/allele_sequences/GENERATE.sh",
    "chars": 3506,
    "preview": "#!/bin/bash\n#\n# Create allele sequences (sometimes referred to as pseudosequences) by\n# performing a global alignment ac"
  },
  {
    "path": "downloads-generation/allele_sequences/class1_pseudosequences.csv",
    "chars": 603915,
    "preview": "allele,pseudosequence\nBoLA-100901,YYSMYREISENVYGSNLYLLYRDYTWEYLNYRWY\nBoLA-100902,YYSEYREISENVYESNLYLLYRDYTWEYLNYRWY\nBoLA"
  },
  {
    "path": "downloads-generation/allele_sequences/filter_sequences.py",
    "chars": 2937,
    "preview": "\"\"\"\nFilter and combine class I sequence fastas.\n\"\"\"\nfrom __future__ import print_function\n\nimport sys\nimport argparse\n\n\n"
  },
  {
    "path": "downloads-generation/allele_sequences/make_allele_sequences.py",
    "chars": 8863,
    "preview": "\"\"\"\nGenerate allele sequences for pan-class I models.\n\nAdditional dependency: biopython\n\"\"\"\nfrom __future__ import print"
  },
  {
    "path": "downloads-generation/allele_sequences/select_alleles_to_disambiguate.py",
    "chars": 903,
    "preview": "\"\"\"\nSelect alleles to disambiguate\n\n\"\"\"\nfrom __future__ import print_function\n\nimport sys\nimport argparse\n\nimport pandas"
  },
  {
    "path": "downloads-generation/analysis_predictor_info/GENERATE.sh",
    "chars": 5226,
    "preview": "#!/bin/bash\n#\n#\n# Usage: GENERATE.sh <local|cluster> <fresh|continue-incomplete>\n#\n# cluster mode uses an HPC cluster (M"
  },
  {
    "path": "downloads-generation/analysis_predictor_info/cluster_submit_script_header.mssm_hpc.lsf",
    "chars": 1329,
    "preview": "#!/bin/bash\n#BSUB -J MHCf-{work_item_num} # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q g"
  },
  {
    "path": "downloads-generation/analysis_predictor_info/generate_artifacts.py",
    "chars": 12544,
    "preview": "\"\"\"\nGenerate images for MHC binding motifs.\n\nNote: a shared filesystem is assumed even when running on an HPC cluster.\nT"
  },
  {
    "path": "downloads-generation/analysis_predictor_info/generate_model_selection_with_decoys.py",
    "chars": 2844,
    "preview": "\"\"\"\nFrom affinity predictor model selection data, add decoys so that AUCs can be\ncalculated per-allele.\n\"\"\"\nimport sys\ni"
  },
  {
    "path": "downloads-generation/analysis_predictor_info/predict_on_model_selection_data.py",
    "chars": 5585,
    "preview": "\"\"\"\nEvaluate affinity predictor on its held-out model selection data, using only\nthe individual models that were not tra"
  },
  {
    "path": "downloads-generation/analysis_predictor_info/requirements.txt",
    "chars": 17,
    "preview": "logomaker\nseaborn"
  },
  {
    "path": "downloads-generation/data_curated/GENERATE.sh",
    "chars": 2705,
    "preview": "#!/bin/bash\n#\n# Create \"curated\" training data, which combines an IEDB download with additional\n# published data, remove"
  },
  {
    "path": "downloads-generation/data_curated/README.md",
    "chars": 483,
    "preview": "# Combined training data\n\nThis download contains the data used to train the production class1 MHCflurry models. This dat"
  },
  {
    "path": "downloads-generation/data_curated/curate.py",
    "chars": 11972,
    "preview": "\"\"\"\nFilter and combine various peptide/MHC datasets to derive a composite training set,\noptionally including eluted pept"
  },
  {
    "path": "downloads-generation/data_curated/curate_ms_by_pmid.py",
    "chars": 46974,
    "preview": "\"\"\"\nFilter and combine various peptide/MHC datasets to derive a composite training set,\noptionally including eluted pept"
  },
  {
    "path": "downloads-generation/data_curated/requirements.txt",
    "chars": 12,
    "preview": "xlrd>=1.1.0\n"
  },
  {
    "path": "downloads-generation/data_evaluation/GENERATE.sh",
    "chars": 15139,
    "preview": "#!/bin/bash\n#\n#\n# Usage: GENERATE.sh <local|cluster> <fresh|continue-incomplete>\n#\n# cluster mode uses an HPC cluster (M"
  },
  {
    "path": "downloads-generation/data_evaluation/cluster_submit_script_header.mssm_hpc.lsf",
    "chars": 678,
    "preview": "#!/bin/bash\n#BSUB -J MHCf # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q premium # queue\n#"
  },
  {
    "path": "downloads-generation/data_evaluation/join_with_precomputed.py",
    "chars": 4560,
    "preview": "\"\"\"\nJoin benchmark with precomputed predictions.\n\"\"\"\nimport sys\nimport argparse\nimport os\nimport numpy\nimport collection"
  },
  {
    "path": "downloads-generation/data_evaluation/make_benchmark.py",
    "chars": 7210,
    "preview": "\"\"\"\nMake training data by selecting decoys, etc.\n\"\"\"\nimport sys\nimport argparse\nimport os\nimport numpy\nimport collection"
  },
  {
    "path": "downloads-generation/data_evaluation/split_by_sample.py",
    "chars": 1137,
    "preview": "\"\"\"\nSplit a big csv by a particular column (sample id)\n\"\"\"\nimport sys\nimport argparse\nimport re\n\nimport pandas\n\n\nparser "
  },
  {
    "path": "downloads-generation/data_iedb/GENERATE.sh",
    "chars": 1119,
    "preview": "#!/bin/bash\n#\n# Download latest MHC I ligand data from IEDB.\n#\nset -e\nset -x\n\nDOWNLOAD_NAME=data_iedb\nSCRATCH_DIR=${TMPD"
  },
  {
    "path": "downloads-generation/data_iedb/README.md",
    "chars": 173,
    "preview": "# IEDB Data\n\nThis download is a snapshot of the IEDB MHC ligand data, available at:\n\nhttp://www.iedb.org/doc/mhc_ligand_"
  },
  {
    "path": "downloads-generation/data_mass_spec_annotated/GENERATE.sh",
    "chars": 1175,
    "preview": "#!/bin/bash\n#\n#\nset -e\nset -x\n\nDOWNLOAD_NAME=data_mass_spec_annotated\nSCRATCH_DIR=${TMPDIR-/tmp}/mhcflurry-downloads-gen"
  },
  {
    "path": "downloads-generation/data_mass_spec_annotated/annotate.py",
    "chars": 3974,
    "preview": "\"\"\"\n\"\"\"\nimport sys\nimport argparse\nimport os\nimport time\nimport collections\nimport re\nfrom six.moves import StringIO\n\nim"
  },
  {
    "path": "downloads-generation/data_mass_spec_annotated/requirements.txt",
    "chars": 13,
    "preview": "shellinford\n\n"
  },
  {
    "path": "downloads-generation/data_predictions/GENERATE.WITH_HPC_CLUSTER.sh",
    "chars": 43,
    "preview": "bash GENERATE.sh cluster reuse-predictions\n"
  },
  {
    "path": "downloads-generation/data_predictions/GENERATE.sh",
    "chars": 6597,
    "preview": "#!/bin/bash\n#\n# This download includes predictions for NetMHCpan 4.0 and MixMHCpred over a\n# large number of peptides en"
  },
  {
    "path": "downloads-generation/data_predictions/cluster_submit_script_header.mssm_hpc.gpu.lsf",
    "chars": 1087,
    "preview": "#!/bin/bash\n#BSUB -J MHCf-{work_item_num} # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q g"
  },
  {
    "path": "downloads-generation/data_predictions/cluster_submit_script_header.mssm_hpc.nogpu.lsf",
    "chars": 1057,
    "preview": "#!/bin/bash\n#BSUB -J MHCf-{work_item_num} # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q e"
  },
  {
    "path": "downloads-generation/data_predictions/requirements.txt",
    "chars": 9,
    "preview": "mhctools\n"
  },
  {
    "path": "downloads-generation/data_predictions/run_predictors.py",
    "chars": 17362,
    "preview": "\"\"\"\n\"\"\"\nimport argparse\nimport os\nimport signal\nimport sys\nimport time\nimport traceback\nimport math\nimport collections\nf"
  },
  {
    "path": "downloads-generation/data_predictions/write_allele_list.py",
    "chars": 930,
    "preview": "\"\"\"\n\"\"\"\nimport sys\nimport argparse\nimport os\n\nimport pandas\nimport tqdm  # progress bar\ntqdm.monitor_interval = 0  # see"
  },
  {
    "path": "downloads-generation/data_predictions/write_proteome_peptides.py",
    "chars": 3754,
    "preview": "\"\"\"\n\"\"\"\nimport sys\nimport argparse\nimport os\nimport time\nimport collections\nimport re\nfrom six.moves import StringIO\n\nim"
  },
  {
    "path": "downloads-generation/data_published/GENERATE.sh",
    "chars": 6462,
    "preview": "#!/bin/bash\n#\n# Download published non-IEDB MHC I ligand data. Most data has made its way into\n# IEDB but not all. Here "
  },
  {
    "path": "downloads-generation/data_published/README.md",
    "chars": 668,
    "preview": "# Published datasets\n\nThese datasets are derived from publications and do not change.\n\nTo generate this download run:\n\n`"
  },
  {
    "path": "downloads-generation/data_references/GENERATE.sh",
    "chars": 2162,
    "preview": "#!/bin/bash\n#\n#\n#\nset -e\nset -x\n\nDOWNLOAD_NAME=data_references\nSCRATCH_DIR=${TMPDIR-/tmp}/mhcflurry-downloads-generation"
  },
  {
    "path": "downloads-generation/data_references/README.md",
    "chars": 181,
    "preview": "# data_mass_spec_annotated\n\nOn OS X, if you encounter problem installing shellinford, try this:\n\n```\nCXXFLAGS=\"-stdlib=l"
  },
  {
    "path": "downloads-generation/data_references/process.py",
    "chars": 3204,
    "preview": "\"\"\"\n\"\"\"\nimport sys\nimport argparse\nimport os\nimport gzip\n\nimport pandas\n\nimport gtfparse\nimport shellinford\nfrom Bio imp"
  },
  {
    "path": "downloads-generation/data_references/requirements.txt",
    "chars": 32,
    "preview": "shellinford\nbiopython\ngtfparse\n\n"
  },
  {
    "path": "downloads-generation/data_systemhcatlas/GENERATE.sh",
    "chars": 1195,
    "preview": "#!/bin/bash\n#\n# Download some published MHC I ligands identified by mass-spec\n#\n#\nset -e\nset -x\n\nDOWNLOAD_NAME=data_syst"
  },
  {
    "path": "downloads-generation/data_systemhcatlas/README.md",
    "chars": 286,
    "preview": "# SysteMHC database dump\n\nThis is a database export of the [SysteMHC Atlas](https://systemhcatlas.org/)\ndownloaded from "
  },
  {
    "path": "downloads-generation/models_class1/GENERATE.sh",
    "chars": 2212,
    "preview": "#!/bin/bash\n#\n# Model select standard MHCflurry Class I models.\n#\nset -e\nset -x\n\nDOWNLOAD_NAME=models_class1\nSCRATCH_DIR"
  },
  {
    "path": "downloads-generation/models_class1/README.md",
    "chars": 160,
    "preview": "# Class I allele-specific models (ensemble)\n\nThis download contains trained MHC Class I MHCflurry models.\n\nTo generate t"
  },
  {
    "path": "downloads-generation/models_class1/write_validation_data.py",
    "chars": 3216,
    "preview": "\"\"\"\nWrite and summarize model validation data, which is obtained by taking a full\ndataset and removing the data used for"
  },
  {
    "path": "downloads-generation/models_class1_kim_benchmark/GENERATE.sh",
    "chars": 2604,
    "preview": "#!/bin/bash\n#\nset -x\n\nDOWNLOAD_NAME=models_class1_kim_benchmark\nSCRATCH_DIR=${TMPDIR-/tmp}/mhcflurry-downloads-generatio"
  },
  {
    "path": "downloads-generation/models_class1_kim_benchmark/README.md",
    "chars": 606,
    "preview": "# Kim benchmark\n\nThis download trains MHCflurry predictors using the BD2009 dataset from \n[Dataset size and composition "
  },
  {
    "path": "downloads-generation/models_class1_kim_benchmark/class1_pseudosequences.csv",
    "chars": 137484,
    "preview": "allele,pseudosequence\nHLA-A*01:01,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY\nHLA-A*01:02,YSAMYQENMAHTDANTLYIIYRDYTWVARVYRGY\nHLA-"
  },
  {
    "path": "downloads-generation/models_class1_kim_benchmark/curate.py",
    "chars": 9601,
    "preview": "\"\"\"\nFilter and combine various peptide/MHC datasets to derive a composite training set,\noptionally including eluted pept"
  },
  {
    "path": "downloads-generation/models_class1_kim_benchmark/generate_hyperparameters.py",
    "chars": 3074,
    "preview": "\"\"\"\nGenerate grid of hyperparameters\n\"\"\"\n\nfrom sys import stdout\nfrom copy import deepcopy\nfrom yaml import dump\n\nbase_h"
  },
  {
    "path": "downloads-generation/models_class1_kim_benchmark/write_validation_data.py",
    "chars": 3216,
    "preview": "\"\"\"\nWrite and summarize model validation data, which is obtained by taking a full\ndataset and removing the data used for"
  },
  {
    "path": "downloads-generation/models_class1_minimal/GENERATE.sh",
    "chars": 2302,
    "preview": "#!/bin/bash\n#\n# Model select standard MHCflurry Class I models, limiting to 1 model per allele.\n#\nset -e\nset -x\n\nDOWNLOA"
  },
  {
    "path": "downloads-generation/models_class1_minimal/README.md",
    "chars": 547,
    "preview": "# Class I allele-specific models (minimal: ensemble size of 1)\n\nThis download contains \"minimal\" MHC Class I MHCflurry p"
  },
  {
    "path": "downloads-generation/models_class1_pan/GENERATE.WITH_HPC_CLUSTER.sh",
    "chars": 25,
    "preview": "bash GENERATE.sh cluster\n"
  },
  {
    "path": "downloads-generation/models_class1_pan/GENERATE.sh",
    "chars": 5338,
    "preview": "#!/bin/bash\n#\n# Train pan-allele MHCflurry Class I models. Supports re-starting a failed run.\n#\n# Usage: GENERATE.sh <lo"
  },
  {
    "path": "downloads-generation/models_class1_pan/README.md",
    "chars": 179,
    "preview": "# Class I pan-allele models (ensemble)\n\nThis download contains trained MHC Class I MHCflurry models before model selecti"
  },
  {
    "path": "downloads-generation/models_class1_pan/additional_alleles.txt",
    "chars": 121,
    "preview": "# Additional alleles besides those in the training data to include in percentile rank calibration\nHLA-C*02:10\nHLA-A*02:2"
  },
  {
    "path": "downloads-generation/models_class1_pan/cluster_submit_script_header.mssm_hpc.lsf",
    "chars": 1094,
    "preview": "#!/bin/bash\n#BSUB -J MHCf-{work_item_num} # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q g"
  },
  {
    "path": "downloads-generation/models_class1_pan/generate_hyperparameters.py",
    "chars": 2709,
    "preview": "\"\"\"\nGenerate grid of hyperparameters\n\"\"\"\n\nfrom sys import stdout\nfrom copy import deepcopy\nfrom yaml import dump\n\nbase_h"
  },
  {
    "path": "downloads-generation/models_class1_pan/reassign_mass_spec_training_data.py",
    "chars": 1426,
    "preview": "\"\"\"\nReassign affinity values for mass spec data\n\"\"\"\nimport sys\nimport os\nimport argparse\n\nimport pandas\n\nparser = argpar"
  },
  {
    "path": "downloads-generation/models_class1_pan_variants/GENERATE.WITH_HPC_CLUSTER.sh",
    "chars": 25,
    "preview": "bash GENERATE.sh cluster\n"
  },
  {
    "path": "downloads-generation/models_class1_pan_variants/GENERATE.sh",
    "chars": 9537,
    "preview": "#!/bin/bash\n#\n# Uses an HPC cluster (Mount Sinai chimera cluster, which uses lsf job\n# scheduler). This would need to be"
  },
  {
    "path": "downloads-generation/models_class1_pan_variants/cluster_submit_script_header.mssm_hpc.gpu.lsf",
    "chars": 1087,
    "preview": "#!/bin/bash\n#BSUB -J MHCf-{work_item_num} # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q g"
  },
  {
    "path": "downloads-generation/models_class1_pan_variants/exclude_data_from_training.py",
    "chars": 3703,
    "preview": "\"\"\"\nExtract allele/peptide pairs to exclude from training data.\n\"\"\"\nimport sys\nimport os\nimport argparse\n\nimport pandas\n"
  },
  {
    "path": "downloads-generation/models_class1_pan_variants/generate_hyperparameters.py",
    "chars": 1657,
    "preview": "\"\"\"\nGenerate grid of hyperparameters\n\"\"\"\n\nfrom sys import stdout, argv\nfrom copy import deepcopy\nfrom yaml import dump, "
  },
  {
    "path": "downloads-generation/models_class1_presentation/GENERATE.sh",
    "chars": 4325,
    "preview": "#!/bin/bash\n#\n#\n# Usage: GENERATE.sh <local|cluster> <fresh|continue-incomplete>\n#\n# cluster mode uses an HPC cluster (M"
  },
  {
    "path": "downloads-generation/models_class1_presentation/cluster_submit_script_header.mssm_hpc.lsf",
    "chars": 678,
    "preview": "#!/bin/bash\n#BSUB -J MHCf # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q premium # queue\n#"
  },
  {
    "path": "downloads-generation/models_class1_presentation/make_train_data.py",
    "chars": 7214,
    "preview": "\"\"\"\nMake training data by selecting decoys, etc.\n\"\"\"\nimport sys\nimport argparse\nimport os\nimport numpy\nimport collection"
  },
  {
    "path": "downloads-generation/models_class1_processing/GENERATE.WITH_HPC_CLUSTER.sh",
    "chars": 25,
    "preview": "bash GENERATE.sh cluster\n"
  },
  {
    "path": "downloads-generation/models_class1_processing/GENERATE.sh",
    "chars": 6087,
    "preview": "#!/bin/bash\n#\n#\n# Usage: GENERATE.sh <local|cluster> <fresh|continue-incomplete>\n#\n# cluster mode uses an HPC cluster (M"
  },
  {
    "path": "downloads-generation/models_class1_processing/annotate_hits_with_expression.py",
    "chars": 1676,
    "preview": "\"\"\"\nAnnotate hits with expression (tpm), and roll up to just the highest-expressed\ngene for each peptide.\n\"\"\"\nimport sys"
  },
  {
    "path": "downloads-generation/models_class1_processing/cluster_submit_script_header.mssm_hpc.lsf",
    "chars": 1094,
    "preview": "#!/bin/bash\n#BSUB -J MHCf-{work_item_num} # Job name\n#BSUB -P acc_nkcancer # allocation account or Unix group\n#BSUB -q g"
  },
  {
    "path": "downloads-generation/models_class1_processing/generate_hyperparameters.base.py",
    "chars": 1947,
    "preview": "\"\"\"\nGenerate grid of hyperparameters\n\"\"\"\nfrom __future__ import print_function\nfrom sys import stdout, stderr\nfrom copy "
  },
  {
    "path": "downloads-generation/models_class1_processing/generate_hyperparameters.variants.py",
    "chars": 1349,
    "preview": "\"\"\"\nGenerate grid of hyperparameters\n\"\"\"\n\nfrom sys import stdout, argv\nfrom copy import deepcopy\nfrom yaml import dump, "
  },
  {
    "path": "downloads-generation/models_class1_processing/make_train_data.py",
    "chars": 9964,
    "preview": "\"\"\"\nMake training data by selecting decoys, etc.\n\"\"\"\nimport sys\nimport argparse\nimport os\nimport numpy\nimport time\nfrom "
  },
  {
    "path": "downloads-generation/models_class1_selected_no_mass_spec/GENERATE.sh",
    "chars": 2079,
    "preview": "#!/bin/bash\n#\n# Model select standard MHCflurry Class I models.\n#\nset -e\nset -x\n\nDOWNLOAD_NAME=models_class1_selected_no"
  },
  {
    "path": "downloads-generation/models_class1_trained_with_mass_spec/GENERATE.sh",
    "chars": 2344,
    "preview": "#!/bin/bash\n#\n# Model select MHCflurry Class I models that were trained on mass-spec. Model\n# selection uses both mass-s"
  },
  {
    "path": "downloads-generation/models_class1_unselected/GENERATE.sh",
    "chars": 1721,
    "preview": "#!/bin/bash\n#\n# Train standard MHCflurry Class I models.\n# Calls mhcflurry-class1-train-allele-specific-models on curate"
  },
  {
    "path": "downloads-generation/models_class1_unselected/README.md",
    "chars": 160,
    "preview": "# Class I allele-specific models (ensemble)\n\nThis download contains trained MHC Class I MHCflurry models.\n\nTo generate t"
  },
  {
    "path": "downloads-generation/models_class1_unselected/class1_pseudosequences.csv",
    "chars": 137484,
    "preview": "allele,pseudosequence\nHLA-A*01:01,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY\nHLA-A*01:02,YSAMYQENMAHTDANTLYIIYRDYTWVARVYRGY\nHLA-"
  },
  {
    "path": "downloads-generation/models_class1_unselected/generate_hyperparameters.py",
    "chars": 3074,
    "preview": "\"\"\"\nGenerate grid of hyperparameters\n\"\"\"\n\nfrom sys import stdout\nfrom copy import deepcopy\nfrom yaml import dump\n\nbase_h"
  },
  {
    "path": "downloads-generation/models_class1_unselected_with_mass_spec/GENERATE.sh",
    "chars": 1738,
    "preview": "#!/bin/bash\n#\n# Train standard MHCflurry Class I models.\n# Calls mhcflurry-class1-train-allele-specific-models on curate"
  },
  {
    "path": "downloads-generation/models_class1_unselected_with_mass_spec/README.md",
    "chars": 160,
    "preview": "# Class I allele-specific models (ensemble)\n\nThis download contains trained MHC Class I MHCflurry models.\n\nTo generate t"
  },
  {
    "path": "downloads-generation/models_class1_unselected_with_mass_spec/class1_pseudosequences.csv",
    "chars": 137484,
    "preview": "allele,pseudosequence\nHLA-A*01:01,YFAMYQENMAHTDANTLYIIYRDYTWVARVYRGY\nHLA-A*01:02,YSAMYQENMAHTDANTLYIIYRDYTWVARVYRGY\nHLA-"
  },
  {
    "path": "downloads-generation/models_class1_unselected_with_mass_spec/generate_hyperparameters.py",
    "chars": 3074,
    "preview": "\"\"\"\nGenerate grid of hyperparameters\n\"\"\"\n\nfrom sys import stdout\nfrom copy import deepcopy\nfrom yaml import dump\n\nbase_h"
  },
  {
    "path": "downloads-generation/random_peptide_predictions/GENERATE.sh",
    "chars": 1137,
    "preview": "#!/bin/bash\n#\n# Generate predictions for random peptides. Used for pre-training some models.\n#\nset -e\nset -x\n\nDOWNLOAD_N"
  },
  {
    "path": "downloads-generation/random_peptide_predictions/random_predictions.py",
    "chars": 1943,
    "preview": "\"\"\"\nGenerate predictions for random peptides.\n\"\"\"\nfrom __future__ import print_function\n\nimport sys\nimport argparse\nimpo"
  },
  {
    "path": "lint.sh",
    "chars": 210,
    "preview": "#!/bin/bash\nset -o errexit\n\n# Lint using ruff (fast Python linter)\n# Run from project root directory\n\necho \"Running ruff"
  },
  {
    "path": "mhcflurry/__init__.py",
    "chars": 610,
    "preview": "\"\"\"\nClass I MHC ligand prediction package\n\"\"\"\n\nfrom .class1_affinity_predictor import Class1AffinityPredictor\nfrom .clas"
  },
  {
    "path": "mhcflurry/allele_encoding.py",
    "chars": 5474,
    "preview": "import pandas\n\nfrom . import amino_acid\n\n\nclass AlleleEncoding(object):\n    def __init__(self, alleles=None, allele_to_s"
  },
  {
    "path": "mhcflurry/amino_acid.py",
    "chars": 5508,
    "preview": "\"\"\"\nFunctions for encoding fixed length sequences of amino acids into various\nvector representations, such as one-hot an"
  },
  {
    "path": "mhcflurry/calibrate_percentile_ranks_command.py",
    "chars": 14047,
    "preview": "\"\"\"\nCalibrate percentile ranks for models. Runs in-place.\n\"\"\"\nimport argparse\nimport os\nimport signal\nimport sys\nimport "
  },
  {
    "path": "mhcflurry/class1_affinity_predictor.py",
    "chars": 68310,
    "preview": "import collections\nimport hashlib\nimport json\nimport logging\nimport time\nimport warnings\nfrom os.path import join, exist"
  },
  {
    "path": "mhcflurry/class1_neural_network.py",
    "chars": 99419,
    "preview": "\"\"\"\nClass1NeuralNetwork - PyTorch implementation for MHC class I binding prediction.\n\"\"\"\n\nimport gc\nimport time\nimport c"
  },
  {
    "path": "mhcflurry/class1_presentation_predictor.py",
    "chars": 44215,
    "preview": "from os.path import join, exists\nfrom os import mkdir\nfrom socket import gethostname\nfrom getpass import getuser\n\nimport"
  },
  {
    "path": "mhcflurry/class1_processing_neural_network.py",
    "chars": 38409,
    "preview": "\"\"\"\nAntigen processing neural network implementation - PyTorch version\n\"\"\"\n\nimport time\nimport collections\nimport gc\nimp"
  },
  {
    "path": "mhcflurry/class1_processing_predictor.py",
    "chars": 13319,
    "preview": "from os.path import join, exists, abspath\nfrom os import mkdir\nfrom socket import gethostname\nfrom getpass import getuse"
  },
  {
    "path": "mhcflurry/cluster_parallelism.py",
    "chars": 14787,
    "preview": "\"\"\"\nSimple, relatively naive parallel map implementation for HPC clusters.\n\nUsed for training MHCflurry models.\n\"\"\"\nimpo"
  },
  {
    "path": "mhcflurry/cluster_worker_entry_point.py",
    "chars": 197,
    "preview": "\"\"\"\nModule entry point for cluster workers to ensure the current interpreter is used.\n\"\"\"\n\nfrom .cluster_parallelism imp"
  },
  {
    "path": "mhcflurry/common.py",
    "chars": 11103,
    "preview": "import collections\nimport logging\nimport sys\nimport os\nimport json\nimport warnings\n\nimport numpy\nimport pandas\nfrom mhcg"
  },
  {
    "path": "mhcflurry/custom_loss.py",
    "chars": 9056,
    "preview": "\"\"\"\nCustom loss functions.\n\nFor losses supporting inequalities, each training data point is associated with\none of (=), "
  },
  {
    "path": "mhcflurry/data_dependent_weights_initialization.py",
    "chars": 7926,
    "preview": "\"\"\"\nLayer-sequential unit-variance initialization for neural networks.\n\nSee:\n    Mishkin and Matas, \"All you need is a g"
  },
  {
    "path": "mhcflurry/downloads.py",
    "chars": 8408,
    "preview": "\"\"\"\nManage local downloaded data.\n\"\"\"\n\nimport logging\nimport yaml\nfrom os.path import join, exists\nfrom os import enviro"
  },
  {
    "path": "mhcflurry/downloads.yml",
    "chars": 42296,
    "preview": "# This file describes collections of data and trained model weights that are\n# released with MHCflurry. We refer to thes"
  },
  {
    "path": "mhcflurry/downloads_command.py",
    "chars": 10693,
    "preview": "'''\nDownload MHCflurry released datasets and trained models.\n\nExamples\n\nFetch the default downloads:\n    $ mhcflurry-dow"
  },
  {
    "path": "mhcflurry/encodable_sequences.py",
    "chars": 18813,
    "preview": "\"\"\"\nClass for encoding variable-length peptides to fixed-size numerical matrices\n\"\"\"\nimport math\n\nimport numpy\nimport pa"
  },
  {
    "path": "mhcflurry/ensemble_centrality.py",
    "chars": 2509,
    "preview": "\"\"\"\nMeasures of centrality (e.g. mean) used to combine predictions across an\nensemble. The input to these functions are "
  },
  {
    "path": "mhcflurry/fasta.py",
    "chars": 4122,
    "preview": "\"\"\"\nAdapted from pyensembl, github.com/openvax/pyensembl\nOriginal implementation by Alex Rubinsteyn.\n\nThe worse sin in b"
  },
  {
    "path": "mhcflurry/flanking_encoding.py",
    "chars": 6443,
    "preview": "\"\"\"\nClass for encoding variable-length flanking and peptides to\nfixed-size numerical matrices\n\"\"\"\nfrom collections impor"
  },
  {
    "path": "mhcflurry/hyperparameters.py",
    "chars": 3651,
    "preview": "\"\"\"\nHyperparameter (neural network options) management\n\"\"\"\nimport itertools\n\n\nclass HyperparameterDefaults(object):\n    "
  },
  {
    "path": "mhcflurry/local_parallelism.py",
    "chars": 12654,
    "preview": "\"\"\"\nInfrastructure for \"local\" parallelism, i.e. multiprocess parallelism on one\ncompute node.\n\"\"\"\n\nimport itertools\nimp"
  },
  {
    "path": "mhcflurry/percent_rank_transform.py",
    "chars": 2387,
    "preview": "\"\"\"\nClass for transforming arbitrary values into percent ranks given a distribution.\n\"\"\"\nimport numpy\nimport pandas\n\n\ncl"
  },
  {
    "path": "mhcflurry/predict_command.py",
    "chars": 10330,
    "preview": "'''\nRun MHCflurry predictor on specified peptides.\n\nBy default, the presentation predictor is used, and predictions for\n"
  },
  {
    "path": "mhcflurry/predict_scan_command.py",
    "chars": 11364,
    "preview": "'''\nScan protein sequences using the MHCflurry presentation predictor.\n\nBy default, sub-sequences (peptides) with affini"
  },
  {
    "path": "mhcflurry/pytorch_layers.py",
    "chars": 3606,
    "preview": "\"\"\"\nPyTorch custom layers for mhcflurry.\n\"\"\"\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\n\n\ndef ge"
  },
  {
    "path": "mhcflurry/pytorch_losses.py",
    "chars": 12270,
    "preview": "\"\"\"\nPyTorch loss functions for mhcflurry.\n\nSupports inequality constraints where training data includes (=), (<), and (>"
  },
  {
    "path": "mhcflurry/random_negative_peptides.py",
    "chars": 11367,
    "preview": "import logging\nimport math\n\nimport numpy\nimport pandas\n\nfrom .hyperparameters import HyperparameterDefaults\nfrom .common"
  },
  {
    "path": "mhcflurry/regression_target.py",
    "chars": 685,
    "preview": "import numpy\n\n\ndef from_ic50(ic50, max_ic50=50000.0):\n    \"\"\"\n    Convert ic50s to regression targets in the range [0.0,"
  },
  {
    "path": "mhcflurry/scoring.py",
    "chars": 1411,
    "preview": "\"\"\"\nMeasures of prediction accuracy\n\"\"\"\nimport logging\nimport sklearn.metrics\nimport numpy\nimport scipy\n\nfrom .regressio"
  },
  {
    "path": "mhcflurry/select_allele_specific_models_command.py",
    "chars": 28693,
    "preview": "\"\"\"\nModel select class1 single allele models.\n\"\"\"\nimport argparse\nimport os\nimport signal\nimport sys\nimport time\nimport "
  },
  {
    "path": "mhcflurry/select_pan_allele_models_command.py",
    "chars": 12391,
    "preview": "\"\"\"\nModel select class1 pan-allele models.\n\nAPPROACH: For each training fold, we select at least min and at most max mod"
  },
  {
    "path": "mhcflurry/select_processing_models_command.py",
    "chars": 9791,
    "preview": "\"\"\"\nModel select antigen processing models.\n\nAPPROACH: For each training fold, we select at least min and at most max mo"
  },
  {
    "path": "mhcflurry/testing_utils.py",
    "chars": 373,
    "preview": "\"\"\"\nUtilities used in MHCflurry unit tests.\n\"\"\"\nfrom . import Class1NeuralNetwork\nfrom .common import configure_pytorch\n"
  },
  {
    "path": "mhcflurry/train_allele_specific_models_command.py",
    "chars": 15402,
    "preview": "\"\"\"\nTrain Class1 single allele models.\n\"\"\"\nimport argparse\nimport os\nimport signal\nimport sys\nimport time\nimport traceba"
  },
  {
    "path": "mhcflurry/train_pan_allele_models_command.py",
    "chars": 24826,
    "preview": "\"\"\"\nTrain Class1 pan-allele models.\n\"\"\"\nimport argparse\nimport os\nfrom os.path import join\nimport signal\nimport sys\nimpo"
  },
  {
    "path": "mhcflurry/train_presentation_models_command.py",
    "chars": 5339,
    "preview": "\"\"\"\nTrain Class1 presentation models.\n\"\"\"\nimport argparse\nimport os\nimport signal\nimport sys\nimport time\nimport tracebac"
  },
  {
    "path": "mhcflurry/train_processing_models_command.py",
    "chars": 15855,
    "preview": "\"\"\"\nTrain Class1 processing models.\n\"\"\"\nimport argparse\nimport os\nfrom os.path import join\nimport signal\nimport sys\nimpo"
  },
  {
    "path": "mhcflurry/version.py",
    "chars": 22,
    "preview": "__version__ = \"2.2.1\"\n"
  },
  {
    "path": "notebooks/example1.ipynb",
    "chars": 10558,
    "preview": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {\n    \"collapsed\": true\n   },\n   \"outp"
  },
  {
    "path": "notebooks/mhcflurry-colab.ipynb",
    "chars": 61096,
    "preview": "{\n  \"cells\": [\n    {\n      \"cell_type\": \"markdown\",\n      \"source\": [\n        \"# Setup\"\n      ],\n      \"metadata\": {\n   "
  },
  {
    "path": "pylintrc",
    "chars": 140,
    "preview": "[TYPECHECK]\n# Without ignoring this, we get errors like:\n# E:249,20: Module 'numpy' has no 'nan' member (no-member)\nigno"
  },
  {
    "path": "readthedocs.yml",
    "chars": 38,
    "preview": "conda:\n    file: docs/environment.yml\n"
  },
  {
    "path": "requirements.txt",
    "chars": 89,
    "preview": "numpy>=1.22.4\npandas>=2.0\ntorch>=2.0.0\nappdirs\nscikit-learn\nmhcgnomes>=3.0.1\npyyaml\ntqdm\n"
  },
  {
    "path": "scripts/compare_tf_pytorch_random_outputs.py",
    "chars": 27711,
    "preview": "\"\"\"\nLarge-scale TF vs PyTorch MHCflurry comparison on random peptide/allele examples.\n\nThis script is designed to run lo"
  },
  {
    "path": "scripts/cross_allele_parity_analysis.py",
    "chars": 29846,
    "preview": "\"\"\"\nCross-allele TF vs PyTorch parity analysis for MHCflurry.\n\nThis script:\n1. Selects a limited curated allele panel (d"
  },
  {
    "path": "scripts/extract_high_presentation_fixture.py",
    "chars": 6307,
    "preview": "\"\"\"\nExtract high-presentation TF rows for release regression fixtures.\n\nGiven a TF predictions table from `cross_allele_"
  },
  {
    "path": "scripts/generate_fixture_error_report.py",
    "chars": 34023,
    "preview": "\"\"\"\nGenerate an HTML parity/error report against cached master-release fixtures.\n\nThe report compares the current branch"
  },
  {
    "path": "scripts/modal_train_mhcflurry.py",
    "chars": 4314,
    "preview": "\"\"\"\nRun MHCflurry training jobs on Modal.\n\nThis script is intentionally generic: pass any supported training command\ntem"
  },
  {
    "path": "scripts/plot_fixture_diffs.py",
    "chars": 10393,
    "preview": "#!/usr/bin/env python\n\"\"\"\nCompare current PyTorch predictions against the TF fixture and generate\nper-output figures sho"
  },
  {
    "path": "scripts/validate_allele_sequences.py",
    "chars": 9477,
    "preview": "#!/usr/bin/env python\n\"\"\"\nValidate that allele name -> pseudosequence mappings are consistent.\n\nChecks:\n  1. The raw all"
  },
  {
    "path": "selected-peptides.csv",
    "chars": 15713,
    "preview": ",gene,rnav8_len,k,start_1based,end_1based,peptide,is_mut,is_ref,is_linker,is_leader,is_mitd,is_chimeric,is_CTA,mhcflurry"
  },
  {
    "path": "setup.py",
    "chars": 3775,
    "preview": "# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     htt"
  },
  {
    "path": "setup_local_env.sh",
    "chars": 472,
    "preview": "#!/bin/bash\nset -e\n\nVENV_DIR=\".venv\"\n\n# Create virtual environment if it doesn't exist\nif [ ! -d \"$VENV_DIR\" ]; then\n   "
  },
  {
    "path": "test/__init__.py",
    "chars": 1024,
    "preview": "'''\nUtility functions for tests.\n'''\n\nimport os\nimport time\n\n\ndef data_path(name):\n    '''\n    Return the absolute path "
  },
  {
    "path": "test/conftest.py",
    "chars": 760,
    "preview": "\"\"\"\nPytest configuration and session-wide initialization.\n\"\"\"\nfrom . import initialize\n\n# Ensure deterministic test setu"
  },
  {
    "path": "test/data/data_10mer.csv",
    "chars": 41,
    "preview": "peptide,mhc,meas\nAAAAAAAAAA,HLA-A0201,400"
  },
  {
    "path": "test/data/data_8mer.csv",
    "chars": 39,
    "preview": "peptide,mhc,meas\nAAAAAAAA,HLA-A0201,400"
  },
  {
    "path": "test/data/data_9mer.csv",
    "chars": 40,
    "preview": "peptide,mhc,meas\nAAAAAAAAA,HLA-A0201,400"
  },
  {
    "path": "test/data/example.fasta",
    "chars": 493,
    "preview": ">QHN73810.1 surface glycoprotein [Severe acute respiratory syndrome coronavirus 2] prefix\nMFVFLVLLPLVSSQCVNLTTRTQLPPAYTN"
  },
  {
    "path": "test/data/hpv_predictions.csv",
    "chars": 67196,
    "preview": "allele,peptide,Length,Affinity (uM),Binding Capacity,Status,Security,affinity,netmhcpan3,netmhcpan4,netmhc,MHCflurry 1.2"
  },
  {
    "path": "test/data/master_affinity_fixture_config.json",
    "chars": 6997,
    "preview": "{\"_network\": null, \"fit_info\": [], \"hyperparameters\": {\"activation\": \"tanh\", \"allele_amino_acid_encoding\": \"BLOSUM62\", \""
  },
  {
    "path": "test/data/master_affinity_fixture_predictions.json",
    "chars": 127,
    "preview": "{\"peptides\": [\"AAAAAAAAA\", \"CCCCCCCCC\", \"ACDEFGHIK\"], \"predictions\": [223.6097539934542, 223.6097539934542, 223.60975399"
  },
  {
    "path": "test/data/master_densenet_fixture_config.json",
    "chars": 12913,
    "preview": "{\"hyperparameters\": {\"peptide_dense_layer_sizes\": [3], \"layer_sizes\": [4, 3, 2], \"activation\": \"tanh\", \"output_activatio"
  },
  {
    "path": "test/data/master_densenet_fixture_predictions.json",
    "chars": 163,
    "preview": "{\"peptides\": [\"SYFPEITHI\", \"AAAAAAAAA\", \"CCCCCCCCC\", \"DDDDDDDDD\"], \"predictions\": [3600.1894226888244, 3600.189422688824"
  },
  {
    "path": "test/data/master_multi_output_fixture_config.json",
    "chars": 4059,
    "preview": "{\"hyperparameters\": {\"peptide_dense_layer_sizes\": [], \"layer_sizes\": [4], \"activation\": \"tanh\", \"output_activation\": \"si"
  },
  {
    "path": "test/data/master_multi_output_fixture_predictions.json",
    "chars": 159,
    "preview": "{\"peptides\": [\"SYFPEITHI\", \"AAAAAAAAA\", \"CCCCCCCCC\", \"DDDDDDDDD\"], \"predictions\": [39441.12426571068, 39441.12426571068,"
  },
  {
    "path": "test/data/master_pan_concat_fixture_config.json",
    "chars": 11159,
    "preview": "{\"hyperparameters\": {\"allele_amino_acid_encoding\": \"BLOSUM62\", \"allele_dense_layer_sizes\": [3], \"peptide_dense_layer_siz"
  },
  {
    "path": "test/data/master_pan_concat_fixture_predictions.json",
    "chars": 328,
    "preview": "{\"peptides\": [\"SYFPEITHI\", \"AAAAAAAAA\", \"CCCCCCCCC\", \"DDDDDDDDD\"], \"predictions\": [50000.0, 50000.0, 50000.0, 50000.0], "
  },
  {
    "path": "test/data/master_pan_multiply_fixture_config.json",
    "chars": 11139,
    "preview": "{\"hyperparameters\": {\"allele_amino_acid_encoding\": \"BLOSUM62\", \"allele_dense_layer_sizes\": [3], \"peptide_dense_layer_siz"
  },
  {
    "path": "test/data/master_pan_multiply_fixture_predictions.json",
    "chars": 328,
    "preview": "{\"peptides\": [\"SYFPEITHI\", \"AAAAAAAAA\", \"CCCCCCCCC\", \"DDDDDDDDD\"], \"predictions\": [50000.0, 50000.0, 50000.0, 50000.0], "
  },
  {
    "path": "test/data/master_released_class1_affinity_predictions.json",
    "chars": 715,
    "preview": "{\"release\": \"2.2.0\", \"allele_specific\": {\"alleles\": [\"HLA-A*01:01\", \"HLA-B*07:01\", \"HLA-C*03:03\", \"HLA-A*01:01\", \"HLA-B*"
  },
  {
    "path": "test/data/master_released_class1_presentation_highscore_rows_metadata.json",
    "chars": 827,
    "preview": "{\n  \"allele_count\": 35,\n  \"context_count\": 9,\n  \"high_score_threshold\": 0.9,\n  \"low_score_stats\": {\n    \"pres_with_prese"
  },
  {
    "path": "test/expensive_verify_pretrain_optimizable.py",
    "chars": 3513,
    "preview": "# Expensive test - not run by pytest.\n\nfrom mhcflurry import train_pan_allele_models_command\nfrom mhcflurry.downloads im"
  },
  {
    "path": "test/pytest_helpers.py",
    "chars": 2344,
    "preview": "\"\"\"\nTest helper functions providing assertion utilities.\n\"\"\"\n\n\nimport sys\n\n\n_MHCFLURRY_COMMANDS = {\n    \"mhcflurry-calib"
  },
  {
    "path": "test/test_allele_encoding.py",
    "chars": 1104,
    "preview": "\nimport time\n\nfrom mhcflurry.allele_encoding import AlleleEncoding\nfrom mhcflurry.amino_acid import BLOSUM62_MATRIX\nfrom"
  },
  {
    "path": "test/test_amino_acid.py",
    "chars": 1359,
    "preview": "\"\"\"Tests for amino acid encoding.\"\"\"\n\nfrom mhcflurry import amino_acid\nfrom numpy.testing import assert_equal\nimport pan"
  },
  {
    "path": "test/test_api_compat_shims.py",
    "chars": 1786,
    "preview": "import inspect\nimport pytest\n\nfrom mhcflurry.class1_neural_network import Class1NeuralNetwork\nfrom mhcflurry.common impo"
  },
  {
    "path": "test/test_calibrate_percentile_ranks_command.py",
    "chars": 2733,
    "preview": "\"\"\"\nTests for calibrate percentile ranks command\n\"\"\"\n\nimport os\nimport shutil\nimport tempfile\nimport subprocess\nimport p"
  },
  {
    "path": "test/test_changing_allele_representations.py",
    "chars": 3047,
    "preview": "\nimport pandas\nimport pytest\n\nfrom mhcflurry.class1_affinity_predictor import Class1AffinityPredictor\nfrom mhcflurry.dow"
  },
  {
    "path": "test/test_class1_affinity_predictor.py",
    "chars": 8845,
    "preview": "\"\"\"Tests for Class1AffinityPredictor.\"\"\"\nimport pytest\n\nimport tempfile\nimport shutil\nimport logging\nimport warnings\nimp"
  },
  {
    "path": "test/test_class1_neural_network.py",
    "chars": 10462,
    "preview": "\"\"\"\nTests for Class1NeuralNetwork.\n\"\"\"\nimport pytest\n\nimport numpy\nfrom numpy import testing\n\n\nimport pandas\n\nfrom mhcfl"
  }
]

// ... and 38 more files (download for full content)

About this extraction

This page contains the full source code of the openvax/mhcflurry GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 238 files (2.4 MB), approximately 649.4k tokens, and a symbol index with 794 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!