Showing preview only (903K chars total). Download the full file or copy to clipboard to get everything.
Repository: tensorflow/moonlight
Branch: master
Commit: d80279a3bf5d
Files: 162
Total size: 852.8 KB
Directory structure:
gitextract_080bjy0d/
├── AUTHORS
├── CONTRIBUTING.md
├── CONTRIBUTORS
├── LICENSE
├── README.md
├── WORKSPACE
├── docs/
│ ├── concepts.md
│ └── engine.md
├── moonlight/
│ ├── BUILD
│ ├── conversions/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── musicxml.py
│ │ ├── musicxml_test.py
│ │ └── notesequence.py
│ ├── data/
│ │ ├── README.md
│ │ └── glyphs_nn_model_20180808/
│ │ ├── BUILD
│ │ ├── saved_model.pbtxt
│ │ └── variables/
│ │ ├── variables.data-00000-of-00001
│ │ └── variables.index
│ ├── engine.py
│ ├── evaluation/
│ │ ├── BUILD
│ │ ├── evaluator.py
│ │ ├── evaluator_endtoend_test.py
│ │ ├── musicxml.py
│ │ └── musicxml_test.py
│ ├── glyphs/
│ │ ├── BUILD
│ │ ├── base.py
│ │ ├── convolutional.py
│ │ ├── convolutional_test.py
│ │ ├── corpus.py
│ │ ├── geometry.py
│ │ ├── glyph_types.py
│ │ ├── knn.py
│ │ ├── knn_model.py
│ │ ├── knn_test.py
│ │ ├── neural.py
│ │ ├── neural_test.py
│ │ ├── note_dots.py
│ │ ├── repeated.py
│ │ ├── saved_classifier.py
│ │ ├── saved_classifier_fn.py
│ │ ├── saved_classifier_test.py
│ │ └── testing.py
│ ├── image.py
│ ├── models/
│ │ ├── base/
│ │ │ ├── BUILD
│ │ │ ├── batches.py
│ │ │ ├── batches_test.py
│ │ │ ├── glyph_patches.py
│ │ │ ├── glyph_patches_test.py
│ │ │ ├── hyperparameters.py
│ │ │ ├── hyperparameters_test.py
│ │ │ ├── label_weights.py
│ │ │ └── label_weights_test.py
│ │ └── glyphs_dnn/
│ │ ├── BUILD
│ │ ├── model.py
│ │ └── train.py
│ ├── music/
│ │ ├── BUILD
│ │ └── constants.py
│ ├── omr.py
│ ├── omr_endtoend_test.py
│ ├── omr_regression_test.py
│ ├── page_processors.py
│ ├── pipeline/
│ │ ├── BUILD
│ │ └── pipeline_flags.py
│ ├── protobuf/
│ │ ├── BUILD
│ │ ├── groundtruth.proto
│ │ └── musicscore.proto
│ ├── score/
│ │ ├── BUILD
│ │ ├── elements/
│ │ │ ├── BUILD
│ │ │ ├── clef.py
│ │ │ ├── clef_test.py
│ │ │ ├── key_signature.py
│ │ │ └── key_signature_test.py
│ │ ├── measures.py
│ │ ├── reader.py
│ │ ├── reader_test.py
│ │ └── state/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── measure.py
│ │ └── staff.py
│ ├── score_processors.py
│ ├── scripts/
│ │ └── imslp_pdfs_to_pngs.sh
│ ├── staves/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── base.py
│ │ ├── detectors_test.py
│ │ ├── filter.py
│ │ ├── hough.py
│ │ ├── projection.py
│ │ ├── removal.py
│ │ ├── removal_test.py
│ │ ├── staff_processor.py
│ │ ├── staff_processor_test.py
│ │ ├── staffline_distance.py
│ │ ├── staffline_distance_test.py
│ │ ├── staffline_extractor.py
│ │ ├── staffline_extractor_test.py
│ │ └── testing.py
│ ├── structure/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── barlines.py
│ │ ├── barlines_test.py
│ │ ├── beam_processor.py
│ │ ├── beams.py
│ │ ├── components.py
│ │ ├── components_test.py
│ │ ├── section_barlines.py
│ │ ├── stems.py
│ │ ├── stems_test.py
│ │ ├── structure_test.py
│ │ ├── verticals.py
│ │ └── verticals_test.py
│ ├── testdata/
│ │ ├── BUILD
│ │ ├── IMSLP00747-000.LICENSE.md
│ │ ├── IMSLP00747.golden.LICENSE.md
│ │ ├── IMSLP00747.golden.xml
│ │ ├── README.md
│ │ ├── TWO_MEASURE_SAMPLE.LICENSE.md
│ │ └── TWO_MEASURE_SAMPLE.xml
│ ├── tools/
│ │ ├── BUILD
│ │ ├── export_kmeans_centroids.py
│ │ ├── export_kmeans_centroids_test.py
│ │ └── gen_structure_test_case.py
│ ├── training/
│ │ ├── clustering/
│ │ │ ├── BUILD
│ │ │ ├── kmeans_labeler.py
│ │ │ ├── kmeans_labeler_request_handler.py
│ │ │ ├── kmeans_labeler_request_handler_test.py
│ │ │ ├── kmeans_labeler_template.html
│ │ │ ├── staffline_patches_dofn.py
│ │ │ ├── staffline_patches_dofn_test.py
│ │ │ ├── staffline_patches_kmeans_pipeline.py
│ │ │ └── staffline_patches_kmeans_pipeline_test.py
│ │ └── generation/
│ │ ├── BUILD
│ │ ├── generation.py
│ │ ├── generation_test.py
│ │ ├── image_noise.py
│ │ ├── vexflow_generator.js
│ │ └── vexflow_generator_pipeline.py
│ ├── util/
│ │ ├── BUILD
│ │ ├── functional_ops.py
│ │ ├── functional_ops_test.py
│ │ ├── memoize.py
│ │ ├── more_iter_tools.py
│ │ ├── more_iter_tools_test.py
│ │ ├── patches.py
│ │ ├── patches_test.py
│ │ ├── run_length.py
│ │ ├── run_length_test.py
│ │ ├── segments.py
│ │ └── segments_test.py
│ └── vision/
│ ├── BUILD
│ ├── hough.py
│ ├── hough_test.py
│ ├── images.py
│ ├── images_test.py
│ ├── morphology.py
│ └── morphology_test.py
├── requirements.txt
├── sandbox/
│ └── README.md
├── six.BUILD
└── tools/
├── bazel_0.20.0-linux-x86_64.deb.sha256
└── travis_tests.sh
================================================
FILE CONTENTS
================================================
================================================
FILE: AUTHORS
================================================
# This is the list of Moonlight OMR authors for copyright purposes.
#
# Copyright for contributions made under Google's Corporate CLA belong to the
# contributor's organization. Contributions made under the Individual CLA belong
# to the author. Individual contributors are recognized separately in the
# "CONTRIBUTORS" file.
Google Inc.
Nuno Jesus
================================================
FILE: CONTRIBUTING.md
================================================
# How to Contribute
We'd love to accept your patches and contributions to this project. There are
just a few small guidelines you need to follow.
## Contributor License Agreement
Contributions to this project must be accompanied by a Contributor License
Agreement. You (or your employer) retain the copyright to your contribution;
this simply gives us permission to use and redistribute your contributions as
part of the project. Head over to <https://cla.developers.google.com/> to see
your current agreements on file or to sign a new one.
You generally only need to submit a CLA once, so if you've already submitted one
(even if it was for a different project), you probably don't need to do it
again.
## Code reviews
All submissions, including submissions by project members, require review. We
use GitHub pull requests for this purpose. Consult
[GitHub Help](https://help.github.com/articles/about-pull-requests/) for more
information on using pull requests.
## Community Guidelines
This project follows [Google's Open Source Community
Guidelines](https://opensource.google.com/conduct/).
================================================
FILE: CONTRIBUTORS
================================================
# This is the list of individual contributors to the Moonlight OMR project.
#
# Some contributions belong to the organization of the contributor. Copyright is
# tracked separately, in the "AUTHORS" file.
#
# We will make an effort to recognize all contributors here. However, the source
# of truth for contributors is the commit author in source control history.
Dan Ringwalt
Larry Li
Nuno Jesus
================================================
FILE: LICENSE
================================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
================================================
FILE: README.md
================================================
<img align="center" width="400" height="94,358" src="https://user-images.githubusercontent.com/34600369/40580500-74088e4a-6137-11e8-9705-ecac1499b1ce.png">
# Moonlight Optical Music Recognition (OMR) [](https://travis-ci.org/tensorflow/moonlight)
An experimental [optical music
recognition](https://en.wikipedia.org/wiki/Optical_music_recognition) engine.
Moonlight reads PNG image(s) containing sheet music and outputs
[MusicXML](https://www.musicxml.com/) or a
[NoteSequence message](https://github.com/tensorflow/magenta/blob/master/magenta/protobuf/music.proto).
MusicXML is a standard sheet music interchange format, and `NoteSequence` is
used by [Magenta](http://magenta.tensorflow.org) for training generative music
models.
Moonlight is not an officially supported Google product.
### Command-Line Usage
git clone https://github.com/tensorflow/moonlight
cd moonlight
# You may want to run this inside a virtualenv.
pip install -r requirements.txt
# Build the OMR command-line tool.
bazel build moonlight:omr
# Prints a Score message.
bazel-bin/moonlight/omr moonlight/testdata/IMSLP00747-000.png
# Scans several pages and prints a NoteSequence message.
bazel-bin/moonlight/omr --output_type=NoteSequence IMSLP00001-*.png
# Writes MusicXML to ~/mozart.xml.
bazel-bin/moonlight/omr --output_type=MusicXML --output=$HOME/mozart.xml \
corpus/56/IMSLP56442-*.png
The `omr` CLI will print a [`Score`](moonlight/protobuf/musicscore.proto)
message by default, or [MusicXML](https://www.musicxml.com/) or a
`NoteSequence` message if specified.
Moonlight is intended to be run in bulk, and will not offer a full UI for
correcting the score. The main entry point will be an Apache Beam pipeline that
processes an entire corpus of images.
There is no release yet, and Moonlight is not ready for end users. To run
interactively or import the module, you can use the [sandbox
directory](sandbox/README.md). Moonlight will be used offline for digitizing
a scanned corpus (it can be installed on all Cloud Compute platforms, and OS
compatibility is not a priority).
### Dependencies
* Linux
- Note: Our Google dep versions are fragile, and updating them or updating other OS may break directory structure in fragile ways.
* [Protobuf 3.6.1](https://pypi.org/project/protobuf/3.6.1/)
* [Bazel 0.20.0](https://github.com/bazelbuild/bazel/releases/tag/0.20.0). We
encountered some errors using Bazel 0.21.0 to build Protobuf 3.6.1, which is
the latest Protobuf release at the time of writing.
* Python version supported by TensorFlow (Python 3.5-3.7)
* Python dependencies specified in the [requirements](requirements.txt).
### Resources
[Forum](https://groups.google.com/forum/#!forum/moonlight-omr)
================================================
FILE: WORKSPACE
================================================
http_archive(
name = "com_google_protobuf",
sha256 = "40f009cb0c190816a52fc21d45c26558ee7d63c3bd511b326bd85739b2fd99a6",
strip_prefix = "protobuf-3.6.1",
url = "https://github.com/google/protobuf/releases/download/v3.6.1/protobuf-python-3.6.1.tar.gz",
)
new_http_archive(
name = "six_archive",
build_file = "six.BUILD",
sha256 = "105f8d68616f8248e24bf0e9372ef04d3cc10104f1980f54d57b2ce73a5ad56a",
strip_prefix = "six-1.10.0",
url = "https://pypi.python.org/packages/source/s/six/six-1.10.0.tar.gz#md5=34eed507548117b2ab523ab14b2f8b55",
)
bind(
name = "six",
actual = "@six_archive//:six",
)
new_http_archive(
name = "magenta",
strip_prefix = "magenta-48a199085e303eeae7c36068f050696209b856bb/magenta",
url = "https://github.com/tensorflow/magenta/archive/48a199085e303eeae7c36068f050696209b856bb.tar.gz",
sha256 = "931fb7b57714d667db618b0c31db1444e44baab17865ad66b76fd24d9e20ad6d",
build_file_content = "",
)
new_http_archive(
name = "omr_regression_test_data",
strip_prefix = "omr_regression_test_data_20180516",
url = "https://github.com/tensorflow/moonlight/releases/download/v2018.05.16-data/omr_regression_test_data_20180516.tar.gz",
sha256 = "b47577ee6b359c2cbbcdb8064c6bd463692c728c55ec5c0ab78139165ba8f35a",
build_file_content = """
package(default_visibility = ["//visibility:public"])
filegroup(
name = "omr_regression_test_data",
srcs = glob(["**/*.png"]),
)
""",
)
================================================
FILE: docs/concepts.md
================================================
# Moonlight OMR Concepts
## Diagram
<img src="concepts_diagram.png" />
## Glossary
| Term | Definition |
| ------------ | ------------------------------------------------------------- |
| barline | A vertical line spanning one or more staves, which is not a |
: : stem. :
| beam | A thick line connecting multiple stems horizontally. Each |
: : beam halves the duration of a filled notehead connected to :
: : the stem, which would otherwise be a quarter note. :
| notehead | An ellipse representing a single note. May be filled (quarter |
: : or lesser value) or empty (half or whole note). :
| ledger line | An extra line above or below the 5 staff lines. |
| staff | The object which notes and other glyphs are placed on. |
| staff system | One or more staves joined by at least one barline. |
| staves | Plural of staff. |
| stem | A vertical line attached to a notehead. All noteheads except |
: : for whole notes should have a stem. :
## Staves
Staves have 5 parallel, horizontal lines, and are parameterized by the center
(3rd) line, and the staffline distance (vertical distance between consecutive
staff lines). The staffline distance is constant for reasonable quality printed
music scores, and this representation avoids redundancy and makes it possible to
find the coordinates of each staff line.
## Staff Positions
Glyphs are vertically centered either on staff lines (or ledger lines), or on
the space halfway between lines. We refer to each of these y coordinates as a
*position*.
**Note**: We still refer to positions as *stafflines* in many places which is
counter-intuitive, since positions are also on the space between lines, and can
be a potential ledger line which is empty in a particular image. We are in the
process of renaming stafflines in this context to staff positions.
* Staff position 0 is the third staff line (staff center line), which is also
the y-coordinate that is outputted by staff detection.
* Staff positions are half the staffline distance apart. Always calculate the
relative y position by <code>tf.floordiv(staffline_distance * y_position,
2)</code> instead of dividing by 2 first.
* In treble clef, staff position 0 is B4, -6 is C4, -1 is A4, and +1 is C5.
* In bass clef, staff position 0 is D3, +3 is G3, and +6 is C4.
## Glyphs
Glyphs are defined in
[<code>musicscore.proto</code>](../moonlight/protobuf/musicscore.proto). Each
glyph has an x coordinate on the original image, and a y position (staff
position). The staff position determines the pitch of the glyph, if applicable.
If the glyph is especially large (e.g. clefs) or is not centered on a particular
vertical position, we choose a *canonical* staff position (e.g. the G line for
treble clef, aka G clef).
### Glyph centers
* Every glyph needs a canonical center coordinate. The classifier will detect
the glyph if the window (e.g. convolutional filter) is centered on this
point.
* Noteheads are centered on the middle of the ellipsis, which should be
exactly on a staffline or halfway between stafflines.
* Accidentals should be centered on their center of mass. For flats, sharps,
and naturals, this is the center of the empty space inside them. For double
sharps, this is the middle of the crosshairs.
* Treble clef ('G clef") is centered on the intersection of the thin vertical
line with the G staffline (staff position -2).
* Bass clef ("F clef") is centered on the F staffline (staff position +2). The
x coordinate is halfway between the filled circle on the left and the
vertical segment on the right.
* All rests should normally be centered at staff position 0, unless shifted
from their usual position.
================================================
FILE: docs/engine.md
================================================
## OMR Engine Detailed Reference
The OMR engine converts a PNG image to a Magenta
[NoteSequence message](https://github.com/magenta/note-seq/blob/master/note_seq/protobuf/music.proto),
which is interoperable with MIDI and MusicXML.
OMR uses TensorFlow for glyph (symbol) classification, as well as all other
compute-intensive steps. Final processing is done in pure Python. The entry
point is [OMREngine.run](../moonlight/engine.py). Glyph classification is
configurable by the`glyph_classifier_fn` argument, and other subtasks of image
recognition are part of the [Structure](../moonlight/structure/__init__.py).
### Diagram
<img src="engine_diagram.svg">
### API
The entry point for running OMR is [`OMREngine.run()`](../moonlight/engine.py).
It takes in a list of PNG filenames and outputs a `Score` or `NoteSequence`
message. The `Score` can further be converted to
[MusicXML](../moonlight/conversions/musicxml.py).
### TensorFlow Graph
For maximum parallelism, all processing is run in the same TensorFlow graph. The
graph is run by `OMREngine._get_page()`. This also evaluates the `Structure` in
the same graph. `Structure` is a wrapper for detectors which extract information
from the image, including [staves](../moonlight/staves/hough.py), [vertical
lines](../moonlight/structure/verticals.py), and [note
beams](../moonlight/structure/beams.py).
#### Structure
[Structure](../moonlight/structure/__init__.py) holds the structural elements
that need to be evaluated for OMR, but does not do any symbol recognition. The
structure encompasses staves, beams, and vertical lines, and may contain more
elements in the future (e.g. full connected component analysis) which can be
used to detect more elements (e.g. note dots). Structure detection is currently
simple computer vision rather than ML, but it can easily be swapped out with a
different TensorFlow model.
##### Staffline Distance Estimation
We estimate the [staffline
distance(s)](../moonlight/staves/staffline_distance.py) of the entire image.
There may be staves with multiple different sizes for different parts on a
single page, but there should be just a few possible staffline distance values.
##### Staff Detection
Concrete subclasses of [BaseStaffDetector](../moonlight/staves/base.py) take in
the image and produce:
* `staves`: Tensor of shape `(num_staves, num_points, 2)`. Coordinates of the
staff center line (third line on the staff).
* `staffline_distance`: Vector of the estimated staffline distance (distance
between consecutive staff lines) for each staff.
* `staffline_thickness`: Scalar thickness of staff lines. Assumed to be the
same for all staves.
* `staves_interpolated_y`: Tensor of shape `(num_staves, width)`. For each
staff and column of the image, outputs the interpolated y position of the
staff center line.
##### Staff Removal
[StaffRemover](../moonlight/staves/removal.py) takes in the image and staves,
and outputs `remove_staves` which is the image with the staff lines erased. This
is useful so that [glyphs](concepts.md) look the same whether they are centered
on a staff line or the space between lines. It is also used within the
structure, by beam detection.
##### Beam Detection
[Beams](../moonlight/structure/beams.py) are currently detected from connected
components on an
[eroded](https://en.wikipedia.org/wiki/Mathematical_morphology#Binary_morphology)
staves-removed image. These are attached to notes by a `BeamProcessor`.
##### Vertical Line Detection
[ColumnBasedVerticals](../moonlight/structure/verticals.py) detects all vertical
lines in the image. These will later be used as either stems or barlines.
#### Glyph Classification: 1-D Convolutional Model
We also run a glyph classifier as part of the TensorFlow graph, which outputs
predictions.
##### Staffline Extraction
Glyphs are considered to lie on a black staff line, or halfway between staff
lines. For OMR, extracted stafflines are slices of the image that are either
centered on an staff line, or halfway between staff lines. The line that the
extracted staffline lies on may just be referred to a staffline, or a y position
of the staff.
[StafflineExtractor](../moonlight/staves/staffline_extractor.py) extracts these
vertical slices of the image, and scales their height to a constant value
(currently, 18 pixels tall). `StaffRemover` is used so that all extracted
stafflines should look similar.
##### Glyph Classification
[Glyphs](concepts.md) are classified on small, horizontal slices (currently, 15
pixels wide) of the extracted staffline, a 1D convolutional model.
A [GlyphClassifier](../moonlight/glyphs/base.py) outputs a Tensor
`staffline_predictions` of shape `(num_staves, num_stafflines, width)`. The
values are for the `Glyph.Type` enum. Value 0 (UNKNOWN_TYPE) is not used; value
1 (NONE) corresponds to no glyph.
### Post-Processing
#### Page Construction
OMR processing operates on [Page
protos](../moonlight/protobuf/musicscore.proto). The Page is first constructed
by `BaseGlyphClassifier.get_page`, which populates the glyphs on each staff.
Staff location information is then added by `StaffProcessor`.
Single `Glyph`s are created from consecutive runs in `staffline_predictions`
that are classified as the same glyph type.
Additional processors modify the `Page` in place, usually adding information
from the `Structure`. Each page is run through
[`page_processors.process()`](../moonlight/page_processors.py), and then the
score (containing all pages) is run through
[`score_processors.process()`](../moonlight/score_processors.py).
#### Stem Detection
[Stems](../moonlight/structure/stems.py) finds stem candidates from the vertical
lines, and adds a `Stem` to notehead `Glyph`s if the closest stem is close
enough to the expected position. The `ScoreReader` considers multiple noteheads
with identical `Stem`s as a single chord. Stems will also be used as a negative
signal to avoid detecting barlines in the same area.
#### Beam Processing
Beams from the `Structure` that are close enough to a stem are added to one or
more notes by [BeamProcessor](../moonlight/structure/beam_processor.py).
#### Barlines
[Barlines](../moonlight/structure/barlines.py) are detected from the verticals
if they have not already been used as a stem.
#### Score Reading
The [ScoreReader](../moonlight/score/reader.py) is the only score processor. It
can potentially use state that lasts across multiple pages, such as the current
time in the score, which needs to persist for the entire score.
Staves are scanned from left to right for glyphs. The `ScoreReader` manages a
hierarchy of state, from the global `ScoreState` to the `MeasureState`, holding
local state such as accidentals. Based on the preceding `Glyph`s, each notehead
`Glyph` gets assigned a
[`Note`](https://github.com/magenta/note-seq/blob/d7153cdb26758a69c2fa022782c5817970de7066/note_seq/protobuf/music.proto#L104)
field holding its musical value.
Afterwards, the `Score` can be converted to a `NoteSequence` (just pulling out
all of the `Note`s) or [MusicXML](../moonlight/conversions/musicxml.py).
================================================
FILE: moonlight/BUILD
================================================
# Description:
# Optical music recognition using TensorFlow.
package(
default_visibility = ["//moonlight:__subpackages__"],
)
licenses(["notice"]) # Apache 2.0
# The OMR engine. Entry point for running OMR.
py_library(
name = "engine",
srcs = ["engine.py"],
srcs_version = "PY2AND3",
deps = [
":image",
":page_processors",
":score_processors",
"//moonlight/conversions",
"//moonlight/glyphs:saved_classifier_fn",
"//moonlight/protobuf:protobuf_py_pb2",
"//moonlight/staves:base",
"//moonlight/structure",
"//moonlight/structure:beams",
"//moonlight/structure:components",
"//moonlight/structure:verticals",
# numpy dep
# six dep
# tensorflow dep
],
)
# The omr CLI for running locally on a single score.
py_binary(
name = "omr",
srcs = ["omr.py"],
srcs_version = "PY2AND3",
deps = [
":engine",
# disable_tf2
"@com_google_protobuf//:protobuf_python",
# absl dep
"//moonlight/conversions",
"//moonlight/glyphs:saved_classifier_fn",
# tensorflow dep
],
)
py_test(
name = "omr_endtoend_test",
size = "large",
srcs = ["omr_endtoend_test.py"],
data = ["//moonlight/testdata:images"],
shard_count = 4,
srcs_version = "PY2AND3",
deps = [
":engine",
# disable_tf2
# pillow dep
# absl/testing dep
# librosa dep
# lxml dep
"//moonlight/conversions",
"@magenta//protobuf:music_py_pb2",
# numpy dep
# tensorflow.python.platform dep
],
)
py_test(
name = "omr_regression_test",
size = "large",
srcs = ["omr_regression_test.py"],
args = ["--corpus_dir=../omr_regression_test_data"],
data = ["@omr_regression_test_data"],
shard_count = 4,
srcs_version = "PY2AND3",
deps = [
":engine",
# disable_tf2
# absl/testing dep
"//moonlight/protobuf:protobuf_py_pb2",
"//moonlight/score:reader",
],
)
py_library(
name = "image",
srcs = ["image.py"],
srcs_version = "PY2AND3",
deps = [], # tensorflow dep
)
py_library(
name = "page_processors",
srcs = ["page_processors.py"],
srcs_version = "PY2AND3",
deps = [
"//moonlight/glyphs:glyph_types",
"//moonlight/glyphs:note_dots",
"//moonlight/glyphs:repeated",
"//moonlight/staves:staff_processor",
"//moonlight/structure:barlines",
"//moonlight/structure:beam_processor",
"//moonlight/structure:section_barlines",
"//moonlight/structure:stems",
],
)
py_library(
name = "score_processors",
srcs = ["score_processors.py"],
srcs_version = "PY2AND3",
deps = ["//moonlight/score:reader"],
)
================================================
FILE: moonlight/conversions/BUILD
================================================
# Description:
# Format conversions for OMR.
package(
default_visibility = ["//moonlight:__subpackages__"],
)
licenses(["notice"]) # Apache 2.0
py_library(
name = "conversions",
srcs = ["__init__.py"],
deps = [
":musicxml",
":notesequence",
],
)
py_library(
name = "musicxml",
srcs = ["musicxml.py"],
deps = [
# librosa dep
# lxml dep
"//moonlight/protobuf:protobuf_py_pb2",
"//moonlight/score:measures",
],
)
py_test(
name = "musicxml_test",
srcs = ["musicxml_test.py"],
deps = [
":musicxml",
# absl/testing dep
"//moonlight/protobuf:protobuf_py_pb2",
"@magenta//protobuf:music_py_pb2",
],
)
py_library(
name = "notesequence",
srcs = ["notesequence.py"],
deps = [
"//moonlight/protobuf:protobuf_py_pb2",
"@magenta//protobuf:music_py_pb2",
],
)
================================================
FILE: moonlight/conversions/__init__.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""OMR format conversions."""
# TODO(ringw): Score to MusicXML, preserving staves, etc.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from moonlight.conversions.musicxml import score_to_musicxml
from moonlight.conversions.notesequence import page_to_notesequence
from moonlight.conversions.notesequence import score_to_notesequence
================================================
FILE: moonlight/conversions/musicxml.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Score to MusicXML conversion."""
# TODO(ringw): Key signature
# TODO(ringw): Chords
# TODO(ringw): Stems--MusicXML supports "up" or "down".
# TODO(ringw): Accurate layout of pages, staves, and measures.
# TODO(ringw): Barline types.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import copy
import re
import librosa
from lxml import etree
from moonlight.protobuf import musicscore_pb2
from moonlight.score import measures
from six import moves
DOCTYPE = ('<!DOCTYPE score-partwise PUBLIC\n'
' "-//Recordare//DTD MusicXML 3.0 Partwise//EN"\n'
' "http://www.musicxml.org/dtds/partwise.dtd">\n')
MUSICXML_VERSION = '3.0'
# Number of divisions (duration units) per quarter note.
DIVISIONS = 1024
# TODO(ringw): Detect the actual time signature.
TIME_SIGNATURE = etree.Element('time', symbol='common')
etree.SubElement(TIME_SIGNATURE, 'beats').text = '4'
etree.SubElement(TIME_SIGNATURE, 'beat-type').text = '4'
# Note types.
HALF = 'half'
WHOLE = 'whole'
# Indexed by the number of beams on a filled note. Each beam halves the duration
# of the note.
FILLED = [
'quarter', 'eighth', '16th', '32nd', '64th', '128th', '256th', '512th',
'1024th'
]
# Maps the ASCII accidental names to the actual pitch alteration.
ACCIDENTAL_TO_ALTER = {'': 0, '#': 1, 'b': -1}
def score_to_musicxml(score):
"""Converts a `tensorflow.moonlight.Score` to MusicXML.
Args:
score: The OMR score.
Returns:
XML text.
"""
musicxml = MusicXMLScore(score)
measure_num = 0
previous_note_start_time = 0
previous_note_end_time = 0
for page in score.page:
for system in page.system:
system_measures = measures.Measures(system)
for system_measure_num in moves.xrange(system_measures.size()):
for staff_num, staff in enumerate(system.staff):
# Produce the measure, even if there are no glyphs.
measure = musicxml.get_measure(staff_num, measure_num)
for glyph in staff.glyph:
if system_measures.get_measure(glyph) == system_measure_num:
clef = _glyph_to_clef(glyph)
if clef is not None:
attributes = _get_attributes(measure)
if attributes.find('clef') is not None:
attributes.remove(attributes.find('clef'))
attributes.append(clef)
note = _glyph_to_note(glyph)
if note is not None:
if (glyph.note.start_time == previous_note_start_time and
glyph.note.end_time == previous_note_end_time):
position = note.index(note.find('pitch'))
chord = etree.Element('chord')
note.insert(position, chord)
previous_note_start_time = glyph.note.start_time
previous_note_end_time = glyph.note.end_time
measure.append(note)
measure_num += 1
# Add <divisions> and <time> to each part.
for part in musicxml.score:
# XPath indexing is 1-based.
measure = part.find('measure[1]')
if measure is not None:
attributes = _get_attributes(measure, position=0)
etree.SubElement(attributes, 'divisions').text = str(DIVISIONS)
attributes.append(copy.deepcopy(TIME_SIGNATURE))
return musicxml.to_string()
_TREBLE_CLEF = etree.Element('clef')
etree.SubElement(_TREBLE_CLEF, 'sign').text = 'G'
etree.SubElement(_TREBLE_CLEF, 'line').text = '2'
_BASS_CLEF = etree.Element('clef')
etree.SubElement(_BASS_CLEF, 'sign').text = 'F'
etree.SubElement(_BASS_CLEF, 'line').text = '4'
def _get_attributes(measure, position=-1):
"""Gets or creates an `<attributes>` tag in the `<measure>` tag.
If the child of `measure` at the given `position` is not an `<attributes>`
tag, creates a new `<attributes>` tag, appends, and returns it.
`position == -1` will get or insert attributes at the end of the measure.
Args:
measure: A `<measure>` etree tag.
position: The index where the attributes should be found or inserted.
Returns:
An `<attributes>` etree tag.
"""
if len(measure) and measure[position].tag == 'attributes':
return measure[position]
else:
attributes = etree.Element('attributes')
measure.insert(position, attributes)
return attributes
def _glyph_to_clef(glyph):
"""Converts a `Glyph` to a `<clef>` tag.
Args:
glyph: A `tensorflow.moonlight.Glyph` message.
Returns:
An etree `<clef>` tag, or `None` if the glyph is not a clef.
"""
if glyph.type == musicscore_pb2.Glyph.CLEF_TREBLE:
return copy.deepcopy(_TREBLE_CLEF)
elif glyph.type == musicscore_pb2.Glyph.CLEF_BASS:
return copy.deepcopy(_BASS_CLEF)
else:
return None
def _glyph_to_note(glyph):
"""Converts a `Glyph` message to a `<note>` tag.
Args:
glyph: A `tensorflow.moonlight.Glyph` message. The glyph type should be one
of `NOTEHEAD_*`.
Returns:
An etree `<note>` tag, or `None` if the glyph is not a notehead.
Raises:
ValueError: If the note duration is not a multiple of `1 / DIVISIONS`.
"""
if not glyph.HasField('note'):
return None
note = etree.Element('note')
etree.SubElement(note, 'voice').text = '1'
if glyph.type == musicscore_pb2.Glyph.NOTEHEAD_EMPTY:
note_type = HALF
elif glyph.type == musicscore_pb2.Glyph.NOTEHEAD_WHOLE:
note_type = WHOLE
else:
index = min(len(FILLED), len(glyph.beam))
note_type = FILLED[index]
etree.SubElement(note, 'type').text = note_type
duration = DIVISIONS * (glyph.note.end_time - glyph.note.start_time)
if not duration.is_integer():
raise ValueError('Duration is not an integer: ' + str(duration))
etree.SubElement(note, 'duration').text = str(int(duration))
pitch_match = re.match('([A-G])([#b]?)([0-9]+)',
librosa.midi_to_note(glyph.note.pitch))
pitch = etree.SubElement(note, 'pitch')
etree.SubElement(pitch, 'step').text = pitch_match.group(1)
etree.SubElement(pitch, 'alter').text = str(
ACCIDENTAL_TO_ALTER[pitch_match.group(2)])
etree.SubElement(pitch, 'octave').text = pitch_match.group(3)
return note
class MusicXMLScore(object):
"""Manages the parts and measures of the MusicXML score.
Provides access to parts and measures by index, creating new parts and
measures as needed.
"""
def __init__(self, omr_score):
num_parts = max(
len(system.staff) for page in omr_score.page for system in page.system)
part_list = _create_part_list(num_parts)
self.score = etree.Element('score-partwise', version=MUSICXML_VERSION)
self.score.append(part_list)
def get_measure(self, part_ind, measure_ind):
while len(self.score.findall('part')) <= part_ind:
next_part_ind = len(self.score.findall('part'))
etree.SubElement(self.score, 'part', id=_get_part_id(next_part_ind))
# XPath indexing is 1-based.
part = self.score.find('part[%d]' % (part_ind + 1))
while len(part.findall('measure')) <= measure_ind:
next_measure_ind = len(part.findall('measure'))
etree.SubElement(part, 'measure', number=str(next_measure_ind + 1))
return part.find('measure[%d]' % (measure_ind + 1))
def to_string(self):
return etree.tostring(
self.score.getroottree(),
pretty_print=True,
xml_declaration=True,
encoding='UTF-8',
doctype=DOCTYPE)
def _create_part_list(num_parts):
part_list = etree.Element('part-list')
for part_num in moves.xrange(1, num_parts + 1):
score_part = etree.SubElement(part_list, 'score-part')
score_part.set('id', 'P%d' % part_num)
etree.SubElement(score_part, 'part-name').text = 'Part %d' % part_num
return part_list
def _get_part_id(part_ind):
return 'P%d' % (part_ind + 1)
================================================
FILE: moonlight/conversions/musicxml_test.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for MusicXML output."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from absl.testing import absltest
from protobuf import music_pb2
from moonlight.conversions import musicxml
from moonlight.protobuf import musicscore_pb2
class MusicXMLTest(absltest.TestCase):
def testSmallScore(self):
score = musicscore_pb2.Score(page=[
musicscore_pb2.Page(system=[
musicscore_pb2.StaffSystem(staff=[
musicscore_pb2.Staff(glyph=[
musicscore_pb2.Glyph(
type=musicscore_pb2.Glyph.NOTEHEAD_FILLED,
x=10,
y_position=0,
note=music_pb2.NoteSequence.Note(
start_time=0, end_time=1, pitch=71)),
musicscore_pb2.Glyph(
type=musicscore_pb2.Glyph.NOTEHEAD_EMPTY,
x=110,
y_position=-6,
note=music_pb2.NoteSequence.Note(
start_time=1, end_time=2.5, pitch=61)),
]),
musicscore_pb2.Staff(glyph=[
musicscore_pb2.Glyph(
type=musicscore_pb2.Glyph.NOTEHEAD_WHOLE,
x=10,
y_position=2,
note=music_pb2.NoteSequence.Note(
start_time=0, end_time=4, pitch=50)),
musicscore_pb2.Glyph(
type=musicscore_pb2.Glyph.NOTEHEAD_FILLED,
beam=[
musicscore_pb2.LineSegment(),
musicscore_pb2.LineSegment()
],
x=110,
y_position=-4,
note=music_pb2.NoteSequence.Note(
start_time=4, end_time=4.25, pitch=60)),
]),
musicscore_pb2.Staff(glyph=[
musicscore_pb2.Glyph(
type=musicscore_pb2.Glyph.NOTEHEAD_FILLED,
x=10,
y_position=2,
note=music_pb2.NoteSequence.Note(
start_time=0, end_time=4, pitch=50)),
musicscore_pb2.Glyph(
type=musicscore_pb2.Glyph.NOTEHEAD_FILLED,
x=10,
y_position=-4,
note=music_pb2.NoteSequence.Note(
start_time=0, end_time=4, pitch=60)),
]),
]),
]),
])
self.assertEqual(
b"""<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE score-partwise PUBLIC
"-//Recordare//DTD MusicXML 3.0 Partwise//EN"
"http://www.musicxml.org/dtds/partwise.dtd">
<score-partwise version="3.0">
<part-list>
<score-part id="P1">
<part-name>Part 1</part-name>
</score-part>
<score-part id="P2">
<part-name>Part 2</part-name>
</score-part>
<score-part id="P3">
<part-name>Part 3</part-name>
</score-part>
</part-list>
<part id="P1">
<measure number="1">
<attributes>
<divisions>1024</divisions>
<time symbol="common">
<beats>4</beats>
<beat-type>4</beat-type>
</time>
</attributes>
<note>
<voice>1</voice>
<type>quarter</type>
<duration>1024</duration>
<pitch>
<step>B</step>
<alter>0</alter>
<octave>4</octave>
</pitch>
</note>
<note>
<voice>1</voice>
<type>half</type>
<duration>1536</duration>
<pitch>
<step>C</step>
<alter>1</alter>
<octave>4</octave>
</pitch>
</note>
</measure>
</part>
<part id="P2">
<measure number="1">
<attributes>
<divisions>1024</divisions>
<time symbol="common">
<beats>4</beats>
<beat-type>4</beat-type>
</time>
</attributes>
<note>
<voice>1</voice>
<type>whole</type>
<duration>4096</duration>
<pitch>
<step>D</step>
<alter>0</alter>
<octave>3</octave>
</pitch>
</note>
<note>
<voice>1</voice>
<type>16th</type>
<duration>256</duration>
<pitch>
<step>C</step>
<alter>0</alter>
<octave>4</octave>
</pitch>
</note>
</measure>
</part>
<part id="P3">
<measure number="1">
<attributes>
<divisions>1024</divisions>
<time symbol="common">
<beats>4</beats>
<beat-type>4</beat-type>
</time>
</attributes>
<note>
<voice>1</voice>
<type>quarter</type>
<duration>4096</duration>
<pitch>
<step>D</step>
<alter>0</alter>
<octave>3</octave>
</pitch>
</note>
<note>
<voice>1</voice>
<type>quarter</type>
<duration>4096</duration>
<chord/>
<pitch>
<step>C</step>
<alter>0</alter>
<octave>4</octave>
</pitch>
</note>
</measure>
</part>
</score-partwise>
""", musicxml.score_to_musicxml(score))
if __name__ == '__main__':
absltest.main()
================================================
FILE: moonlight/conversions/notesequence.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Converts an OMR `Score` to a `NoteSequence`."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from protobuf import music_pb2
from moonlight.protobuf import musicscore_pb2
def score_to_notesequence(score):
"""Score to NoteSequence conversion.
Args:
score: A `tensorflow.moonlight.Score` message.
Returns:
A `tensorflow.magenta.NoteSequence` message containing the notes in the
score.
"""
return music_pb2.NoteSequence(notes=list(_score_notes(score)))
def page_to_notesequence(page):
return score_to_notesequence(musicscore_pb2.Score(page=[page]))
def _score_notes(score):
for page in score.page:
for system in page.system:
for staff in system.staff:
for glyph in staff.glyph:
if glyph.HasField('note'):
yield glyph.note
================================================
FILE: moonlight/data/README.md
================================================
Description:
Built-in models required for running OMR.
================================================
FILE: moonlight/data/glyphs_nn_model_20180808/BUILD
================================================
package(
default_visibility = ["//moonlight:__subpackages__"],
)
licenses(["notice"]) # Apache 2.0
filegroup(
name = "glyphs_nn_model_20180808",
srcs = glob(["**/*"]),
)
================================================
FILE: moonlight/data/glyphs_nn_model_20180808/saved_model.pbtxt
================================================
saved_model_schema_version: 1
meta_graphs {
meta_info_def {
stripped_op_list {
op {
name: "Add"
input_arg {
name: "x"
type_attr: "T"
}
input_arg {
name: "y"
type_attr: "T"
}
output_arg {
name: "z"
type_attr: "T"
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_BFLOAT16
type: DT_HALF
type: DT_FLOAT
type: DT_DOUBLE
type: DT_UINT8
type: DT_INT8
type: DT_INT16
type: DT_INT32
type: DT_INT64
type: DT_COMPLEX64
type: DT_COMPLEX128
type: DT_STRING
}
}
}
}
op {
name: "ArgMax"
input_arg {
name: "input"
type_attr: "T"
}
input_arg {
name: "dimension"
type_attr: "Tidx"
}
output_arg {
name: "output"
type_attr: "output_type"
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_FLOAT
type: DT_DOUBLE
type: DT_INT32
type: DT_UINT8
type: DT_INT16
type: DT_INT8
type: DT_COMPLEX64
type: DT_INT64
type: DT_QINT8
type: DT_QUINT8
type: DT_QINT32
type: DT_BFLOAT16
type: DT_UINT16
type: DT_COMPLEX128
type: DT_HALF
type: DT_UINT32
type: DT_UINT64
}
}
}
attr {
name: "Tidx"
type: "type"
default_value {
type: DT_INT32
}
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
attr {
name: "output_type"
type: "type"
default_value {
type: DT_INT64
}
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
}
op {
name: "AsString"
input_arg {
name: "input"
type_attr: "T"
}
output_arg {
name: "output"
type: DT_STRING
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_INT8
type: DT_INT16
type: DT_INT32
type: DT_INT64
type: DT_COMPLEX64
type: DT_FLOAT
type: DT_DOUBLE
type: DT_BOOL
}
}
}
attr {
name: "precision"
type: "int"
default_value {
i: -1
}
}
attr {
name: "scientific"
type: "bool"
default_value {
b: false
}
}
attr {
name: "shortest"
type: "bool"
default_value {
b: false
}
}
attr {
name: "width"
type: "int"
default_value {
i: -1
}
}
attr {
name: "fill"
type: "string"
default_value {
s: ""
}
}
}
op {
name: "Assign"
input_arg {
name: "ref"
type_attr: "T"
is_ref: true
}
input_arg {
name: "value"
type_attr: "T"
}
output_arg {
name: "output_ref"
type_attr: "T"
is_ref: true
}
attr {
name: "T"
type: "type"
}
attr {
name: "validate_shape"
type: "bool"
default_value {
b: true
}
}
attr {
name: "use_locking"
type: "bool"
default_value {
b: true
}
}
allows_uninitialized_input: true
}
op {
name: "BiasAdd"
input_arg {
name: "value"
type_attr: "T"
}
input_arg {
name: "bias"
type_attr: "T"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_FLOAT
type: DT_DOUBLE
type: DT_INT32
type: DT_UINT8
type: DT_INT16
type: DT_INT8
type: DT_COMPLEX64
type: DT_INT64
type: DT_QINT8
type: DT_QUINT8
type: DT_QINT32
type: DT_BFLOAT16
type: DT_UINT16
type: DT_COMPLEX128
type: DT_HALF
type: DT_UINT32
type: DT_UINT64
}
}
}
attr {
name: "data_format"
type: "string"
default_value {
s: "NHWC"
}
allowed_values {
list {
s: "NHWC"
s: "NCHW"
}
}
}
}
op {
name: "Cast"
input_arg {
name: "x"
type_attr: "SrcT"
}
output_arg {
name: "y"
type_attr: "DstT"
}
attr {
name: "SrcT"
type: "type"
}
attr {
name: "DstT"
type: "type"
}
attr {
name: "Truncate"
type: "bool"
default_value {
b: false
}
}
}
op {
name: "Const"
output_arg {
name: "output"
type_attr: "dtype"
}
attr {
name: "value"
type: "tensor"
}
attr {
name: "dtype"
type: "type"
}
}
op {
name: "Equal"
input_arg {
name: "x"
type_attr: "T"
}
input_arg {
name: "y"
type_attr: "T"
}
output_arg {
name: "z"
type: DT_BOOL
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_BFLOAT16
type: DT_HALF
type: DT_FLOAT
type: DT_DOUBLE
type: DT_UINT8
type: DT_INT8
type: DT_INT16
type: DT_INT32
type: DT_INT64
type: DT_COMPLEX64
type: DT_QUINT8
type: DT_QINT8
type: DT_QINT32
type: DT_STRING
type: DT_BOOL
type: DT_COMPLEX128
}
}
}
is_commutative: true
}
op {
name: "ExpandDims"
input_arg {
name: "input"
type_attr: "T"
}
input_arg {
name: "dim"
type_attr: "Tdim"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "T"
type: "type"
}
attr {
name: "Tdim"
type: "type"
default_value {
type: DT_INT32
}
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
}
op {
name: "HistogramSummary"
input_arg {
name: "tag"
type: DT_STRING
}
input_arg {
name: "values"
type_attr: "T"
}
output_arg {
name: "summary"
type: DT_STRING
}
attr {
name: "T"
type: "type"
default_value {
type: DT_FLOAT
}
allowed_values {
list {
type: DT_FLOAT
type: DT_DOUBLE
type: DT_INT32
type: DT_UINT8
type: DT_INT16
type: DT_INT8
type: DT_INT64
type: DT_BFLOAT16
type: DT_UINT16
type: DT_HALF
type: DT_UINT32
type: DT_UINT64
}
}
}
}
op {
name: "Identity"
input_arg {
name: "input"
type_attr: "T"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "T"
type: "type"
}
}
op {
name: "MatMul"
input_arg {
name: "a"
type_attr: "T"
}
input_arg {
name: "b"
type_attr: "T"
}
output_arg {
name: "product"
type_attr: "T"
}
attr {
name: "transpose_a"
type: "bool"
default_value {
b: false
}
}
attr {
name: "transpose_b"
type: "bool"
default_value {
b: false
}
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_BFLOAT16
type: DT_HALF
type: DT_FLOAT
type: DT_DOUBLE
type: DT_INT32
type: DT_COMPLEX64
type: DT_COMPLEX128
}
}
}
}
op {
name: "Mean"
input_arg {
name: "input"
type_attr: "T"
}
input_arg {
name: "reduction_indices"
type_attr: "Tidx"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "keep_dims"
type: "bool"
default_value {
b: false
}
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_FLOAT
type: DT_DOUBLE
type: DT_INT32
type: DT_UINT8
type: DT_INT16
type: DT_INT8
type: DT_COMPLEX64
type: DT_INT64
type: DT_QINT8
type: DT_QUINT8
type: DT_QINT32
type: DT_BFLOAT16
type: DT_UINT16
type: DT_COMPLEX128
type: DT_HALF
type: DT_UINT32
type: DT_UINT64
}
}
}
attr {
name: "Tidx"
type: "type"
default_value {
type: DT_INT32
}
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
}
op {
name: "MergeV2Checkpoints"
input_arg {
name: "checkpoint_prefixes"
type: DT_STRING
}
input_arg {
name: "destination_prefix"
type: DT_STRING
}
attr {
name: "delete_old_dirs"
type: "bool"
default_value {
b: true
}
}
is_stateful: true
}
op {
name: "Mul"
input_arg {
name: "x"
type_attr: "T"
}
input_arg {
name: "y"
type_attr: "T"
}
output_arg {
name: "z"
type_attr: "T"
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_BFLOAT16
type: DT_HALF
type: DT_FLOAT
type: DT_DOUBLE
type: DT_UINT8
type: DT_INT8
type: DT_UINT16
type: DT_INT16
type: DT_INT32
type: DT_INT64
type: DT_COMPLEX64
type: DT_COMPLEX128
}
}
}
is_commutative: true
}
op {
name: "NoOp"
}
op {
name: "Pack"
input_arg {
name: "values"
type_attr: "T"
number_attr: "N"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "N"
type: "int"
has_minimum: true
minimum: 1
}
attr {
name: "T"
type: "type"
}
attr {
name: "axis"
type: "int"
default_value {
i: 0
}
}
}
op {
name: "ParseExample"
input_arg {
name: "serialized"
type: DT_STRING
}
input_arg {
name: "names"
type: DT_STRING
}
input_arg {
name: "sparse_keys"
type: DT_STRING
number_attr: "Nsparse"
}
input_arg {
name: "dense_keys"
type: DT_STRING
number_attr: "Ndense"
}
input_arg {
name: "dense_defaults"
type_list_attr: "Tdense"
}
output_arg {
name: "sparse_indices"
type: DT_INT64
number_attr: "Nsparse"
}
output_arg {
name: "sparse_values"
type_list_attr: "sparse_types"
}
output_arg {
name: "sparse_shapes"
type: DT_INT64
number_attr: "Nsparse"
}
output_arg {
name: "dense_values"
type_list_attr: "Tdense"
}
attr {
name: "Nsparse"
type: "int"
has_minimum: true
}
attr {
name: "Ndense"
type: "int"
has_minimum: true
}
attr {
name: "sparse_types"
type: "list(type)"
has_minimum: true
allowed_values {
list {
type: DT_FLOAT
type: DT_INT64
type: DT_STRING
}
}
}
attr {
name: "Tdense"
type: "list(type)"
has_minimum: true
allowed_values {
list {
type: DT_FLOAT
type: DT_INT64
type: DT_STRING
}
}
}
attr {
name: "dense_shapes"
type: "list(shape)"
has_minimum: true
}
}
op {
name: "Placeholder"
output_arg {
name: "output"
type_attr: "dtype"
}
attr {
name: "dtype"
type: "type"
}
attr {
name: "shape"
type: "shape"
default_value {
shape {
unknown_rank: true
}
}
}
}
op {
name: "RandomUniform"
input_arg {
name: "shape"
type_attr: "T"
}
output_arg {
name: "output"
type_attr: "dtype"
}
attr {
name: "seed"
type: "int"
default_value {
i: 0
}
}
attr {
name: "seed2"
type: "int"
default_value {
i: 0
}
}
attr {
name: "dtype"
type: "type"
allowed_values {
list {
type: DT_HALF
type: DT_BFLOAT16
type: DT_FLOAT
type: DT_DOUBLE
}
}
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
is_stateful: true
}
op {
name: "Range"
input_arg {
name: "start"
type_attr: "Tidx"
}
input_arg {
name: "limit"
type_attr: "Tidx"
}
input_arg {
name: "delta"
type_attr: "Tidx"
}
output_arg {
name: "output"
type_attr: "Tidx"
}
attr {
name: "Tidx"
type: "type"
default_value {
type: DT_INT32
}
allowed_values {
list {
type: DT_BFLOAT16
type: DT_FLOAT
type: DT_DOUBLE
type: DT_INT32
type: DT_INT64
}
}
}
}
op {
name: "Reshape"
input_arg {
name: "tensor"
type_attr: "T"
}
input_arg {
name: "shape"
type_attr: "Tshape"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "T"
type: "type"
}
attr {
name: "Tshape"
type: "type"
default_value {
type: DT_INT32
}
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
}
op {
name: "RestoreV2"
input_arg {
name: "prefix"
type: DT_STRING
}
input_arg {
name: "tensor_names"
type: DT_STRING
}
input_arg {
name: "shape_and_slices"
type: DT_STRING
}
output_arg {
name: "tensors"
type_list_attr: "dtypes"
}
attr {
name: "dtypes"
type: "list(type)"
has_minimum: true
minimum: 1
}
is_stateful: true
}
op {
name: "SaveV2"
input_arg {
name: "prefix"
type: DT_STRING
}
input_arg {
name: "tensor_names"
type: DT_STRING
}
input_arg {
name: "shape_and_slices"
type: DT_STRING
}
input_arg {
name: "tensors"
type_list_attr: "dtypes"
}
attr {
name: "dtypes"
type: "list(type)"
has_minimum: true
minimum: 1
}
is_stateful: true
}
op {
name: "ScalarSummary"
input_arg {
name: "tags"
type: DT_STRING
}
input_arg {
name: "values"
type_attr: "T"
}
output_arg {
name: "summary"
type: DT_STRING
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_FLOAT
type: DT_DOUBLE
type: DT_INT32
type: DT_UINT8
type: DT_INT16
type: DT_INT8
type: DT_INT64
type: DT_BFLOAT16
type: DT_UINT16
type: DT_HALF
type: DT_UINT32
type: DT_UINT64
}
}
}
}
op {
name: "Shape"
input_arg {
name: "input"
type_attr: "T"
}
output_arg {
name: "output"
type_attr: "out_type"
}
attr {
name: "T"
type: "type"
}
attr {
name: "out_type"
type: "type"
default_value {
type: DT_INT32
}
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
}
op {
name: "ShardedFilename"
input_arg {
name: "basename"
type: DT_STRING
}
input_arg {
name: "shard"
type: DT_INT32
}
input_arg {
name: "num_shards"
type: DT_INT32
}
output_arg {
name: "filename"
type: DT_STRING
}
}
op {
name: "Sigmoid"
input_arg {
name: "x"
type_attr: "T"
}
output_arg {
name: "y"
type_attr: "T"
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_BFLOAT16
type: DT_HALF
type: DT_FLOAT
type: DT_DOUBLE
type: DT_COMPLEX64
type: DT_COMPLEX128
}
}
}
}
op {
name: "Softmax"
input_arg {
name: "logits"
type_attr: "T"
}
output_arg {
name: "softmax"
type_attr: "T"
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_HALF
type: DT_BFLOAT16
type: DT_FLOAT
type: DT_DOUBLE
}
}
}
}
op {
name: "StridedSlice"
input_arg {
name: "input"
type_attr: "T"
}
input_arg {
name: "begin"
type_attr: "Index"
}
input_arg {
name: "end"
type_attr: "Index"
}
input_arg {
name: "strides"
type_attr: "Index"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "T"
type: "type"
}
attr {
name: "Index"
type: "type"
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
attr {
name: "begin_mask"
type: "int"
default_value {
i: 0
}
}
attr {
name: "end_mask"
type: "int"
default_value {
i: 0
}
}
attr {
name: "ellipsis_mask"
type: "int"
default_value {
i: 0
}
}
attr {
name: "new_axis_mask"
type: "int"
default_value {
i: 0
}
}
attr {
name: "shrink_axis_mask"
type: "int"
default_value {
i: 0
}
}
}
op {
name: "StringJoin"
input_arg {
name: "inputs"
type: DT_STRING
number_attr: "N"
}
output_arg {
name: "output"
type: DT_STRING
}
attr {
name: "N"
type: "int"
has_minimum: true
minimum: 1
}
attr {
name: "separator"
type: "string"
default_value {
s: ""
}
}
}
op {
name: "Sub"
input_arg {
name: "x"
type_attr: "T"
}
input_arg {
name: "y"
type_attr: "T"
}
output_arg {
name: "z"
type_attr: "T"
}
attr {
name: "T"
type: "type"
allowed_values {
list {
type: DT_BFLOAT16
type: DT_HALF
type: DT_FLOAT
type: DT_DOUBLE
type: DT_UINT8
type: DT_INT8
type: DT_UINT16
type: DT_INT16
type: DT_INT32
type: DT_INT64
type: DT_COMPLEX64
type: DT_COMPLEX128
}
}
}
}
op {
name: "Tile"
input_arg {
name: "input"
type_attr: "T"
}
input_arg {
name: "multiples"
type_attr: "Tmultiples"
}
output_arg {
name: "output"
type_attr: "T"
}
attr {
name: "T"
type: "type"
}
attr {
name: "Tmultiples"
type: "type"
default_value {
type: DT_INT32
}
allowed_values {
list {
type: DT_INT32
type: DT_INT64
}
}
}
}
op {
name: "VariableV2"
output_arg {
name: "ref"
type_attr: "dtype"
is_ref: true
}
attr {
name: "shape"
type: "shape"
}
attr {
name: "dtype"
type: "type"
}
attr {
name: "container"
type: "string"
default_value {
s: ""
}
}
attr {
name: "shared_name"
type: "string"
default_value {
s: ""
}
}
is_stateful: true
}
}
tags: "serve"
tensorflow_version: "1.10.0-rc1"
tensorflow_git_version: "unknown"
stripped_default_attrs: true
}
graph_def {
node {
name: "global_step/Initializer/zeros"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@global_step"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT64
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT64
tensor_shape {
}
int64_val: 0
}
}
}
}
node {
name: "global_step"
op: "VariableV2"
attr {
key: "_class"
value {
list {
s: "loc:@global_step"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT64
}
}
attr {
key: "shape"
value {
shape {
}
}
}
}
node {
name: "global_step/Assign"
op: "Assign"
input: "global_step"
input: "global_step/Initializer/zeros"
attr {
key: "T"
value {
type: DT_INT64
}
}
attr {
key: "_class"
value {
list {
s: "loc:@global_step"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "global_step/read"
op: "Identity"
input: "global_step"
attr {
key: "T"
value {
type: DT_INT64
}
}
attr {
key: "_class"
value {
list {
s: "loc:@global_step"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "Placeholder"
op: "Placeholder"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "shape"
value {
shape {
dim {
size: -1
}
}
}
}
}
node {
name: "ParseExample/Const"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
dim {
}
}
}
}
}
}
node {
name: "ParseExample/ParseExample/names"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
dim {
}
}
}
}
}
}
node {
name: "ParseExample/ParseExample/dense_keys_0"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "patch"
}
}
}
}
node {
name: "ParseExample/ParseExample"
op: "ParseExample"
input: "Placeholder"
input: "ParseExample/ParseExample/names"
input: "ParseExample/ParseExample/dense_keys_0"
input: "ParseExample/Const"
attr {
key: "Ndense"
value {
i: 1
}
}
attr {
key: "Nsparse"
value {
i: 0
}
}
attr {
key: "Tdense"
value {
list {
type: DT_FLOAT
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 15
}
dim {
size: 12
}
}
}
}
}
attr {
key: "dense_shapes"
value {
list {
shape {
dim {
size: 15
}
dim {
size: 12
}
}
}
}
}
attr {
key: "sparse_types"
value {
list {
}
}
}
}
node {
name: "params/l1_regularization_strength"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 1
}
}
}
}
node {
name: "params/augmentation_max_rotation_degrees"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0
}
}
}
}
node {
name: "params/label_weights"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "NONE=0.25,FLAT=5.0,NATURAL=5.0,CLEF_TREBLE=5.0,CLEF_BASS=5.0"
}
}
}
}
node {
name: "params/dropout"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0
}
}
}
}
node {
name: "params/learning_rate"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0.01
}
}
}
}
node {
name: "params/use_included_label_weight"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_BOOL
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_BOOL
tensor_shape {
}
bool_val: true
}
}
}
}
node {
name: "params/activation_fn"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "sigmoid"
}
}
}
}
node {
name: "params/augmentation_x_shift_probability"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0.25
}
}
}
}
node {
name: "params/layer_dims"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 1
}
}
int_val: 150
}
}
}
}
node {
name: "params/model_name"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "glyphs_dnn"
}
}
}
}
node {
name: "params/l2_regularization_strength"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/Shape"
op: "Shape"
input: "ParseExample/ParseExample"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 3
}
}
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/strided_slice/stack"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 1
}
}
int_val: 0
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/strided_slice/stack_1"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 1
}
}
int_val: 1
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/strided_slice/stack_2"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 1
}
}
int_val: 1
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/strided_slice"
op: "StridedSlice"
input: "dnn/input_from_feature_columns/input_layer/patch/Shape"
input: "dnn/input_from_feature_columns/input_layer/patch/strided_slice/stack"
input: "dnn/input_from_feature_columns/input_layer/patch/strided_slice/stack_1"
input: "dnn/input_from_feature_columns/input_layer/patch/strided_slice/stack_2"
attr {
key: "Index"
value {
type: DT_INT32
}
}
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "shrink_axis_mask"
value {
i: 1
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/Reshape/shape/1"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 180
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/Reshape/shape"
op: "Pack"
input: "dnn/input_from_feature_columns/input_layer/patch/strided_slice"
input: "dnn/input_from_feature_columns/input_layer/patch/Reshape/shape/1"
attr {
key: "N"
value {
i: 2
}
}
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/patch/Reshape"
op: "Reshape"
input: "ParseExample/ParseExample"
input: "dnn/input_from_feature_columns/input_layer/patch/Reshape/shape"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 180
}
}
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/concat/concat_dim"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 1
}
}
}
}
node {
name: "dnn/input_from_feature_columns/input_layer/concat"
op: "Identity"
input: "dnn/input_from_feature_columns/input_layer/patch/Reshape"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 180
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/shape"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 2
}
}
tensor_content: "\264\000\000\000\226\000\000\000"
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/min"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: -0.13483997
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/max"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0.13483997
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/RandomUniform"
op: "RandomUniform"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/shape"
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/sub"
op: "Sub"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/max"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/min"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/mul"
op: "Mul"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/RandomUniform"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/sub"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform"
op: "Add"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/mul"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform/min"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0"
op: "VariableV2"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "shape"
value {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/Assign"
op: "Assign"
input: "dnn/hiddenlayer_0/kernel/part_0"
input: "dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/read"
op: "Identity"
input: "dnn/hiddenlayer_0/kernel/part_0"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/bias/part_0/Initializer/zeros"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
dim {
size: 150
}
}
float_val: 0
}
}
}
}
node {
name: "dnn/hiddenlayer_0/bias/part_0"
op: "VariableV2"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "shape"
value {
shape {
dim {
size: 150
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/bias/part_0/Assign"
op: "Assign"
input: "dnn/hiddenlayer_0/bias/part_0"
input: "dnn/hiddenlayer_0/bias/part_0/Initializer/zeros"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/bias/part_0/read"
op: "Identity"
input: "dnn/hiddenlayer_0/bias/part_0"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel"
op: "Identity"
input: "dnn/hiddenlayer_0/kernel/part_0/read"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/MatMul"
op: "MatMul"
input: "dnn/input_from_feature_columns/input_layer/concat"
input: "dnn/hiddenlayer_0/kernel"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/bias"
op: "Identity"
input: "dnn/hiddenlayer_0/bias/part_0/read"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/BiasAdd"
op: "BiasAdd"
input: "dnn/hiddenlayer_0/MatMul"
input: "dnn/hiddenlayer_0/bias"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/Sigmoid"
op: "Sigmoid"
input: "dnn/hiddenlayer_0/BiasAdd"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/zero_fraction/zero"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0
}
}
}
}
node {
name: "dnn/zero_fraction/Equal"
op: "Equal"
input: "dnn/hiddenlayer_0/Sigmoid"
input: "dnn/zero_fraction/zero"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/zero_fraction/Cast"
op: "Cast"
input: "dnn/zero_fraction/Equal"
attr {
key: "DstT"
value {
type: DT_FLOAT
}
}
attr {
key: "SrcT"
value {
type: DT_BOOL
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "dnn/zero_fraction/Const"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 2
}
}
tensor_content: "\000\000\000\000\001\000\000\000"
}
}
}
}
node {
name: "dnn/zero_fraction/Mean"
op: "Mean"
input: "dnn/zero_fraction/Cast"
input: "dnn/zero_fraction/Const"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/dnn/hiddenlayer_0/fraction_of_zero_values/tags"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "dnn/dnn/hiddenlayer_0/fraction_of_zero_values"
}
}
}
}
node {
name: "dnn/dnn/hiddenlayer_0/fraction_of_zero_values"
op: "ScalarSummary"
input: "dnn/dnn/hiddenlayer_0/fraction_of_zero_values/tags"
input: "dnn/zero_fraction/Mean"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/dnn/hiddenlayer_0/activation/tag"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "dnn/dnn/hiddenlayer_0/activation"
}
}
}
}
node {
name: "dnn/dnn/hiddenlayer_0/activation"
op: "HistogramSummary"
input: "dnn/dnn/hiddenlayer_0/activation/tag"
input: "dnn/hiddenlayer_0/Sigmoid"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/Initializer/random_uniform/shape"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 2
}
}
tensor_content: "\226\000\000\000\024\000\000\000"
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/Initializer/random_uniform/min"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: -0.18786728
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/Initializer/random_uniform/max"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0.18786728
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/Initializer/random_uniform/RandomUniform"
op: "RandomUniform"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform/shape"
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
}
node {
name: "dnn/logits/kernel/part_0/Initializer/random_uniform/sub"
op: "Sub"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform/max"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform/min"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/Initializer/random_uniform/mul"
op: "Mul"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform/RandomUniform"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform/sub"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/Initializer/random_uniform"
op: "Add"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform/mul"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform/min"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/kernel/part_0"
op: "VariableV2"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "shape"
value {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/Assign"
op: "Assign"
input: "dnn/logits/kernel/part_0"
input: "dnn/logits/kernel/part_0/Initializer/random_uniform"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/read"
op: "Identity"
input: "dnn/logits/kernel/part_0"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/bias/part_0/Initializer/zeros"
op: "Const"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
dim {
size: 20
}
}
float_val: 0
}
}
}
}
node {
name: "dnn/logits/bias/part_0"
op: "VariableV2"
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "shape"
value {
shape {
dim {
size: 20
}
}
}
}
}
node {
name: "dnn/logits/bias/part_0/Assign"
op: "Assign"
input: "dnn/logits/bias/part_0"
input: "dnn/logits/bias/part_0/Initializer/zeros"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/bias/part_0/read"
op: "Identity"
input: "dnn/logits/bias/part_0"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/kernel"
op: "Identity"
input: "dnn/logits/kernel/part_0/read"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/MatMul"
op: "MatMul"
input: "dnn/hiddenlayer_0/Sigmoid"
input: "dnn/logits/kernel"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/bias"
op: "Identity"
input: "dnn/logits/bias/part_0/read"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/logits/BiasAdd"
op: "BiasAdd"
input: "dnn/logits/MatMul"
input: "dnn/logits/bias"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/zero_fraction_1/zero"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0
}
}
}
}
node {
name: "dnn/zero_fraction_1/Equal"
op: "Equal"
input: "dnn/logits/BiasAdd"
input: "dnn/zero_fraction_1/zero"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/zero_fraction_1/Cast"
op: "Cast"
input: "dnn/zero_fraction_1/Equal"
attr {
key: "DstT"
value {
type: DT_FLOAT
}
}
attr {
key: "SrcT"
value {
type: DT_BOOL
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/zero_fraction_1/Const"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 2
}
}
tensor_content: "\000\000\000\000\001\000\000\000"
}
}
}
}
node {
name: "dnn/zero_fraction_1/Mean"
op: "Mean"
input: "dnn/zero_fraction_1/Cast"
input: "dnn/zero_fraction_1/Const"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/dnn/logits/fraction_of_zero_values/tags"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "dnn/dnn/logits/fraction_of_zero_values"
}
}
}
}
node {
name: "dnn/dnn/logits/fraction_of_zero_values"
op: "ScalarSummary"
input: "dnn/dnn/logits/fraction_of_zero_values/tags"
input: "dnn/zero_fraction_1/Mean"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/dnn/logits/activation/tag"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "dnn/dnn/logits/activation"
}
}
}
}
node {
name: "dnn/dnn/logits/activation"
op: "HistogramSummary"
input: "dnn/dnn/logits/activation/tag"
input: "dnn/logits/BiasAdd"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/head/logits/Shape"
op: "Shape"
input: "dnn/logits/BiasAdd"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
}
node {
name: "dnn/head/logits/assert_rank_at_least/rank"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 2
}
}
}
}
node {
name: "dnn/head/logits/assert_rank_at_least/assert_type/statically_determined_correct_type"
op: "NoOp"
}
node {
name: "dnn/head/logits/assert_rank_at_least/static_checks_determined_all_ok"
op: "NoOp"
}
node {
name: "dnn/head/predictions/class_ids/dimension"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: -1
}
}
}
}
node {
name: "dnn/head/predictions/class_ids"
op: "ArgMax"
input: "dnn/logits/BiasAdd"
input: "dnn/head/predictions/class_ids/dimension"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
}
}
}
}
}
node {
name: "dnn/head/predictions/ExpandDims/dim"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: -1
}
}
}
}
node {
name: "dnn/head/predictions/ExpandDims"
op: "ExpandDims"
input: "dnn/head/predictions/class_ids"
input: "dnn/head/predictions/ExpandDims/dim"
attr {
key: "T"
value {
type: DT_INT64
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
}
}
node {
name: "dnn/head/predictions/str_classes"
op: "AsString"
input: "dnn/head/predictions/ExpandDims"
attr {
key: "T"
value {
type: DT_INT64
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
}
}
node {
name: "dnn/head/predictions/probabilities"
op: "Softmax"
input: "dnn/logits/BiasAdd"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/head/Shape"
op: "Shape"
input: "dnn/head/predictions/probabilities"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
}
node {
name: "dnn/head/strided_slice/stack"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 1
}
}
int_val: 0
}
}
}
}
node {
name: "dnn/head/strided_slice/stack_1"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 1
}
}
int_val: 1
}
}
}
}
node {
name: "dnn/head/strided_slice/stack_2"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 1
}
}
int_val: 1
}
}
}
}
node {
name: "dnn/head/strided_slice"
op: "StridedSlice"
input: "dnn/head/Shape"
input: "dnn/head/strided_slice/stack"
input: "dnn/head/strided_slice/stack_1"
input: "dnn/head/strided_slice/stack_2"
attr {
key: "Index"
value {
type: DT_INT32
}
}
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "shrink_axis_mask"
value {
i: 1
}
}
}
node {
name: "dnn/head/range/start"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 0
}
}
}
}
node {
name: "dnn/head/range/limit"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 20
}
}
}
}
node {
name: "dnn/head/range/delta"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 1
}
}
}
}
node {
name: "dnn/head/range"
op: "Range"
input: "dnn/head/range/start"
input: "dnn/head/range/limit"
input: "dnn/head/range/delta"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/head/AsString"
op: "AsString"
input: "dnn/head/range"
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/head/ExpandDims/dim"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 0
}
}
}
}
node {
name: "dnn/head/ExpandDims"
op: "ExpandDims"
input: "dnn/head/AsString"
input: "dnn/head/ExpandDims/dim"
attr {
key: "T"
value {
type: DT_STRING
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "dnn/head/Tile/multiples/1"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 1
}
}
}
}
node {
name: "dnn/head/Tile/multiples"
op: "Pack"
input: "dnn/head/strided_slice"
input: "dnn/head/Tile/multiples/1"
attr {
key: "N"
value {
i: 2
}
}
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
}
node {
name: "dnn/head/Tile"
op: "Tile"
input: "dnn/head/ExpandDims"
input: "dnn/head/Tile/multiples"
attr {
key: "T"
value {
type: DT_STRING
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "zero_fraction/zero"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0
}
}
}
}
node {
name: "zero_fraction/Equal"
op: "Equal"
input: "dnn/hiddenlayer_0/kernel/part_0/read"
input: "zero_fraction/zero"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "zero_fraction/Cast"
op: "Cast"
input: "zero_fraction/Equal"
attr {
key: "DstT"
value {
type: DT_FLOAT
}
}
attr {
key: "SrcT"
value {
type: DT_BOOL
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "zero_fraction/Const"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 2
}
}
tensor_content: "\000\000\000\000\001\000\000\000"
}
}
}
}
node {
name: "zero_fraction/Mean"
op: "Mean"
input: "zero_fraction/Cast"
input: "zero_fraction/Const"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/fraction_of_zero_weights/tags"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "dnn/hiddenlayer_0/kernel/part_0/fraction_of_zero_weights"
}
}
}
}
node {
name: "dnn/hiddenlayer_0/kernel/part_0/fraction_of_zero_weights"
op: "ScalarSummary"
input: "dnn/hiddenlayer_0/kernel/part_0/fraction_of_zero_weights/tags"
input: "zero_fraction/Mean"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "zero_fraction_1/zero"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_FLOAT
tensor_shape {
}
float_val: 0
}
}
}
}
node {
name: "zero_fraction_1/Equal"
op: "Equal"
input: "dnn/logits/kernel/part_0/read"
input: "zero_fraction_1/zero"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "zero_fraction_1/Cast"
op: "Cast"
input: "zero_fraction_1/Equal"
attr {
key: "DstT"
value {
type: DT_FLOAT
}
}
attr {
key: "SrcT"
value {
type: DT_BOOL
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "zero_fraction_1/Const"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 2
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 2
}
}
tensor_content: "\000\000\000\000\001\000\000\000"
}
}
}
}
node {
name: "zero_fraction_1/Mean"
op: "Mean"
input: "zero_fraction_1/Cast"
input: "zero_fraction_1/Const"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/fraction_of_zero_weights/tags"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "dnn/logits/kernel/part_0/fraction_of_zero_weights"
}
}
}
}
node {
name: "dnn/logits/kernel/part_0/fraction_of_zero_weights"
op: "ScalarSummary"
input: "dnn/logits/kernel/part_0/fraction_of_zero_weights/tags"
input: "zero_fraction_1/Mean"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "init"
op: "NoOp"
}
node {
name: "init_all_tables"
op: "NoOp"
}
node {
name: "init_1"
op: "NoOp"
}
node {
name: "group_deps"
op: "NoOp"
input: "^init"
input: "^init_1"
input: "^init_all_tables"
}
node {
name: "save/Const"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "model"
}
}
}
}
node {
name: "save/StringJoin/inputs_1"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
}
string_val: "_temp_579b5e65d43d45f3bd142ae66888bbab/part"
}
}
}
}
node {
name: "save/StringJoin"
op: "StringJoin"
input: "save/Const"
input: "save/StringJoin/inputs_1"
attr {
key: "N"
value {
i: 2
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "save/num_shards"
op: "Const"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 1
}
}
}
}
node {
name: "save/ShardedFilename/shard"
op: "Const"
device: "/device:CPU:0"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 0
}
}
}
}
node {
name: "save/ShardedFilename"
op: "ShardedFilename"
input: "save/StringJoin"
input: "save/ShardedFilename/shard"
input: "save/num_shards"
device: "/device:CPU:0"
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "save/SaveV2/tensor_names"
op: "Const"
device: "/device:CPU:0"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 5
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
dim {
size: 5
}
}
string_val: "dnn/hiddenlayer_0/bias"
string_val: "dnn/hiddenlayer_0/kernel"
string_val: "dnn/logits/bias"
string_val: "dnn/logits/kernel"
string_val: "global_step"
}
}
}
}
node {
name: "save/SaveV2/shape_and_slices"
op: "Const"
device: "/device:CPU:0"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 5
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
dim {
size: 5
}
}
string_val: "150 0,150"
string_val: "180 150 0,180:0,150"
string_val: "20 0,20"
string_val: "150 20 0,150:0,20"
string_val: ""
}
}
}
}
node {
name: "save/SaveV2"
op: "SaveV2"
input: "save/ShardedFilename"
input: "save/SaveV2/tensor_names"
input: "save/SaveV2/shape_and_slices"
input: "dnn/hiddenlayer_0/bias/part_0/read"
input: "dnn/hiddenlayer_0/kernel/part_0/read"
input: "dnn/logits/bias/part_0/read"
input: "dnn/logits/kernel/part_0/read"
input: "global_step"
device: "/device:CPU:0"
attr {
key: "dtypes"
value {
list {
type: DT_FLOAT
type: DT_FLOAT
type: DT_FLOAT
type: DT_FLOAT
type: DT_INT64
}
}
}
}
node {
name: "save/control_dependency"
op: "Identity"
input: "save/ShardedFilename"
input: "^save/SaveV2"
device: "/device:CPU:0"
attr {
key: "T"
value {
type: DT_STRING
}
}
attr {
key: "_class"
value {
list {
s: "loc:@save/ShardedFilename"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "save/MergeV2Checkpoints/checkpoint_prefixes"
op: "Pack"
input: "save/ShardedFilename"
input: "^save/control_dependency"
device: "/device:CPU:0"
attr {
key: "N"
value {
i: 1
}
}
attr {
key: "T"
value {
type: DT_STRING
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 1
}
}
}
}
}
}
node {
name: "save/MergeV2Checkpoints"
op: "MergeV2Checkpoints"
input: "save/MergeV2Checkpoints/checkpoint_prefixes"
input: "save/Const"
device: "/device:CPU:0"
}
node {
name: "save/Identity"
op: "Identity"
input: "save/Const"
input: "^save/MergeV2Checkpoints"
input: "^save/control_dependency"
device: "/device:CPU:0"
attr {
key: "T"
value {
type: DT_STRING
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "save/RestoreV2/tensor_names"
op: "Const"
device: "/device:CPU:0"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 5
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
dim {
size: 5
}
}
string_val: "dnn/hiddenlayer_0/bias"
string_val: "dnn/hiddenlayer_0/kernel"
string_val: "dnn/logits/bias"
string_val: "dnn/logits/kernel"
string_val: "global_step"
}
}
}
}
node {
name: "save/RestoreV2/shape_and_slices"
op: "Const"
device: "/device:CPU:0"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 5
}
}
}
}
}
attr {
key: "dtype"
value {
type: DT_STRING
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_STRING
tensor_shape {
dim {
size: 5
}
}
string_val: "150 0,150"
string_val: "180 150 0,180:0,150"
string_val: "20 0,20"
string_val: "150 20 0,150:0,20"
string_val: ""
}
}
}
}
node {
name: "save/RestoreV2"
op: "RestoreV2"
input: "save/Const"
input: "save/RestoreV2/tensor_names"
input: "save/RestoreV2/shape_and_slices"
device: "/device:CPU:0"
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
}
shape {
dim {
size: 180
}
dim {
size: 150
}
}
shape {
dim {
size: 20
}
}
shape {
dim {
size: 150
}
dim {
size: 20
}
}
shape {
unknown_rank: true
}
}
}
}
attr {
key: "dtypes"
value {
list {
type: DT_FLOAT
type: DT_FLOAT
type: DT_FLOAT
type: DT_FLOAT
type: DT_INT64
}
}
}
}
node {
name: "save/Assign"
op: "Assign"
input: "dnn/hiddenlayer_0/bias/part_0"
input: "save/RestoreV2"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
}
}
}
}
}
node {
name: "save/Assign_1"
op: "Assign"
input: "dnn/hiddenlayer_0/kernel/part_0"
input: "save/RestoreV2:1"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/hiddenlayer_0/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 180
}
dim {
size: 150
}
}
}
}
}
}
node {
name: "save/Assign_2"
op: "Assign"
input: "dnn/logits/bias/part_0"
input: "save/RestoreV2:2"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/bias/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 20
}
}
}
}
}
}
node {
name: "save/Assign_3"
op: "Assign"
input: "dnn/logits/kernel/part_0"
input: "save/RestoreV2:3"
attr {
key: "T"
value {
type: DT_FLOAT
}
}
attr {
key: "_class"
value {
list {
s: "loc:@dnn/logits/kernel/part_0"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
dim {
size: 150
}
dim {
size: 20
}
}
}
}
}
}
node {
name: "save/Assign_4"
op: "Assign"
input: "global_step"
input: "save/RestoreV2:4"
attr {
key: "T"
value {
type: DT_INT64
}
}
attr {
key: "_class"
value {
list {
s: "loc:@global_step"
}
}
}
attr {
key: "_output_shapes"
value {
list {
shape {
}
}
}
}
}
node {
name: "save/restore_shard"
op: "NoOp"
input: "^save/Assign"
input: "^save/Assign_1"
input: "^save/Assign_2"
input: "^save/Assign_3"
input: "^save/Assign_4"
}
node {
name: "save/restore_all"
op: "NoOp"
input: "^save/restore_shard"
}
versions {
producer: 26
}
}
saver_def {
filename_tensor_name: "save/Const:0"
save_tensor_name: "save/Identity:0"
restore_op_name: "save/restore_all"
max_to_keep: 5
sharded: true
keep_checkpoint_every_n_hours: 10000
version: V2
}
collection_def {
key: "global_step"
value {
bytes_list {
value: "\n\rglobal_step:0\022\022global_step/Assign\032\022global_step/read:02\037global_step/Initializer/zeros:0"
}
}
}
collection_def {
key: "params"
value {
node_list {
value: "params/l1_regularization_strength:0"
value: "params/augmentation_max_rotation_degrees:0"
value: "params/label_weights:0"
value: "params/dropout:0"
value: "params/learning_rate:0"
value: "params/use_included_label_weight:0"
value: "params/activation_fn:0"
value: "params/augmentation_x_shift_probability:0"
value: "params/layer_dims:0"
value: "params/model_name:0"
value: "params/l2_regularization_strength:0"
}
}
}
collection_def {
key: "saved_model_main_op"
value {
node_list {
value: "group_deps"
}
}
}
collection_def {
key: "summaries"
value {
node_list {
value: "dnn/dnn/hiddenlayer_0/fraction_of_zero_values:0"
value: "dnn/dnn/hiddenlayer_0/activation:0"
value: "dnn/dnn/logits/fraction_of_zero_values:0"
value: "dnn/dnn/logits/activation:0"
value: "dnn/hiddenlayer_0/kernel/part_0/fraction_of_zero_weights:0"
value: "dnn/logits/kernel/part_0/fraction_of_zero_weights:0"
}
}
}
collection_def {
key: "trainable_variables"
value {
bytes_list {
value: "\n!dnn/hiddenlayer_0/kernel/part_0:0\022&dnn/hiddenlayer_0/kernel/part_0/Assign\032&dnn/hiddenlayer_0/kernel/part_0/read:0\"*\n\030dnn/hiddenlayer_0/kernel\022\004\264\001\226\001\032\002\000\000\"\004\264\001\226\0012<dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform:08\001"
value: "\n\037dnn/hiddenlayer_0/bias/part_0:0\022$dnn/hiddenlayer_0/bias/part_0/Assign\032$dnn/hiddenlayer_0/bias/part_0/read:0\"#\n\026dnn/hiddenlayer_0/bias\022\002\226\001\032\001\000\"\002\226\00121dnn/hiddenlayer_0/bias/part_0/Initializer/zeros:08\001"
value: "\n\032dnn/logits/kernel/part_0:0\022\037dnn/logits/kernel/part_0/Assign\032\037dnn/logits/kernel/part_0/read:0\"!\n\021dnn/logits/kernel\022\003\226\001\024\032\002\000\000\"\003\226\001\02425dnn/logits/kernel/part_0/Initializer/random_uniform:08\001"
value: "\n\030dnn/logits/bias/part_0:0\022\035dnn/logits/bias/part_0/Assign\032\035dnn/logits/bias/part_0/read:0\"\032\n\017dnn/logits/bias\022\001\024\032\001\000\"\001\0242*dnn/logits/bias/part_0/Initializer/zeros:08\001"
}
}
}
collection_def {
key: "variables"
value {
bytes_list {
value: "\n\rglobal_step:0\022\022global_step/Assign\032\022global_step/read:02\037global_step/Initializer/zeros:0"
value: "\n!dnn/hiddenlayer_0/kernel/part_0:0\022&dnn/hiddenlayer_0/kernel/part_0/Assign\032&dnn/hiddenlayer_0/kernel/part_0/read:0\"*\n\030dnn/hiddenlayer_0/kernel\022\004\264\001\226\001\032\002\000\000\"\004\264\001\226\0012<dnn/hiddenlayer_0/kernel/part_0/Initializer/random_uniform:08\001"
value: "\n\037dnn/hiddenlayer_0/bias/part_0:0\022$dnn/hiddenlayer_0/bias/part_0/Assign\032$dnn/hiddenlayer_0/bias/part_0/read:0\"#\n\026dnn/hiddenlayer_0/bias\022\002\226\001\032\001\000\"\002\226\00121dnn/hiddenlayer_0/bias/part_0/Initializer/zeros:08\001"
value: "\n\032dnn/logits/kernel/part_0:0\022\037dnn/logits/kernel/part_0/Assign\032\037dnn/logits/kernel/part_0/read:0\"!\n\021dnn/logits/kernel\022\003\226\001\024\032\002\000\000\"\003\226\001\02425dnn/logits/kernel/part_0/Initializer/random_uniform:08\001"
value: "\n\030dnn/logits/bias/part_0:0\022\035dnn/logits/bias/part_0/Assign\032\035dnn/logits/bias/part_0/read:0\"\032\n\017dnn/logits/bias\022\001\024\032\001\000\"\001\0242*dnn/logits/bias/part_0/Initializer/zeros:08\001"
}
}
}
signature_def {
key: "example:classification"
value {
inputs {
key: "inputs"
value {
name: "Placeholder:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
}
}
}
outputs {
key: "classes"
value {
name: "dnn/head/Tile:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
outputs {
key: "scores"
value {
name: "dnn/head/predictions/probabilities:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
method_name: "tensorflow/serving/classify"
}
}
signature_def {
key: "example:predict"
value {
inputs {
key: "input"
value {
name: "Placeholder:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
}
}
}
outputs {
key: "class_ids"
value {
name: "dnn/head/predictions/ExpandDims:0"
dtype: DT_INT64
tensor_shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
outputs {
key: "classes"
value {
name: "dnn/head/predictions/str_classes:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
outputs {
key: "logits"
value {
name: "dnn/logits/BiasAdd:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
outputs {
key: "probabilities"
value {
name: "dnn/head/predictions/probabilities:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
method_name: "tensorflow/serving/predict"
}
}
signature_def {
key: "example:serving_default"
value {
inputs {
key: "inputs"
value {
name: "Placeholder:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
}
}
}
outputs {
key: "classes"
value {
name: "dnn/head/Tile:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
outputs {
key: "scores"
value {
name: "dnn/head/predictions/probabilities:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
method_name: "tensorflow/serving/classify"
}
}
signature_def {
key: "patch:predict"
value {
inputs {
key: "input"
value {
name: "ParseExample/ParseExample:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 15
}
dim {
size: 12
}
}
}
}
outputs {
key: "class_ids"
value {
name: "dnn/head/predictions/ExpandDims:0"
dtype: DT_INT64
tensor_shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
outputs {
key: "classes"
value {
name: "dnn/head/predictions/str_classes:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
outputs {
key: "logits"
value {
name: "dnn/logits/BiasAdd:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
outputs {
key: "probabilities"
value {
name: "dnn/head/predictions/probabilities:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
method_name: "tensorflow/serving/predict"
}
}
signature_def {
key: "predict"
value {
inputs {
key: "input"
value {
name: "ParseExample/ParseExample:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 15
}
dim {
size: 12
}
}
}
}
outputs {
key: "class_ids"
value {
name: "dnn/head/predictions/ExpandDims:0"
dtype: DT_INT64
tensor_shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
outputs {
key: "classes"
value {
name: "dnn/head/predictions/str_classes:0"
dtype: DT_STRING
tensor_shape {
dim {
size: -1
}
dim {
size: 1
}
}
}
}
outputs {
key: "logits"
value {
name: "dnn/logits/BiasAdd:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
outputs {
key: "probabilities"
value {
name: "dnn/head/predictions/probabilities:0"
dtype: DT_FLOAT
tensor_shape {
dim {
size: -1
}
dim {
size: 20
}
}
}
}
method_name: "tensorflow/serving/predict"
}
}
}
================================================
FILE: moonlight/engine.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""The optical music recognition engine.
Parses PNG music score images.
The engine holds a TensorFlow graph containing all structural information and
classifier predictions from an image.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import numpy as np
import six
import tensorflow as tf
from moonlight import conversions
from moonlight import image
from moonlight import page_processors
from moonlight import score_processors
from moonlight import structure as structure_module
from moonlight.glyphs import saved_classifier_fn
from moonlight.protobuf import musicscore_pb2
from moonlight.staves import base as staves_base
from moonlight.structure import beams
from moonlight.structure import components
from moonlight.structure import verticals
# TODO(ringw): Get OMR running on GPU. It seems to create too many individual
# ops/allocations and can freeze the machine.
CONFIG = tf.ConfigProto(device_count={'GPU': 0})
class OMREngine(object):
"""The OMR engine.
The engine reads one music score page image at a time, and extracts musical
elements from it.
The glyph classifier can be chosen by a custom glyph_classifier_fn.
Attributes:
graph: The TensorFlow graph used by OMR.
png_path: A scalar tensor representing the PNG image filename to load. It
should be used as a key in the `feed_dict` to `get_page()`. The image is
expected to contain an entire music score page.
image: A 2D uint8 tensor representing the music score page image. The
background is white (255) and the foreground is black (0). It may be fed
in to `get_page()` instead of specifying a PNG filename with `png_path`.
structure: The `Structure` holds the tensors that represent structural
information in the music score.
glyph_classifier: An instance of `BaseGlyphClassifier`, which holds the
tensor representing the detected glyphs for the entire page. Defaults to
nearest-neighbor classification using the pre-packaged labeled clusters.
"""
def __init__(self, glyph_classifier_fn=None):
"""Creates the engine and TF graph for running OMR.
Args:
glyph_classifier_fn: Callable that loads the glyph classifier into the
graph. Accepts a `Structure` as the single argument, and returns an
instance of `BaseGlyphClassifier`. The function typically loads a TF
saved model or other external data, and wraps the classification in a
concrete glyph classifier subclass. If the classifier uses a
`StafflineExtractor` for classification, it must set the
`staffline_extractor` attribute of the `Structure`. Otherwise, glyph x
coordinates will not be scaled back to image coordinates.
"""
glyph_classifier_fn = (
glyph_classifier_fn or saved_classifier_fn.build_classifier_fn())
self.graph = tf.Graph()
self.session = tf.Session(graph=self.graph)
with self.graph.as_default():
with self.session.as_default():
with tf.name_scope('OMREngine'):
self.png_path = tf.placeholder(tf.string, name='png_path', shape=())
self.image = image.decode_music_score_png(
tf.read_file(self.png_path, name='page_image'))
self.structure = structure_module.create_structure(self.image)
# Loading saved models happens outside of the name scope, because scopes
# can rename tensors from the model and cause dangling references.
# TODO(ringw): TF should be able to load models gracefully within a
# name scope.
self.glyph_classifier = glyph_classifier_fn(self.structure)
def run(self, input_pngs, output_notesequence=False):
"""Converts input PNGs into a `Score` message.
Args:
input_pngs: A list of PNG filenames to process.
output_notesequence: Whether to return a NoteSequence, as opposed to a
Score containing Pages with Glyphs.
Returns:
A NoteSequence message, or a Score message holding Pages for each input
image (with their detected Glyphs).
"""
if isinstance(input_pngs, six.string_types):
input_pngs = [input_pngs]
score = musicscore_pb2.Score()
score.page.extend(
self._get_page(feed_dict={self.png_path: png}) for png in input_pngs)
score = score_processors.process(score)
return (conversions.score_to_notesequence(score)
if output_notesequence else score)
def process_image(self, image_arr, process_structure=True):
"""Processes a uint8 image array.
Args:
image_arr: A 2D (H, W) uint8 NumPy array. Must have a white (255)
background and black (0) foreground.
process_structure: Whether to add structural information to the page.
Returns:
A `Page` message constructed from the contents of the image.
"""
with tf.Session(graph=self.graph, config=CONFIG):
return self._get_page(
feed_dict={self.image: image_arr},
process_structure=process_structure)
def _get_page(self, feed_dict=None, process_structure=True):
"""Returns the Page holding Glyphs for the page.
Args:
feed_dict: The feed dict to use for the TensorFlow graph. The image must
be fed in.
process_structure: If True, run the page_processors, which add staff
locations and other structural information. If False, return a Page
containing only Glyphs for each staff.
Returns:
A Page message holding Staff protos which have location information and
detected Glyphs.
"""
# If structure is given, output all structural information in addition to
# the Page message.
structure_data = (
_nested_ndarrays_to_tensors(self.structure.data)
if self.structure else [])
structure_data, glyphs = self.session.run(
[structure_data,
self.glyph_classifier.get_detected_glyphs()],
feed_dict=feed_dict)
computed_staves, computed_beams, computed_verticals, computed_components = (
structure_data)
# Construct and return a computed Structure.
computed_structure = structure_module.Structure(
staves_base.ComputedStaves(*computed_staves),
beams.ComputedBeams(*computed_beams),
verticals.ComputedVerticals(*computed_verticals),
components.ComputedComponents(*computed_components))
# The Page without staff location information.
labels_page = self.glyph_classifier.glyph_predictions_to_page(glyphs)
# Process the Page using the computed structure.
if process_structure:
processed_page = page_processors.process(
labels_page, computed_structure,
self.glyph_classifier.staffline_extractor)
else:
processed_page = labels_page
return processed_page
def _nested_ndarrays_to_tensors(data):
"""Converts possibly nested lists of np.ndarrays and Tensors to Tensors.
This is necessary in case some data in the Structure is already computed. We
just pass everything to tf.Session.run, and some data may be a tf.constant
which is just spit back out.
Args:
data: An np.ndarray, Tensor, or list recursively containing the same types.
Returns:
data with all np.ndarrays converted to Tensors.
Raises:
ValueError: If unexpected data was given.
"""
if isinstance(data, list):
return [_nested_ndarrays_to_tensors(element) for element in data]
elif isinstance(data, np.ndarray):
return tf.constant(data)
elif isinstance(data, tf.Tensor):
return data
else:
raise ValueError('Unexpected data: %s' % data)
================================================
FILE: moonlight/evaluation/BUILD
================================================
# Description:
# OMR evaluation and music score similarity metrics.
package(
default_visibility = ["//moonlight:__subpackages__"],
)
licenses(["notice"]) # Apache 2.0
py_library(
name = "evaluation",
deps = [
":evaluator_lib",
":musicxml",
],
)
py_binary(
name = "evaluator",
srcs = ["evaluator.py"],
srcs_version = "PY2AND3",
deps = [
":evaluator_lib",
# disable_tf2
],
)
py_library(
name = "evaluator_lib",
srcs = ["evaluator.py"],
srcs_version = "PY2AND3",
deps = [
":musicxml",
"@com_google_protobuf//:protobuf_python",
"//moonlight:engine",
"//moonlight/conversions",
"//moonlight/protobuf:protobuf_py_pb2",
# tensorflow dep
],
)
py_test(
name = "evaluator_endtoend_test",
size = "large",
srcs = ["evaluator_endtoend_test.py"],
data = [
"//moonlight/testdata:images",
"//moonlight/testdata:musicxml",
],
srcs_version = "PY2AND3",
deps = [
":evaluator_lib",
# disable_tf2
"//moonlight/protobuf:protobuf_py_pb2",
# tensorflow dep
],
)
py_library(
name = "musicxml",
srcs = ["musicxml.py"],
srcs_version = "PY2AND3",
deps = [
# lxml dep
"//moonlight/music:constants",
# pandas dep
# six dep
],
)
py_test(
name = "musicxml_test",
srcs = ["musicxml_test.py"],
data = ["//moonlight/testdata:musicxml"],
srcs_version = "PY2AND3",
deps = [
":musicxml",
# disable_tf2
# absl/testing dep
# lxml dep
# pandas dep
# tensorflow dep
],
)
================================================
FILE: moonlight/evaluation/evaluator.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Evaluates OMR using ground truth images and MusicXML."""
# TODO(ringw): Maybe add a flag for exporting CSV. We might also want to join
# all DataFrames, with a new index for the ground truth title, if that's not too
# unwieldy.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
from google.protobuf import text_format
from tensorflow.python.lib.io import file_io
from moonlight import conversions
from moonlight import engine
from moonlight.evaluation import musicxml
from moonlight.protobuf import groundtruth_pb2
class Evaluator(object):
def __init__(self, **kwargs):
self.omr = engine.OMREngine(**kwargs)
def evaluate(self, ground_truth):
expected = file_io.read_file_to_string(ground_truth.ground_truth_filename)
score = self.omr.run(
page_spec.filename for page_spec in ground_truth.page_spec)
actual = conversions.score_to_musicxml(score)
return musicxml.musicxml_similarity(actual, expected)
def main(argv):
if len(argv) <= 1:
raise ValueError('Ground truth filenames are required')
evaluator = Evaluator()
for ground_truth_file in argv[1:]:
truth = groundtruth_pb2.GroundTruth()
text_format.Parse(file_io.read_file_to_string(ground_truth_file), truth)
print(truth.title)
print(evaluator.evaluate(truth))
if __name__ == '__main__':
tf.app.run()
================================================
FILE: moonlight/evaluation/evaluator_endtoend_test.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Runs OMR evaluation end to end and asserts on the evaluation metric."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os.path
import tensorflow as tf
from moonlight.evaluation import evaluator
from moonlight.protobuf import groundtruth_pb2
class EvaluatorEndToEndTest(tf.test.TestCase):
def testIncludedGroundTruth(self):
ground_truth = groundtruth_pb2.GroundTruth(
ground_truth_filename=os.path.join(
tf.resource_loader.get_data_files_path(),
'../testdata/IMSLP00747.golden.xml'),
page_spec=[
groundtruth_pb2.PageSpec(
filename=os.path.join(tf.resource_loader.get_data_files_path(),
'../testdata/IMSLP00747-000.png')),
])
results = evaluator.Evaluator().evaluate(ground_truth)
# Evaluation score is known to be greater than 0.65.
self.assertGreater(results['overall_score']['total', ''], 0.65)
if __name__ == '__main__':
tf.test.main()
================================================
FILE: moonlight/evaluation/musicxml.py
================================================
# Copyright 2018 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Simple MusicXML high-level diffing and evaluation.
This is a high-level evaluation for OMR because it compares musical elements,
instead of the individual glyphs and lines that are detected. It is also easier
to find MusicXML ground truth that corresponds exactly to a scanned score.
The evaluation result is a pandas DataFrame. This can eventually have multiple
columns with different evaluation metrics/subscores for different musical
elements for each measure (and the average for the whole score), which we can
choose between as needed.
This is considered high-level evaluation [1] because MusicXML does not have
information on every visual element (e.g. it has the stem direction and number
of beams, but not the start and end coordinates of the stem and beams). We also
want to evaluate emergent properties of OMR, such as note durations and voices,
instead of just counting the raw elements that are detected. This has the
drawback that one mistake in low-level symbol detection can have drastic effects
on the evaluation, but has the benefit that we can obtain thousands of MusicXML
scores. Low-level evaluation would require labeling the coordinates of all
elements on a given music score image, and there is not a public dataset for
printed music scores (MUSCIMA++ has handwritten music scores, which are
currently outside of our scope.)
[1] D. Byrd and J. G. Simonsen. Towards a standard testbed for optical music
recognition: definitions, metrics, and page images, 2015.
http://www.informatics.indiana.edu/donbyrd/OMRTestbed/
OMRStandardTestbed1Mar2013.pdf
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import copy
import fractions
from lxml import etree
import pandas as pd
import six
from six import moves
from moonlight.music import constants
OVERALL_SCORE = 'overall_score'
def musicxml_similarity(a, b):
"""Determines the similarity of two scores represented as musicXML strings.
We currently assume the following:
- The durations of each representation represent the same tempo.
- i.e. the unit of measure tempo divisions is the same for both pieces.
- Scores have an equal number of measures. They are dissimilar otherwise.
- Corresponding parts have an equal number of staves.
This currently accounts for:
- Intra-measure note-to-note pitch and tempo distance, as an edit distance.
- This can easily be generalized to weight differences based on distance,
such as penalizing larger gaps in pitch or tempo.
Args:
a: a musicXML string
b: a musicXML string
Returns:
A pd.DataFrame with scores on similarity between two scores. 1.0 means
exactly the same, 0.0 means not similar at all.
Example:
overall_score
staff measure
0 0 0.750
1 1.000
total 0.875
"""
if isinstance(a, six.text_type):
a = a.encode('utf-8')
if isinstance(a, bytes):
a = etree.fromstring(a)
if isinstance(b, six.text_type):
b = b.encode('utf-8')
if isinstance(b, bytes):
b = etree.fromstring(b)
a = PartStaves(a)
b = PartStaves(b)
# TODO(larryruili): Implement dissimilar measure count edit distance.
if a.all_measure_counts() != b.all_measure_counts():
return _not_similar()
measure_similarities = []
index = []
for part_staff in moves.xrange(a.num_partstaves()):
for measure_num in moves.xrange(a.num_measures(part_staff)):
measure_similarities.append(
measure_similarity(
a.get_measure(part_staff, measure_num),
b.get_measure(part_staff, measure_num)))
index.append((part_staff, measure_num))
df = pd.DataFrame(
measure_similarities,
columns=[OVERALL_SCORE],
index=pd.MultiIndex.from_tuples(index, names=['staff', 'measure']))
return df.append(
pd.DataFrame([df[OVERALL_SCORE].mean()],
columns=[OVERALL_SCORE],
index=pd.MultiIndex.from_tuples([('total', '')],
names=['staff', 'measure'])))
def _not_similar():
"""Returns a pd.DataFrame representing 0 similarity between scores."""
return pd.DataFrame(
[[0]],
columns=[OVERALL_SCORE],
index=pd.MultiIndex.from_product([('total',), ('',)],
names=['staff', 'measure']),
)
def _index(num_staves, num_measures):
return pd.MultiIndex.from_product(
[range(num_staves), range(num_measures)], names=['staff', 'measure'])
class PartStaves(object):
"""Accessor for parts and staves in a MusicXML score.
self.score: An etree rooted at <score>
self.part_staves: A list of tuples (part number, staff number)
"""
def __init__(self, score):
self.score = score
num_parts = len(score.findall('part'))
part_staves = []
for i in moves.xrange(num_parts):
# XPath is 1-indexed.
part = score.find('part[%d]' % (i + 1))
def num_staves(part):
yield 1
for staves_tag in part.findall('measure/attributes/staves'):
yield int(staves_tag.text)
# Find all tags underneath any <attributes> tag. They may also include
# the staff number as a "number" XML attribute.
for attribute in part.findall('.//attributes//'):
if 'number' in attribute.attrib:
yield int(attribute.attrib['number'])
num_staves = max(num_staves(part))
for j in moves.xrange(num_staves):
part_staves.append((i, j))
self.part_staves = part_staves
def get_measure(self, part_staff, measure_num):
"""Returns a single staff measure given a part and measure index.
Args:
part_staff: Index of the staff across all parts.
measure_num: Index of the measure.
Returns:
A <measure> etree element.
"""
part_num, staff_num = self.part_staves[part_staff]
part = self.score.find('part[%d]' % (part_num + 1))
measure = part.find('measure[%d]' % (measure_num + 1))
staff_measure = etree.Element('measure')
staff_measure.append(self._build_attributes(staff_num, measure))
for elem in measure:
if elem.tag != 'attributes':
new_elem = _filter_for_staff(staff_num, elem)
if new_elem is not None:
staff_measure.append(new_elem)
return staff_measure
def _build_attributes(self, staff_num, orig_measure):
"""Builds a new <attributes> tag for the given measure.
If the measure already contains an <attributes> tag, all children are copied
as long as they do not have a <staff> tag that's incompatible with the given
staff number.
If the existing <attributes> don't contain a <divisions> tag, we take the
most recent <divisions> from any measure on any part. The <divisions> tag is
required at the start of the score.
Args:
staff_num: Index of the staff across all parts.
orig_measure: The original <measure> tag, which may or may not contain its
own <attributes>.
Returns:
A new <attributes> tag, keeping any applicable attributes from
`orig_measure`, and the most recent <divisions>.
"""
new_attributes = etree.Element('attributes')
orig_attributes = orig_measure.find('attributes')
if orig_attributes is not None:
for attribute in orig_attributes:
new_attribute = _filter_for_staff(staff_num, attribute)
if new_attribute is not None:
new_attributes.append(new_attribute)
# Look for the most recent "divisions" tag.
def find_divisions():
divisions = None
for part in orig_measure.getparent().getparent().findall('part'):
for prev_measure in part.findall('measure'):
prev_attrs = prev_measure.find('attributes')
if (prev_attrs is not None and
prev_attrs.find('divisions') is not None):
divisions = prev_attrs.find('divisions')
if prev_measure is orig_measure:
# Done.
if divisions is None:
raise ValueError('<divisions> is required')
# Copy the tag, or else when appending the tag to a new element, it
# will be deleted from the old element.
return copy.deepcopy(divisions)
if new_attributes.find('divisions') is None:
new_attributes.append(find_divisions())
return new_attributes
def num_partstaves(self):
"""Returns the total count of parts/staves."""
return len(self.part_staves)
def num_measures(self, part_staff):
part_num, _ = self.part_staves[part_staff]
return len(self._get_part(part_num).findall('measure'))
def _get_part(self, part_num):
return self.score.find('part[%d]' % (part_num + 1))
def all_measure_counts(self):
return [
self.num_measures(part_staff)
for part_staff in moves.xrange(self.num_partstaves())
]
def _filter_for_staff(staff_num, elem):
"""Determines whether the element belongs to the given staff.
Elements normally have a <staff> child, with a 1-indexed staff number.
However, elements in <attributes> may use a "number" XML attribute instead
(also 1-indexed).
Args:
staff_num: The 0-indexed staff index.
elem: The XML element (a descendant of <measure>).
Returns:
A copied element with staff information removed, or None if the element does
not belong to the given staff.
"""
staff = elem.find('staff')
new_elem = copy.deepcopy(elem)
if staff is not None:
for subelem in new_elem:
if subelem.tag == 'staff':
if subelem.text == str(staff_num + 1):
# Got the correct "staff" tag.
return new_elem
else:
# Incorrect "staff" value.
return None
# The "number" XML attribute can refer to the staff within <attributes>.
if 'number' in new_elem.attrib:
if new_elem.attrib['number'] == str(staff_num + 1):
del new_elem.attrib['number']
return new_elem
else:
# Incorrect "number" attribute.
return None
# No staff information, element should be copied to all staves.
return copy.deepcopy(elem)
def measure_similarity(a, b):
"""Similarity metric between <measure> tags.
This currently converts <note> tags into tuples of the form
((pitch letter, octave number, alteration number), duration),
then determines the edit distance between the two sequences of notes,
normalized by the maximum edit distance possible.
The similarity is simply 1 - (normalized edit distance).
Args:
a: A <measure> etree tag.
b: A <measure> etree tag.
Returns:
A float between 0.0 (no similarity) and 1.0 (measures are equivalent).
"""
a = measure_to_note_list(a)
b = measure_to_note_list(b)
scale = max(len(a), len(b))
if scale == 0:
return 1
else:
return 1 - (levenshtein(a, b) / scale)
def measure_to_note_list(measure):
notes = measure.findall('note')
note_list = []
for note in notes:
duration = duration_to_fraction(note.find('duration'), measure)
pitch = pitch_to_int(note.find('pitch'))
note_list.append((pitch, duration))
return note_list
def pitch_to_int(pitch):
"""Converts a <pitch> tag to a an integer representation.
Note is a character [A-G].
Octave is an integer [0-9]
Alter is an integer [(-2)-2]
The formula is: pitch = 12 * octave + note_map[note] + alter
Where note_map maps a note to its 12-tone representation, where C = 0, D = 2,
and B = 12. We only represent white-key tones in this map because the alter is
a separate dimension.
This is a heuristic that treats A#4 the same as Bb4, which may not
be the semantics we want for an optical recognition pipeline. However, it is
a simple way of gauging 'how far' a note's pitch is from the other.
Args:
pitch: A <pitch> etree tag.
Returns:
An integer representation of the pitch, from 0 to 129, which is the number
of semitones the pitch is above C0.
"""
note_map = dict(zip('CDEFGAB', constants.MAJOR_SCALE))
if pitch is None:
return None
note = pitch.find('step').text
octave = int(pitch.find('octave').text)
alter_tag = pitch.find('alter')
alter = int(alter_tag.text) if alter_tag is not None else 0
return 12 * octave + note_map[note] + alter
def duration_to_fraction(duration, measure):
if duration is None:
return None
else:
return fractions.Fraction(
int(duration.text), int(measure.find('attributes/divisions').text))
def _note_distance(n1, n2):
"""Returns the distance between two notes.
Pitch distance is a scaled absolute difference in pitch values between the two
notes. If the difference is below 6 half-steps, the distance is proportional
to difference / 8, else it's 1. This represents 'close-enough' semantics,
where we reward pitches that are at least reasonably close to the target,
since pitches can easily be very far off.
Duration distance is simply the ratio between the absolute distance between
representations, and the maximum duration between the notes. This represents
'percent-correct' semantics, because all durations will be at least somewhat
correct.
The note distance is defined as the average between pitch distance and
duration distance. Currently each distance has the same weight.
Args:
n1: a tuple of (pitch_int_representation, duration_int_representation)
n2: a tuple of (pitch_int_representation, duration_int_representation)
Returns:
A float from 0 to 1 representing the distance between the two notes.
"""
# If either pitch or duration is missing from either note, use exact equality.
if not (n1[0] and n2[0]) or not (n1[1] and n2[1]):
return n1 != n2
max_pitch_diff = 4.
pitch_diff = float(abs(n1[0] - n2[0]))
if pitch_diff > max_pitch_diff:
pitch_distance = 1
else:
pitch_distance = pitch_diff / max_pitch_diff
duration1 = float(n1[1])
duration2 = float(n2[1])
duration_distance = abs(duration1 - duration2) / max(duration1, duration2, 1)
return (pitch_distance + duration_distance) / 2
def levenshtein(s1, s2, distance=_note_distance):
"""Determines the edit distance between two sequences.
Args:
s1: A sequence.
s2: A sequence.
distance: A callable that accepts an element from each of s1 and s2, and
returns a float distance metric between 0 and 1.
Returns:
The Levenshtein distance between the two sequences.
"""
if len(s1) < len(s2):
return levenshtein(s2, s1, distance)
if not s2:
return len(s1)
previous_row = range(len(s2) + 1)
for i, c1 in enumerate(s1):
current_row = [i + 1]
for j, c2 in enumerate(s2):
insertions = previous_row[j + 1] + 1
deletions = current_row[j] + 1
substitutions = previous_row[j] + distance(c1, c2)
current_row.append(min(insertions, deletions, substitutions))
previous_row = current_row
return previous_row[-1]
def exact_match_distance(n1, n2):
"""Metric for `levenshtein
gitextract_080bjy0d/
├── AUTHORS
├── CONTRIBUTING.md
├── CONTRIBUTORS
├── LICENSE
├── README.md
├── WORKSPACE
├── docs/
│ ├── concepts.md
│ └── engine.md
├── moonlight/
│ ├── BUILD
│ ├── conversions/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── musicxml.py
│ │ ├── musicxml_test.py
│ │ └── notesequence.py
│ ├── data/
│ │ ├── README.md
│ │ └── glyphs_nn_model_20180808/
│ │ ├── BUILD
│ │ ├── saved_model.pbtxt
│ │ └── variables/
│ │ ├── variables.data-00000-of-00001
│ │ └── variables.index
│ ├── engine.py
│ ├── evaluation/
│ │ ├── BUILD
│ │ ├── evaluator.py
│ │ ├── evaluator_endtoend_test.py
│ │ ├── musicxml.py
│ │ └── musicxml_test.py
│ ├── glyphs/
│ │ ├── BUILD
│ │ ├── base.py
│ │ ├── convolutional.py
│ │ ├── convolutional_test.py
│ │ ├── corpus.py
│ │ ├── geometry.py
│ │ ├── glyph_types.py
│ │ ├── knn.py
│ │ ├── knn_model.py
│ │ ├── knn_test.py
│ │ ├── neural.py
│ │ ├── neural_test.py
│ │ ├── note_dots.py
│ │ ├── repeated.py
│ │ ├── saved_classifier.py
│ │ ├── saved_classifier_fn.py
│ │ ├── saved_classifier_test.py
│ │ └── testing.py
│ ├── image.py
│ ├── models/
│ │ ├── base/
│ │ │ ├── BUILD
│ │ │ ├── batches.py
│ │ │ ├── batches_test.py
│ │ │ ├── glyph_patches.py
│ │ │ ├── glyph_patches_test.py
│ │ │ ├── hyperparameters.py
│ │ │ ├── hyperparameters_test.py
│ │ │ ├── label_weights.py
│ │ │ └── label_weights_test.py
│ │ └── glyphs_dnn/
│ │ ├── BUILD
│ │ ├── model.py
│ │ └── train.py
│ ├── music/
│ │ ├── BUILD
│ │ └── constants.py
│ ├── omr.py
│ ├── omr_endtoend_test.py
│ ├── omr_regression_test.py
│ ├── page_processors.py
│ ├── pipeline/
│ │ ├── BUILD
│ │ └── pipeline_flags.py
│ ├── protobuf/
│ │ ├── BUILD
│ │ ├── groundtruth.proto
│ │ └── musicscore.proto
│ ├── score/
│ │ ├── BUILD
│ │ ├── elements/
│ │ │ ├── BUILD
│ │ │ ├── clef.py
│ │ │ ├── clef_test.py
│ │ │ ├── key_signature.py
│ │ │ └── key_signature_test.py
│ │ ├── measures.py
│ │ ├── reader.py
│ │ ├── reader_test.py
│ │ └── state/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── measure.py
│ │ └── staff.py
│ ├── score_processors.py
│ ├── scripts/
│ │ └── imslp_pdfs_to_pngs.sh
│ ├── staves/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── base.py
│ │ ├── detectors_test.py
│ │ ├── filter.py
│ │ ├── hough.py
│ │ ├── projection.py
│ │ ├── removal.py
│ │ ├── removal_test.py
│ │ ├── staff_processor.py
│ │ ├── staff_processor_test.py
│ │ ├── staffline_distance.py
│ │ ├── staffline_distance_test.py
│ │ ├── staffline_extractor.py
│ │ ├── staffline_extractor_test.py
│ │ └── testing.py
│ ├── structure/
│ │ ├── BUILD
│ │ ├── __init__.py
│ │ ├── barlines.py
│ │ ├── barlines_test.py
│ │ ├── beam_processor.py
│ │ ├── beams.py
│ │ ├── components.py
│ │ ├── components_test.py
│ │ ├── section_barlines.py
│ │ ├── stems.py
│ │ ├── stems_test.py
│ │ ├── structure_test.py
│ │ ├── verticals.py
│ │ └── verticals_test.py
│ ├── testdata/
│ │ ├── BUILD
│ │ ├── IMSLP00747-000.LICENSE.md
│ │ ├── IMSLP00747.golden.LICENSE.md
│ │ ├── IMSLP00747.golden.xml
│ │ ├── README.md
│ │ ├── TWO_MEASURE_SAMPLE.LICENSE.md
│ │ └── TWO_MEASURE_SAMPLE.xml
│ ├── tools/
│ │ ├── BUILD
│ │ ├── export_kmeans_centroids.py
│ │ ├── export_kmeans_centroids_test.py
│ │ └── gen_structure_test_case.py
│ ├── training/
│ │ ├── clustering/
│ │ │ ├── BUILD
│ │ │ ├── kmeans_labeler.py
│ │ │ ├── kmeans_labeler_request_handler.py
│ │ │ ├── kmeans_labeler_request_handler_test.py
│ │ │ ├── kmeans_labeler_template.html
│ │ │ ├── staffline_patches_dofn.py
│ │ │ ├── staffline_patches_dofn_test.py
│ │ │ ├── staffline_patches_kmeans_pipeline.py
│ │ │ └── staffline_patches_kmeans_pipeline_test.py
│ │ └── generation/
│ │ ├── BUILD
│ │ ├── generation.py
│ │ ├── generation_test.py
│ │ ├── image_noise.py
│ │ ├── vexflow_generator.js
│ │ └── vexflow_generator_pipeline.py
│ ├── util/
│ │ ├── BUILD
│ │ ├── functional_ops.py
│ │ ├── functional_ops_test.py
│ │ ├── memoize.py
│ │ ├── more_iter_tools.py
│ │ ├── more_iter_tools_test.py
│ │ ├── patches.py
│ │ ├── patches_test.py
│ │ ├── run_length.py
│ │ ├── run_length_test.py
│ │ ├── segments.py
│ │ └── segments_test.py
│ └── vision/
│ ├── BUILD
│ ├── hough.py
│ ├── hough_test.py
│ ├── images.py
│ ├── images_test.py
│ ├── morphology.py
│ └── morphology_test.py
├── requirements.txt
├── sandbox/
│ └── README.md
├── six.BUILD
└── tools/
├── bazel_0.20.0-linux-x86_64.deb.sha256
└── travis_tests.sh
SYMBOL INDEX (544 symbols across 111 files)
FILE: moonlight/conversions/musicxml.py
function score_to_musicxml (line 61) | def score_to_musicxml(score):
function _get_attributes (line 121) | def _get_attributes(measure, position=-1):
function _glyph_to_clef (line 143) | def _glyph_to_clef(glyph):
function _glyph_to_note (line 160) | def _glyph_to_note(glyph):
class MusicXMLScore (line 199) | class MusicXMLScore(object):
method __init__ (line 206) | def __init__(self, omr_score):
method get_measure (line 213) | def get_measure(self, part_ind, measure_ind):
method to_string (line 224) | def to_string(self):
function _create_part_list (line 233) | def _create_part_list(num_parts):
function _get_part_id (line 242) | def _get_part_id(part_ind):
FILE: moonlight/conversions/musicxml_test.py
class MusicXMLTest (line 27) | class MusicXMLTest(absltest.TestCase):
method testSmallScore (line 29) | def testSmallScore(self):
FILE: moonlight/conversions/notesequence.py
function score_to_notesequence (line 25) | def score_to_notesequence(score):
function page_to_notesequence (line 38) | def page_to_notesequence(page):
function _score_notes (line 42) | def _score_notes(score):
FILE: moonlight/engine.py
class OMREngine (line 47) | class OMREngine(object):
method __init__ (line 70) | def __init__(self, glyph_classifier_fn=None):
method run (line 100) | def run(self, input_pngs, output_notesequence=False):
method process_image (line 121) | def process_image(self, image_arr, process_structure=True):
method _get_page (line 137) | def _get_page(self, feed_dict=None, process_structure=True):
function _nested_ndarrays_to_tensors (line 183) | def _nested_ndarrays_to_tensors(data):
FILE: moonlight/evaluation/evaluator.py
class Evaluator (line 33) | class Evaluator(object):
method __init__ (line 35) | def __init__(self, **kwargs):
method evaluate (line 38) | def evaluate(self, ground_truth):
function main (line 46) | def main(argv):
FILE: moonlight/evaluation/evaluator_endtoend_test.py
class EvaluatorEndToEndTest (line 28) | class EvaluatorEndToEndTest(tf.test.TestCase):
method testIncludedGroundTruth (line 30) | def testIncludedGroundTruth(self):
FILE: moonlight/evaluation/musicxml.py
function musicxml_similarity (line 60) | def musicxml_similarity(a, b):
function _not_similar (line 124) | def _not_similar():
function _index (line 134) | def _index(num_staves, num_measures):
class PartStaves (line 139) | class PartStaves(object):
method __init__ (line 146) | def __init__(self, score):
method get_measure (line 169) | def get_measure(self, part_staff, measure_num):
method _build_attributes (line 191) | def _build_attributes(self, staff_num, orig_measure):
method num_partstaves (line 240) | def num_partstaves(self):
method num_measures (line 244) | def num_measures(self, part_staff):
method _get_part (line 248) | def _get_part(self, part_num):
method all_measure_counts (line 251) | def all_measure_counts(self):
function _filter_for_staff (line 258) | def _filter_for_staff(staff_num, elem):
function measure_similarity (line 296) | def measure_similarity(a, b):
function measure_to_note_list (line 322) | def measure_to_note_list(measure):
function pitch_to_int (line 332) | def pitch_to_int(pitch):
function duration_to_fraction (line 365) | def duration_to_fraction(duration, measure):
function _note_distance (line 373) | def _note_distance(n1, n2):
function levenshtein (line 414) | def levenshtein(s1, s2, distance=_note_distance):
function exact_match_distance (line 445) | def exact_match_distance(n1, n2):
FILE: moonlight/evaluation/musicxml_test.py
class MusicXMLTest (line 33) | class MusicXMLTest(absltest.TestCase):
method testNotSimilar (line 35) | def testNotSimilar(self):
method testEmptyMeasures (line 38) | def testEmptyMeasures(self):
method testSomeMeasuresNotEqual (line 57) | def testSomeMeasuresNotEqual(self):
method testIdentical (line 88) | def testIdentical(self):
method testMoreChangesLessSimilar (line 97) | def testMoreChangesLessSimilar(self):
method testLevenshteinDistance (line 121) | def testLevenshteinDistance(self):
method testOptionalAlter (line 135) | def testOptionalAlter(self):
method testPitchToInt (line 149) | def testPitchToInt(self):
method testNoteDistance_samePitch_sameDuration_isEqual (line 167) | def testNoteDistance_samePitch_sameDuration_isEqual(self):
method testNoteDistance_samePitch_differentDuration_fuzzyDuration (line 170) | def testNoteDistance_samePitch_differentDuration_fuzzyDuration(self):
method testNoteDistance_differentPitch_sameDuration_fuzzyPitch (line 175) | def testNoteDistance_differentPitch_sameDuration_fuzzyPitch(self):
method testNoteDistance_differentPitchAndDuration_fuzzyDistance (line 180) | def testNoteDistance_differentPitchAndDuration_fuzzyDistance(self):
class PartStavesTest (line 187) | class PartStavesTest(absltest.TestCase):
method testDivisions_golden (line 189) | def testDivisions_golden(self):
method testAttributes_perStaff (line 204) | def testAttributes_perStaff(self):
FILE: moonlight/glyphs/base.py
class GlyphsTensorColumns (line 28) | class GlyphsTensorColumns(enum.IntEnum):
class BaseGlyphClassifier (line 41) | class BaseGlyphClassifier(object):
method __init__ (line 45) | def __init__(self):
method get_detected_glyphs (line 57) | def get_detected_glyphs(self):
method glyph_predictions_to_page (line 70) | def glyph_predictions_to_page(self, predictions):
FILE: moonlight/glyphs/convolutional.py
class Convolutional1DGlyphClassifier (line 36) | class Convolutional1DGlyphClassifier(base.BaseGlyphClassifier):
method __init__ (line 39) | def __init__(self, run_min_length=DEFAULT_RUN_MIN_LENGTH):
method staffline_predictions (line 51) | def staffline_predictions(self):
method _build_detected_glyphs (line 62) | def _build_detected_glyphs(self, predictions_arr):
method _create_glyph_arr (line 101) | def _create_glyph_arr(self, staff_index, y_position, x, type_value):
method get_detected_glyphs (line 109) | def get_detected_glyphs(self):
FILE: moonlight/glyphs/convolutional_test.py
class ConvolutionalTest (line 35) | class ConvolutionalTest(tf.test.TestCase):
method testGetGlyphsPage (line 37) | def testGetGlyphsPage(self):
method testNoGlyphs_dummyClassifier (line 62) | def testNoGlyphs_dummyClassifier(self):
FILE: moonlight/glyphs/corpus.py
function get_patch_shape (line 27) | def get_patch_shape(corpus_file):
function parse_corpus (line 48) | def parse_corpus(corpus_file, height, width):
FILE: moonlight/glyphs/geometry.py
function glyph_y (line 21) | def glyph_y(staff, glyph):
FILE: moonlight/glyphs/glyph_types.py
function is_notehead (line 27) | def is_notehead(glyph):
function is_stemmed_notehead (line 34) | def is_stemmed_notehead(glyph):
function is_beamed_notehead (line 40) | def is_beamed_notehead(glyph):
function is_dotted_notehead (line 44) | def is_dotted_notehead(glyph):
function is_clef (line 49) | def is_clef(glyph):
function is_rest (line 55) | def is_rest(glyph):
FILE: moonlight/glyphs/knn.py
class NearestNeighborGlyphClassifier (line 34) | class NearestNeighborGlyphClassifier(
method __init__ (line 38) | def __init__(self, corpus_file, staffline_extractor, **kwargs):
method staffline_predictions (line 116) | def staffline_predictions(self):
function _squared_euclidean_distance_matrix (line 120) | def _squared_euclidean_distance_matrix(a, b):
FILE: moonlight/glyphs/knn_model.py
function knn_kmeans_model (line 33) | def knn_kmeans_model(centroids, labels, patches=None):
function _to_uint8 (line 103) | def _to_uint8(values):
function _to_float (line 107) | def _to_float(values_t):
function export_knn_model (line 111) | def export_knn_model(centroids, labels, export_path):
function _squared_euclidean_distance_matrix (line 142) | def _squared_euclidean_distance_matrix(a, b):
FILE: moonlight/glyphs/knn_test.py
class KnnTest (line 36) | class KnnTest(tf.test.TestCase):
method testFakeStaffline (line 38) | def testFakeStaffline(self):
FILE: moonlight/glyphs/neural.py
class NeuralNetworkGlyphClassifier (line 33) | class NeuralNetworkGlyphClassifier(object):
method __init__ (line 36) | def __init__(self,
method get_autoencoder_initializers (line 98) | def get_autoencoder_initializers(self):
method get_classifier_initializers (line 106) | def get_classifier_initializers(self):
method semi_supervised_model (line 115) | def semi_supervised_model(batch_size,
class BaseLayer (line 170) | class BaseLayer(object):
method __init__ (line 172) | def __init__(self, filter_size, n_in, n_out, name):
method get (line 178) | def get(self):
class InputConvLayer (line 188) | class InputConvLayer(BaseLayer):
method __init__ (line 191) | def __init__(self, image, n_hidden, activation=tf.nn.sigmoid, name="in...
class HiddenLayer (line 215) | class HiddenLayer(BaseLayer):
method __init__ (line 218) | def __init__(self,
class ReconstructionLayer (line 242) | class ReconstructionLayer(BaseLayer):
method __init__ (line 245) | def __init__(self,
class PredictionLayer (line 271) | class PredictionLayer(BaseLayer):
method __init__ (line 274) | def __init__(self, layer_in, name="prediction"):
FILE: moonlight/glyphs/neural_test.py
class NeuralNetworkGlyphClassifierTest (line 30) | class NeuralNetworkGlyphClassifierTest(tf.test.TestCase):
method testSemiSupervisedClassifier (line 32) | def testSemiSupervisedClassifier(self):
FILE: moonlight/glyphs/note_dots.py
class NoteDots (line 41) | class NoteDots(object):
method __init__ (line 43) | def __init__(self, structure):
method apply (line 46) | def apply(self, page):
function _is_in_range (line 74) | def _is_in_range(min_value, values, max_value):
function _extract_dots (line 78) | def _extract_dots(structure):
FILE: moonlight/glyphs/repeated.py
class FixRepeatedRests (line 23) | class FixRepeatedRests(object):
method apply (line 25) | def apply(self, page):
FILE: moonlight/glyphs/saved_classifier.py
class SavedConvolutional1DClassifier (line 47) | class SavedConvolutional1DClassifier(
method __init__ (line 55) | def __init__(self,
method staffline_predictions (line 167) | def staffline_predictions(self):
FILE: moonlight/glyphs/saved_classifier_fn.py
function build_classifier_fn (line 39) | def build_classifier_fn(saved_model=None):
FILE: moonlight/glyphs/saved_classifier_test.py
class SavedClassifierTest (line 33) | class SavedClassifierTest(tf.test.TestCase):
method testSaveAndLoadDummyClassifier (line 35) | def testSaveAndLoadDummyClassifier(self):
FILE: moonlight/glyphs/testing.py
class DummyGlyphClassifier (line 54) | class DummyGlyphClassifier(convolutional.Convolutional1DGlyphClassifier):
method __init__ (line 60) | def __init__(self, predictions):
method staffline_predictions (line 65) | def staffline_predictions(self):
FILE: moonlight/image.py
function decode_music_score_png (line 27) | def decode_music_score_png(contents):
FILE: moonlight/models/base/batches.py
function get_batched_tensor (line 35) | def get_batched_tensor(dataset):
FILE: moonlight/models/base/batches_test.py
class BatchesTest (line 26) | class BatchesTest(tf.test.TestCase):
method testBatching (line 28) | def testBatching(self):
FILE: moonlight/models/base/glyph_patches.py
function read_patch_dimensions (line 88) | def read_patch_dimensions():
function input_fn (line 109) | def input_fn(input_patches):
function _augment (line 153) | def _augment(patch):
function _augment_shift (line 158) | def _augment_shift(patch):
function _shift_left (line 177) | def _shift_left(patch):
function _shift_right (line 182) | def _shift_right(patch):
function _augment_rotation (line 187) | def _augment_rotation(patch):
function serving_fn (line 199) | def serving_fn():
function create_patch_feature_column (line 221) | def create_patch_feature_column():
function multiclass_binary_metric (line 226) | def multiclass_binary_metric(class_number, binary_metric, labels, predic...
function metrics_fn (line 235) | def metrics_fn(features, labels, predictions):
function train_and_evaluate (line 268) | def train_and_evaluate(estimator):
FILE: moonlight/models/base/glyph_patches_test.py
class GlyphPatchesTest (line 29) | class GlyphPatchesTest(tf.test.TestCase):
method testInputFn (line 31) | def testInputFn(self):
method testShiftLeft (line 60) | def testShiftLeft(self):
method testShiftRight (line 68) | def testShiftRight(self):
method testMulticlassBinaryMetric (line 76) | def testMulticlassBinaryMetric(self):
FILE: moonlight/models/base/hyperparameters.py
function estimator_with_saved_params (line 30) | def estimator_with_saved_params(estimator, params):
FILE: moonlight/models/base/hyperparameters_test.py
class HyperparametersTest (line 23) | class HyperparametersTest(tf.test.TestCase):
method testSimpleModel (line 25) | def testSimpleModel(self):
FILE: moonlight/models/base/label_weights.py
function parse_label_weights_array (line 48) | def parse_label_weights_array(weights_str=None):
function weights_from_labels (line 79) | def weights_from_labels(labels, weights_str=None):
FILE: moonlight/models/base/label_weights_test.py
class LabelWeightsTest (line 25) | class LabelWeightsTest(tf.test.TestCase):
method testWeightsFromLabels (line 27) | def testWeightsFromLabels(self):
FILE: moonlight/models/glyphs_dnn/model.py
function get_flag_params (line 42) | def get_flag_params():
function create_estimator (line 83) | def create_estimator(params=None):
FILE: moonlight/models/glyphs_dnn/train.py
function main (line 29) | def main(_):
FILE: moonlight/omr.py
function run (line 52) | def run(input_pngs, glyphs_saved_model=None, output_notesequence=False):
function main (line 70) | def main(argv):
FILE: moonlight/omr_endtoend_test.py
class OmrEndToEndTest (line 34) | class OmrEndToEndTest(absltest.TestCase):
method setUp (line 36) | def setUp(self):
method testNoteSequence (line 39) | def testNoteSequence(self):
method testBeams_sixteenthNotes (line 53) | def testBeams_sixteenthNotes(self):
method testIMSLP00747_000_structure_barlines (line 74) | def testIMSLP00747_000_structure_barlines(self):
method testMusicXML (line 102) | def testMusicXML(self):
method testProcessImage (line 113) | def testProcessImage(self):
FILE: moonlight/omr_regression_test.py
class OmrRegressionTest (line 40) | class OmrRegressionTest(absltest.TestCase):
method testIMSLP01963_106_multipleStaffSizes (line 42) | def testIMSLP01963_106_multipleStaffSizes(self):
method testIMSLP00823_000_structure (line 53) | def testIMSLP00823_000_structure(self):
method testIMSLP00823_008_mergeStandardAndBeginRepeatBars (line 76) | def testIMSLP00823_008_mergeStandardAndBeginRepeatBars(self):
method testIMSLP39661_keySignature_CSharpMinor (line 104) | def testIMSLP39661_keySignature_CSharpMinor(self):
method testIMSLP00023_015_doubleNoteDots (line 117) | def testIMSLP00023_015_doubleNoteDots(self):
function _get_imslp_path (line 178) | def _get_imslp_path(filename):
FILE: moonlight/page_processors.py
function create_processors (line 38) | def create_processors(structure, staffline_extractor=None):
function process (line 63) | def process(page, structure, staffline_extractor=None):
class CenteredRests (line 70) | class CenteredRests(object):
method apply (line 72) | def apply(self, page):
FILE: moonlight/pipeline/pipeline_flags.py
function create_pipeline (line 33) | def create_pipeline(**kwargs):
FILE: moonlight/score/elements/clef.py
class Clef (line 30) | class Clef(object):
method y_position_to_midi (line 39) | def y_position_to_midi(self, y_position):
class TrebleClef (line 43) | class TrebleClef(Clef):
method __init__ (line 46) | def __init__(self):
class BassClef (line 52) | class BassClef(Clef):
method __init__ (line 55) | def __init__(self):
class _ScalePitch (line 61) | class _ScalePitch(object):
method __init__ (line 72) | def __init__(self, scale, midi):
method midi (line 78) | def midi(self):
method pitch_index (line 84) | def pitch_index(self):
method __add__ (line 88) | def __add__(self, interval):
FILE: moonlight/score/elements/clef_test.py
class ClefTest (line 26) | class ClefTest(absltest.TestCase):
method testTrebleClef (line 28) | def testTrebleClef(self):
method testBassClef (line 44) | def testBassClef(self):
FILE: moonlight/score/elements/key_signature.py
class _BaseAccidentals (line 37) | class _BaseAccidentals(object):
method __init__ (line 40) | def __init__(self, clef, accidentals=None):
method _normalize_position (line 44) | def _normalize_position(self, position):
method get_accidental_for_position (line 58) | def get_accidental_for_position(self, position):
class Accidentals (line 62) | class Accidentals(_BaseAccidentals):
method __init__ (line 65) | def __init__(self, clef):
method put (line 68) | def put(self, position, accidental):
class KeySignature (line 72) | class KeySignature(_BaseAccidentals):
method _normalize_position (line 80) | def _normalize_position(self, position):
method try_put (line 95) | def try_put(self, position, accidental):
method _can_put (line 111) | def _can_put(self, position, accidental):
method get_next_accidental (line 118) | def get_next_accidental(self):
method get_type (line 162) | def get_type(self):
method __len__ (line 167) | def __len__(self):
function _key_sig_pitch_classes (line 172) | def _key_sig_pitch_classes(note_name, ascending_fifths):
FILE: moonlight/score/elements/key_signature_test.py
class KeySignatureTest (line 27) | class KeySignatureTest(absltest.TestCase):
method testEmpty_noNextAccidental (line 29) | def testEmpty_noNextAccidental(self):
method testGMajor (line 34) | def testGMajor(self):
method testGMajor_bassClef (line 40) | def testGMajor_bassClef(self):
method testBMajor (line 46) | def testBMajor(self):
method testEFlatMajor (line 54) | def testEFlatMajor(self):
method testEFlatMajor_bassClef (line 62) | def testEFlatMajor_bassClef(self):
method testCFlatMajor_noMoreAccidentals (line 70) | def testCFlatMajor_noMoreAccidentals(self):
FILE: moonlight/score/measures.py
class Measures (line 30) | class Measures(object):
method __init__ (line 33) | def __init__(self, staff_system):
method size (line 36) | def size(self):
method get_measure (line 44) | def get_measure(self, glyph):
function _get_bar_intervals (line 59) | def _get_bar_intervals(staff_system):
FILE: moonlight/score/reader.py
class ScoreReader (line 39) | class ScoreReader(object):
method __init__ (line 50) | def __init__(self):
method __call__ (line 54) | def __call__(self, score):
method read_page (line 70) | def read_page(self, page):
method read_system (line 85) | def read_system(self, system):
method _read_glyph (line 95) | def _read_glyph(self, glyph, staff_state):
method _read_clef (line 103) | def _read_clef(self, staff_state, glyph):
method _read_note (line 125) | def _read_note(self, staff_state, glyph):
method _read_rest (line 129) | def _read_rest(self, staff_state, glyph):
method _read_accidental (line 132) | def _read_accidental(self, staff_state, glyph):
method _no_op_handler (line 135) | def _no_op_handler(self, glyph):
FILE: moonlight/score/reader_test.py
class ReaderTest (line 34) | class ReaderTest(absltest.TestCase):
method testTreble_simple (line 36) | def testTreble_simple(self):
method testBass_simple (line 56) | def testBass_simple(self):
method testTreble_accidentals (line 76) | def testTreble_accidentals(self):
method testChords (line 128) | def testChords(self):
method testBeams (line 172) | def testBeams(self):
method testAllNoteheadTypes (line 221) | def testAllNoteheadTypes(self):
method testStaffSystems (line 245) | def testStaffSystems(self):
method testMeasures (line 323) | def testMeasures(self):
method testKeySignatures (line 395) | def testKeySignatures(self):
function _bar (line 452) | def _bar(x):
FILE: moonlight/score/state/__init__.py
class ScoreState (line 24) | class ScoreState(object):
method __init__ (line 37) | def __init__(self):
method num_staves (line 40) | def num_staves(self, num_staves):
method add_measure (line 58) | def add_measure(self):
FILE: moonlight/score/state/measure.py
class _KeySignatureState (line 42) | class _KeySignatureState(enum.Enum):
class MeasureState (line 48) | class MeasureState(object):
method __init__ (line 61) | def __init__(self, start_time, clef, key_signature=None):
method new_measure (line 84) | def new_measure(self, start_time):
method set_accidental (line 98) | def set_accidental(self, y_position, accidental):
method get_note (line 111) | def get_note(self, glyph):
method set_clef (line 155) | def set_clef(self, clef):
method on_read_notehead (line 163) | def on_read_notehead(self):
function _get_note_duration (line 173) | def _get_note_duration(note):
FILE: moonlight/score/state/staff.py
class StaffState (line 24) | class StaffState(object):
method __init__ (line 32) | def __init__(self, start_time, clef=None):
method add_measure (line 36) | def add_measure(self, start_time):
method new_staff (line 46) | def new_staff(self, start_time):
method get_key_signature (line 59) | def get_key_signature(self):
method get_time (line 63) | def get_time(self):
method set_time (line 67) | def set_time(self, time):
method set_clef (line 75) | def set_clef(self, clef):
FILE: moonlight/score_processors.py
function create_processors (line 28) | def create_processors():
function process (line 32) | def process(score):
FILE: moonlight/staves/base.py
class BaseStaffDetector (line 28) | class BaseStaffDetector(object):
method __init__ (line 38) | def __init__(self, image=None):
method data (line 50) | def data(self):
method staves_interpolated_y (line 63) | def staves_interpolated_y(self):
method compute (line 162) | def compute(self, session=None, feed_dict=None):
class ComputedStaves (line 177) | class ComputedStaves(BaseStaffDetector):
method __init__ (line 184) | def __init__(self, staves, staffline_distance, staffline_thickness,
method staves_interpolated_y (line 195) | def staves_interpolated_y(self):
method compute (line 198) | def compute(self, session=None, feed_dict=None):
FILE: moonlight/staves/detectors_test.py
class StaffDetectorsTest (line 31) | class StaffDetectorsTest(tf.test.TestCase):
method setUp (line 33) | def setUp(self):
method tearDown (line 40) | def tearDown(self):
method test_single_staff (line 44) | def test_single_staff(self):
method test_corpus_image (line 70) | def test_corpus_image(self):
method test_staves_interpolated_y (line 86) | def test_staves_interpolated_y(self):
method test_staves_interpolated_y_empty (line 103) | def test_staves_interpolated_y_empty(self):
method test_staves_interpolated_y_staves_dont_extend_to_edge (line 109) | def test_staves_interpolated_y_staves_dont_extend_to_edge(self):
method generate_staff_detectors (line 119) | def generate_staff_detectors(self, image):
FILE: moonlight/staves/filter.py
function staff_center_filter (line 40) | def staff_center_filter(image,
function _shift_y (line 89) | def _shift_y(image, y_offset):
function _get_staff_ys (line 122) | def _get_staff_ys(is_staff, staffline_thickness):
FILE: moonlight/staves/hough.py
class FilteredHoughStaffDetector (line 45) | class FilteredHoughStaffDetector(base.BaseStaffDetector):
method __init__ (line 53) | def __init__(self,
method staves (line 75) | def staves(self):
method staffline_distance (line 80) | def staffline_distance(self):
method staffline_thickness (line 85) | def staffline_thickness(self):
method _data (line 90) | def _data(self):
class _SingleSizeFilteredHoughStaffDetector (line 144) | class _SingleSizeFilteredHoughStaffDetector(object):
method __init__ (line 155) | def __init__(self, image, staffline_distance, staffline_thickness,
method staves (line 178) | def staves(self):
function _argsort (line 215) | def _argsort(values):
FILE: moonlight/staves/projection.py
class ProjectionStaffDetector (line 28) | class ProjectionStaffDetector(base.BaseStaffDetector):
method __init__ (line 37) | def __init__(self, image=None):
method staves (line 52) | def staves(self):
method staffline_distance (line 57) | def staffline_distance(self):
method staffline_thickness (line 62) | def staffline_thickness(self):
function _projection_to_staves (line 66) | def _projection_to_staves(projection, width):
FILE: moonlight/staves/removal.py
class StaffRemover (line 30) | class StaffRemover(object):
method __init__ (line 38) | def __init__(self, staff_detector, threshold=127):
method remove_staves (line 44) | def remove_staves(self):
FILE: moonlight/staves/removal_test.py
class RemovalTest (line 31) | class RemovalTest(tf.test.TestCase):
method test_corpus_image (line 33) | def test_corpus_image(self):
FILE: moonlight/staves/staff_processor.py
class StaffProcessor (line 27) | class StaffProcessor(object):
method __init__ (line 29) | def __init__(self, structure, staffline_extractor):
method apply (line 33) | def apply(self, page):
FILE: moonlight/staves/staff_processor_test.py
class StaffProcessorTest (line 33) | class StaffProcessorTest(absltest.TestCase):
method testGetPage_x_scale (line 35) | def testGetPage_x_scale(self):
FILE: moonlight/staves/staffline_distance.py
function _single_peak (line 70) | def _single_peak(values, relative_cutoff, minval, invalidate_distance):
function _estimate_staffline_distance (line 104) | def _estimate_staffline_distance(columns, lengths):
function _estimate_staffline_thickness (line 168) | def _estimate_staffline_thickness(columns, values, lengths, staffline_di...
function estimate_staffline_distance_and_thickness (line 217) | def estimate_staffline_distance_and_thickness(image, threshold=127):
FILE: moonlight/staves/staffline_distance_test.py
class StafflineDistanceTest (line 28) | class StafflineDistanceTest(tf.test.TestCase):
method testCorpusImage (line 30) | def testCorpusImage(self):
method testZeros (line 43) | def testZeros(self):
method testSpeckles (line 53) | def testSpeckles(self):
FILE: moonlight/staves/staffline_extractor.py
class Axes (line 36) | class Axes(enum.IntEnum):
function get_staffline (line 44) | def get_staffline(y_position, extracted_staff_arr):
function y_position_to_index (line 65) | def y_position_to_index(y_position, num_stafflines):
class StafflineExtractor (line 73) | class StafflineExtractor(object):
method __init__ (line 89) | def __init__(self,
method extract_staves (line 119) | def extract_staves(self):
method _extract_staffline (line 166) | def _extract_staffline(self, staff_y, staffline_distance, staffline_num):
method _get_resized_width (line 217) | def _get_resized_width(self, staffline_distance):
method _get_staffline_window_size (line 224) | def _get_staffline_window_size(self, staffline_distance):
class StafflinePatchExtractor (line 231) | class StafflinePatchExtractor(object):
method __init__ (line 241) | def __init__(self,
method extract_staff_strip (line 285) | def extract_staff_strip(self, filename, staff_index, y_position):
method extract_staff_patch (line 310) | def extract_staff_patch(self, filename, staff_index, y_position, image...
method page_patch_iterator (line 336) | def page_patch_iterator(self, filename):
method make_patch_id (line 378) | def make_patch_id(filename, staff_index, y_position, image_x):
FILE: moonlight/staves/staffline_extractor_test.py
class StafflineExtractorTest (line 29) | class StafflineExtractorTest(tf.test.TestCase):
method setUp (line 31) | def setUp(self):
method testExtractStaff (line 50) | def testExtractStaff(self):
method testFloatMultiple (line 77) | def testFloatMultiple(self):
class StafflinePatchExtractorTest (line 96) | class StafflinePatchExtractorTest(tf.test.TestCase):
method testCompareIteratorWithSinglePatch (line 98) | def testCompareIteratorWithSinglePatch(self):
FILE: moonlight/staves/testing.py
class FakeStaves (line 23) | class FakeStaves(base.BaseStaffDetector):
method __init__ (line 36) | def __init__(self,
method staves (line 47) | def staves(self):
method staffline_distance (line 51) | def staffline_distance(self):
method staffline_thickness (line 55) | def staffline_thickness(self):
FILE: moonlight/structure/__init__.py
function create_structure (line 35) | def create_structure(image,
class Structure (line 77) | class Structure(object):
method __init__ (line 80) | def __init__(self,
method compute (line 94) | def compute(self, session=None, image=None):
method is_computed (line 152) | def is_computed(self):
method data (line 160) | def data(self):
FILE: moonlight/structure/barlines.py
class Barlines (line 27) | class Barlines(object):
method __init__ (line 30) | def __init__(self, structure, close_barline_threshold=None):
method apply (line 43) | def apply(self, page):
method _assign_barlines (line 63) | def _assign_barlines(self, systems):
method _get_blacklist_x (line 114) | def _get_blacklist_x(self, system):
function assign_barlines_to_staves (line 147) | def assign_barlines_to_staves(barline_x, barline_y0, barline_y1,
FILE: moonlight/structure/barlines_test.py
class BarlinesTest (line 34) | class BarlinesTest(absltest.TestCase):
method testDummy (line 36) | def testDummy(self):
FILE: moonlight/structure/beam_processor.py
class BeamProcessor (line 38) | class BeamProcessor(object):
method __init__ (line 40) | def __init__(self, structure):
method apply (line 45) | def apply(self, page):
function _get_overlapping_beams (line 83) | def _get_overlapping_beams(stem, beams):
function _maybe_duplicate_beams (line 105) | def _maybe_duplicate_beams(beams, staffline_distance):
function _do_intervals_overlap (line 135) | def _do_intervals_overlap(intervals_a, intervals_b):
FILE: moonlight/structure/beams.py
class Beams (line 42) | class Beams(object):
method __init__ (line 45) | def __init__(self, staff_remover, threshold=127):
class ComputedBeams (line 64) | class ComputedBeams(object):
method __init__ (line 67) | def __init__(self, beams):
FILE: moonlight/structure/components.py
class ConnectedComponentsColumns (line 29) | class ConnectedComponentsColumns(enum.IntEnum):
function from_staff_remover (line 39) | def from_staff_remover(staff_remover, threshold=127):
class ConnectedComponents (line 45) | class ConnectedComponents(object):
method __init__ (line 48) | def __init__(self, components):
class ComputedComponents (line 54) | class ComputedComponents(object):
method __init__ (line 57) | def __init__(self, components):
function get_component_bounds (line 62) | def get_component_bounds(image):
function _unsorted_segment_min (line 93) | def _unsorted_segment_min(data, segment_ids, num_segments):
FILE: moonlight/structure/components_test.py
class ComponentsTest (line 25) | class ComponentsTest(tf.test.TestCase):
method testComponents (line 27) | def testComponents(self):
FILE: moonlight/structure/section_barlines.py
class SectionBarlines (line 37) | class SectionBarlines(object):
method __init__ (line 40) | def __init__(self, structure):
method apply (line 44) | def apply(self, page):
method _section_min_width (line 127) | def _section_min_width(self):
method _section_max_width (line 130) | def _section_max_width(self, staff_index):
class MergeStandardAndBeginRepeatBars (line 134) | class MergeStandardAndBeginRepeatBars(object):
method __init__ (line 150) | def __init__(self, structure):
method apply (line 153) | def apply(self, page):
FILE: moonlight/structure/stems.py
class Stems (line 47) | class Stems(object):
method __init__ (line 50) | def __init__(self, structure):
method apply (line 66) | def apply(self, page):
function _get_stem_candidates (line 117) | def _get_stem_candidates(staffline_distance, verticals):
FILE: moonlight/structure/stems_test.py
class StemsTest (line 34) | class StemsTest(absltest.TestCase):
method testDummy (line 36) | def testDummy(self):
FILE: moonlight/structure/structure_test.py
class StructureTest (line 28) | class StructureTest(tf.test.TestCase):
method testCompute (line 30) | def testCompute(self):
FILE: moonlight/structure/verticals.py
class ColumnBasedVerticals (line 40) | class ColumnBasedVerticals(object):
method __init__ (line 58) | def __init__(
method lines (line 84) | def lines(self):
method _verticals_in_column (line 100) | def _verticals_in_column(self, max_gap, column):
method data (line 129) | def data(self):
function _horizontal_filter (line 138) | def _horizontal_filter(image, staffline_thickness):
class ComputedVerticals (line 163) | class ComputedVerticals(object):
method __init__ (line 170) | def __init__(self, lines):
method data (line 174) | def data(self):
FILE: moonlight/structure/verticals_test.py
class ColumnBasedVerticalsTest (line 27) | class ColumnBasedVerticalsTest(tf.test.TestCase):
method testVerticalLines_singleColumn (line 29) | def testVerticalLines_singleColumn(self):
FILE: moonlight/tools/export_kmeans_centroids.py
function run (line 27) | def run(tfrecords_filename, export_dir):
function main (line 35) | def main(argv):
FILE: moonlight/tools/export_kmeans_centroids_test.py
class ExportKmeansCentroidsTest (line 31) | class ExportKmeansCentroidsTest(tf.test.TestCase):
method testEndToEnd (line 33) | def testEndToEnd(self):
FILE: moonlight/tools/gen_structure_test_case.py
function main (line 32) | def main(argv):
function _sanitized_basename (line 54) | def _sanitized_basename(filename):
FILE: moonlight/training/clustering/kmeans_labeler.py
function load_clusters (line 43) | def load_clusters(input_path):
function main (line 70) | def main(_):
FILE: moonlight/training/clustering/kmeans_labeler_request_handler.py
class LabelerHTTPRequestHandler (line 40) | class LabelerHTTPRequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
method __init__ (line 49) | def __init__(self, clusters, output_path, *args):
method do_GET (line 54) | def do_GET(self):
method do_POST (line 64) | def do_POST(self):
function create_page (line 81) | def create_page(clusters, template):
function _process_cluster (line 105) | def _process_cluster(cluster):
function create_highlighted_image (line 129) | def create_highlighted_image(cluster, enlarge_ratio=10):
function create_examples (line 169) | def create_examples(clusters, labels):
FILE: moonlight/training/clustering/kmeans_labeler_request_handler_test.py
class KmeansLabelerRequestHandlerTest (line 33) | class KmeansLabelerRequestHandlerTest(absltest.TestCase):
method testCreatePage (line 35) | def testCreatePage(self):
method testCreateExamples (line 45) | def testCreateExamples(self):
FILE: moonlight/training/clustering/staffline_patches_dofn.py
function _filter_patch (line 37) | def _filter_patch(patch, min_num_dark_pixels=10):
class StafflinePatchesDoFn (line 42) | class StafflinePatchesDoFn(beam.DoFn):
method __init__ (line 45) | def __init__(self, patch_height, patch_width, num_stafflines, timeout_ms,
method start_bundle (line 65) | def start_bundle(self):
method process (line 72) | def process(self, png_path):
method finish_bundle (line 110) | def finish_bundle(self):
FILE: moonlight/training/clustering/staffline_patches_dofn_test.py
class StafflinePatchesDoFnTest (line 38) | class StafflinePatchesDoFnTest(absltest.TestCase):
method testPipeline_corpusImage (line 40) | def testPipeline_corpusImage(self):
FILE: moonlight/training/clustering/staffline_patches_kmeans_pipeline.py
function train_kmeans (line 63) | def train_kmeans(patch_file_pattern,
function main (line 147) | def main(_):
FILE: moonlight/training/clustering/staffline_patches_kmeans_pipeline_test.py
class StafflinePatchesKmeansPipelineTest (line 38) | class StafflinePatchesKmeansPipelineTest(absltest.TestCase):
method testKmeans (line 40) | def testKmeans(self):
FILE: moonlight/training/generation/generation.py
function _normalize_path (line 62) | def _normalize_path(filename):
class PageGenerationDoFn (line 83) | class PageGenerationDoFn(beam.DoFn):
method __init__ (line 90) | def __init__(self, num_pages_per_batch, vexflow_generator_command,
method process (line 96) | def process(self, batch_num):
method get_pages_for_batch (line 104) | def get_pages_for_batch(self, batch_num, num_pages_per_batch):
method get_pages (line 124) | def get_pages(self, seeds):
method _svg_to_png (line 135) | def _svg_to_png(self, svg):
class PatchExampleDoFn (line 150) | class PatchExampleDoFn(beam.DoFn):
method __init__ (line 153) | def __init__(self,
method start_bundle (line 164) | def start_bundle(self):
method process (line 172) | def process(self, item):
method _create_example (line 253) | def _create_example(self, staffline, x, label):
FILE: moonlight/training/generation/generation_test.py
class GenerationTest (line 30) | class GenerationTest(tf.test.TestCase):
method testDoFn (line 32) | def testDoFn(self):
FILE: moonlight/training/generation/image_noise.py
function placeholder_image (line 29) | def placeholder_image():
function random_rotation (line 33) | def random_rotation(image, angle=math.pi / 180):
function gaussian_noise (line 40) | def gaussian_noise(image, stddev=5):
FILE: moonlight/training/generation/vexflow_generator.js
function checkNotNull (line 40) | function checkNotNull(value, opt_message) {
function vexflowLineToOMR (line 56) | function vexflowLineToOMR(line) {
function absoluteYToOMR (line 67) | function absoluteYToOMR(stave, y) {
function resetPageState (line 75) | function resetPageState() {
constant CLEF_LINE_FOR_OMR (line 83) | const CLEF_LINE_FOR_OMR = {
constant ACCIDENTAL_TYPES (line 138) | const ACCIDENTAL_TYPES = {
function discreteSample (line 170) | function discreteSample(random, probs) {
constant PROB_LEDGER_NOTE (line 183) | const PROB_LEDGER_NOTE = 0.1;
constant PROB_MODIFIERS (line 184) | const PROB_MODIFIERS = {
class Clef (line 192) | class Clef {
method genNote (line 198) | genNote(random) {
method baseNotes_ (line 210) | baseNotes_() {
class TrebleClef (line 219) | class TrebleClef extends Clef {
method name (line 221) | name() {
class BassClef (line 227) | class BassClef extends Clef {
method name (line 229) | name() {
constant CLEFS (line 235) | const CLEFS = [new TrebleClef(), new BassClef()];
FILE: moonlight/training/generation/vexflow_generator_pipeline.py
function main (line 58) | def main(_):
FILE: moonlight/util/functional_ops.py
function flat_map_fn (line 23) | def flat_map_fn(fn, elems, dtype=None):
FILE: moonlight/util/functional_ops_test.py
class FunctionalOpsTest (line 25) | class FunctionalOpsTest(tf.test.TestCase):
method testFlatMap (line 27) | def testFlatMap(self):
FILE: moonlight/util/memoize.py
class MemoizedFunction (line 21) | class MemoizedFunction(object):
method __init__ (line 31) | def __init__(self, function):
method __call__ (line 35) | def __call__(self, *args):
FILE: moonlight/util/more_iter_tools.py
function iter_sample (line 10) | def iter_sample(iterator, count, rand=None):
FILE: moonlight/util/more_iter_tools_test.py
class MoreIterToolsTest (line 15) | class MoreIterToolsTest(absltest.TestCase):
method testSample_count_0 (line 17) | def testSample_count_0(self):
method testSample_iter_empty (line 20) | def testSample_iter_empty(self):
method testSample_distribution (line 23) | def testSample_distribution(self):
FILE: moonlight/util/patches.py
function patches_1d (line 24) | def patches_1d(images, patch_width):
FILE: moonlight/util/patches_test.py
class PatchesTest (line 26) | class PatchesTest(tf.test.TestCase):
method test2D (line 28) | def test2D(self):
method test4D (line 40) | def test4D(self):
method testEmpty (line 57) | def testEmpty(self):
FILE: moonlight/util/run_length.py
function vertical_run_length_encoding (line 31) | def vertical_run_length_encoding(image):
FILE: moonlight/util/run_length_test.py
class RunLengthTest (line 25) | class RunLengthTest(tf.test.TestCase):
method testEmpty (line 27) | def testEmpty(self):
method testBooleanImage (line 35) | def testBooleanImage(self):
FILE: moonlight/util/segments.py
class SegmentsMode (line 24) | class SegmentsMode(enum.Enum):
function true_segments_1d (line 32) | def true_segments_1d(segments,
function _segments_1d (line 116) | def _segments_1d(values, mode, name=None):
function peaks (line 175) | def peaks(values, minval=None, invalidate_distance=0, name=None):
FILE: moonlight/util/segments_test.py
class SegmentsTest (line 29) | class SegmentsTest(tf.test.TestCase, absltest.TestCase):
method test_true_segments_1d (line 31) | def test_true_segments_1d(self):
method test_true_segments_1d_large (line 45) | def test_true_segments_1d_large(self):
method test_true_segments_1d_empty (line 71) | def test_true_segments_1d_empty(self):
method test_true_segments_1d_max_gap (line 81) | def test_true_segments_1d_max_gap(self):
method test_true_segments_1d_all_false_length_1 (line 114) | def test_true_segments_1d_all_false_length_1(self):
method test_true_segments_1d_all_false_length_2 (line 117) | def test_true_segments_1d_all_false_length_2(self):
method test_true_segments_1d_all_false_length_8 (line 120) | def test_true_segments_1d_all_false_length_8(self):
method test_true_segments_1d_all_false_length_11 (line 123) | def test_true_segments_1d_all_false_length_11(self):
method _test_true_segments_1d_all_false (line 126) | def _test_true_segments_1d_all_false(self, length):
method test_true_segments_1d_min_length_0 (line 132) | def test_true_segments_1d_min_length_0(self):
method test_true_segments_1d_min_length_1 (line 135) | def test_true_segments_1d_min_length_1(self):
method test_true_segments_1d_min_length_2 (line 138) | def test_true_segments_1d_min_length_2(self):
method test_true_segments_1d_min_length_3 (line 141) | def test_true_segments_1d_min_length_3(self):
method test_true_segments_1d_min_length_4 (line 144) | def test_true_segments_1d_min_length_4(self):
method test_true_segments_1d_min_length_5 (line 147) | def test_true_segments_1d_min_length_5(self):
method test_true_segments_1d_min_length_6 (line 150) | def test_true_segments_1d_min_length_6(self):
method _test_true_segments_1d_min_length (line 153) | def _test_true_segments_1d_min_length(self, min_length):
method test_peaks (line 176) | def test_peaks(self):
method test_peaks_empty (line 182) | def test_peaks_empty(self):
method test_peaks_invalidate_distance (line 186) | def test_peaks_invalidate_distance(self):
method test_peaks_array_filled_with_same_value (line 206) | def test_peaks_array_filled_with_same_value(self):
method test_peaks_one_segment (line 212) | def test_peaks_one_segment(self):
FILE: moonlight/vision/hough.py
function hough_lines (line 33) | def hough_lines(image, thetas):
function hough_peaks (line 58) | def hough_peaks(hough_bins, thetas, minval=0, invalidate_distance=0):
function _bincount_2d (line 115) | def _bincount_2d(values, num_values):
FILE: moonlight/vision/hough_test.py
class HoughTest (line 26) | class HoughTest(tf.test.TestCase):
method testHorizontalLines (line 28) | def testHorizontalLines(self):
method testHoughPeaks_verticalLines (line 54) | def testHoughPeaks_verticalLines(self):
method testHoughPeaks_minval (line 76) | def testHoughPeaks_minval(self):
method testHoughPeaks_minvalTooLarge (line 89) | def testHoughPeaks_minvalTooLarge(self):
FILE: moonlight/vision/images.py
function translate (line 26) | def translate(image, x, y):
FILE: moonlight/vision/images_test.py
class ImagesTest (line 25) | class ImagesTest(tf.test.TestCase):
method testTranslate (line 27) | def testTranslate(self):
FILE: moonlight/vision/morphology.py
function binary_erosion (line 32) | def binary_erosion(image, n):
function binary_dilation (line 50) | def binary_dilation(image, n):
function _repeated_morphological_op (line 68) | def _repeated_morphological_op(float_image, binary_op, n):
function _single_morphological_op (line 77) | def _single_morphological_op(float_image, binary_op):
FILE: moonlight/vision/morphology_test.py
class MorphologyTest (line 26) | class MorphologyTest(tf.test.TestCase):
method testMorphology_false (line 28) | def testMorphology_false(self):
method testErosion_small (line 35) | def testErosion_small(self):
method testErosion (line 42) | def testErosion(self):
method testDilation (line 60) | def testDilation(self):
Condensed preview — 162 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (923K chars).
[
{
"path": "AUTHORS",
"chars": 349,
"preview": "# This is the list of Moonlight OMR authors for copyright purposes.\n#\n# Copyright for contributions made under Google's "
},
{
"path": "CONTRIBUTING.md",
"chars": 1101,
"preview": "# How to Contribute\n\nWe'd love to accept your patches and contributions to this project. There are\njust a few small guid"
},
{
"path": "CONTRIBUTORS",
"chars": 396,
"preview": "# This is the list of individual contributors to the Moonlight OMR project.\n#\n# Some contributions belong to the organiz"
},
{
"path": "LICENSE",
"chars": 11358,
"preview": "\n Apache License\n Version 2.0, January 2004\n "
},
{
"path": "README.md",
"chars": 2841,
"preview": "<img align=\"center\" width=\"400\" height=\"94,358\" src=\"https://user-images.githubusercontent.com/34600369/40580500-74088e4"
},
{
"path": "WORKSPACE",
"chars": 1473,
"preview": "http_archive(\n name = \"com_google_protobuf\",\n sha256 = \"40f009cb0c190816a52fc21d45c26558ee7d63c3bd511b326bd85739b2"
},
{
"path": "docs/concepts.md",
"chars": 4063,
"preview": "# Moonlight OMR Concepts\n\n## Diagram\n\n<img src=\"concepts_diagram.png\" />\n\n## Glossary\n\n| Term | Definition "
},
{
"path": "docs/engine.md",
"chars": 7120,
"preview": "## OMR Engine Detailed Reference\n\nThe OMR engine converts a PNG image to a Magenta\n[NoteSequence message](https://github"
},
{
"path": "moonlight/BUILD",
"chars": 2833,
"preview": "# Description:\n# Optical music recognition using TensorFlow.\n\npackage(\n default_visibility = [\"//moonlight:__subpacka"
},
{
"path": "moonlight/conversions/BUILD",
"chars": 915,
"preview": "# Description:\n# Format conversions for OMR.\npackage(\n default_visibility = [\"//moonlight:__subpackages__\"],\n)\n\nlicen"
},
{
"path": "moonlight/conversions/__init__.py",
"chars": 972,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/conversions/musicxml.py",
"chars": 8347,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/conversions/musicxml_test.py",
"chars": 6027,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/conversions/notesequence.py",
"chars": 1437,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/data/README.md",
"chars": 56,
"preview": "Description:\n\nBuilt-in models required for running OMR.\n"
},
{
"path": "moonlight/data/glyphs_nn_model_20180808/BUILD",
"chars": 185,
"preview": "package(\n default_visibility = [\"//moonlight:__subpackages__\"],\n)\n\nlicenses([\"notice\"]) # Apache 2.0\n\nfilegroup(\n "
},
{
"path": "moonlight/data/glyphs_nn_model_20180808/saved_model.pbtxt",
"chars": 118603,
"preview": "saved_model_schema_version: 1\nmeta_graphs {\n meta_info_def {\n stripped_op_list {\n op {\n name: \"Add\"\n "
},
{
"path": "moonlight/engine.py",
"chars": 8170,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/evaluation/BUILD",
"chars": 1678,
"preview": "# Description:\n# OMR evaluation and music score similarity metrics.\n\npackage(\n default_visibility = [\"//moonlight:__s"
},
{
"path": "moonlight/evaluation/evaluator.py",
"chars": 1984,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/evaluation/evaluator_endtoend_test.py",
"chars": 1628,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/evaluation/musicxml.py",
"chars": 15619,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/evaluation/musicxml_test.py",
"chars": 10235,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/BUILD",
"chars": 4345,
"preview": "# Description:\n# Glyph classification for OMR.\n\npackage(\n default_visibility = [\"//moonlight:__subpackages__\"],\n)\n\nli"
},
{
"path": "moonlight/glyphs/base.py",
"chars": 3542,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/convolutional.py",
"chars": 4453,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/convolutional_test.py",
"chars": 2773,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/corpus.py",
"chars": 2905,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/geometry.py",
"chars": 1644,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/glyph_types.py",
"chars": 1656,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/knn.py",
"chars": 5528,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/knn_model.py",
"chars": 6074,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/knn_test.py",
"chars": 4332,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/neural.py",
"chars": 11877,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/neural_test.py",
"chars": 2171,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/note_dots.py",
"chars": 4129,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/repeated.py",
"chars": 1334,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/saved_classifier.py",
"chars": 6907,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/saved_classifier_fn.py",
"chars": 1924,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/saved_classifier_test.py",
"chars": 3071,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/glyphs/testing.py",
"chars": 2324,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/image.py",
"chars": 1914,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/BUILD",
"chars": 1936,
"preview": "# Description:\n# Common logic for all model training.\n\npackage(\n default_visibility = [\"//moonlight:__subpackages__\"]"
},
{
"path": "moonlight/models/base/batches.py",
"chars": 2071,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/batches_test.py",
"chars": 2226,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/glyph_patches.py",
"chars": 10257,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/glyph_patches_test.py",
"chars": 4382,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/hyperparameters.py",
"chars": 3107,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/hyperparameters_test.py",
"chars": 1755,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/label_weights.py",
"chars": 3066,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/base/label_weights_test.py",
"chars": 1339,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/glyphs_dnn/BUILD",
"chars": 647,
"preview": "# Description:\n# Glyph patches DNN classifier.\n\npackage(\n default_visibility = [\"//moonlight:__subpackages__\"],\n)\n\nli"
},
{
"path": "moonlight/models/glyphs_dnn/model.py",
"chars": 3915,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/models/glyphs_dnn/train.py",
"chars": 1087,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/music/BUILD",
"chars": 250,
"preview": "# Description:\n# General music theory for OMR.\npackage(\n default_visibility = [\"//moonlight:__subpackages__\"],\n)\n\nlic"
},
{
"path": "moonlight/music/constants.py",
"chars": 1360,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/omr.py",
"chars": 3536,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/omr_endtoend_test.py",
"chars": 4902,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/omr_regression_test.py",
"chars": 7224,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/page_processors.py",
"chars": 2979,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/pipeline/BUILD",
"chars": 336,
"preview": "# Description:\n# OMR pipelines and Apache Beam utilities.\npackage(\n default_visibility = [\"//moonlight:__subpackages_"
},
{
"path": "moonlight/pipeline/pipeline_flags.py",
"chars": 1119,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/protobuf/BUILD",
"chars": 1014,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/protobuf/groundtruth.proto",
"chars": 2173,
"preview": "// Copyright 2018 Google LLC\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use th"
},
{
"path": "moonlight/protobuf/musicscore.proto",
"chars": 6254,
"preview": "// Copyright 2018 Google LLC\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use th"
},
{
"path": "moonlight/score/BUILD",
"chars": 891,
"preview": "# Description:\n# Score reading for OMR.\n\npackage(\n default_visibility = [\"//moonlight:__subpackages__\"],\n)\n\nlicenses("
},
{
"path": "moonlight/score/elements/BUILD",
"chars": 969,
"preview": "# Description:\n# Score elements which encapsulate custom logic for score reading.\n\npackage(\n default_visibility = [\"/"
},
{
"path": "moonlight/score/elements/clef.py",
"chars": 2988,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/elements/clef_test.py",
"chars": 2493,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/elements/key_signature.py",
"chars": 7784,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/elements/key_signature_test.py",
"chars": 3997,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/measures.py",
"chars": 2225,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/reader.py",
"chars": 5142,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/reader_test.py",
"chars": 19072,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/state/BUILD",
"chars": 891,
"preview": "# Description:\n# The music score state. Holds information such as the key signature and current time in the score,\n# whi"
},
{
"path": "moonlight/score/state/__init__.py",
"chars": 2397,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/state/measure.py",
"chars": 7112,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score/state/staff.py",
"chars": 2705,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/score_processors.py",
"chars": 1382,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/scripts/imslp_pdfs_to_pngs.sh",
"chars": 1821,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/BUILD",
"chars": 3830,
"preview": "# Description:\n# Staff detection and removal routines.\n\npackage(\n default_visibility = [\"//moonlight:__subpackages__\""
},
{
"path": "moonlight/staves/__init__.py",
"chars": 1203,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/base.py",
"chars": 7156,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/detectors_test.py",
"chars": 5030,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/filter.py",
"chars": 4740,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/hough.py",
"chars": 7985,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/projection.py",
"chars": 3624,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/removal.py",
"chars": 4678,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/removal_test.py",
"chars": 1901,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/staff_processor.py",
"chars": 2566,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/staff_processor_test.py",
"chars": 3622,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/staffline_distance.py",
"chars": 10559,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/staffline_distance_test.py",
"chars": 2617,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/staffline_extractor.py",
"chars": 15498,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/staffline_extractor_test.py",
"chars": 4521,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/staves/testing.py",
"chars": 1787,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/BUILD",
"chars": 3668,
"preview": "# Description:\n# Music score structure detection above the staff level.\n\npackage(\n default_visibility = [\"//moonlight"
},
{
"path": "moonlight/structure/__init__.py",
"chars": 5971,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/barlines.py",
"chars": 8394,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/barlines_test.py",
"chars": 5147,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/beam_processor.py",
"chars": 6203,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/beams.py",
"chars": 2596,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/components.py",
"chars": 3228,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/components_test.py",
"chars": 1618,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/section_barlines.py",
"chars": 7150,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/stems.py",
"chars": 4980,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/stems_test.py",
"chars": 3691,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/structure_test.py",
"chars": 1654,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/verticals.py",
"chars": 6024,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/structure/verticals_test.py",
"chars": 2046,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/testdata/BUILD",
"chars": 320,
"preview": "# Description:\n# Music scores used for inference and testing the OMR pipeline.\n\npackage(\n default_visibility = [\"//mo"
},
{
"path": "moonlight/testdata/IMSLP00747-000.LICENSE.md",
"chars": 150,
"preview": "The image was obtained from IMSLP:\n\nhttps://imslp.org/wiki/Special:ReverseLookup/747\n\nIt was published in 1853 and is in"
},
{
"path": "moonlight/testdata/IMSLP00747.golden.LICENSE.md",
"chars": 148,
"preview": "The score was transcribed by the Moonlight authors. The MusicXML transcription\nis released under the Apache License, ver"
},
{
"path": "moonlight/testdata/IMSLP00747.golden.xml",
"chars": 185361,
"preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<!-- Copyright 2018 Google LLC\n -\n - Licensed under the Apache License, Versi"
},
{
"path": "moonlight/testdata/README.md",
"chars": 309,
"preview": "# OMR Corpus\n\nContains test data for unit testing the OMR pipeline. Currently contains a\nsingle page of a music score, t"
},
{
"path": "moonlight/testdata/TWO_MEASURE_SAMPLE.LICENSE.md",
"chars": 145,
"preview": "The score was composed by the Moonlight authors. The MusicXML transcription is\nreleased under the Apache License, versio"
},
{
"path": "moonlight/testdata/TWO_MEASURE_SAMPLE.xml",
"chars": 5637,
"preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<!-- Copyright 2018 Google LLC\n -\n - Licensed under the Apache License, Versi"
},
{
"path": "moonlight/tools/BUILD",
"chars": 1354,
"preview": "package(default_visibility = [\"//moonlight:__subpackages__\"])\n\nlicenses([\"notice\"]) # Apache 2.0\n\npy_binary(\n name ="
},
{
"path": "moonlight/tools/export_kmeans_centroids.py",
"chars": 1245,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/tools/export_kmeans_centroids_test.py",
"chars": 1989,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/tools/gen_structure_test_case.py",
"chars": 2047,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/clustering/BUILD",
"chars": 2885,
"preview": "# Description:\n# Unsupervised learning pipeline for OMR glyph classification.\n\npackage(\n default_visibility = [\"//moo"
},
{
"path": "moonlight/training/clustering/kmeans_labeler.py",
"chars": 2854,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/clustering/kmeans_labeler_request_handler.py",
"chars": 6732,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/clustering/kmeans_labeler_request_handler_test.py",
"chars": 2443,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/clustering/kmeans_labeler_template.html",
"chars": 1549,
"preview": "<%def name=\"cluster_preview(cluster)\">\n ## cluster is a tuple (index, preview, is_content).\n <div class=\"cluster_previ"
},
{
"path": "moonlight/training/clustering/staffline_patches_dofn.py",
"chars": 4345,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/clustering/staffline_patches_dofn_test.py",
"chars": 2774,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/clustering/staffline_patches_kmeans_pipeline.py",
"chars": 7509,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/clustering/staffline_patches_kmeans_pipeline_test.py",
"chars": 1931,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/generation/BUILD",
"chars": 1399,
"preview": "# Description:\n# Generated training data for OMR.\npackage(default_visibility = [\"//moonlight:__subpackages__\"])\n\nlicense"
},
{
"path": "moonlight/training/generation/generation.py",
"chars": 11447,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/generation/generation_test.py",
"chars": 1955,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/generation/image_noise.py",
"chars": 1320,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/training/generation/vexflow_generator.js",
"chars": 8710,
"preview": "// Copyright 2018 Google LLC\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use th"
},
{
"path": "moonlight/training/generation/vexflow_generator_pipeline.py",
"chars": 3602,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/BUILD",
"chars": 2264,
"preview": "# Description:\n# General utilities for OMR.\npackage(\n default_visibility = [\"//moonlight:__subpackages__\"],\n)\n\nlicens"
},
{
"path": "moonlight/util/functional_ops.py",
"chars": 1908,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/functional_ops_test.py",
"chars": 1078,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/memoize.py",
"chars": 1701,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/more_iter_tools.py",
"chars": 1083,
"preview": "\"\"\"More iterator utilities.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ i"
},
{
"path": "moonlight/util/more_iter_tools_test.py",
"chars": 1195,
"preview": "\"\"\"Tests for more_iter_tools.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__"
},
{
"path": "moonlight/util/patches.py",
"chars": 2832,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/patches_test.py",
"chars": 2471,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/run_length.py",
"chars": 3862,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/run_length_test.py",
"chars": 1821,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/segments.py",
"chars": 10411,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/util/segments_test.py",
"chars": 8482,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/vision/BUILD",
"chars": 1400,
"preview": "# Description:\n# General computer vision routines for OMR.\n\npackage(\n default_visibility = [\"//moonlight:__subpackage"
},
{
"path": "moonlight/vision/hough.py",
"chars": 5557,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/vision/hough_test.py",
"chars": 3977,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/vision/images.py",
"chars": 1621,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/vision/images_test.py",
"chars": 1420,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/vision/morphology.py",
"chars": 2686,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "moonlight/vision/morphology_test.py",
"chars": 2825,
"preview": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this f"
},
{
"path": "requirements.txt",
"chars": 355,
"preview": "# List of all packages used by Moonlight.\nabsl-py\n# TODO(ringw): Get the latest apache beam version working in py2 tests"
},
{
"path": "sandbox/README.md",
"chars": 510,
"preview": "# Moonlight Sandbox\n\nThis directory has the symlinks necessary to import Moonlight after building.\nYou can either add th"
},
{
"path": "six.BUILD",
"chars": 126,
"preview": "py_library(\n name = \"six\",\n srcs = [\"six.py\"],\n visibility = [\"//visibility:public\"],\n srcs_version = \"PY2AN"
},
{
"path": "tools/bazel_0.20.0-linux-x86_64.deb.sha256",
"chars": 96,
"preview": "ea47050fe839a7f5fb6c3ac1cc876a70993e614bab091aefc387715c3cb48a86 bazel_0.20.0-linux-x86_64.deb\n"
},
{
"path": "tools/travis_tests.sh",
"chars": 672,
"preview": "#!/bin/bash\n# Print commands before running them.\nset -x\n\n# Apache Beam only supports Python 2 :(\n# Filter tests with ta"
}
]
// ... and 2 more files (download for full content)
About this extraction
This page contains the full source code of the tensorflow/moonlight GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 162 files (852.8 KB), approximately 227.5k tokens, and a symbol index with 544 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.