Repository: bj80heyue/One_Shot_Face_Reenactment
Branch: master
Commit: 880a80b2f891
Files: 42
Total size: 282.7 KB
Directory structure:
gitextract_8hbwdasm/
├── .gitignore
├── LICENSE
├── README.md
├── data/
│ ├── poseGuide/
│ │ └── lms_poseGuide.out
│ └── reference/
│ └── lms_ref.out
├── fusion/
│ ├── README.md
│ ├── affineFace.py
│ ├── calcAffine.py
│ ├── parts2lms.py
│ ├── points2heatmap.py
│ ├── test.py
│ └── warper.py
├── loader/
│ ├── __init__.py
│ ├── dataset_basic.py
│ ├── dataset_loader_demo.py
│ └── dataset_loader_train.py
├── model/
│ ├── base_model.py
│ └── spade_model.py
├── net/
│ ├── ResNet.py
│ ├── appear_decoder_net.py
│ ├── appear_encoder_net.py
│ ├── base_net.py
│ ├── discriminator_net.py
│ ├── face_id_mlp_net.py
│ ├── face_id_net.py
│ ├── generaotr_net.py
│ ├── generator_net_concat_1Layer.py
│ └── vgg_net.py
├── opt/
│ ├── __init__.py
│ ├── config.py
│ └── configTrain.py
├── requirements.txt
├── test.py
└── utils/
├── __init__.py
├── affineFace.py
├── affine_util.py
├── calcAffine.py
├── lms.test
├── metric.py
├── points2heatmap.py
├── transforms.py
└── warper.py
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2019 Stan
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
# One-shot Face Reenactment
[[Project]](https://wywu.github.io/projects/ReenactGAN/OneShotReenact.html) [[Paper]](https://arxiv.org/abs/1908.03251) [[Demo]](https://www.youtube.com/watch?v=FE-D6wh11_A)
Official test script for 2019 BMVC spotlight paper 'One-shot Face Reenactment' in PyTorch.
## Installation
### Requirements
- Linux
- Python 3.6
- PyTorch 0.4+
- CUDA 9.0+
- GCC 4.9+
### Easy Install
```shell
pip install -r requirements.txt
```
## Getting Started
### Prepare Data
It is recommended to symlink the dataset root to `$PROJECT/data`.
```shell
Project
├── data
│ ├── poseGuide
│ │ ├── imgs
│ │ ├── lms
│ ├── reference
│ │ ├── imgs
│ │ ├── lms
```
- imgs : store images
- lms : store landmarks extracted from images
- format : 106 common facial key points & 20+20 gaze key points

Example input data is organized in folder 'data'. Please organize your data in the format the same as the example input data if you want to test with your own data.
Output images are saved in folder 'output'.
Due to the protocol of company, the model to extract 106 + 40 facial landmarks cannot be released, however, if you want to get access to the following dataset, please fill in the license file in the repo (license/celebHQlms_license.pdf), then email the signed copy to siwei.1995@163.com to get access to the annotation dataset.
- our preprocessed 106 + 40 facial landmark annotations of celebHQ dataset
- additional 80 images as pose guide with corresponding 106 + 40 facial landmark annotations
### Inference with pretrained model
```
python test.py --pose_path PATH/TO/POSE/GUIDE/IMG/DIR --ref_path PATH/TO/REF/IMG/DIR --pose_lms PATH/TO/POSE/LANDMARK/FILE --ref_lms PATH/TO/REF/LANDMARK/FILE
```
```
output sequence:
ref1-pose1, ref1-pose2, ref1-pose3, ... &
ref2-pose1, ref2-pose2, ref2-pose3, ... &
ref3-pose1, ref3-pose2, ref3-pose3, ... &
.
.
.
```
### Pretrained model
You can download models from [here](https://drive.google.com/open?id=1Wnc2TGwFQM4PdCdeSn-trI75UeGbuY_E)
```shell
Project
├── pretrainModel
│ ├── id_200.pth
│ ├── vgg16-397923af.pth
├── trained_model
│ ├── latest_net_appEnc.pth
│ ├── latest_net_appDnc.pth
│ ├── latest_net_netG.pth
│ ├── latest_net_netD64.pth
│ ├── latest_net_netD128.pth
│ ├── latest_net_netD256.pth
```
### Visualization of results
You can download our sample data and corresponding results from [here](https://drive.google.com/open?id=1Ia8YJrtYTvNRwBfcKK7iBSAf5vb8gkqw)
## License and Citation
The use of this software follows **MIT License**.
```
@inproceedings{OneShotFace2019,
title={One-shot Face Reenactment},
author={Zhang, Yunxuan and Zhang, Siwei and He, Yue and Li, Cheng and Loy, Chen Change and Liu, Ziwei},
booktitle={British Machine Vision Conference (BMVC)},
year={2019}
}
```
================================================
FILE: data/poseGuide/lms_poseGuide.out
================================================
pose_1.jpg
90.44765716236793 148.92923857343578 91.57595251229804 162.8479934005091 92.9120000469461 176.46386363738992 94.4750702020267 189.95495589886139 97.02199692128647 203.29694912957 99.79529533135087 216.69020667075688 102.741526161524 230.20452567522284 107.24603891660149 243.02974731417868 112.59511909643379 255.8834214948385 119.47399642829623 267.9355943070909 127.98055826477298 278.4243267720386 137.41574048038206 287.8953381465611 147.8454651322761 296.04741539656624 159.34423801793673 301.4650047702521 172.04009073265115 306.14600984182664 184.8476145369723 309.1498658924654 198.1901765257437 309.2894505953745 210.300733963732 306.8638379074362 221.74102561555833 302.8266755169669 232.31119706456093 296.7210222250431 242.4531426761971 288.6321111047236 250.61525106310273 280.1429009636254 257.9335619693902 269.4977033895718 264.79963843573 258.62440630155766 269.747212813302 247.98834091017915 272.8998867302482 235.1301723854457 275.20565595853645 222.93601399814503 277.7970566608542 210.45285890859884 279.74133799998094 197.64873719251972 280.61281413259303 184.5151342861813 281.86739874645264 171.85992721532836 281.4792935965238 158.93559678704815 281.5092061164478 146.28295371195867 114.76937329328939 130.951598068198 128.08263322780795 117.98798787017822 142.75231443631833 115.1611843213403 157.86672034108483 116.6153814472695 173.90706075198796 120.19246576692521 206.74686782862727 121.02894499848259 220.99542433006582 116.48930599229095 236.1074216934221 114.23420102476143 250.73115505761848 115.86996634388765 263.3734861763337 127.31835338341557 191.33970313484292 140.2993282288778 192.45687057588907 153.99556019589238 193.40006374594554 167.7197145214593 194.32236469376062 181.56555264352028 174.32287399383358 199.9117998945228 184.1300182390903 200.7080648332646 194.13823396512498 200.9199239355632 203.11092044225438 199.86497334690608 212.42735582667092 198.39314190081785 131.27202642017005 145.25804889557963 139.60930390659155 138.39140341160612 159.2010831026048 137.80280976717103 166.42600991645304 143.9995468330668 158.77062924567076 148.28800882103064 140.55642625303972 148.86503435426226 213.4374142301142 143.2204520243767 220.39543471642332 136.74147974141397 239.92071927456345 137.12800959501914 247.93163833918334 142.84138090328952 240.00825505051557 147.15669896848198 221.6303427783007 146.4370558472341 128.0094815119112 127.30603943524707 143.35837320872042 124.87580588006517 158.72832845001858 126.4964054707917 173.69199365904 128.65707163001753 207.1552887017947 128.7516632692907 221.6274310617116 125.50257185064652 235.7439188520525 123.88641465163721 250.25624675797326 124.72534010103783 149.3436339426612 135.51948593642157 149.7487980940789 149.8507149705502 149.2291436246387 143.5389557296295 230.21048834458065 134.5325702413665 230.55831327048872 147.71896200290422 230.8006705913076 142.14327359606972 179.39126786362544 141.84649970810392 202.05051090427742 141.56137209773732 173.8039715509792 178.44659547107824 211.45557809523098 178.39411289071995 168.10655914660424 192.04339505424878 217.73949281404384 190.9817893520048 163.77671869691835 245.9297018326953 174.2606984559743 234.04905150188023 188.89939874358288 227.84672346345394 196.7329061228004 228.2799145924343 204.10161247457313 226.22829637685066 215.79634899438292 232.07471557651525 225.75460645391786 242.76781565926217 218.67711851772015 251.95207930554216 208.64262130213024 258.23235688939445 196.93652094532126 261.1568196873806 184.66714682070506 260.51896117531226 172.74227783176593 254.4916159172529 167.5079387479136 245.17813995913383 181.0178831606829 236.37151769034227 196.79519987565118 235.13433671419762 210.26497756865683 235.59731745760598 221.92064485126207 242.76948430139294 209.8973809584047 248.94593227955767 196.5173815999451 251.3505374509133 180.9974006859345 249.87787983113742 149.24129540873582 143.53587629472423 230.8157833355565 142.13912050783574 147.133 140.543 146.515 142.549 145.134 144.308 143.303 145.533 141.195 146.09 139.025 146.064 137.013 145.291 135.327 143.832 134.289 141.928 133.844 140.046 133.787 139.301 134.208 137.402 135.234 135.42 136.895 134.055 138.873 133.177 141.105 132.998 143.193 133.411 145.055 134.593 146.375 136.332 147.056 138.359 211.016 134.501 211.428 135.177 212.929 139.128 215.603 142.521 219.29 144.814 223.556 145.707 227.95 145.317 232.052 143.676 235.325 140.702 237.262 136.833 237.814 132.601 237.099 128.309 235.103 124.352 231.856 121.555 227.845 120.103 223.408 119.916 219.124 120.821 215.37 122.99 212.551 126.254 211.221 130.155
pose_2.jpg
100.093194977701 161.4013559052053 100.53716781745582 174.1857192569044 101.90737024900535 186.330944275622 103.3735344503214 198.64378987292483 106.0354925127567 210.26739998238142 108.63544049277088 222.61087067996962 111.59644983652562 234.0313677912341 115.58285449158832 244.8659427127749 119.6903965821058 256.22302381253576 125.40724643070543 266.6858725541848 132.9721330817453 276.1873768745778 141.3874107390709 284.37879171946804 150.87961259137933 291.7796379709438 161.32208063157066 297.27698204120384 173.02991152939967 301.9868103222333 183.95909621077863 305.35365302459354 195.7679601753028 306.2800992849018 208.18786280319534 305.7388693777715 219.90713032948378 302.63420928143375 231.0674983570704 298.4270380602661 241.87467583557293 292.4041270658705 251.11430378350548 284.80162413595906 259.75921009425394 275.4774005034841 267.5936934573682 265.5298987339355 273.10328426726045 254.9936831616392 276.9043510885375 242.3450654350767 279.6842731168115 230.00664245652425 282.2207554656838 217.85061671549514 283.6271167176647 205.31749176197863 284.91158825286465 192.74645559564587 286.28405217837695 180.92652850754328 286.4006652981046 168.26830468338156 286.37802009263464 156.15744637477152 115.4900160996773 125.082193502199 125.97839064692833 111.80204842041258 140.26601488172514 106.87664397110419 155.14218390910088 106.70852137040197 169.09935687644764 110.27262179717772 206.40394643424622 109.51418879556448 220.59279429423606 105.65324388595778 235.85397794810945 105.46019037708922 250.0199630890238 110.31695931104736 261.8146174979548 122.08371614206064 188.27026113905077 141.11795251994283 188.94549470038856 155.79707940298127 189.08404129537774 170.40651295539388 189.0338830757779 185.08851954718264 171.16007470459334 206.46071102387398 180.93779563127492 207.0502404698997 190.87321681074877 206.35219052398807 200.97355808008274 204.79966402335427 211.08925122271853 204.3564037192203 129.99709531216052 146.82516613202017 138.45895319487153 138.65812639817906 158.9199691214858 139.59960169134345 165.7290131584893 147.94525838106998 156.7653382067972 151.544880685268 138.46813920180864 150.68082990304023 212.80803931969268 146.85949161785712 219.26231272640433 138.08401938986523 239.75751734527222 136.58540730340738 248.649051930705 143.87512069902633 240.21094885731281 148.74329648092575 221.15190422708247 149.42092601802844 126.86960645378379 120.33246236674327 140.8901007864652 116.54432321777693 154.93940323118028 117.14727923840363 168.52913532187097 118.78941690089661 207.75083328136253 118.21893080257749 222.02013148499196 115.72501499112207 236.30085682138304 115.43394613063205 249.963450327016 118.47854574588834 148.87490777257472 135.70718113818126 147.59434754904237 152.29876550495365 148.03748477546577 145.4896501933588 229.46599844979602 134.41317010619932 230.84721286967346 149.98417040083237 230.72517795734979 143.7379041389614 177.48241144005863 144.42981886262302 200.57731720506337 143.59579850675797 171.84385977785854 184.30467672499688 208.85265199799937 182.28690286702104 164.7157107525598 198.3152809435389 216.5810910941354 196.00943146426206 162.08994193115757 242.61385974228384 172.0858024620631 231.3517041461669 184.59666727117178 225.07210003155583 192.4492468801826 226.01037622068648 200.5588240919513 224.77790926122546 213.2634454736595 229.9549048592879 224.9381988568802 240.3493923009241 215.6243032223897 247.39660041613203 205.36281791541535 252.38760036802856 193.5081632119767 254.17419095783012 182.49965019869023 253.80076635628163 170.88804392816343 249.59330429304387 167.57043887732817 242.73629516138735 180.19135949493068 238.375810332754 192.85939239711735 237.72187925766596 206.8976616720181 236.7157494378059 219.61588976882643 239.68200286313336 205.99954107757281 239.46004693594395 192.87395458689463 239.71164424499196 180.12457142150345 240.76992809596996 148.06175692301161 145.49601398018515 230.75213110892832 143.74539610046523 151.81004827804105 143.84939898317916 150.63144534208192 145.6441302756924 148.86913851684324 146.98729337759272 146.8314499157171 147.6939404451874 144.67901038421724 147.69476928060448 142.64910917316251 147.06602696415425 141.02872336421495 145.72688544037987 139.94956991302985 143.81564142727416 139.60619050377977 141.69161161706296 139.7873028215118 139.78759526241075 139.95654438421695 139.21832295211956 140.85339649250864 137.5185476486653 142.29640969176268 135.85395234715398 144.20622126850992 134.9262398383987 146.28762452136618 134.58486734190467 148.44938709275277 134.98154598440036 150.30228524201237 135.933428538162 151.69219101976256 137.56935702042549 152.39817535520797 139.56902405102318 152.4303516793084 141.772208047627 220.83087992185017 138.27456575190863 221.03940931940252 140.4295799316757 221.93522532027163 142.46753600857878 223.44225732523256 144.06060385796238 225.3969587320836 145.0880821494304 227.5350677449888 145.46792778177556 229.65563945935935 145.05177339486923 231.56306625944353 143.89634876044013 232.94612166470642 142.1701057576689 233.7429995319805 140.35694118816687 233.86461164644606 139.79091564686422 233.88532984187515 137.84824845062215 233.4031521329124 135.67294387241222 232.13258946772032 133.9224424167932 230.40870849293913 132.67311152665408 228.2875682138373 132.0047132346189 226.16157424927616 131.9614449361577 224.10586896110965 132.72545572561967 222.45252030932784 134.1380106049866 221.29291179707224 136.12143267485757
pose_3.jpg
100.68002376329395 153.79703744153008 101.4051291762957 167.29352731036488 102.36608555212445 180.38026696181367 103.5115085866463 193.2796673806298 105.60208513182278 206.22525492792363 108.05407168715283 219.1911996360299 110.72916034828148 232.04327056505645 114.662128816159 244.31299676982297 119.22729693999361 256.96663174064025 124.91038003602449 268.85800394432505 132.06981400976326 280.20769626878973 140.08399028936486 290.3547602041902 149.26992550691995 299.5458046262653 159.67522608282252 305.8702960749566 171.6053687509525 310.79222087866896 183.76463958488898 314.0718881616237 196.40643310382438 314.9296751937251 209.08260222463787 312.7460533253202 221.08109456329373 309.46714244205816 232.5904138851381 304.34993137439795 243.38614067947708 296.72779844763835 251.95556374662897 287.92424367136994 259.56128384934743 277.37881002812264 266.6055223864016 266.02289533102015 271.8223444494196 255.0853207857532 275.6882976698365 241.69952614724372 278.1332311458493 229.07474592014216 280.865130942551 216.16749828113603 282.6715383437519 203.17098731003347 283.4855144423068 189.7047846995279 284.54365417249846 176.60752783316747 284.2440201782424 163.29309326444684 284.41190175601093 150.34890426294282 119.32299327945958 131.65871208074068 131.24354204447127 119.4271237085135 144.7233606197231 116.76436231641082 158.3771063414871 117.49771461977107 172.81182338036558 120.10444812210659 203.4335669303016 118.8589482271039 217.87278465955853 115.02702907304996 232.91907468182228 113.44929822161964 247.31294938041242 115.33190812950377 260.2315024010835 126.97841904273312 188.22273343612926 139.3076198752563 188.63686152102878 152.0386047432662 188.7611761515883 164.79384529821388 188.91978090528823 177.55227075180463 169.7111954193623 196.77551803916765 180.06821536557868 197.67351115261314 190.63790976752796 197.7888044966272 200.8804305939036 196.19666268162234 211.24318198311573 194.77596339340437 132.59280263568814 145.38064500335693 140.33753945021442 139.0346919634389 158.93063754126615 138.179268931794 166.69930149762808 143.11074009078462 159.10171084791972 147.3394655494072 141.46611287868313 148.17486895478004 211.87448685223558 141.32109648552972 218.99927225556576 135.38629032260937 238.14200872137235 135.77969758234636 246.3949750306624 141.10883871630253 238.28261153261315 144.81048398646308 219.98203674843285 144.61163538712265 131.35538709568743 128.44829961151106 145.3210348187722 126.39810197004851 159.1387735034233 127.31037555545663 172.68949902201916 128.8156827288658 203.93057601866838 127.37584575978363 218.4936246992536 124.46089694241138 232.73524799057964 123.07160759666456 247.05706726507663 124.18618894799454 149.47918521005658 136.43638963621842 150.41669906636992 148.76269602136756 149.83792696451576 143.299579440643 228.63413541056707 133.3277985159665 228.90817521054475 145.45268978778392 229.24034281706986 140.37335793597077 177.9319146074116 140.92273666581062 199.7174690978967 140.0987596833736 169.30041816138362 174.84747082055745 210.38993478839757 173.7937801486287 163.225771472235 188.3553133264286 217.35450961702531 186.66968050371196 160.1394551245839 242.89775811526653 168.54736414930602 225.85190441340148 184.2487138192256 217.81560362723332 192.7308484038665 217.93019388434595 200.94263381193832 216.52182806789176 217.13170151835243 223.50943332277305 228.81417226616497 239.95764581058992 219.63434860680593 251.1279338845803 207.24764191007648 257.99722671866533 193.60312998843597 260.80504085767797 180.81085488093902 259.80105624440637 168.6046248037361 253.04358964232298 164.01974631861418 241.9637881083339 175.805269270773 228.99728727707162 192.6064447100979 225.38489739724236 210.80045024409787 227.89017795717766 224.7944515525944 239.57564391391026 209.9331052046141 247.97277555588565 193.54048661491902 251.31926099718322 177.56903552895676 248.71010475718293 149.84582414347676 143.2984574691387 229.2632446360568 140.37010421860836 154.301461172647 141.28211941410973 153.98117099887304 143.8251636812882 152.89421223464961 146.2853838191459 151.1345283671787 148.25224364347358 148.80387764360444 149.42124922459743 146.23896934961363 149.66158067587995 143.84019972098628 148.7576380963099 141.85285060979106 146.94446913051868 140.55571670009954 144.5909333525596 139.88406480440915 142.26213159782014 139.72922406113173 141.18360134604575 139.81607205312503 138.8517732020959 140.515012731171 136.2065936517422 142.06859816590202 134.099020893086 144.20611444560933 132.65372132086355 146.81577006048656 132.16694641865024 149.30762030185718 132.6089132914571 151.4890601731642 134.0416290964237 153.06288105736593 136.13107669270323 154.0035887610266 138.58439595152163 215.59932696651867 137.60874451806953 216.85701835983605 139.9114879293186 219.56175336875458 143.1107864900333 223.08959800249056 145.3157822728301 227.1133598042387 146.31031579493748 231.25404792720292 146.05235790736765 235.10368937026135 144.48868952422154 238.22739565617837 141.6545095990617 240.1139335953012 137.84722886724492 240.63358006265992 133.6508493395479 239.9784498785409 129.64070899264615 238.04789058947233 125.85688882307153 234.99614525947288 122.83499049133499 231.17294812335706 121.25605822396645 227.08449592576687 120.91765881391294 222.96594299710102 121.87581135122085 219.39622629853466 123.95239154343281 216.78751784202973 127.16779559194907 215.28458818424303 131.02692301680122 214.9035510490429 133.3522078497409
================================================
FILE: data/reference/lms_ref.out
================================================
ref_1.png
91.5305110043 157.054464213 92.3770956017 170.115229042 93.7010275718 182.787981616 95.1548975719 195.332331205 97.3347102602 207.716766281 99.662236308 220.4665801 102.33052797 232.892260979 106.438173842 244.606825996 111.526303747 256.554821386 118.326501636 267.241535072 126.907554178 276.813854107 136.377258498 285.26307926 146.792371645 292.544235232 158.110919394 297.836709658 170.660884454 302.234441612 183.128704907 305.026855756 196.052484086 305.619976224 209.015038563 303.932347537 221.107902017 300.754207398 232.695360635 295.924145465 243.967578069 289.213744481 253.60576167 281.5663184 262.425012464 271.991909527 270.536534297 261.818957787 276.649322878 251.47003397 280.982866424 238.753417493 283.502200732 226.415968603 285.877343058 213.884350894 287.329861916 201.312966922 288.080086341 188.333151939 289.089321689 176.008753975 288.82966812 163.061934768 288.800723064 150.601424089 111.166514647 133.040716434 123.727850015 119.151674836 138.403570358 115.195718003 153.931016799 115.048281714 169.720489595 117.81474832 208.638735958 116.075327861 223.825603912 111.908791938 239.973167757 110.52262007 255.36881296 113.263118909 268.240874597 126.677018495 189.682041913 143.924696725 190.854041755 158.428013381 191.543603835 173.099139858 192.249936075 187.783461235 170.693937826 203.850342545 181.552014432 205.0961696 192.539693343 205.753910309 203.005895575 203.645338586 213.804201864 201.942667541 129.861998337 150.469447406 138.682325087 144.971438056 158.318952147 142.947134846 166.562767713 148.175911133 158.434543813 151.783553428 139.921782751 153.00548942 213.982344278 146.516031156 221.257153182 140.292642262 241.333860112 140.473593503 250.974868308 145.430719473 241.612688503 149.159617782 222.219029473 148.786122779 124.003302311 128.855280973 139.004461521 125.423794795 154.413255848 125.532565126 169.382919275 126.426350997 209.298274471 124.561489414 224.585108448 121.842109834 239.789645816 120.983759394 254.844007729 123.011316913 148.178048442 141.920938125 149.36627429 153.337310947 148.454744756 148.355641177 231.301475374 138.364326844 231.767179017 149.639055154 232.49717489 145.093280533 178.536600869 145.793460055 201.690026469 144.834209602 170.808068504 182.238448616 212.783621495 181.284539841 163.878859691 195.537120954 220.17502113 193.566153645 155.061694709 237.314899133 168.691774343 229.41669664 184.445533833 225.636771942 192.922252371 226.622102972 201.359987418 224.797551625 216.244562439 227.506028901 231.183770684 234.595217305 219.339705183 241.029675481 206.8282294 244.653452066 193.48598518 246.593223075 180.50678043 247.025890411 166.939154463 243.418623657 159.66420291 237.284550105 176.247472 234.802214372 193.01302325 235.541054382 209.885009946 233.825001783 226.476721585 234.730269342 209.673760794 234.29342297 193.075478992 235.183984642 176.303512976 235.572025677 148.470422359 148.355139093 232.508279859 145.09292489 155.279565975 145.477647446 154.609748868 147.675943207 153.171909168 149.577768712 151.204735999 150.877299851 148.911294491 151.503950821 146.567872056 151.488761482 144.388146126 150.70457047 142.543317208 149.188398723 141.370292655 147.169439163 140.856966418 145.147661715 140.800527419 144.549185855 141.191517902 142.546836043 142.207488061 140.417230767 143.951139828 138.896645721 146.047975359 137.929722343 148.444171137 137.632617459 150.732458337 137.968023504 152.812426749 139.128788161 154.338207325 140.912161647 155.171859036 143.155816221 226.565177903 142.22102609 227.396870455 144.280996543 228.961440761 145.98923668 230.960699023 147.080002841 233.215103849 147.500847529 235.465646862 147.31107009 237.492576996 146.386927482 239.138685693 144.781189341 240.084062174 142.754545992 240.398573086 140.793879684 240.370511298 140.223920684 239.804647089 138.317604988 238.641406911 136.359145098 236.836733786 135.055513762 234.751298886 134.295766975 232.428229204 134.201457416 230.262410384 134.688663192 228.365602855 135.959784698 227.066455763 137.781969604 226.454437003 139.988222896
ref_2.png
92.039137758 156.12329245 92.9050358285 169.485384495 94.4162795053 182.406301682 96.1104208304 195.161424701 98.7067967479 207.76323503 101.650204186 220.637302423 104.656856193 233.139233976 109.003742607 245.093591736 113.935069382 257.263007266 120.449074914 268.455902014 128.651936371 278.822349738 137.573958433 288.281461939 147.617540506 296.645662325 158.961544532 302.47668445 171.597037381 307.133502321 184.26512518 310.22100852 197.36392408 310.812898844 210.115403508 308.7057881 221.723349692 304.902468686 232.433400559 299.163784305 242.376146097 291.152868024 250.247482235 282.220521752 257.279485702 271.442518292 263.809502706 260.366941295 268.315940536 249.563609167 271.33120125 236.662909339 273.438509977 224.500099191 275.749154425 212.326589649 277.225648649 200.04842333 278.089551694 187.299923103 279.124329773 175.181193895 278.652381786 162.544083451 278.440504309 150.367990519 111.907895958 131.531235069 125.275455017 117.599022821 141.013648304 113.249077652 157.206272745 113.356300912 173.625129093 116.363822971 208.025288848 117.11531083 222.629721985 113.376275223 237.988667189 112.278082276 252.366591804 115.164869446 264.708164729 127.833223769 192.266751163 140.459419472 193.995037436 154.739435239 195.437956791 169.13898836 196.781258921 183.608614787 173.455824323 202.774889104 184.130273167 204.044877226 195.136226464 204.092632645 205.483673375 202.652360976 216.199353772 200.823558418 127.959007135 146.38452335 136.576082323 138.250970054 158.359299034 137.18681324 166.537550992 144.585496212 157.568374849 148.991600409 137.497804009 149.872248132 214.242507489 143.822636835 221.288462029 136.104347199 242.420508443 136.275974524 250.738034297 143.624391031 242.152287276 148.082170471 222.516329474 147.196935303 125.761681549 126.665056316 141.819047605 123.172883829 157.944320564 123.833332018 173.46129159 125.367303396 208.675931776 125.775806801 223.493937889 123.183216472 237.993139935 122.390416611 252.240579909 124.406580011 147.362642976 134.611399153 147.696202459 150.589866343 147.427284461 143.886617095 231.859031092 133.272454046 232.088616173 148.400661618 232.608514118 142.302373647 179.8362925 142.274270671 203.070888711 141.780055796 173.602126818 180.473167562 215.098561496 179.091802358 167.039173259 194.238218679 221.816089887 192.044322644 159.552088031 241.879093966 171.939655217 230.56629131 187.823790129 224.704613898 196.280822527 225.320683281 204.407476176 223.261294069 217.308645593 228.39691398 228.456682798 239.236012992 219.483389839 248.10694088 208.77547541 253.650756487 196.003122315 256.115729966 182.941888656 255.641470828 170.063697991 250.148592102 164.18724798 241.650019357 179.885120337 238.392818704 195.939883631 238.82889921 210.514669451 237.464503248 224.1035427 239.103085833 209.988551017 237.922111755 195.836065085 238.750814194 179.870307548 239.237023435 147.444582396 143.885883792 232.62927164 142.301493684 155.127028764 140.537331076 154.454429858 142.837332404 152.976286017 144.840845065 150.930815747 146.215450575 148.534658311 146.888194252 146.08006162 146.885503984 143.793948875 146.082695223 141.852325861 144.511123652 140.601332083 142.405653462 140.03895484 140.291092756 139.965669494 139.657877826 140.353049746 137.562666354 141.394324906 135.326056255 143.204773107 133.721580259 145.391970301 132.697944438 147.895657387 132.370688549 150.291740828 132.70996288 152.476167005 133.910104603 154.090852598 135.76517545 154.986592178 138.107517235 223.322353839 139.639890978 224.129231465 141.815482269 225.962528748 144.608836529 228.558435579 146.734926501 231.734648617 147.866683781 235.120995672 147.90542935 238.300858089 146.752373692 240.951424244 144.575776476 242.634433954 141.649299848 243.324806591 138.493184533 243.265992152 136.026638803 242.358140849 132.924412989 240.531109003 129.984536177 237.812378784 128.051499592 234.617845994 127.141637251 231.141121329 127.390918972 227.996289538 128.585465568 225.45099488 130.776315405 223.805921696 133.616618182 223.184736412 136.246464014
ref_3.png
89.7446818354 155.927528419 90.7091086051 169.45220918 92.1771452547 182.544666392 93.8190295425 195.494446186 96.315176578 208.265813727 99.1361861374 221.338389755 102.098395321 234.028489192 106.444773034 246.116655146 111.705677341 258.252967158 118.660060954 269.336292739 127.394685486 279.259712586 136.871520014 288.23173066 147.382824167 296.007851938 159.024171274 301.495403945 171.945588057 305.928041318 184.877657529 308.713101069 198.13098132 309.294782126 210.91666423 307.472922348 222.673864878 304.076525209 233.717266677 298.590029705 243.962595248 290.867063882 252.172612606 282.179417886 259.363784266 271.591163705 266.09216319 260.628978059 270.629296692 249.790642218 273.693222139 236.971069668 275.55531311 224.74186416 277.617786921 212.52923176 278.992960992 200.310134731 279.92342743 187.623430919 280.883729231 175.645397515 280.492032806 163.019975017 280.433642765 150.866565497 112.458881803 134.482923002 125.793396564 120.718675865 141.412204212 116.58009044 157.736266482 116.938735036 174.143618313 120.077149335 207.617215666 120.47122235 222.705830998 116.583712441 238.679772245 115.291092678 253.775657689 118.319344335 266.429632531 131.441166189 191.414226473 140.041489615 192.850822523 153.790601065 193.867209985 167.723469328 194.867778998 181.592537517 172.201717619 200.524667806 182.989602302 201.72866697 194.063698079 201.954069529 204.340615551 200.625265206 214.877793463 198.954489056 128.827440635 145.575422565 137.250661365 137.697763146 158.281694321 136.971861385 166.378688858 144.135400398 157.511216356 148.548000686 137.936630177 149.278622465 214.027648544 143.561100774 221.391715781 135.894704045 242.726200152 135.873806118 251.291436373 143.451470985 242.813075433 148.060481283 222.781634755 147.099606436 126.207495596 129.795013955 142.190211571 126.429900213 158.428127855 127.200662026 173.966468532 128.642187902 208.14434046 128.731465173 223.429448025 126.136964343 238.557853799 125.324946357 253.455571586 127.501648322 147.7060987 134.440665618 147.84086471 150.150124609 147.803482172 143.377713126 232.027555847 132.998389028 232.615159742 148.51752631 232.794367298 142.144380431 179.326938943 141.853703357 202.59662599 141.480726012 172.243260843 178.68304969 213.723770212 177.726980694 165.473446815 192.214971599 220.965017207 190.718945231 155.728922492 239.057606299 170.181175172 231.096228175 186.43091055 226.976940276 195.40539851 227.815860474 204.081826624 225.735051841 218.526224478 229.255215261 232.915680778 236.620503481 222.162507919 244.526047894 209.767832475 249.304907615 196.051622376 251.316624098 182.214783646 250.94673925 168.042340229 246.246573353 160.252157314 239.028301437 177.729957901 236.168927085 195.513964855 236.985212136 212.244918867 235.532484803 228.424807435 236.717161803 211.918022534 237.853759569 195.535605688 239.005363972 177.818326321 238.916362828 147.819888173 143.377165012 232.811456882 142.143809478 155.497042729 139.080294968 154.977936922 140.994646125 153.559323194 143.49473855 151.400137905 145.418032555 148.676044822 146.532825019 145.736284972 146.804864582 142.901565346 146.124206394 140.40355375 144.501587927 138.641411515 142.147454709 137.703539473 139.536527092 137.457627258 137.686058117 137.926338285 134.980233171 139.247572242 132.289724122 141.470811945 130.399191762 144.168431101 129.33181933 147.192121197 129.175367631 150.028614396 129.806693626 152.516618781 131.375583064 154.337109022 133.632642765 155.271760892 136.147185127 225.024125179 139.763086823 225.894796831 142.021768143 227.699222084 144.269099369 230.12902897 145.816665373 232.963430422 146.508283733 235.864918962 146.360705139 238.522957238 145.276177727 240.710690042 143.30607876 242.040227711 140.741212021 242.541665625 138.113677869 242.503864887 136.634015215 241.735070996 134.085873437 240.153703585 131.591568754 237.781859577 129.976368993 235.047504717 129.186283992 232.076311952 129.316345317 229.362318434 130.200859979 227.082765877 131.98309395 225.550964847 134.372682048 224.898900853 136.875222715
ref_4.png
92.6273695423 151.863222686 93.5634544076 165.442673083 94.8552471172 178.650325358 96.1474669783 191.749293298 98.0906383413 204.755684179 100.285175961 218.185819107 102.771752809 231.218781595 106.694038527 243.517218451 111.578585938 256.102201048 118.105511997 267.511376806 126.399356176 278.087505265 135.433629718 287.672366713 145.548504929 296.115766537 156.790602001 302.502752179 169.567545866 307.633094546 182.648571695 310.755146004 196.200805323 311.565606788 209.726313039 309.554460514 222.045251117 305.930018398 233.409945926 300.075564793 244.155133582 291.95486222 252.97666835 283.019263644 260.964792575 272.252452492 268.496273838 260.855603759 273.975458397 249.594996703 278.038081101 236.255777027 280.263208217 223.304503915 282.451046605 210.303387374 283.742767823 197.41335475 284.44342505 183.967466947 285.381033961 171.210624512 285.241893339 157.86537336 285.678359822 144.974773932 113.571695264 137.773282591 126.423070037 126.077276146 141.172084872 122.730020142 156.786673448 122.593110636 172.634015099 124.836543465 207.450243229 123.717710033 222.274178504 120.207772688 238.075031062 119.034652488 253.256669102 120.945992859 266.378505116 132.251581801 190.798224303 140.716603443 192.4233368 158.219556799 193.510106082 176.018613632 194.593526928 193.511670397 171.785536057 203.409870542 182.366082381 206.668729094 193.510218649 208.510021741 204.046347598 205.124224658 214.935880011 201.577604946 131.014695568 145.367011031 139.467183559 138.890210226 158.250907697 137.78525507 166.742471689 143.410065194 158.406075004 146.365828471 140.52095462 147.231028374 212.998014834 141.651829149 220.247853138 134.842563125 239.687697507 134.623601761 249.027437664 140.825322088 239.784804206 143.673153006 221.191753251 143.161764316 126.683321326 133.941381975 141.829788841 130.897089652 157.413555648 131.388575641 172.45413795 132.438648698 207.91546898 131.300051503 222.85468888 128.599313474 237.814246381 127.719136616 252.80098701 129.302383065 148.675890017 136.023315453 149.560589595 147.399107602 149.075840872 142.890987351 229.896886431 132.295537257 230.1985368 143.703648159 231.003209289 139.701959552 179.198639948 141.826319745 201.798114566 140.820250173 171.076364201 183.059199156 214.990800393 181.784780491 164.273786399 195.597389697 222.433535348 193.198757019 150.300478792 234.949007132 166.707377679 229.025010333 185.00149523 227.935187017 194.085939754 228.77216209 202.846508575 226.960413473 219.437476402 226.808102856 236.225997626 231.52430752 224.084792536 240.17210242 209.752520443 244.824966141 194.238170627 247.014452882 179.226720943 247.450978793 163.6234829 243.117198749 154.322548581 235.645769488 174.051832814 235.379514895 193.84864258 236.667679431 213.380385096 233.964877996 232.149237377 232.054578053 213.174011753 234.392647807 193.899421574 236.315921248 174.046701477 236.207236101 149.091263629 142.890591717 231.003868388 139.702585682 154.232787412 139.498496855 153.559249613 141.367723554 152.229615328 142.978560688 150.492771132 144.056476928 148.496682712 144.553950002 146.471422963 144.490799448 144.61066478 143.746310258 143.063676723 142.373883662 142.135785891 140.594608659 141.784370627 138.852631608 141.815688391 138.368906882 142.252695627 136.604132584 143.200795401 134.784449975 144.76419749 133.538005825 146.595971128 132.776601117 148.674070754 132.59339881 150.63604423 132.921818296 152.388039139 133.963608833 153.620955407 135.526071872 154.266246094 137.49393295 226.065859144 136.26361424 226.868195419 138.262170727 228.342357179 139.92904768 230.222562721 141.030566979 232.367304026 141.489978956 234.524264239 141.322758726 236.433610911 140.376741347 237.931692714 138.764798283 238.743687983 136.778202108 238.970360889 134.885714623 238.876300783 134.227084712 238.297546768 132.371334475 237.177941243 130.488304076 235.450083136 129.234334949 233.449811878 128.515930955 231.227253172 128.482277069 229.195261323 129.034043292 227.487947055 130.335622184 226.376902766 132.120259444 225.883529367 134.119259343
ref_5.png
95.8641666481 153.876216957 96.6446302127 167.22924384 97.9311325716 180.266378675 99.184959939 193.083168223 101.027597231 205.919003828 103.289807178 219.021574555 105.748259956 231.850852856 109.997457645 243.874168633 115.064778665 256.019911195 121.866812215 266.937287877 130.243278876 277.246220556 139.086598138 286.677196779 148.628167036 295.190484457 159.245096382 301.855525478 171.270807106 307.693979843 183.679301808 311.556503455 197.019283776 312.547509766 210.208056594 310.222936374 222.167477955 305.845679539 232.959303274 299.539884924 243.333168051 291.22890252 251.895625423 282.104000409 259.899547789 271.605086811 267.550411672 260.67534063 273.404815111 250.031511959 277.612279363 237.086154859 280.224974721 224.611605969 282.351300904 211.899376581 283.388452301 199.238960208 283.853415984 186.1044991 284.81094413 173.540800443 284.680510281 160.42545556 284.984174898 147.773904598 113.702919061 133.934425806 125.923380435 121.087837609 140.381271813 117.228598734 155.527007262 117.096417177 171.10423348 119.939581571 209.024718207 119.116092608 223.467736291 114.959349376 238.90147459 113.643068334 253.66274328 116.095616255 266.264419508 128.3682792 189.927230613 142.786284433 191.031431781 159.91561173 191.716484945 177.299107727 192.396402803 194.505825337 170.397055103 203.027801289 180.851436938 205.830811793 191.881432183 208.401207438 202.330991514 204.306038302 213.26702099 200.924225841 131.375682053 147.069434029 140.453815794 140.747815903 159.785033195 140.092646074 167.499734139 146.150652377 159.288103199 149.060921564 141.104433377 149.332161205 212.243008654 144.989128685 219.401922188 138.089053946 239.431172281 137.201448633 248.880813497 142.819445264 239.623307623 146.601572588 220.520097369 146.669603505 126.179256591 128.811103972 140.988925759 125.438628592 156.052577039 126.045256262 170.724849743 127.519761685 209.45863941 126.541451262 224.10024653 123.376039483 238.649646206 122.200545878 253.326614294 124.112638099 149.962146842 137.966419304 150.317830278 149.909939276 149.71119027 145.110796371 229.253804711 135.16935239 229.818628112 147.046677606 230.519357909 142.504417332 179.128902539 144.294270439 200.958214909 143.528436518 170.009334396 183.169208698 212.880929723 181.855934049 162.584727006 195.337426237 220.839542107 192.709053035 147.281824005 232.261709769 164.002657088 223.554253707 183.37014552 221.50797366 192.259443622 222.355737203 200.802915419 220.478495682 219.03648697 220.316533677 237.226191351 227.304014643 227.693805169 244.124502154 212.897176317 255.947635359 193.652382038 260.711780201 174.873795913 258.887172586 158.517661986 247.971968875 151.041529776 232.921207339 171.196964243 227.007403955 192.011174916 227.081699422 213.015536573 224.988189242 233.485042643 228.134080153 216.193361049 244.952965166 193.416850404 251.289293202 169.881928882 247.149151994 149.727135814 145.110213144 230.543964925 142.504182588 152.451951943 142.39464313 151.824151025 144.383527388 150.476923652 146.092693718 148.688914 147.264463788 146.621652379 147.822409065 144.506891575 147.809279979 142.551471213 147.077800954 140.910884152 145.677503436 139.903731134 143.842890243 139.482997763 142.034728686 139.470055027 141.498981138 139.860599833 139.662476694 140.810033368 137.736340433 142.40855364 136.385886219 144.306882875 135.520104602 146.474911233 135.279807056 148.526122102 135.592643636 150.374028544 136.658702515 151.697185583 138.274853377 152.412463839 140.296443166 227.934250089 139.199356545 228.741799454 141.124431296 230.251117618 142.71913679 232.144641023 143.740948837 234.257583507 144.122713168 236.372046431 143.927618466 238.271866802 143.029848555 239.799014676 141.49239821 240.651000461 139.575813795 240.911201815 137.740750384 240.855348809 137.195927746 240.289193514 135.377118292 239.162060095 133.536248939 237.43771659 132.337935585 235.464632569 131.65101154 233.278206253 131.606574092 231.256229646 132.09300793 229.501001958 133.306157299 228.323876427 135.027729975 227.783515958 137.097242098
================================================
FILE: fusion/README.md
================================================
简介:
此项目的意义是针对Face2Face的生成结果,利用reference进行纹理的融合.
算法输入:
生成图像img_gen
生成图像106点+40点眼睛(optional)
参考图像img_ref
参考图像106点+40点眼睛(optional)
算法输出:
融合后的图像
算法选项:
1.alpha blending
2.泊松融合
2.NCC mask net
================================================
FILE: fusion/affineFace.py
================================================
from fusion.points2heatmap import *
from fusion.calcAffine import *
from fusion.warper import warping as warp
import matplotlib.pyplot as plt
from fusion.parts2lms import parts2lms
import time
from tqdm import *
import random
import multiprocessing
import sys
def gammaTrans(img, gamma):
gamma_table = [np.power(x/255.0, gamma)*255.0 for x in range(256)]
gamma_table = np.round(np.array(gamma_table)).astype(np.uint8)
return cv2.LUT(img, gamma_table)
def erodeAndBlur(img,kernelSize=21,blurSize=21):
#img : ndarray float32
kernel = np.ones((int(kernelSize), int(kernelSize)), np.uint8)
res = cv2.erode(img,kernel)
res = cv2.GaussianBlur(res, (blurSize, blurSize), math.sqrt(blurSize))
return res
def affineface(img,src_pt,dst_pt,heatmapSize=256,needImg=True):
#src/dst_pt[ndarray] : [...,[x,y],...] in [0.0,1.0],with gaze
#naive mode: align 5 parts
curves_src,_ = points2curves(src_pt.copy())
pts_fivesense_src = np.vstack(curves_src[1:])
curves_dst,_ = points2curves(dst_pt.copy())
pts_fivesense_dst = np.vstack(curves_dst[1:])
affine_mat = calAffine(pts_fivesense_src,pts_fivesense_dst)
pt_aligned = affinePts(affine_mat,src_pt*255.0)/255.0
if needImg:
img_aligned = affineImg(img,affine_mat)
return pt_aligned,img_aligned
else:
return pt_aligned
def affineface_parts(img,src_pt,dst_pt):
curves_src,_ = points2curves(src_pt.copy())
curves_dst,_ = points2curves(dst_pt.copy())#[0,255]
parts_src = curves2parts(curves_src)
parts_dst = curves2parts(curves_dst) #[0,255]
partsList = []
for i in range(len(parts_src)-2):
affine_mat = calAffine(parts_src[i],parts_dst[i])
parts_aligned = affinePts(affine_mat,parts_src[i]) #[0,255]
partsList.append(parts_aligned)
partsList.append(parts_src[-2])
partsList.append(parts_src[-1])
'''
A = []
B = []
for i in range(len(parts_src)):
A.append(parts_src[i])
B.append(partsList[i])
A = np.vstack(A)
B = np.vstack(B)
res = warp(img,A,B)
'''
lms = parts2lms(partsList)
#bound
lms[:33] = dst_pt[:33]*256
res = warp(img,src_pt[:106]*256,lms[:106])
return lms/255.0,res
def lightEye(img_ref,lms_ref,img_gen,lms_gen,ratio=0.1):
#get curves
curves_ref,_ = points2curves(lms_ref.copy())
curves_gen,_ = points2curves(lms_gen.copy())
parts_ref = curves2parts(curves_ref)
parts_gen = curves2parts(curves_gen) #[0,255]
#get rois
gaze_ref = curves2gaze(curves_ref)
gaze_gen = curves2gaze(curves_gen)
#img_gazeL = np.dot(gaze_ref[0], img_ref)
img_gazeL = multi(img_ref,gaze_ref[0])
#img_gazeR = np.dot(gaze_ref[1] , img_ref)
img_gazeR = multi(img_ref,gaze_ref[1])
affine_mat = calAffine(parts_ref[-2],parts_gen[-2])
img_gazeL_affined = affineImg(img_gazeL,affine_mat)
affine_mat = calAffine(parts_ref[-1],parts_gen[-1])
img_gazeR_affined = affineImg(img_gazeR,affine_mat)
img_ref = img_gazeL_affined + img_gazeR_affined
mask = gaze_gen[0] + gaze_gen[1]
mask = erodeAndBlur(mask,5,5)
R = img_gen[:,:,0] * (1-mask) + mask* (img_gen[:,:,0]*ratio + img_ref[:,:,0]*(1-ratio))
G = img_gen[:,:,1] * (1-mask) + mask* (img_gen[:,:,1]*ratio + img_ref[:,:,1]*(1-ratio))
B = img_gen[:,:,2] * (1-mask) + mask* (img_gen[:,:,2]*ratio + img_ref[:,:,2]*(1-ratio))
res = np.stack([R,G,B]).transpose((1,2,0))
seg = mask
seg = seg * 127
return res,seg,img_ref
def multi(img,mask):
R = img[:,:,0] * mask
G = img[:,:,1] * mask
B = img[:,:,2] * mask
res = np.stack([R,G,B]).transpose((1,2,0))
return res
def fusion(img_ref,lms_ref,img_gen,lms_gen,ratio=0.2):
#img*: ndarray(np.uint8) [0,255]
#lms*: ndarray , [...,[x,y],...] in [0,1]
#ratio: weight of gen
#--------------------------------------------
#get curves
curves_ref,_ = points2curves(lms_ref.copy())
curves_gen,_ = points2curves(lms_gen.copy())
#get rois
roi_ref = curves2segments(curves_ref)
roi_gen = curves2segments(curves_gen)
#get seg
seg_ref = roi_ref.sum(0)
seg_gen = roi_gen.sum(0)
seg_ref = seg_ref / seg_ref.max() * 255
seg_gen = seg_gen / seg_gen.max() * 255
#get skin mask
skin_src = roi_ref[0] - roi_ref[2:].max(0)
skin_gen = roi_gen[0] - roi_gen[2:].max(0)
#blur edge
skin_src = erodeAndBlur(skin_src,7,7)
skin_gen = erodeAndBlur(skin_gen,7,7)
#fusion
skin = skin_src * skin_gen
R = img_gen[:,:,0] * (1-skin) + skin * (img_gen[:,:,0]*ratio + img_ref[:,:,0]*(1-ratio))
G = img_gen[:,:,1] * (1-skin) + skin * (img_gen[:,:,1]*ratio + img_ref[:,:,1]*(1-ratio))
B = img_gen[:,:,2] * (1-skin) + skin * (img_gen[:,:,2]*ratio + img_ref[:,:,2]*(1-ratio))
res = np.stack([R,G,B]).transpose((1,2,0))
return res,seg_ref,seg_gen
def loaddata(head,path_lms,flag=256,num = 50000):
#head: head of img
#return res:[[path,lms[0,1]]]
fin = open(path_lms,'r')
data = fin.read().splitlines()
res = []
for i in tqdm(range(min(len(data)//2,num))):
name = data[2*i]
path = os.path.join(head,name)
lms = list(map(float,data[2*i+1].split()))
if flag==256:
lms = np.array(lms).reshape(-1,2) / 255.0
else:
lms = (np.array(lms).reshape(-1,2)-64) / 255.0
res.append((path,lms))
return res
def gray2rgb(img):
res = np.stack([img,img,img]).transpose((1,2,0))
return res.astype(np.uint8)
def process(index, album_ref, album_gen, album_pose):
# 30ms
img_gen = cv2.imread(album_gen[index][0])
lms_gen = album_gen[index][1]
img_ref = cv2.imread(album_ref[index // 100][0])[64:64 + 256, 64:64 + 256, :]
lms_ref = album_ref[index // 100][1]
img_pose = cv2.imread(album_pose[index % 100][0])[64:64 + 256, 64:64 + 256, :]
lms_pose = album_pose[index % 100][1]
# affine
# 4ms
lms_ref_, img_ref_ = affineface(img_ref, lms_ref, lms_gen)
# 200ms
lms_ref_parts, img_ref_parts = affineface_parts(img_ref, lms_ref, lms_gen)
# fusion
# fuse_all,seg_ref_,seg_gen = fusion(img_ref_,lms_ref_,img_gen,lms_gen,0.1)
fuse_parts, seg_ref_parts, seg_gen = fusion(img_ref_parts, lms_ref_parts, img_gen, lms_gen, 0.1)
fuse_eye, mask_eye, img_eye = lightEye(img_ref, lms_ref, fuse_parts, lms_gen, 0.1)
res = np.hstack([img_ref, img_pose, img_gen, fuse_eye])
cv2.imwrite('proposed_wild/fuse/%d.jpg' % (index), fuse_eye)
================================================
FILE: fusion/calcAffine.py
================================================
# -*- coding: utf-8 -*-
"""
Created on Fri Dec 29 13:43:03 2017
"""
import numpy as np
import cv2
#affine points via least square method
#src_p[input] -- np.array([[x,y],...])
#dst_p[input] -- list[float]
#affine_mat[output] -- np.array() | matrix of affine
#pt_align[output] -- np.array() | aligned points
def calAffine(src_p, dst_p):
p_N = len(src_p)
U = np.mat(list(dst_p[:,0]) + list(dst_p[:,1]))
xx_src,yy_src = list(src_p[:,0]),list(src_p[:,1])
X = np.mat(np.stack([xx_src + yy_src, yy_src + [-ii for ii in xx_src], \
[1 for ii in range(p_N)] + [0 for ii in range(p_N)], \
[0 for ii in range(p_N)] + [1 for ii in range(p_N)]], axis=1))
result = np.linalg.pinv(X) * U.T
affine_mat = np.zeros([2, 3])
affine_mat[0][0] = result[0][0]
affine_mat[0][1] = result[1][0]
affine_mat[0][2] = result[2][0]
affine_mat[1][0] = -result[1][0]
affine_mat[1][1] = result[0][0]
affine_mat[1][2] = result[3][0]
return affine_mat
def affinePts(affine_mat,pt):
src_align = pt.T
new_align = np.mat(affine_mat[:2, :2]) * np.mat(src_align) + np.reshape(affine_mat[:, 2], (-1, 1))
pt_align = np.array(np.reshape(new_align.T, -1))[0].reshape(-1,2)
return pt_align
#affine Image from pt_src to pt_mean
#img[input] -- np.array()
#pt_src,pt_mean[input] -- list[float] format = x0,y0,x1,y1,...,xn,yn
#img_align -- np.array() | aligned image
def affineImg(img,TransMat,dsize = 256):
img_align = cv2.warpAffine(img, TransMat, (dsize, dsize), borderValue=(155, 155, 155) )
return img_align
if __name__ == '__main__':
path_src = '/media/heyue/8d1c3fac-68d3-4428-af91-bc478fbdd541/Project/Face2Face/detectface/samples/common/output/landmarks.txt'
output_pt = 'lms/lms.txt'
output_img = 'imgs'
affineList(path_src, output_pt,output_img,'meanpose384.txt',k=2,head='/media/heyue/8d1c3fac-68d3-4428-af91-bc478fbdd541/Project/Face2Face/Data/test')
'''
path_src = 'alignedPoints_256.txt'
output_pt = 'output/AU_points.txt'
output_img = 'output/AU'
head = '/media/heyue/8d1c3fac-68d3-4428-af91-bc478fbdd541/Project/Face2Face/net/GANimation/dataset_emo'
affineList(path_src,output_pt,output_img,'meanpose384.txt',k=2,head = head)
print('done')
'''
================================================
FILE: fusion/parts2lms.py
================================================
import numpy as np
def parts2lms(parts):
bound,browL,browR,eyeL,eyeR,nose,lipU,lipD,gazeL,gazeR = parts
res = list()
res.append(bound) #0-32
res.append(browL[:5]) #33- 37
res.append(browR[:5]) #38-42
res.append(nose[:4]) #43,44,45,46
res.append(nose[6:6+5]) #47,48,49,50,51
res.append(eyeL[:2]) #52,53
res.append(eyeL[3:3+2]) #54,55
res.append(eyeL[6]) #56
res.append(eyeL[8]) #57
res.append(eyeR[:2]) #58,59
res.append(eyeR[3:3+2]) #60,61
res.append(eyeR[6]) #62
res.append(eyeR[8]) #63
res.append(browL[6:6+4])#64,65,66,67
res.append(browR[5:5+4])#68,69,70,71
res.append(eyeL[2]) #72
res.append(eyeL[7]) #73
res.append((eyeL[2]+eyeL[7])/2) #74 useless
res.append(eyeR[2]) #75
res.append(eyeR[7]) #76
res.append((eyeR[2]+eyeR[7])/2) #77 useless
res.append((nose[0]+eyeL[4])/2) #78
res.append((nose[0]+eyeR[0])/2) #79
res.append(nose[4]) #80
res.append(nose[12]) #81
res.append(nose[5]) #82
res.append(nose[11]) #83
res.append(lipU[:7]) #84,85,86,87,88,89,90
res.append(lipD[10]) #91
res.append(lipD[9]) #92
res.append(lipD[8]) #93
res.append(lipD[7]) #94
res.append(lipD[6]) #95
res.append(lipU[7:7+5]) #96,97,98,99,100
res.append(lipD[3]) #101
res.append(lipD[2]) #102
res.append(lipD[1]) #103
res.append((eyeL[2]+eyeL[7])/2) #104
res.append((eyeR[2]+eyeR[7])/2) #105
res.append(gazeL)
res.append(gazeR)
res = np.vstack(res)
return res
================================================
FILE: fusion/points2heatmap.py
================================================
import numpy as np
import cv2
import os
import math
def curve_interp(points, heatmapSize=256, sigma=3):
sigma = max(1,(sigma // 2)*2 + 1)
img = np.zeros((heatmapSize, heatmapSize), np.uint8)
for ii in range(1, points.shape[0]):
cv2.line(img, tuple(points[ii-1].astype(np.int32)),tuple(points[ii].astype(np.int32)), (255), sigma)
img = cv2.GaussianBlur(img, (sigma, sigma), sigma)
return img.astype(np.float64)/255.0
def curve_fill(points, heatmapSize=256, sigma=3, erode=False):
sigma = max(1,(sigma // 2)*2 + 1)
points = points.astype(np.int32)
canvas = np.zeros([heatmapSize, heatmapSize])
cv2.fillPoly(canvas,np.array([points]),255)
'''
kernel = np.ones((sigma, sigma), np.uint8)
if erode:
erode_kernel = np.ones((int(0.5*sigma), int(0.5*sigma)), np.uint8)
canvas = cv2.erode(canvas, erode_kernel)
else:
canvas = cv2.dilate(canvas, kernel)
'''
canvas = cv2.GaussianBlur(canvas, (sigma,sigma), sigma)
return canvas.astype(np.float64)/255.0
def curves2heatmap(curves,heatmapSize=256,sigma=3,flag='line'):
#-----------------------input--------------------------
# curves [list of ndarray] : points coordinate in [0,heatmapSize]
# heatmapSize[int]: the size of the generated heatmap
# sigma[float]: Boundary vagueness
# flag[string]: 'line' or 'segment'
#-----------------------output----------------
# heatmap[ndarray,float64]: [D,D,num of curves],range in (0.0,1.0)
#=============================================
heatmap = np.zeros((heatmapSize, heatmapSize, len(curves)),np.float64)
for i in range(len(curves)):
if flag == 'line':
heatmap[:, :, i] = curve_interp(curves[i], heatmapSize, sigma)
else:
heatmap[:, :, i] = curve_fill(curves[i], heatmapSize, sigma)
return heatmap
def curves2segments(curves,heatmapSize=256,sigma=3):
#res[ndarray]: range in (0,1) [Channel,Size,Size]
face = curve_fill(np.vstack([curves[0],curves[2][::-1],curves[1][::-1]]),heatmapSize,sigma)
browL = curve_fill(np.vstack([curves[1],curves[13][::-1]]),heatmapSize,sigma)
browR = curve_fill(np.vstack([curves[2],curves[14][::-1]]),heatmapSize,sigma)
eyeL = curve_fill(np.vstack([curves[5],curves[6]]),heatmapSize,sigma)
eyeR = curve_fill(np.vstack([curves[7],curves[8]]),heatmapSize,sigma)
gazeL = curve_fill(curves[15],heatmapSize,sigma)
gazeR = curve_fill(curves[16],heatmapSize,sigma)
#intersect eye and gaze
gazeL = gazeL * eyeL
gazeR = gazeR * eyeR
#2 to 1
eye = np.max([eyeL,eyeR],axis=0)
gaze = np.max([gazeL,gazeR],axis=0)
brow = np.max([browL,browR],axis=0)
nose = curve_fill(np.vstack([curves[3][0:1],curves[4]]),heatmapSize,sigma)
lipU= curve_fill(np.vstack([curves[9],curves[10][::-1]]),heatmapSize,sigma)
lipD= curve_fill(np.vstack([curves[11],curves[12][::-1]]),heatmapSize,sigma)
tooth = curve_fill(np.vstack([curves[10],curves[11][::-1]]),heatmapSize,sigma)
return np.stack([face,brow,eye,gaze,nose,lipU,lipD,tooth])
def curves2gaze(curves,heatmapSize=256,sigma=3):
eyeL = curve_fill(np.vstack([curves[5],curves[6]]),heatmapSize,sigma)
eyeR = curve_fill(np.vstack([curves[7],curves[8]]),heatmapSize,sigma)
gazeL = curve_fill(curves[15],heatmapSize,sigma)
gazeR = curve_fill(curves[16],heatmapSize,sigma)
#intersect eye and gaze
gazeL = gazeL * eyeL
gazeR = gazeR * eyeR
return np.stack([gazeL,gazeR])
def curves2parts(curves):
bound = curves[0]
browL = np.vstack([curves[1],curves[13]])
browR = np.vstack([curves[2],curves[14]])
eyeL = np.vstack([curves[5],curves[6]])
eyeR = np.vstack([curves[7],curves[8]])
gazeL = curves[15]
gazeR = curves[16]
nose = np.vstack([curves[3],curves[4]])
lipU= np.vstack([curves[9],curves[10]])
lipD= np.vstack([curves[11],curves[12]])
return [bound,browL,browR,eyeL,eyeR,nose,lipU,lipD,gazeL,gazeR]
def points2curves(points, heatmapSize=256, sigma=1, heatmap_num=17):
#-----------------------input--------------------------
# points[ndarray]: [...,[x,y],...],range in (0.0,1.0)
# heatmapSize[int]: the size of the generated heatmap
# heatmapNum: number of heatmap channels
#-----------------------output----------------
# curves [list of ndarray] : points coordinate in [0,heatmapSize]
# =====================================================
# resize points (0-1) to heatmapSize(0-D)
for i in range(points.shape[0]):
points[i] *= (float(heatmapSize))
# curve define
curves = [0]*heatmap_num
curves[0] = np.zeros((33, 2)) # contour
curves[1] = np.zeros((5, 2)) # left top eyebrow
curves[2] = np.zeros((5, 2)) # right top eyebrow
curves[3] = np.zeros((4, 2)) # nose bridge
curves[4] = np.zeros((9, 2)) # nose tip
curves[5] = np.zeros((5, 2)) # left top eye
curves[6] = np.zeros((5, 2)) # left bottom eye
curves[7] = np.zeros((5, 2)) # right top eye
curves[8] = np.zeros((5, 2)) # right bottom eye
curves[9] = np.zeros((7, 2)) # up up lip
curves[10] = np.zeros((5, 2)) # up bottom lip
curves[11] = np.zeros((5, 2)) # bottom up lip
curves[12] = np.zeros((7, 2)) # bottom bottom lip
curves[13] = np.zeros((5, 2)) # left bottom eyebrow
curves[14] = np.zeros((5, 2)) # left bottom eyebrow
if heatmap_num == 17:
curves[15] = np.zeros((20, 2)) # left gaze
curves[16] = np.zeros((20, 2)) # right gaze
# assignment proccess
# countour
for i in range(33):
curves[0][i] = points[i]
for i in range(5):
# left top eyebrow
curves[1][i] = points[i+33]
# right top eyebrow
curves[2][i] = points[i+38]
# nose bridge
for i in range(4):
curves[3][i] = points[i+43]
# nose tip
curves[4][0] = points[80]
curves[4][1] = points[82]
for i in range(5):
curves[4][i+2] = points[i+47]
curves[4][7] = points[83]
curves[4][8] = points[81]
# left top eye
curves[5][0] = points[52]
curves[5][1] = points[53]
curves[5][2] = points[72]
curves[5][3] = points[54]
curves[5][4] = points[55]
# left bottom eye
curves[6][0] = points[55]
curves[6][1] = points[56]
curves[6][2] = points[73]
curves[6][3] = points[57]
curves[6][4] = points[52]
# right top eye
curves[7][0] = points[58]
curves[7][1] = points[59]
curves[7][2] = points[75]
curves[7][3] = points[60]
curves[7][4] = points[61]
# right bottom eye
curves[8][0] = points[61]
curves[8][1] = points[62]
curves[8][2] = points[76]
curves[8][3] = points[63]
curves[8][4] = points[58]
# up up lip
for i in range(7):
curves[9][i] = points[i+84]
# up bottom lip
for i in range(5):
curves[10][i] = points[i+96]
# bottom up lip
curves[11][0] = points[96]
curves[11][1] = points[103]
curves[11][2] = points[102]
curves[11][3] = points[101]
curves[11][4] = points[100]
# bottom bottom lip
curves[12][0] = points[84]
curves[12][1] = points[95]
curves[12][2] = points[94]
curves[12][3] = points[93]
curves[12][4] = points[92]
curves[12][5] = points[91]
curves[12][6] = points[90]
# left bottom eyebrow
curves[13][0] = points[33]
curves[13][1] = points[64]
curves[13][2] = points[65]
curves[13][3] = points[66]
curves[13][4] = points[67]
# right bottom eyebrow
curves[14][0] = points[68]
curves[14][1] = points[69]
curves[14][2] = points[70]
curves[14][3] = points[71]
curves[14][4] = points[42]
if heatmap_num == 17:
# left gaze
for i in range(20):
curves[15][i] = points[106+i]
# right gaze
for i in range(20):
curves[16][i] = points[106+20+i]
return curves,None
def distance(p1, p2):
return math.sqrt((p1[0]-p2[0])*(p1[0]-p2[0])+(p1[1]-p2[1])*(p1[1]-p2[1]))
def curve_fitting(points, heatmap_size, sigma):
curve_tmp = curve_interp(points, heatmap_size, sigma)
return curve_tmp
if __name__ == '__main__':
import matplotlib.pyplot as plt
res = list()
path = '../2019CVPR_reconstruct/data/celebHQ/lms.txt'
head = '../2019CVPR_reconstruct/data/celebHQ/align_384'
with open(path, 'r') as fin:
data = fin.read().splitlines()
N = len(data)//2
for i in range(N):
imgPath = os.path.join(head, data[2*i+0])
landmarks = list(map(float, data[2*i+1].split()))
res.append((imgPath, landmarks))
for path,landmark in res:
points = (np.array(landmark).reshape(-1,2).astype(np.float32)-64)/256.0
curves = points2curves(points)
segments = curves2segments(curves)
img= np.sum(segments,axis=0)
plt.figure()
plt.imshow(img)
for i in range(len(points)):
plt.plot(points[i][0],(points[i][1]),'.',255,1)
if i<=106:
plt.text(points[i][0], (points[i][1]), str(i), fontsize=5)
else:
plt.text(points[i][0], (points[i][1]), str(i), fontsize=3)
plt.show()
================================================
FILE: fusion/test.py
================================================
import multiprocessing
import time
from tqdm import *
# list = [1, 2, 3, 4]
def func(i):
msg = "hello %d" % (list[i])
print ("msg:", msg)
time.sleep(3)
print("end")
list = [1, 2, 3, 4]
data = []
if __name__ == "__main__":
pool = multiprocessing.Pool(processes = 4)
# list = [1, 2, 3, 4]
for i in tqdm(range(4)):
data.append(i)
pool.map(func, data)
# pool.apply_async(func, (i, list, )) #维持执行的进程总数为processes,当一个进程执行完毕后会添加新的进程进去
print("Mark~ Mark~ Mark~~~~~~~~~~~~~~~~~~~~~~")
pool.close()
pool.join()
print("Sub-process(es) done.")
================================================
FILE: fusion/warper.py
================================================
import numpy as np
import scipy.spatial as spatial
from builtins import range
import cv2
from matplotlib import pyplot as plt
def warping(img, src_bound, dst_bound, size=(256, 256)):
d = 254
bound = np.array([0, 0, 0, d, d, 0, d, d]).reshape(-1, 2)
src_bound = np.vstack([src_bound, bound]).astype(np.int32)
dst_bound = np.vstack([dst_bound, bound]).astype(np.int32)
src_bound[src_bound > d] = d
dst_bound[dst_bound > d] = d
src_bound = src_bound.astype(np.int32)
dst_bound = dst_bound.astype(np.int32)
res = warp_image(img, src_bound, dst_bound, size)
return res
def bilinear_interpolate(img, coords):
""" Interpolates over every image channel
http://en.wikipedia.org/wiki/Bilinear_interpolation
:param img: max 3 channel image
:param coords: 2 x _m_ array. 1st row = xcoords, 2nd row = ycoords
:returns: array of interpolated pixels with same shape as coords
"""
int_coords = np.int32(coords)
x0, y0 = int_coords
dx, dy = coords - int_coords
# 4 Neighour pixels
q11 = img[y0, x0]
q21 = img[y0, x0+1]
q12 = img[y0+1, x0]
q22 = img[y0+1, x0+1]
btm = q21.T * dx + q11.T * (1 - dx)
top = q22.T * dx + q12.T * (1 - dx)
inter_pixel = top * dy + btm * (1 - dy)
return inter_pixel.T
def grid_coordinates(points):
""" x,y grid coordinates within the ROI of supplied points
:param points: points to generate grid coordinates
:returns: array of (x, y) coordinates
"""
xmin = np.min(points[:, 0])
xmax = np.max(points[:, 0]) + 1
ymin = np.min(points[:, 1])
ymax = np.max(points[:, 1]) + 1
return np.asarray([(x, y) for y in range(ymin, ymax)
for x in range(xmin, xmax)], np.uint32)
def process_warp(src_img, result_img, tri_affines, dst_points, delaunay):
"""
Warp each triangle from the src_image only within the
ROI of the destination image (points in dst_points).
"""
roi_coords = grid_coordinates(dst_points)
# indices to vertices. -1 if pixel is not in any triangle
roi_tri_indices = delaunay.find_simplex(roi_coords)
for simplex_index in range(len(delaunay.simplices)):
coords = roi_coords[roi_tri_indices == simplex_index]
num_coords = len(coords)
out_coords = np.dot(tri_affines[simplex_index],
np.vstack((coords.T, np.ones(num_coords))))
x, y = coords.T
result_img[y, x] = bilinear_interpolate(src_img, out_coords)
return None
def triangular_affine_matrices(vertices, src_points, dest_points):
"""
Calculate the affine transformation matrix for each
triangle (x,y) vertex from dest_points to src_points
:param vertices: array of triplet indices to corners of triangle
:param src_points: array of [x, y] points to landmarks for source image
:param dest_points: array of [x, y] points to landmarks for destination image
:returns: 2 x 3 affine matrix transformation for a triangle
"""
ones = [1, 1, 1]
for tri_indices in vertices:
src_tri = np.vstack((src_points[tri_indices, :].T, ones))
dst_tri = np.vstack((dest_points[tri_indices, :].T, ones))
mat = np.dot(src_tri, np.linalg.inv(dst_tri))[:2, :]
yield mat
def warp_image(src_img, src_points, dest_points, dest_shape, dtype=np.uint8):
# Resultant image will not have an alpha channel
num_chans = 3
src_img = src_img[:, :, :3]
rows, cols = dest_shape[:2]
result_img = np.zeros((rows, cols, num_chans), dtype)
delaunay = spatial.Delaunay(dest_points)
tri_affines = np.asarray(list(triangular_affine_matrices(
delaunay.simplices, src_points, dest_points)))
process_warp(src_img, result_img, tri_affines, dest_points, delaunay)
return result_img
if __name__ == "__main__":
pass
================================================
FILE: loader/__init__.py
================================================
================================================
FILE: loader/dataset_basic.py
================================================
# coding:utf-8
import sys
from utils.points2heatmap import curves2segments,points2curves
from utils import warper
import os
import numpy as np
import torch
import torch.utils.data
import cv2
from tqdm import *
class DatasetBasic(torch.utils.data.Dataset):
def __init__(self, imgSize=256):
# imgSize[int]
self.boundList = None
self.appearList = None
self.imgSize = imgSize
self.sigma = 3
def __len__(self):
return -1
def shape(self):
return self.__len__()
def loadtxt(self, path, head=''):
# path[string] : path to lms.txt
# format: subpath of img
# landmarks [106*2 + 20*2] or 40*2
# head[string] : head of subpath
res = list()
with open(path, 'r') as fin:
data = fin.read().splitlines()
N = len(data)//2
for i in tqdm(range(N)):
imgPath = os.path.join(head, data[2*i+0])
landmarks = list(map(float, data[2*i+1].split()))
res.append((imgPath, landmarks))
return res
def loadtxtList(self, pathList, head):
res = list()
for path in pathList:
res += self.loadtxt(path, head)
return res
def warp(self, img, srcPt, dstPt):
# img[ndarray]: shape = (3,D,D)
# srcPt[ndarray]: shape = (K,2) ,K key points
# dstPt[ndarray]: shape = (K,2) ,K key points
return warper.warping(img, srcPt.reshape(-1, 2), dstPt.reshape(-1, 2), (self.imgSize, self.imgSize))
def np2tensor(self, img, scale=1/255.0):
#========input=======
#img[ndarray][H,W,C] (0.0,1/scale)
#========output======
#img[ndarray][C,H,W] (-1.0,1.0)
img = img.transpose((2, 0, 1))
img = torch.from_numpy(img).float() * scale
img = (img - 0.5) / 0.5
return img
def points2heatmap(self, landmarks, mapSize, sigma, landmarkSize=255.0, heatmap_num=17):
# landmarks[ndarray] : shape = (K,2), K = 106 + 20*2
# landmarkSize [float]
# mapSize[int] : output size
# sigma[float] : gaussian sigma
# _________Return__________
# heatmap[tensor]: [C,H,W]
# curve[list] : list[list]
if landmarks.max() > 1:
landmarks /= landmarkSize
landmarks[landmarks > 1] = 1
curves, boundary = points2curves(landmarks, mapSize, sigma, heatmap_num)
# [C,H,W] (0.0,1.0)
heatmap = curves2segments(curves)
# np 2 tensor
heatmap = torch.from_numpy(heatmap).float()
# boundary heatmap
boundary = boundary.transpose([2, 0, 1])
boundary = torch.from_numpy(boundary).float()
return heatmap, curves, boundary
'''
def getRois(self, curve, sigma, onlyMask=False):
# curves[list[list]] :
# sigma[float] : gaussian sigma
# onlyMask[bool] : for train ,just need mask of face
bound = np.vstack([curve[1], curve[2], curve[0]])
mask_bound = genROI(bound, D=5, sigma=5)
if onlyMask:
return None, mask_bound
browL = np.vstack([curve[1], curve[13]])
browR = np.vstack([curve[2], curve[14]])
eyeL = np.vstack([curve[5], curve[6]])
eyeR = np.vstack([curve[7], curve[8]])
nose = np.vstack([curve[3], curve[4]])
teeth = np.vstack([curve[10], curve[11]])
mouth = np.vstack([curve[9], curve[12]])
mask_browL = genROI(browL)
mask_browR = genROI(browR)
mask_eyeL = genROI(eyeL, )
mask_eyeR = genROI(eyeR)
mask_nose = genROI(nose)
mask_mouth = genROI(mouth)
mask_teeth = genROI(teeth)
mask_skin = (1-mask_eyeL)*(1-mask_eyeR)*(1-mask_nose)*(1-mask_browL)*(1-mask_browR)\
* (1-mask_teeth)*(1-mask_mouth)
if len(curve) == 17:
# gaze
gazeL = curve[15]
gazeR = curve[16]
mask_gazeL = genROI(gazeL,erode=True)
mask_gazeR = genROI(gazeR,erode=True)
return {'browL': mask_browL, 'browR': mask_browR, 'eyeL': mask_eyeL, 'eyeR': mask_eyeR, 'nose': mask_nose,
'mouth': mask_mouth, 'teeth': mask_teeth, 'skin': mask_skin, 'gazeL': mask_gazeL, 'gazeR': mask_gazeR}, mask_bound
else:
return {'browL': mask_browL, 'browR': mask_browR, 'eyeL': mask_eyeL, 'eyeR': mask_eyeR, 'nose': mask_nose,
'mouth': mask_mouth, 'teeth': mask_teeth, 'skin': mask_skin, }, mask_bound
def fix_gaze(self, eye_roi, gaze_roi):
intersect = eye_roi * gaze_roi
return intersect
'''
def gammaTrans(self, img, gamma):
gamma_table = [np.power(x/255.0, gamma)*255.0 for x in range(256)]
gamma_table = np.round(np.array(gamma_table)).astype(np.uint8)
return cv2.LUT(img, gamma_table)
def __getitem__(self, index):
pass
================================================
FILE: loader/dataset_loader_demo.py
================================================
from loader.dataset_basic import *
import random
import numpy as np
import copy
import torch as th
from utils.affineFace import affineface
class DatasetLoaderDemo(DatasetBasic):
def __init__(self, imgSize=256, gaze=True):
super(DatasetLoaderDemo, self).__init__(imgSize)
self.boundList = None
self.appearList = None
self.rule = 'sequence'
self.indexAppear = 0
def loadBounds(self, pathList, head):
self.boundList = self.loadtxtList(pathList, head)
def loadAppears(self, pathList, head):
self.appearList = self.loadtxtList(pathList, head)
def setAppearRule(self, flag='random'):
# flag[string]: random / similar / sequence
# call this function after load data
if self.appearList == None:
print('please call setAppearRule after load data!')
if flag != 'random' and flag != 'similar' and flag != 'sequence':
print('rule: ', 'random / similar / sequence')
else:
self.rule = flag
if flag == 'random':
self.indexAppear = random.randint(0, len(self.appearList)-1)
else:
pass
def findSimilar(self, pt_dst):
minVal = 1e5
res = 0
for index in range(len(self.appearList)):
_, pt = self.appearList[index]
pt = (np.array(pt) - 64).reshape(-1, 2)
diff = np.linalg.norm(pt[:106] - pt_dst[:106])
if diff < minVal:
res = index
minVal = diff
return res
def adjustPose(self, img_src, pt_src, pt_dst):
img_align, pt_align = affineface(img_src, pt_src, pt_dst)
return img_align, pt_align
def add_nose_bridge(self, boundary, heatmap):
# add nose bridge boundary and dilate
nose_bridge = copy.copy(boundary[3:4])
kernel = np.ones((4, 4), np.uint8)
nose_bridge = 255 * torch.from_numpy(cv2.dilate(nose_bridge.squeeze(0).numpy(), kernel)).unsqueeze(0).float()
heatmap = torch.cat((heatmap, nose_bridge), 0)
return heatmap
def __getitem__(self, index):
# load dst
path, pt = self.boundList[index]
img_dst = cv2.imread(path, 1)[64:64+256, 64:64+256]
pt_dst = (np.array(pt) - 64).reshape(-1, 2)
# dst
heatmap_dst, curves_dst, boundary_dst = self.points2heatmap(pt_dst, self.imgSize, sigma=self.sigma)
heatmap_dst = self.add_nose_bridge(boundary_dst, heatmap_dst) # add nose bridge boundary and dilate
weighted_mask_dst = heatmap_dst[0:1] + 2 * heatmap_dst[1:2] + 3 * heatmap_dst[2:3] + 4 * heatmap_dst[3:4] + 2 * heatmap_dst[4:5] + \
3 * heatmap_dst[5:6] + 3 * heatmap_dst[6:7] + 2 * heatmap_dst[7:8] + heatmap_dst[8:]
#select reference
if self.rule == 'random':
index = self.indexAppear
elif self.rule == 'similar':
index = self.findSimilar(pt_dst)
elif self.rule == 'sequence':
index = min(index, len(self.appearList)-1)
# load src
path, pt = self.appearList[index]
img_src = cv2.imread(path, 1)[64:64+256, 64:64+256]
img_src_np = img_src
img_src = self.gammaTrans(img_src, 0.5)
pt_src = (np.array(pt) - 64).reshape(-1, 2)
pt_src_np = pt_src
# align pose src 2 dst
img_src,pt_src = self.adjustPose(img_src,pt_src/256.0,pt_dst/256.0)
img_src = self.warp(img_src, pt_src, np.vstack([pt_dst[:33], pt_src[33:]]))
# src
heatmap_src, curves_src, boundary_src = self.points2heatmap(pt_src, self.imgSize, sigma=self.sigma)
#np 2 tensor scale = [-1,1]
img_src = self.np2tensor(img_src)
img_dst = self.np2tensor(img_dst)
return {'img_src': img_src, 'face_mask_src': heatmap_src[0:1],
'img_dst': img_dst, 'face_mask_dst': heatmap_dst[0:1], 'seg_dst': heatmap_dst, 'weighted_mask_dst': weighted_mask_dst,
'pt_src': pt_src_np, 'pt_dst': pt_dst, 'img_src_np': img_src_np}
def __len__(self):
return len(self.boundList)
================================================
FILE: loader/dataset_loader_train.py
================================================
from loader.dataset_basic import *
from utils.transforms import initAlignTransfer
#from utils.transforms import shakeCurve
import random
import numpy as np
import torch as th
import copy
class DatasetLoaderTrain(DatasetBasic):
def __init__(self, imgSize=256, gaze=True):
super(DatasetLoaderTrain, self).__init__(imgSize)
self.transformAlign = initAlignTransfer(self.imgSize,mirror=False, gaze=gaze)
self.dataList = None
self.isTransform = True
self.SampleCurveType = 'Bound'
if gaze:
self.heatmap_num = 17
else:
self.heatmap_num = 15
def setSampleCurve(self, flag='Bound'):
self.SampleCurveType = flag
def transform(self, img, pt):
pack = self.transformAlign(pt, img)
pt = pack[0]
img = pack[1]
return img, pt
def loaddata(self, pathList, head):
# pathList[list] : [path1,path2,...]
# head[string]: head of subpath
self.dataList = self.loadtxtList(pathList, head)
def sampleCurve(self, img, pt, flag='Bound'):
# if flag == 'None':
# return img_src.copy(), pt_src.copy()
if flag == 'Bound':
_, pt_sample = random.sample(self.dataList, 1)[0]
pt_sample = (np.array(pt_sample) - 64).reshape(-1, 2)
pt_ = pt.copy()
pt_[:33] = pt_sample[:33]
img_warped = self.warp(img, pt, pt_)
return img_warped, pt_
# if flag == 'Shake':
# pt_ = shakeCurve(pt)
# img_warped = self.warp(img, pt, pt_)
# return img_warped, pt_
return None
def add_nose_bridge(self, boundary, heatmap):
# add nose bridge boundary and dilate
nose_bridge = copy.copy(boundary[3:4])
kernel = np.ones((4, 4), np.uint8)
nose_bridge = 255 * torch.from_numpy(cv2.dilate(nose_bridge.squeeze(0).numpy(), kernel)).unsqueeze(0).float() # todo
heatmap = torch.cat((heatmap, nose_bridge), 0)
return heatmap
def __getitem__(self, index):
path, pt = self.dataList[index]
img_src = cv2.imread(path, 1)[64:64+256, 64:64+256] # [256, 256, 3]
pt_src = (np.array(pt) - 64).reshape(-1, 2) # [146,2]
img_src = self.gammaTrans(img_src, 0.5)
# sample strategy
img_dst, pt_dst = self.sampleCurve(img_src, pt_src, flag=self.SampleCurveType)
# img_dst = copy.copy(img_src)
# pt_dst = copy.copy(pt_src)
# data augmentation
if self.isTransform:
img_dst, pt_dst = self.transform(img_dst, pt_dst)
# src
heatmap_src, curves_src, boundary_src = self.points2heatmap(pt_src, self.imgSize, sigma=self.sigma) # heatmap_src: tensor [8, 256, 256], curve_src: list of 17, each eliemnt: an array
heatmap_src = self.add_nose_bridge(boundary_src, heatmap_src)
weighted_mask_src = heatmap_src[0:1] + 2 * heatmap_src[1:2] + 3 * heatmap_src[2:3] + 4 * heatmap_src[3:4] + 2 * heatmap_src[4:5] + \
3 * heatmap_src[5:6] + 3 * heatmap_src[6:7] + 2 * heatmap_src[7:8] + heatmap_src[8:]
# dst
heatmap_dst, curves_dst, boundary_dst = self.points2heatmap(pt_dst, self.imgSize, sigma=self.sigma)
heatmap_dst = self.add_nose_bridge(boundary_dst, heatmap_dst) # add nose bridge boundary and dilate
weighted_mask_dst = heatmap_dst[0:1] + 2 * heatmap_dst[1:2] + 3 * heatmap_dst[2:3] + 4 * heatmap_dst[3:4] + 2 * heatmap_dst[4:5] + \
3 * heatmap_dst[5:6] + 3 * heatmap_dst[6:7] + 2 * heatmap_dst[7:8] + heatmap_dst[8:]
img_src = self.np2tensor(img_src)
img_dst = self.np2tensor(img_dst)
return {'img_src':img_src,'seg_src':heatmap_src,'face_mask_src':heatmap_src[0:1], 'boundary_dst': boundary_dst,
'img_dst':img_dst,'seg_dst':heatmap_dst,'face_mask_dst':heatmap_dst[0:1], 'weighted_mask_dst': weighted_mask_dst}
def __len__(self):
return len(self.dataList)
================================================
FILE: model/base_model.py
================================================
import os
import torch
from collections import OrderedDict
import net.base_net as base_net
import shutil
from tensorboardX import SummaryWriter
import json
import random
import logging
import datetime
class BaseModel():
# modify parser to add command line options,
# and also change the default values if needed
@staticmethod
def modify_commandline_options(parser, is_train):
return parser
def name(self):
return 'BaseModel'
def initialize(self, opt):
self.opt = opt
self.gpu_ids = opt.gpu_ids
self.isTrain = ('train' == opt.phase)
self.device = torch.device('cuda:{}'.format(
self.gpu_ids[0])) if self.gpu_ids else torch.device('cpu')
# self.save_dir = opt.save_dir
self.loss_names = []
self.visual_names = []
self.image_paths = []
self.train_model_name = []
# if os.path.exists(self.save_dir):
# shutil.rmtree(self.save_dir)
# os.makedirs(self.save_dir)
def set_input(self, input):
pass
def forward(self):
pass
def set_logger(self, opt):
if self.isTrain:
run_id = random.randint(1,100000)
self.logdir = os.path.join(opt.save_dir,str(run_id))
self.writer = SummaryWriter(self.logdir)
self.logger = self.get_logger(self.logdir)
self.logger.info('Let the games begin')
self.logger.info('save dir: runs/{}'.format(run_id))
print('log dir : ', self.logdir)
else:
self.logdir = os.path.join(opt.save_dir,'test_res')
self.writer = SummaryWriter(self.logdir)
def get_logger(self, logdir):
logger = logging.getLogger('myLogger')
ts = str(datetime.datetime.now()).split('.')[0].replace(" ", "_")
ts = ts.replace(":", "_").replace("-", "_")
file_path = os.path.join(logdir, 'run_{}.log'.format(ts))
hdlr = logging.FileHandler(file_path)
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
hdlr.setFormatter(formatter)
logger.addHandler(hdlr)
logger.setLevel(logging.INFO)
return logger
def save_config(self, config):
param_path = os.path.join(self.logdir, "params.json")
print("[*] PARAM path: %s" % param_path)
with open(param_path, 'w') as fp:
json.dump(config.__dict__, fp, indent=4, sort_keys=True)
# load and print networks; create schedulers
def setup(self, opt, parser=None):
if self.isTrain:
self.schedulers = [base_net.get_scheduler(
optimizer, opt) for optimizer in self.optimizers]
if not self.isTrain or opt.load_path:
# load_suffix = 'iter_%d' % opt.load_iter if opt.load_iter > 0 else opt.epoch
load_suffix = '{}/{}_net'.format(opt.load_path, opt.load_model_iter)
self.load_networks_all(load_suffix)
print('load {} successful!'.format(load_suffix))
self.print_networks(opt.verbose)
def load_networks_all(self, prefix):
for name in self.train_model_name:
if 'netD' in name:
continue
net = getattr(self, name)
load_filename = '{}_{}.pth'.format(prefix, name)
self.load_networks(net, load_filename)
# load model
def load_networks(self, model, path):
if isinstance(model, torch.nn.DataParallel):
model = model.module
pretrainDict = torch.load(path, map_location=self.device)
modelDict = model.state_dict()
for kk, vv in pretrainDict.items():
kk = kk.replace('module.', '')
if kk in modelDict:
modelDict[kk].copy_(vv)
else:
print('{} not in modelDict'.format(kk))
# model.load_state_dict(pretrainDict)
# print(modelDict.keys())
# make models eval mode during test time
def eval(self):
for name in self.train_model_name:
if isinstance(name, str):
net = getattr(self, name)
net.eval()
# used in test time, wrapping `forward` in no_grad() so we don't save
# intermediate steps for backprop
def test(self):
with torch.no_grad():
self.forward()
# get image paths
def get_image_paths(self):
return self.image_paths
def optimize_parameters(self):
pass
# update learning rate (called once every epoch)
def update_learning_rate(self):
for scheduler in self.schedulers:
scheduler.step()
lr = self.optimizers[0].param_groups[0]['lr']
print('learning rate = %.7f' % lr)
# # return visualization images. train.py will display these images, and save the images to a html
# def get_current_visuals(self):
# visual_ret = OrderedDict()
# for name in self.visual_names:
# if isinstance(name, str):
# visual_ret[name] = getattr(self, name)
# return visual_ret
#
# # return traning losses/errors. train.py will print out these errors as debugging information
# def get_current_losses(self):
# errors_ret = OrderedDict()
# for name in self.loss_names:
# if isinstance(name, str):
# # float(...) works for both scalar tensor and float number
# errors_ret[name] = float(getattr(self, 'loss_' + name))
# return errors_ret
# save models to the disk
def save_networks(self, epoch):
for name in self.train_model_name:
if isinstance(name, str):
save_filename = '%s_net_%s.pth' % (epoch, name)
save_path = os.path.join(self.logdir, save_filename)
net = getattr(self, name)
if len(self.gpu_ids) > 0 and torch.cuda.is_available():
torch.save(net.module.cpu().state_dict(), save_path)
net.cuda(self.gpu_ids[0])
if len(self.gpu_ids) > 1:
net = torch.nn.DataParallel(net, self.opt.gpu_ids)
else:
torch.save(net.cpu().state_dict(), save_path)
# print network information
def print_networks(self, verbose):
print('---------- Networks initialized -------------')
for name in self.train_model_name:
if isinstance(name, str):
net = getattr(self, name)
num_params = 0
for param in net.parameters():
num_params += param.numel()
if verbose:
print(net)
print('[Network %s] Total number of parameters : %.3f M' %
(name, num_params / 1e6))
print('-----------------------------------------------')
# set requies_grad=Fasle to avoid computation
def set_requires_grad(self, nets, requires_grad=False):
if not isinstance(nets, list):
nets = [nets]
for net in nets:
if net is not None:
for param in net.parameters():
param.requires_grad = requires_grad
================================================
FILE: model/spade_model.py
================================================
import sys
from model.base_model import BaseModel
import net.vgg_net as vgg_net
import net.generaotr_net as generator_net
import net.discriminator_net as discriminator_net
import net.appear_decoder_net as appDec
import net.appear_encoder_net as appEnc
import net.face_id_net as face_id_net
import torch
import torch.nn.functional as F
import itertools
from utils import metric
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
class SpadeModel(BaseModel):
def __init__(self, opt):
super(SpadeModel, self).initialize(opt)
# specify the training losses you want to print out. The program will call base_model.get_current_losses
# self.loss_names = ['vgg', 'id', 'reconstruct', 'gan']
self.train_model_name = ['appEnc', 'appDnc', 'netG']
# define appearance encoder, decoder
self.appEnc = appEnc.defineAppEnc(
3, norm='instance', init_type='normal', init_gain=0.02, gpu_ids=self.opt.gpu_ids, conv_k=3)
self.appDnc = appDec.defineAppDec(
3, norm='instance', init_type='normal', init_gain=0.02, gpu_ids=self.opt.gpu_ids)
self.netG = generator_net.defineSPADEGenerator(opt.input_nc, opt.output_nc, 64, norm='instance',
init_type='normal', init_gain=0.02, gpu_ids=self.opt.gpu_ids,
latent_chl=1024, up_mode='convT')
# -----pass-----
if self.isTrain:
# specify the models you want to save to the disk. The program will call base_model.save_networks and base_model.load_networks
self.pretrain_model_name = []
if self.opt.loss_percept:
self.pretrain_model_name.append('vgg')
if self.opt.loss_faceID:
self.pretrain_model_name.append('faceId')
self.train_model_name += ['netD256', 'netD128', 'netD64']
# load vgg and faceID networks
if self.opt.loss_percept:
self.vgg = vgg_net.defineVGG(
init_type='no', gpu_ids=self.opt.gpu_ids).eval()
if self.opt.loss_faceID:
self.faceId = face_id_net.defineFaceID(
input_nc=opt.output_nc, gpu_ids=self.opt.gpu_ids).eval()
faceId_path = 'pretrainModel/id_200.pth'
self.load_networks(self.faceId, faceId_path)
use_sigmoid = opt.no_lsgan
self.netD256 = discriminator_net.define_D(opt.output_nc, opt.ndf, opt.netD,
opt.n_layers_D, opt.norm, use_sigmoid, opt.init_type,
opt.init_gain,
self.gpu_ids)
self.netD128 = discriminator_net.define_D(opt.output_nc, opt.ndf, opt.netD,
2, opt.norm, use_sigmoid, opt.init_type, opt.init_gain,
self.gpu_ids)
self.netD64 = discriminator_net.define_D(opt.output_nc, opt.ndf, opt.netD,
2, opt.norm, use_sigmoid, opt.init_type, opt.init_gain,
self.gpu_ids)
# initialize optimizers
self.optimizers = []
self.optimizer_G = torch.optim.Adam(filter(lambda p: p.requires_grad,
itertools.chain(self.netG.parameters(),
self.appEnc.parameters(),
self.appDnc.parameters())),
lr=opt.lr, betas=(opt.beta1, 0.999))
self.optimizer_D = torch.optim.Adam(
itertools.chain(self.netD256.parameters(), self.netD128.parameters(), self.netD64.parameters()),
lr=0.5 * opt.lr, betas=(opt.beta1, 0.999))
self.optimizers.append(self.optimizer_G)
self.optimizers.append(self.optimizer_D)
self.loss_D = torch.tensor(0).float().to(
device) # initialize D_loss and gan_loss for G to 0 (first several epochs may not use gan)
self.loss_gan = torch.tensor(0).float().to(device)
# define loss functions
self.criterionVGG = torch.nn.L1Loss().to(self.device)
self.criterionId = torch.nn.L1Loss().to(self.device)
self.criterionReconstruct = torch.nn.L1Loss().to(self.device)
self.criterionPix = torch.nn.L1Loss().to(self.device)
self.criterionGAN = discriminator_net.GANLoss(use_lsgan=not opt.no_lsgan).to(self.device)
def set_input(self, input):
self.seg_dst = input['seg_dst'].to(self.device)
self.img_src = input['img_src'].to(self.device)
self.srcMask = input['face_mask_src'].to(self.device)
self.dstMask = input['face_mask_dst'].to(self.device)
# apply ref & mask
if 'img_dst' in input and self.isTrain:
self.groundtruth = input['img_dst'].to(self.device)
self.groundtruth = self.groundtruth * self.srcMask
if 'weighted_mask_dst' in input:
self.weightMask = input['weighted_mask_dst'].to(self.device)
def forward(self):
sample_z, kl_loss, _ = self.appEnc(self.img_src) # [batch_size,1024,1,1]
out16, out32, out64, out128, self.out256 = self.appDnc(sample_z) # [1024, 16, 16,] [512, 32, 32], [256, 64, 64], [128, 128, 128], [3, 256, 256]
self.fake_B = self.netG(self.seg_dst, sample_z, [out16, out32, out64, out128]) # [batch_size, 3, 256, 256]
if self.isTrain:
self.gt128 = F.max_pool2d(self.groundtruth, 3, stride=2)
self.gt64 = F.max_pool2d(self.gt128, 3, stride=2)
self.fake128 = F.max_pool2d(self.fake_B, 3, stride=2)
self.fake64 = F.max_pool2d(self.fake128, 3, stride=2)
return self.fake_B
def backward_D_basic(self, netD, real, fake):
# Real
pred_real = netD(real)
loss_D_real = self.criterionGAN(pred_real, True)
# Fake
pred_fake = netD(fake.detach())
loss_D_fake = self.criterionGAN(pred_fake, False)
# Combined loss
self.loss_D = (loss_D_real + loss_D_fake) * 0.5
return self.loss_D
def backward_D(self):
self.lossD256 = self.backward_D_basic(self.netD256, self.groundtruth, self.fake_B)
self.lossD128 = self.backward_D_basic(self.netD128, self.gt128, self.fake128)
self.lossD64 = self.backward_D_basic(self.netD64, self.gt64, self.fake64)
self.loss_D = self.lossD256 + self.lossD128 + self.lossD64
self.loss_D.backward()
def backward_G(self, epoch):
# perceptual loss
if self.opt.loss_percept:
self.loss_vgg = self.opt.lambda_vgg * (
self.vgg.module.perceptual_loss(self.fake_B, self.groundtruth, self.criterionVGG) if hasattr(
self.vgg, 'module') else self.vgg.perceptual_loss(self.fake_B, self.groundtruth, self.criterionVGG))
# Identity loss
if self.opt.loss_faceID:
fake_B_id = self.fake_B
gt_id = self.groundtruth
fake_B_id = fake_B_id[:,:,28:228, 28:228]
gt_id = gt_id[:,:,28:228, 28:228]
self.loss_id = self.opt.lambda_id * (
self.faceId.module.face_id_loss(fake_B_id, gt_id, self.criterionId) if hasattr(
self.faceId, 'module') else self.faceId.face_id_loss(fake_B_id, gt_id, self.criterionId))
# GAN loss
if epoch >= self.opt.gan_start_epoch:
self.loss_gan = self.opt.lambda_gan * (
self.criterionGAN(self.netD256(self.fake_B), True) + self.criterionGAN(self.netD128(self.fake128), True) + self.criterionGAN(self.netD64(self.fake64), True))
# AE reconstruction loss
self.loss_reconstruct = self.opt.lambda_reconstruct * self.criterionReconstruct(self.out256, self.img_src)
# pixel loss between gt and generated image
fake_B_pix = self.fake_B * (0.5 + self.weightMask)
gt_pix = self.groundtruth * (0.5 + self.weightMask)
self.loss_pix = self.opt.lambda_pix * self.criterionPix(fake_B_pix, gt_pix)
# combined loss
self.loss_G = torch.tensor(0).float().to(device)
self.loss_G += self.loss_reconstruct
self.loss_G += self.loss_pix
if self.opt.loss_percept:
self.loss_G += self.loss_vgg
if self.opt.loss_faceID:
self.loss_G += self.loss_id
self.loss_G += self.loss_gan
self.loss_G.backward()
def func_require_grad(self, model_, flag_):
for mm in model_:
self.set_requires_grad(mm, flag_)
def func_zero_grad(self, model_):
for mm in model_:
mm.zero_grad()
def optimize_parameters(self, epoch):
self.forward()
# D
if epoch >= self.opt.gan_start_epoch: # start to include D after xxx epochs
self.func_require_grad([self.netD256, self.netD128, self.netD64], True)
self.func_zero_grad([self.netD256, self.netD128, self.netD64])
self.backward_D()
self.optimizer_D.step()
# G
self.func_require_grad([self.netD256, self.netD128, self.netD64], False)
self.optimizer_G.zero_grad()
self.backward_G(epoch) # start to include gan loss for G after xxx epochs
self.optimizer_G.step()
================================================
FILE: net/ResNet.py
================================================
# -*- coding: utf-8 -*-
"""
Created on 18-5-21 下午5:26
@author: ronghuaiyang
"""
import torch
import torch.nn as nn
import math
import torch.utils.model_zoo as model_zoo
import torch.nn.utils.weight_norm as weight_norm
import torch.nn.functional as F
model_urls = {
'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth',
'resnet34': 'https://download.pytorch.org/models/resnet34-333f7ec4.pth',
'resnet50': 'https://download.pytorch.org/models/resnet50-19c8e357.pth',
'resnet101': 'https://download.pytorch.org/models/resnet101-5d3b4d8f.pth',
'resnet152': 'https://download.pytorch.org/models/resnet152-b121ed2d.pth',
}
def conv3x3(in_planes, out_planes, stride=1):
"""3x3 convolution with padding"""
return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride,
padding=1, bias=False)
class BasicBlock(nn.Module):
expansion = 1
def __init__(self, inplanes, planes, stride=1, downsample=None):
super(BasicBlock, self).__init__()
self.conv1 = conv3x3(inplanes, planes, stride)
self.bn1 = nn.BatchNorm2d(planes)
self.relu = nn.ReLU(inplace=True)
self.conv2 = conv3x3(planes, planes)
self.bn2 = nn.BatchNorm2d(planes)
self.downsample = downsample
self.stride = stride
def forward(self, x):
residual = x
out = self.conv1(x)
out = self.bn1(out)
out = self.relu(out)
out = self.conv2(out)
out = self.bn2(out)
if self.downsample is not None:
residual = self.downsample(x)
out += residual
out = self.relu(out)
return out
class IRBlock(nn.Module):
expansion = 1
def __init__(self, inplanes, planes, stride=1, downsample=None, use_se=True):
super(IRBlock, self).__init__()
self.bn0 = nn.BatchNorm2d(inplanes)
self.conv1 = conv3x3(inplanes, inplanes)
self.bn1 = nn.BatchNorm2d(inplanes)
self.prelu = nn.PReLU()
self.conv2 = conv3x3(inplanes, planes, stride)
self.bn2 = nn.BatchNorm2d(planes)
self.downsample = downsample
self.stride = stride
self.use_se = use_se
if self.use_se:
self.se = SEBlock(planes)
def forward(self, x):
residual = x
out = self.bn0(x)
out = self.conv1(out)
out = self.bn1(out)
out = self.prelu(out)
out = self.conv2(out)
out = self.bn2(out)
if self.use_se:
out = self.se(out)
if self.downsample is not None:
residual = self.downsample(x)
out += residual
out = self.prelu(out)
return out
class Bottleneck(nn.Module):
expansion = 4
def __init__(self, inplanes, planes, stride=1, downsample=None):
super(Bottleneck, self).__init__()
self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False)
self.bn1 = nn.BatchNorm2d(planes)
self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride,
padding=1, bias=False)
self.bn2 = nn.BatchNorm2d(planes)
self.conv3 = nn.Conv2d(
planes, planes * self.expansion, kernel_size=1, bias=False)
self.bn3 = nn.BatchNorm2d(planes * self.expansion)
self.relu = nn.ReLU(inplace=True)
self.downsample = downsample
self.stride = stride
def forward(self, x):
residual = x
out = self.conv1(x)
out = self.bn1(out)
out = self.relu(out)
out = self.conv2(out)
out = self.bn2(out)
out = self.relu(out)
out = self.conv3(out)
out = self.bn3(out)
if self.downsample is not None:
residual = self.downsample(x)
out += residual
out = self.relu(out)
return out
class SEBlock(nn.Module):
def __init__(self, channel, reduction=16):
super(SEBlock, self).__init__()
self.avg_pool = nn.AdaptiveAvgPool2d(1)
self.fc = nn.Sequential(
nn.Linear(channel, channel // reduction),
nn.PReLU(),
nn.Linear(channel // reduction, channel),
nn.Sigmoid()
)
def forward(self, x):
b, c, _, _ = x.size()
y = self.avg_pool(x).view(b, c)
y = self.fc(y).view(b, c, 1, 1)
return x * y
class ResNetFace(nn.Module):
def __init__(self, block, layers, input_nc, use_se=True):
self.inplanes = 64
self.use_se = use_se
super(ResNetFace, self).__init__()
self.conv1 = nn.Conv2d(input_nc, 64, kernel_size=3, padding=1, bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.prelu = nn.PReLU()
self.maxpool = nn.MaxPool2d(kernel_size=2, stride=2)
self.layer1 = self._make_layer(block, 64, layers[0])
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
self.bn4 = nn.BatchNorm2d(512)
self.dropout = nn.Dropout()
#self.fc5 = nn.Linear(512 * 8 * 8, 512) # 128
self.fc5 = nn.Linear(512 * 13 * 13, 512) # 200
self.bn5 = nn.BatchNorm1d(512)
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.xavier_normal_(m.weight)
elif isinstance(m, nn.BatchNorm2d) or isinstance(m, nn.BatchNorm1d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
elif isinstance(m, nn.Linear):
nn.init.xavier_normal_(m.weight)
nn.init.constant_(m.bias, 0)
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride,
downsample, use_se=self.use_se))
self.inplanes = planes
for i in range(1, blocks):
layers.append(block(self.inplanes, planes, use_se=self.use_se))
return nn.Sequential(*layers)
def forward(self, x):
x = self.conv1(x)
x = self.bn1(x)
x = self.prelu(x)
x = self.maxpool(x)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.layer4(x)
x = self.bn4(x)
#print(x.size())
x = self.dropout(x)
x = x.view(x.size(0), -1)
x = self.fc5(x)
x = self.bn5(x)
return x
class ResNet(nn.Module):
def __init__(self, block, layers):
self.inplanes = 64
super(ResNet, self).__init__()
# self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
# bias=False)
self.conv1 = nn.Conv2d(1, 64, kernel_size=3, stride=1, padding=1,
bias=False)
self.bn1 = nn.BatchNorm2d(64)
self.relu = nn.ReLU(inplace=True)
# self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
self.layer1 = self._make_layer(block, 64, layers[0], stride=2)
self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
# self.avgpool = nn.AvgPool2d(8, stride=1)
# self.fc = nn.Linear(512 * block.expansion, num_classes)
self.fc5 = nn.Linear(512 * 8 * 8, 512)
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(
m.weight, mode='fan_out', nonlinearity='relu')
elif isinstance(m, nn.BatchNorm2d):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
def _make_layer(self, block, planes, blocks, stride=1):
downsample = None
if stride != 1 or self.inplanes != planes * block.expansion:
downsample = nn.Sequential(
nn.Conv2d(self.inplanes, planes * block.expansion,
kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(planes * block.expansion),
)
layers = []
layers.append(block(self.inplanes, planes, stride, downsample))
self.inplanes = planes * block.expansion
for i in range(1, blocks):
layers.append(block(self.inplanes, planes))
return nn.Sequential(*layers)
def forward(self, x):
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
# x = self.maxpool(x)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.layer4(x)
# x = nn.AvgPool2d(kernel_size=x.size()[2:])(x)
# x = self.avgpool(x)
x = x.view(x.size(0), -1)
x = self.fc5(x)
return x
def resnet18(pretrained=False, **kwargs):
"""Constructs a ResNet-18 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet(BasicBlock, [2, 2, 2, 2], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet18']))
return model
def resnet34(pretrained=False, **kwargs):
"""Constructs a ResNet-34 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet(BasicBlock, [3, 4, 6, 3], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet34']))
return model
def resnet50(pretrained=False, **kwargs):
"""Constructs a ResNet-50 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet(Bottleneck, [3, 4, 6, 3], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet50']))
return model
def resnet101(pretrained=False, **kwargs):
"""Constructs a ResNet-101 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet(Bottleneck, [3, 4, 23, 3], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet101']))
return model
def resnet152(pretrained=False, **kwargs):
"""Constructs a ResNet-152 model.
Args:
pretrained (bool): If True, returns a model pre-trained on ImageNet
"""
model = ResNet(Bottleneck, [3, 8, 36, 3], **kwargs)
if pretrained:
model.load_state_dict(model_zoo.load_url(model_urls['resnet152']))
return model
def resnet_face18(input_nc, use_se=True, **kwargs):
model = ResNetFace(IRBlock, [2, 2, 2, 2], input_nc, use_se=use_se, **kwargs)
return model
================================================
FILE: net/appear_decoder_net.py
================================================
import torch as th
from torch import nn
import net.base_net as base_net
###############################################################################
# define
###############################################################################
def defineAppDec(input_nc, size_=256, norm='batch', init_type='normal', init_gain=0.02, gpu_ids=[]):
net = None
norm_layer = base_net.get_norm_layer(norm_type=norm)
if 128 == size_:
net = appearDec128(input_nc, norm_layer=norm_layer, size_=size_)
elif 256 == size_:
net = appearDec(input_nc, norm_layer=norm_layer, size_=size_)
return base_net.init_net(net, init_type, init_gain, gpu_ids)
class appearDec(nn.Module):
def __init__(self, input_c, norm_layer, size_=256):
super(appearDec, self).__init__()
# input 3x256x256
# encoder
layers = []
channel_list = [1024, 1024, 1024, 1024]
c0 = 1024
for cc in channel_list:
layers.append(nn.ConvTranspose2d(c0, cc, 4, 2, 1))
layers.append(norm_layer(cc))
layers.append(nn.ReLU(True))
c0 = cc
self.decoder16 = nn.Sequential(*layers)
self.decoder32 = nn.Sequential(nn.ConvTranspose2d(1024, 512, 4, 2, 1), norm_layer(512), nn.ReLU(True))
self.decoder64 = nn.Sequential(nn.ConvTranspose2d(512, 256, 4, 2, 1), norm_layer(256), nn.ReLU(True))
self.decoder128 = nn.Sequential(nn.ConvTranspose2d(256, 128, 4, 2, 1), norm_layer(128), nn.ReLU(True))
layers = []
layers.append(nn.ConvTranspose2d(128, 3, 4, 2, 1))
layers.append(nn.Tanh())
self.decoder256 = nn.Sequential(*layers)
def forward(self, input):
out16 = self.decoder16(input)
out32 = self.decoder32(out16)
out64 = self.decoder64(out32)
out128 = self.decoder128(out64)
out256 = self.decoder256(out128)
return out16, out32, out64, out128, out256
class appearDec128(nn.Module):
def __init__(self, input_c, norm_layer, size_=256):
super(appearDec128, self).__init__()
# input 3x256x256
# encoder
layers = []
channel_list = [1024, 1024, 1024]
c0 = 1024
for cc in channel_list:
layers.append(nn.ConvTranspose2d(c0, cc, 4, 2, 1))
layers.append(norm_layer(cc))
layers.append(nn.ReLU(True))
c0 = cc
self.decoder8 = nn.Sequential(*layers)
self.decoder16 = nn.Sequential(nn.ConvTranspose2d(1024, 512, 4, 2, 1), norm_layer(512), nn.ReLU(True))
self.decoder32 = nn.Sequential(nn.ConvTranspose2d(512, 256, 4, 2, 1), norm_layer(256), nn.ReLU(True))
self.decoder64 = nn.Sequential(nn.ConvTranspose2d(256, 128, 4, 2, 1), norm_layer(128), nn.ReLU(True))
layers = []
layers.append(nn.ConvTranspose2d(128, 3, 4, 2, 1))
layers.append(nn.Tanh())
self.decoder128 = nn.Sequential(*layers)
def forward(self, input):
out8 = self.decoder8(input)
out16 = self.decoder16(out8)
out32 = self.decoder32(out16)
out64 = self.decoder64(out32)
out128 = self.decoder128(out64)
return out8, out16, out32, out64, out128
================================================
FILE: net/appear_encoder_net.py
================================================
import torch as th
from torch import nn
import net.base_net as base_net
###############################################################################
# define
###############################################################################
def defineAppEnc(input_nc, size_=256, norm='batch', init_type='normal', init_gain=0.02, gpu_ids=[], conv_k=4):
net = None
norm_layer = base_net.get_norm_layer(norm_type=norm)
net = appearEnc(input_nc, norm_layer=norm_layer, size_=size_, conv_k=conv_k)
return base_net.init_net(net, init_type, init_gain, gpu_ids)
# class appearEnc(nn.Module):
# def __init__(self, input_c, norm_layer, size_=256, conv_k=4):
# super(appearEnc, self).__init__()
# # input 3x256x256
# # encoder
# channel_list = [128, 256, 512, 1024, 1024, 1024]
# c0 = 64
# self.layer1 = nn.Sequential(
# nn.Conv2d(input_c, c0, conv_k, 2, 1),
# nn.LeakyReLU(0.2)
# )
# self.layer2 = nn.Sequential(
# nn.Conv2d(c0, channel_list[0], conv_k, 2, 1),
# norm_layer(channel_list[0]),
# nn.LeakyReLU(0.2)
# )
# self.layer3 = nn.Sequential(
# nn.Conv2d(channel_list[0], channel_list[1], conv_k, 2, 1),
# norm_layer(channel_list[1]),
# nn.LeakyReLU(0.2)
# )
# self.layer4 = nn.Sequential(
# nn.Conv2d(channel_list[1], channel_list[2], conv_k, 2, 1),
# norm_layer(channel_list[2]),
# nn.LeakyReLU(0.2)
# )
# self.layer5 = nn.Sequential(
# nn.Conv2d(channel_list[2], channel_list[3], conv_k, 2, 1),
# norm_layer(channel_list[3]),
# nn.LeakyReLU(0.2)
# )
# self.layer6 = nn.Sequential(
# nn.Conv2d(channel_list[3], channel_list[4], conv_k, 2, 1),
# norm_layer(channel_list[4]),
# nn.LeakyReLU(0.2)
# )
# self.layer7 = nn.Sequential(
# nn.Conv2d(channel_list[4], channel_list[5], conv_k, 2, 1),
# norm_layer(channel_list[5]),
# nn.LeakyReLU(0.2)
# )
# self.mean = nn.Conv2d(1024, 1024, conv_k, 2, 1)
class appearEnc(nn.Module):
def __init__(self, input_c, norm_layer, size_=256, conv_k=4):
super(appearEnc, self).__init__()
# input 3x256x256
# encoder
layers = []
channel_list = [128, 256, 512, 1024, 1024, 1024]
c0 = 64
layers.append(nn.Conv2d(input_c, c0, conv_k, 2, 1))
layers.append(nn.LeakyReLU(0.2))
for cc in channel_list:
layers.append(nn.Conv2d(c0, cc, conv_k, 2, 1))
layers.append(norm_layer(cc))
layers.append(nn.LeakyReLU(0.2))
c0 = cc
self.encoder = nn.Sequential(*layers)
# mean
layers = []
layers.append(nn.Conv2d(1024, 1024, conv_k, 2, 1))
# layers.append(nn.ReLU())
self.mean = nn.Sequential(*layers)
# self.logvar = nn.Sequential(*layers)
def sample_z(self, z_mu):
z_std = 1.0
eps = th.randn(z_mu.size()).type_as(z_mu) # random number in [0,1]
return z_mu + z_std * eps
# def sample_z(self, z_mu, z_logvar):
# z_std = th.exp(0.5 * z_logvar)
# eps = th.randn_like(z_std)
# return z_mu + z_std * eps
def kl_loss(self, z_mu):
#kl_loss = torch.mean(0.5 * torch.sum(torch.exp(z_var) + z_mu**2 - 1. - z_var, 1))
z_var = th.ones(z_mu.size()).type_as(z_mu) # [batch_size, 1024, 1, 1]
kl_loss_ = th.mean(0.5 * th.sum(th.exp(z_var) + z_mu**2 - 1. - z_var, 1))
return kl_loss_ # scalar loss
# def kl_loss(self, z_mu, z_logvar):
# kl_loss = -0.5 * th.mean(1 + z_logvar - z_mu.pow(2) - z_logvar.exp())
# return kl_loss # scalar loss
def freeze(self):
for module_ in self.encoder:
for p in module_.parameters():
p.requires_grad = False
for module_ in self.mean:
for p in module_.parameters():
p.requires_grad = False
# def forward(self, input):
# encoder = self.encoder(input) # input: [batch_size,3,256,256], encoder: [1, 1024, 2, 2]
# z_mu = self.mean(encoder) # [batch_size,1024,1,1]
# z_logvar = self.logvar(encoder) # [batch_size,1024,1,1]
#
# sample_z = self.sample_z(z_mu, z_logvar) # [batch_size,1024,1,1]
# kl_loss = self.kl_loss(z_mu, z_logvar) # scalar KL loss
# return sample_z, kl_loss, z_mu
def forward(self, input):
encoder = self.encoder(input) # input: [1,3,200,200]
z_mu = self.mean(encoder) # [1,1024,1,1]
sample_z = self.sample_z(z_mu) # [1,1024,1,1]
kl_loss = self.kl_loss(z_mu) # scalar KL loss
return sample_z, kl_loss, z_mu
# def forward(self, input):
# encode128 = self.layer1(input)
# encode64 = self.layer2(encode128)
# encode32 = self.layer3(encode64)
# encode16 = self.layer4(encode32)
# encode8 = self.layer5(encode16)
# encode4 = self.layer6(encode8)
# encode2 = self.layer7(encode4)
# z_mu = self.mean(encode2) # [1,1024,1,1]
# sample_z = self.sample_z(z_mu) # [1,1024,1,1]
# return sample_z, z_mu, encode2, encode4, encode8, encode16, encode32, encode64, encode128
================================================
FILE: net/base_net.py
================================================
import torch
import torch.nn as nn
from torch.nn import init
import functools
from torch.optim import lr_scheduler
###############################################################################
# base module set
###############################################################################
def get_norm_layer(norm_type='instance'):
if norm_type == 'batch':
norm_layer = functools.partial(nn.BatchNorm2d, affine=True)
elif norm_type == 'instance':
norm_layer = functools.partial(
nn.InstanceNorm2d, affine=False, track_running_stats=False)
elif norm_type == 'none':
norm_layer = None
else:
raise NotImplementedError(
'normalization layer [%s] is not found' % norm_type)
return norm_layer
def get_scheduler(optimizer, opt):
if opt.lr_policy == 'lambda':
def lambda_rule(epoch):
lr_l = 1.0 - max(0, epoch-
opt.niter) / float(opt.niter_decay + 1)
return lr_l
scheduler = lr_scheduler.LambdaLR(optimizer, lr_lambda=lambda_rule)
elif opt.lr_policy == 'step':
scheduler = lr_scheduler.StepLR(
optimizer, step_size=opt.lr_decay_iters, gamma=0.5)
elif opt.lr_policy == 'plateau':
scheduler = lr_scheduler.ReduceLROnPlateau(
optimizer, mode='min', factor=0.2, threshold=0.01, patience=5)
elif opt.lr_policy == 'cosine':
scheduler = lr_scheduler.CosineAnnealingLR(
optimizer, T_max=opt.niter, eta_min=0)
else:
return NotImplementedError('learning rate policy [%s] is not implemented', opt.lr_policy)
return scheduler
def init_weights(net, init_type='normal', gain=0.02):
def init_func(m):
classname = m.__class__.__name__
if hasattr(m, 'weight') and (classname.find('Conv') != -1 or classname.find('Linear') != -1):
if init_type == 'normal':
init.normal_(m.weight.data, 0.0, gain)
elif init_type == 'xavier':
init.xavier_normal_(m.weight.data, gain=gain)
elif init_type == 'kaiming':
init.kaiming_normal_(m.weight.data, a=0, mode='fan_in')
elif init_type == 'orthogonal':
init.orthogonal_(m.weight.data, gain=gain)
else:
raise NotImplementedError(
'initialization method [%s] is not implemented' % init_type)
if hasattr(m, 'bias') and m.bias is not None:
init.constant_(m.bias.data, 0.0)
elif classname.find('BatchNorm2d') != -1:
init.normal_(m.weight.data, 1.0, gain)
init.constant_(m.bias.data, 0.0)
if init_type == 'no':
print('not init')
else:
print('initialize network with %s' % init_type)
net.apply(init_func)
def init_net(net, init_type='normal', init_gain=0.02, gpu_ids=[]):
if len(gpu_ids) > 0:
assert(torch.cuda.is_available())
net.to(gpu_ids[0])
net = torch.nn.DataParallel(net, gpu_ids)
init_weights(net, init_type, gain=init_gain)
return net
================================================
FILE: net/discriminator_net.py
================================================
import torch
import torch.nn as nn
import functools
import net.base_net as base_net
###############################################################################
# discriminator define
###############################################################################
def define_D(input_nc, ndf, netD,
n_layers_D=3, norm='batch', use_sigmoid=False, init_type='normal', init_gain=0.02, gpu_ids=[]):
net = None
norm_layer = base_net.get_norm_layer(norm_type=norm)
if netD == 'basic':
net = NLayerDiscriminator(
input_nc, ndf, n_layers=3, norm_layer=norm_layer, use_sigmoid=use_sigmoid)
elif netD == 'n_layers':
net = NLayerDiscriminator(
input_nc, ndf, n_layers_D, norm_layer=norm_layer, use_sigmoid=use_sigmoid)
elif netD == 'pixel':
net = PixelDiscriminator(
input_nc, ndf, norm_layer=norm_layer, use_sigmoid=use_sigmoid)
else:
raise NotImplementedError(
'Discriminator model name [%s] is not recognized' % net)
return base_net.init_net(net, init_type, init_gain, gpu_ids)
##############################################################################
# Classes
##############################################################################
# Defines the GAN loss which uses either LSGAN or the regular GAN.
# When LSGAN is used, it is basically same as MSELoss,
# but it abstracts away the need to create the target label tensor
# that has the same size as the input
class GANLoss(nn.Module):
def __init__(self, use_lsgan=True, target_real_label=1.0, target_fake_label=0.0):
super(GANLoss, self).__init__()
self.register_buffer('real_label', torch.tensor(target_real_label))
self.register_buffer('fake_label', torch.tensor(target_fake_label))
if use_lsgan:
self.loss = nn.MSELoss()
else:
self.loss = nn.BCELoss()
def get_target_tensor(self, input, target_is_real):
if target_is_real:
target_tensor = self.real_label
else:
target_tensor = self.fake_label
return target_tensor.expand_as(input)
def __call__(self, input, target_is_real):
target_tensor = self.get_target_tensor(input, target_is_real)
return self.loss(input, target_tensor)
# Defines the PatchGAN discriminator with the specified arguments.
class NLayerDiscriminator(nn.Module):
def __init__(self, input_nc, ndf=64, n_layers=3, norm_layer=nn.BatchNorm2d, use_sigmoid=False):
super(NLayerDiscriminator, self).__init__()
if type(norm_layer) == functools.partial:
use_bias = norm_layer.func == nn.InstanceNorm2d
else:
use_bias = norm_layer == nn.InstanceNorm2d
kw = 4
padw = 1
sequence = [
nn.Conv2d(input_nc, ndf, kernel_size=kw, stride=2, padding=padw),
nn.LeakyReLU(0.2, True)
]
nf_mult = 1
nf_mult_prev = 1
for n in range(1, n_layers):
nf_mult_prev = nf_mult
nf_mult = min(2**n, 8)
sequence += [
nn.Conv2d(ndf * nf_mult_prev, ndf * nf_mult,
kernel_size=kw, stride=2, padding=padw, bias=use_bias),
norm_layer(ndf * nf_mult),
nn.LeakyReLU(0.2, True)
]
nf_mult_prev = nf_mult
nf_mult = min(2**n_layers, 8)
sequence += [
nn.Conv2d(ndf * nf_mult_prev, ndf * nf_mult,
kernel_size=kw, stride=1, padding=padw, bias=use_bias),
norm_layer(ndf * nf_mult),
nn.LeakyReLU(0.2, True)
]
sequence += [nn.Conv2d(ndf * nf_mult, 1,
kernel_size=kw, stride=1, padding=padw)]
if use_sigmoid:
sequence += [nn.Sigmoid()]
self.model = nn.Sequential(*sequence)
def forward(self, input):
return self.model(input)
class PixelDiscriminator(nn.Module):
def __init__(self, input_nc, ndf=64, norm_layer=nn.BatchNorm2d, use_sigmoid=False):
super(PixelDiscriminator, self).__init__()
if type(norm_layer) == functools.partial:
use_bias = norm_layer.func == nn.InstanceNorm2d
else:
use_bias = norm_layer == nn.InstanceNorm2d
self.net = [
nn.Conv2d(input_nc, ndf, kernel_size=1, stride=1, padding=0),
nn.LeakyReLU(0.2, True),
nn.Conv2d(ndf, ndf * 2, kernel_size=1,
stride=1, padding=0, bias=use_bias),
norm_layer(ndf * 2),
nn.LeakyReLU(0.2, True),
nn.Conv2d(ndf * 2, 1, kernel_size=1, stride=1, padding=0, bias=use_bias)]
if use_sigmoid:
self.net.append(nn.Sigmoid())
self.net = nn.Sequential(*self.net)
def forward(self, input):
return self.net(input)
================================================
FILE: net/face_id_mlp_net.py
================================================
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.nn import Parameter
import math
class MLP(nn.Module):
def __init__(self, input_nc, output_nc):
super(MLP, self).__init__()
self.fc = nn.Linear(input_nc, output_nc)
def forward(self, input):
return self.fc(input)
class ArcMarginProduct(nn.Module):
r"""Implement of large margin arc distance: :
Args:
in_features: size of each input sample
out_features: size of each output sample
s: norm of input feature
m: margin
cos(theta + m)
"""
def __init__(self, in_features, out_features, s=30.0, m=0.50, easy_margin=False):
super(ArcMarginProduct, self).__init__()
self.in_features = in_features
self.out_features = out_features
self.s = s
self.m = m
self.weight = Parameter(torch.FloatTensor(out_features, in_features))
nn.init.xavier_uniform_(self.weight)
self.easy_margin = easy_margin
self.cos_m = math.cos(m)
self.sin_m = math.sin(m)
self.th = math.cos(math.pi - m)
self.mm = math.sin(math.pi - m) * m
def forward(self, input, label):
# --------------------------- cos(theta) & phi(theta) ---------------------------
cosine = F.linear(F.normalize(input), F.normalize(self.weight))
sine = torch.sqrt(1.0 - torch.pow(cosine, 2))
phi = cosine * self.cos_m - sine * self.sin_m
if self.easy_margin:
phi = torch.where(cosine > 0, phi, cosine)
else:
phi = torch.where(cosine > self.th, phi, cosine - self.mm)
# --------------------------- convert label to one-hot ---------------------------
# one_hot = torch.zeros(cosine.size(), requires_grad=True, device='cuda')
one_hot = torch.zeros(cosine.size(), device='cuda')
one_hot.scatter_(1, label.view(-1, 1).long(), 1)
# -------------torch.where(out_i = {x_i if condition_i else y_i) -------------
# you can use torch.where if your torch.__version__ is 0.4
output = (one_hot * phi) + ((1.0 - one_hot) * cosine)
output *= self.s
# print(output)
return output
class AddMarginProduct(nn.Module):
r"""Implement of large margin cosine distance: :
Args:
in_features: size of each input sample
out_features: size of each output sample
s: norm of input feature
m: margin
cos(theta) - m
"""
def __init__(self, in_features, out_features, s=30.0, m=0.40):
super(AddMarginProduct, self).__init__()
self.in_features = in_features
self.out_features = out_features
self.s = s
self.m = m
self.weight = Parameter(torch.FloatTensor(out_features, in_features))
nn.init.xavier_uniform_(self.weight)
def forward(self, input, label):
# --------------------------- cos(theta) & phi(theta) ---------------------------
cosine = F.linear(F.normalize(input), F.normalize(self.weight))
phi = cosine - self.m
# --------------------------- convert label to one-hot ---------------------------
one_hot = torch.zeros(cosine.size(), device='cuda')
# one_hot = one_hot.cuda() if cosine.is_cuda else one_hot
one_hot.scatter_(1, label.view(-1, 1).long(), 1)
# -------------torch.where(out_i = {x_i if condition_i else y_i) -------------
# you can use torch.where if your torch.__version__ is 0.4
output = (one_hot * phi) + ((1.0 - one_hot) * cosine)
output *= self.s
# print(output)
return output
def __repr__(self):
return self.__class__.__name__ + '(' \
+ 'in_features=' + str(self.in_features) \
+ ', out_features=' + str(self.out_features) \
+ ', s=' + str(self.s) \
+ ', m=' + str(self.m) + ')'
class SphereProduct(nn.Module):
r"""Implement of large margin cosine distance: :
Args:
in_features: size of each input sample
out_features: size of each output sample
m: margin
cos(m*theta)
"""
def __init__(self, in_features, out_features, m=4):
super(SphereProduct, self).__init__()
self.in_features = in_features
self.out_features = out_features
self.m = m
self.base = 1000.0
self.gamma = 0.12
self.power = 1
self.LambdaMin = 5.0
self.iter = 0
self.weight = Parameter(torch.FloatTensor(out_features, in_features))
nn.init.xavier_uniform(self.weight)
# duplication formula
self.mlambda = [
lambda x: x ** 0,
lambda x: x ** 1,
lambda x: 2 * x ** 2 - 1,
lambda x: 4 * x ** 3 - 3 * x,
lambda x: 8 * x ** 4 - 8 * x ** 2 + 1,
lambda x: 16 * x ** 5 - 20 * x ** 3 + 5 * x
]
def forward(self, input, label):
# lambda = max(lambda_min,base*(1+gamma*iteration)^(-power))
self.iter += 1
self.lamb = max(self.LambdaMin, self.base *
(1 + self.gamma * self.iter) ** (-1 * self.power))
# --------------------------- cos(theta) & phi(theta) ---------------------------
cos_theta = F.linear(F.normalize(input), F.normalize(self.weight))
cos_theta = cos_theta.clamp(-1, 1)
cos_m_theta = self.mlambda[self.m](cos_theta)
theta = cos_theta.data.acos()
k = (self.m * theta / 3.14159265).floor()
phi_theta = ((-1.0) ** k) * cos_m_theta - 2 * k
NormOfFeature = torch.norm(input, 2, 1)
# --------------------------- convert label to one-hot ---------------------------
one_hot = torch.zeros(cos_theta.size())
one_hot = one_hot.cuda() if cos_theta.is_cuda else one_hot
one_hot.scatter_(1, label.view(-1, 1), 1)
# --------------------------- Calculate output ---------------------------
output = (one_hot * (phi_theta - cos_theta) /
(1 + self.lamb)) + cos_theta
output *= NormOfFeature.view(-1, 1)
return output
def __repr__(self):
return self.__class__.__name__ + '(' \
+ 'in_features=' + str(self.in_features) \
+ ', out_features=' + str(self.out_features) \
+ ', m=' + str(self.m) + ')'
================================================
FILE: net/face_id_net.py
================================================
import torch as th
from torch import nn
from net.ResNet import resnet_face18 as resnet18
from net.face_id_mlp_net import MLP
import net.base_net as base_net
###############################################################################
# define
###############################################################################
def defineFaceID(input_nc=3, class_num=10173, init_type='normal', init_gain=0.02, gpu_ids=[]):
net = None
net = faceIDNet(input_nc, class_num)
return base_net.init_net(net, init_type, init_gain, gpu_ids)
class faceIDNet(nn.Module):
def __init__(self, input_nc, class_num):
super(faceIDNet, self).__init__()
# input 3x256x256
self.feat = resnet18(input_nc, use_se=False)
self.mlp = MLP(512, class_num)
def forward(self, input):
feat = self.feat(input)
pred = self.mlp(feat)
return pred
def face_id_loss(self, x, target, loss_func):
targetIdFeat256 = self.feat(target).detach()
faceIDFeat = self.feat(x)
id_loss = loss_func(faceIDFeat, targetIdFeat256)
return id_loss
================================================
FILE: net/generaotr_net.py
================================================
import torch
import torch.nn as nn
import torch.nn.functional as F
import functools
import net.base_net as base_net
###############################################################################
# define spatially adaptive normalized generator
# input: boundary and apperance latent vector
###############################################################################
def defineSPADEGenerator(input_nc, output_nc, ngf, norm='instance', use_dropout=False, init_type='normal',
init_gain=0.02, gpu_ids=[], latent_chl=1024, up_mode='NF'):
norm_layer = base_net.get_norm_layer(norm_type=norm)
net = SPADEGenerator(input_nc, output_nc, ngf,
norm_layer=norm_layer, latent_chl=latent_chl, up_mode=up_mode)
return base_net.init_net(net, init_type, init_gain, gpu_ids)
##############################################################################
# Classes
##############################################################################
# class BasicSPADE(nn.Module):
# def __init__(self, norm_layer, input_nc, planes):
# super(BasicSPADE, self).__init__()
# self.norm = norm_layer(planes, affine=False)
#
# self.conv_weight1=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
# self.conv_bias1=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
# self.conv_weight2=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
# self.conv_bias2=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
# self.conv_weight3=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
# self.conv_bias3=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
# self.conv_weight4=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
# self.conv_bias4=nn.Conv2d(input_nc, input_nc, kernel_size=3, stride=1, padding=1)
#
# self.conv_weight=nn.Conv2d(input_nc, planes, kernel_size=3, stride=1, padding=1)
# self.conv_bias=nn.Conv2d(input_nc, planes, kernel_size=3, stride=1, padding=1)
#
# def forward(self, x, bound):
# out = self.norm(x)
#
# weight_norm1 = self.conv_weight1(bound)
# bias_norm1 = self.conv_bias1(bound)
# weight_norm2 = self.conv_weight2(weight_norm1)
# bias_norm2 = self.conv_bias2(bias_norm1)
# weight_norm3 = self.conv_weight3(weight_norm2)
# bias_norm3 = self.conv_bias3(bias_norm2)
# weight_norm4 = self.conv_weight4(weight_norm3)
# bias_norm4 = self.conv_bias4(bias_norm3)
#
# weight_norm = self.conv_weight(weight_norm4)
# bias_norm = self.conv_bias(bias_norm4)
#
# out = out * weight_norm + bias_norm
# return out
class BasicSPADE(nn.Module):
def __init__(self, norm_layer, input_nc, planes):
super(BasicSPADE, self).__init__()
self.conv_weight = nn.Conv2d(input_nc, planes, kernel_size=3, stride=1, padding=1)
self.conv_bias = nn.Conv2d(input_nc, planes, kernel_size=3, stride=1, padding=1)
self.norm = norm_layer(planes, affine=False)
def forward(self, x, bound):
out = self.norm(x)
weight_norm = self.conv_weight(bound)
bias_norm = self.conv_bias(bound)
out = out * weight_norm + bias_norm
return out
class ResBlkSPADE(nn.Module):
def __init__(self, norm_layer, input_nc, planes, conv_kernel_size=1, padding=0): # todo: change conv kernel size, kernel=3, padding=1 or kernel=1, padding=0
super(ResBlkSPADE, self).__init__()
self.spade1 = BasicSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=planes)
self.relu = nn.ReLU(inplace=True)
self.conv1 = nn.Conv2d(planes, planes, kernel_size=conv_kernel_size, stride=1, padding=padding)
self.spade2 = BasicSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=planes)
self.conv2 = nn.Conv2d(planes, planes, kernel_size=conv_kernel_size, stride=1, padding=padding)
self.spade_res = BasicSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=planes)
self.conv_res = nn.Conv2d(planes, planes, kernel_size=conv_kernel_size, stride=1, padding=padding)
def forward(self, x, bound):
out = self.spade1(x, bound)
out = self.relu(out)
out = self.conv1(out)
out = self.spade2(out, bound)
out = self.relu(out)
out = self.conv2(out)
residual = x
residual = self.spade_res(residual, bound)
residual = self.relu(residual)
residual = self.conv_res(residual)
out = out + residual
return out
# Defines the generator.
# |num_downs|: number of downsamplings in UNet. For example,
# if |num_downs| == 7, image of size 128x128 will become of size 1x1
# at the bottleneck
class SPADEGenerator(nn.Module):
def __init__(self, input_nc, output_nc, ngf=64,
norm_layer=nn.InstanceNorm2d, latent_chl=1024, up_mode='NF'):
super(SPADEGenerator, self).__init__()
layers = []
self.up_mode = up_mode
self.up1 = nn.ConvTranspose2d(in_channels=latent_chl, out_channels=512, kernel_size=4, stride=2, padding=1)
self.up2 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
if self.up_mode == 'convT':
self.up3 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
self.up4 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
self.up5 = nn.ConvTranspose2d(in_channels=256, out_channels=256, kernel_size=4, stride=2, padding=1)
self.up6 = nn.ConvTranspose2d(in_channels=128, out_channels=128, kernel_size=4, stride=2, padding=1)
self.up7 = nn.ConvTranspose2d(in_channels=64, out_channels=64, kernel_size=4, stride=2, padding=1)
self.up8 = nn.ConvTranspose2d(in_channels=64, out_channels=64, kernel_size=4, stride=2, padding=1)
# self.up3 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
# self.up4 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
# self.up5 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
# self.up6 = nn.ConvTranspose2d(in_channels=256, out_channels=256, kernel_size=4, stride=2, padding=1)
# self.up7 = nn.ConvTranspose2d(in_channels=128, out_channels=128, kernel_size=4, stride=2, padding=1)
# self.up8 = nn.ConvTranspose2d(in_channels=64, out_channels=64, kernel_size=4, stride=2, padding=1)
elif self.up_mode == 'NF':
self.up3 = nn.Upsample(scale_factor=2, mode='nearest')
self.up4 = nn.Upsample(scale_factor=2, mode='nearest')
self.up5 = nn.Upsample(scale_factor=2, mode='nearest')
self.up6 = nn.Upsample(scale_factor=2, mode='nearest')
self.up7 = nn.Upsample(scale_factor=2, mode='nearest')
self.up8 = nn.Upsample(scale_factor=2, mode='nearest')
self.spade_blc3 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
self.spade_blc4 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
self.spade_blc5 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=1024+512)
self.spade_blc6 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512+256)
self.spade_blc7 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=256+128)
self.spade_blc8 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=128+64)
self.conv5 = nn.Conv2d(in_channels=1024+512, out_channels=256, kernel_size=1, stride=1, padding=0)
self.conv6 = nn.Conv2d(in_channels=512+256, out_channels=128, kernel_size=1, stride=1, padding=0)
self.conv7 = nn.Conv2d(in_channels=256+128, out_channels=64, kernel_size=1, stride=1, padding=0)
self.conv8 = nn.Conv2d(in_channels=128+64, out_channels=64, kernel_size=1, stride=1, padding=0)
# self.spade_blc3 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc4 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc5 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc6 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=256)
# self.spade_blc7 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=128)
# self.spade_blc8 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=64)
#
# self.conv5 = nn.Conv2d(in_channels=1024 + 512, out_channels=512, kernel_size=1, stride=1, padding=0)
# self.conv6 = nn.Conv2d(in_channels=512 + 512, out_channels=256, kernel_size=1, stride=1, padding=0)
# self.conv7 = nn.Conv2d(in_channels=256 + 256, out_channels=128, kernel_size=1, stride=1, padding=0)
# self.conv8 = nn.Conv2d(in_channels=128 + 128, out_channels=64, kernel_size=1, stride=1, padding=0)
self.same = nn.Conv2d(in_channels=64, out_channels=3, kernel_size=3, stride=1, padding=1)
self.tanh = nn.Tanh()
def forward(self, input, latent_z, decoder_result): # input: bound, batch_size*17*256*256
bound128 = F.interpolate(input, scale_factor=0.5)
bound64 = F.interpolate(bound128, scale_factor=0.5)
bound32 = F.interpolate(bound64, scale_factor=0.5)
bound16 = F.interpolate(bound32, scale_factor=0.5)
bound8 = F.interpolate(bound16, scale_factor=0.5)
bound4 = F.interpolate(bound8, scale_factor=0.5)
x_up1 = self.up1(latent_z)
x_up2 = self.up2(x_up1)
x_up3 = self.spade_blc3(x_up2, bound4) # 4*4 bound
x_up3 = self.up3(x_up3)
x_up4 = self.spade_blc4(x_up3, bound8) # 8*8 bound
x_up4 = self.up4(x_up4)
x_up5 = self.spade_blc5(torch.cat([x_up4, decoder_result[0]], 1), bound16) # 16*16 bound
x_up5 = self.conv5(x_up5)
x_up5 = self.up5(x_up5)
x_up6 = self.spade_blc6(torch.cat([x_up5, decoder_result[1]], 1), bound32) # 32*32 bound
x_up6 = self.conv6(x_up6)
x_up6 = self.up6(x_up6)
x_up7 = self.spade_blc7(torch.cat([x_up6, decoder_result[2]], 1), bound64) # 64*64 bound
x_up7 = self.conv7(x_up7)
x_up7 = self.up7(x_up7)
x_up8 = self.spade_blc8(torch.cat([x_up7, decoder_result[3]], 1), bound128) # 128*128 bound
x_up8 = self.conv8(x_up8)
x_up8 = self.up8(x_up8)
# x_up5 = self.conv5(torch.cat([x_up4, decoder_result[0]], 1))
# x_up5 = self.spade_blc5(x_up5, bound16) # 16*16 bound
# x_up5 = self.up5(x_up5)
#
# x_up6 = self.conv6(torch.cat([x_up5, decoder_result[1]], 1))
# x_up6 = self.spade_blc6(x_up6, bound32) # 16*16 bound
# x_up6 = self.up6(x_up6)
#
# x_up7 = self.conv7(torch.cat([x_up6, decoder_result[2]], 1))
# x_up7 = self.spade_blc7(x_up7, bound64) # 16*16 bound
# x_up7 = self.up7(x_up7)
#
# x_up8 = self.conv8(torch.cat([x_up7, decoder_result[3]], 1))
# x_up8 = self.spade_blc8(x_up8, bound128) # 16*16 bound
# x_up8 = self.up8(x_up8)
x_out = self.same(x_up8)
x_out = self.tanh(x_out)
return x_out
# # define upSample Module
# class UpSampleBlock(nn.Module):
# def __init__(self, input_nc, output_nc,
# outermost=False, innermost=False, norm_layer=nn.BatchNorm2d, use_dropout=False):
# super(UpSampleBlock, self).__init__()
#
# if type(norm_layer) == functools.partial:
# use_bias = norm_layer.func == nn.InstanceNorm2d
# else:
# use_bias = norm_layer == nn.InstanceNorm2d
#
# uprelu = nn.ReLU(True)
# upnorm = norm_layer(output_nc)
#
# if outermost:
# upconv = nn.ConvTranspose2d(input_nc, output_nc,
# kernel_size=4, stride=2,
# padding=1)
# up = [uprelu, upconv, nn.Tanh()]
#
# elif innermost:
# upconv = nn.ConvTranspose2d(input_nc, output_nc,
# kernel_size=4, stride=2,
# padding=1, bias=use_bias)
# up = [uprelu, upconv, upnorm]
#
# else:
# upconv = nn.ConvTranspose2d(input_nc, output_nc,
# kernel_size=4, stride=2,
# padding=1, bias=use_bias)
# up = [uprelu, upconv, upnorm]
# if use_dropout:
# up = up + [nn.Dropout(0.5)]
#
# self.up = nn.Sequential(*up)
#
# def forward(self, x):
# return self.up(x)
================================================
FILE: net/generator_net_concat_1Layer.py
================================================
import torch
import torch.nn as nn
import torch.nn.functional as F
import functools
import net.base_net as base_net
###############################################################################
# define spatially adaptive normalized generator
# input: boundary and apperance latent vector
###############################################################################
def defineSPADEGenerator(input_nc, output_nc, ngf, norm='instance', use_dropout=False, init_type='normal',
init_gain=0.02, gpu_ids=[], latent_chl=1024, up_mode='NF'):
norm_layer = base_net.get_norm_layer(norm_type=norm)
net = SPADEGenerator(input_nc, output_nc, ngf,
norm_layer=norm_layer, latent_chl=latent_chl, up_mode=up_mode)
return base_net.init_net(net, init_type, init_gain, gpu_ids)
##############################################################################
# Classes
##############################################################################
class BasicSPADE(nn.Module):
def __init__(self, norm_layer, input_nc, planes):
super(BasicSPADE, self).__init__()
self.conv_weight = nn.Conv2d(input_nc, planes, kernel_size=3, stride=1, padding=1)
self.conv_bias = nn.Conv2d(input_nc, planes, kernel_size=3, stride=1, padding=1)
self.norm = norm_layer(planes, affine=False)
def forward(self, x, bound):
out = self.norm(x)
weight_norm = self.conv_weight(bound)
bias_norm = self.conv_bias(bound)
out = out * weight_norm + bias_norm
return out
class ResBlkSPADE(nn.Module):
def __init__(self, norm_layer, input_nc, planes, conv_kernel_size=1, padding=0): # todo: change conv kernel size, kernel=3, padding=1 or kernel=1, padding=0
super(ResBlkSPADE, self).__init__()
self.spade1 = BasicSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=planes)
self.relu = nn.ReLU(inplace=True)
self.conv1 = nn.Conv2d(planes, planes, kernel_size=conv_kernel_size, stride=1, padding=padding)
self.spade2 = BasicSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=planes)
self.conv2 = nn.Conv2d(planes, planes, kernel_size=conv_kernel_size, stride=1, padding=padding)
self.spade_res = BasicSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=planes)
self.conv_res = nn.Conv2d(planes, planes, kernel_size=conv_kernel_size, stride=1, padding=padding)
def forward(self, x, bound):
out = self.spade1(x, bound)
out = self.relu(out)
out = self.conv1(out)
out = self.spade2(out, bound)
out = self.relu(out)
out = self.conv2(out)
residual = x
residual = self.spade_res(residual, bound)
residual = self.relu(residual)
residual = self.conv_res(residual)
out = out + residual
return out
# Defines the generator.
# |num_downs|: number of downsamplings in UNet. For example,
# if |num_downs| == 7, image of size 128x128 will become of size 1x1
# at the bottleneck
class SPADEGenerator(nn.Module):
def __init__(self, input_nc, output_nc, ngf=64,
norm_layer=nn.InstanceNorm2d, latent_chl=1024, up_mode='NF'):
super(SPADEGenerator, self).__init__()
layers = []
self.up_mode = up_mode
self.up1 = nn.ConvTranspose2d(in_channels=latent_chl, out_channels=512, kernel_size=4, stride=2, padding=1)
self.up2 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
self.up3 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
self.up4 = nn.ConvTranspose2d(in_channels=512, out_channels=512, kernel_size=4, stride=2, padding=1)
self.up5 = nn.ConvTranspose2d(in_channels=256, out_channels=256, kernel_size=4, stride=2, padding=1)
self.up6 = nn.ConvTranspose2d(in_channels=128, out_channels=128, kernel_size=4, stride=2, padding=1)
self.up7 = nn.ConvTranspose2d(in_channels=64, out_channels=64, kernel_size=4, stride=2, padding=1)
self.up8 = nn.ConvTranspose2d(in_channels=64, out_channels=64, kernel_size=4, stride=2, padding=1)
# without concat
self.spade_blc3 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
self.spade_blc4 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
self.spade_blc5 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
self.spade_blc6 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=256)
self.spade_blc7 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=128)
self.spade_blc8 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=64)
self.conv5 = nn.Conv2d(in_channels=512, out_channels=256, kernel_size=1, stride=1, padding=0)
self.conv6 = nn.Conv2d(in_channels=256, out_channels=128, kernel_size=1, stride=1, padding=0)
self.conv7 = nn.Conv2d(in_channels=128, out_channels=64, kernel_size=1, stride=1, padding=0)
self.conv8 = nn.Conv2d(in_channels=64, out_channels=64, kernel_size=1, stride=1, padding=0)
# # only concat out16
# self.spade_blc3 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc4 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc5 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=1024+512)
# self.spade_blc6 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=256)
# self.spade_blc7 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=128)
# self.spade_blc8 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=64)
#
# self.conv5 = nn.Conv2d(in_channels=1024+512, out_channels=256, kernel_size=1, stride=1, padding=0)
# self.conv6 = nn.Conv2d(in_channels=256, out_channels=128, kernel_size=1, stride=1, padding=0)
# self.conv7 = nn.Conv2d(in_channels=128, out_channels=64, kernel_size=1, stride=1, padding=0)
# self.conv8 = nn.Conv2d(in_channels=64, out_channels=64, kernel_size=1, stride=1, padding=0)
# # only concat out32
# self.spade_blc3 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc4 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc5 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc6 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=256+512)
# self.spade_blc7 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=128)
# self.spade_blc8 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=64)
#
# self.conv5 = nn.Conv2d(in_channels=512, out_channels=256, kernel_size=1, stride=1, padding=0)
# self.conv6 = nn.Conv2d(in_channels=256+512, out_channels=128, kernel_size=1, stride=1, padding=0)
# self.conv7 = nn.Conv2d(in_channels=128, out_channels=64, kernel_size=1, stride=1, padding=0)
# self.conv8 = nn.Conv2d(in_channels=64, out_channels=64, kernel_size=1, stride=1, padding=0)
#
# # only concat out64
# self.spade_blc3 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc4 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc5 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc6 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=256)
# self.spade_blc7 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=128+256)
# self.spade_blc8 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=64)
#
# self.conv5 = nn.Conv2d(in_channels=512, out_channels=256, kernel_size=1, stride=1, padding=0)
# self.conv6 = nn.Conv2d(in_channels=256, out_channels=128, kernel_size=1, stride=1, padding=0)
# self.conv7 = nn.Conv2d(in_channels=128+256, out_channels=64, kernel_size=1, stride=1, padding=0)
# self.conv8 = nn.Conv2d(in_channels=64, out_channels=64, kernel_size=1, stride=1, padding=0)
#
# # only concat out128
# self.spade_blc3 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc4 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc5 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=512)
# self.spade_blc6 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=256)
# self.spade_blc7 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=128)
# self.spade_blc8 = ResBlkSPADE(norm_layer=norm_layer, input_nc=input_nc, planes=64+128)
#
# self.conv5 = nn.Conv2d(in_channels=512, out_channels=256, kernel_size=1, stride=1, padding=0)
# self.conv6 = nn.Conv2d(in_channels=256, out_channels=128, kernel_size=1, stride=1, padding=0)
# self.conv7 = nn.Conv2d(in_channels=128, out_channels=64, kernel_size=1, stride=1, padding=0)
# self.conv8 = nn.Conv2d(in_channels=64+128, out_channels=64, kernel_size=1, stride=1, padding=0)
self.same = nn.Conv2d(in_channels=64, out_channels=3, kernel_size=3, stride=1, padding=1)
self.tanh = nn.Tanh()
# def forward(self, input, latent_z, decoder_result): # input: bound, batch_size*17*256*256
def forward(self, input, latent_z):
bound128 = F.interpolate(input, scale_factor=0.5)
bound64 = F.interpolate(bound128, scale_factor=0.5)
bound32 = F.interpolate(bound64, scale_factor=0.5)
bound16 = F.interpolate(bound32, scale_factor=0.5)
bound8 = F.interpolate(bound16, scale_factor=0.5)
bound4 = F.interpolate(bound8, scale_factor=0.5)
x_up1 = self.up1(latent_z)
x_up2 = self.up2(x_up1)
x_up3 = self.spade_blc3(x_up2, bound4) # 4*4 bound
x_up3 = self.up3(x_up3)
x_up4 = self.spade_blc4(x_up3, bound8) # 8*8 bound
x_up4 = self.up4(x_up4)
# x_up5 = self.spade_blc5(torch.cat([x_up4, decoder_result[0]], 1), bound16) # 16*16 bound
x_up5 = self.spade_blc5(x_up4, bound16) # 16*16 bound
x_up5 = self.conv5(x_up5)
x_up5 = self.up5(x_up5)
# x_up6 = self.spade_blc6(torch.cat([x_up5, decoder_result[1]], 1), bound32) # 32*32 bound
x_up6 = self.spade_blc6(x_up5, bound32)
x_up6 = self.conv6(x_up6)
x_up6 = self.up6(x_up6)
# x_up7 = self.spade_blc7(torch.cat([x_up6, decoder_result[2]], 1), bound64) # 64*64 bound
x_up7 = self.spade_blc7(x_up6, bound64) # 64*64 bound
x_up7 = self.conv7(x_up7)
x_up7 = self.up7(x_up7)
# x_up8 = self.spade_blc8(torch.cat([x_up7, decoder_result[3]], 1), bound128) # 128*128 bound
x_up8 = self.spade_blc8(x_up7, bound128)
x_up8 = self.conv8(x_up8)
x_up8 = self.up8(x_up8)
x_out = self.same(x_up8)
x_out = self.tanh(x_out)
return x_out
================================================
FILE: net/vgg_net.py
================================================
import torch as th
from torch import nn
from torchvision.models import vgg16
import net.base_net as base_net
from utils.metric import gram_matrix
###############################################################################
# define
###############################################################################
def defineVGG(init_type='normal', init_gain=0.02, gpu_ids=[]):
net = VGGNet()
return base_net.init_net(net, init_type, init_gain, gpu_ids)
class VGGNet(nn.Module):
def __init__(self):
super(VGGNet, self).__init__()
self.net = vgg16()
vgg_path = 'pretrainModel/vgg16-397923af.pth'
self.net.load_state_dict(th.load(vgg_path))
def forward(self, x):
map_ = ["relu1_2", "relu2_2", "relu3_3", "relu4_3", "relu5_3"]
vgg_layers = self.net.features
layer_name_mapping = {
'3': "relu1_2",
'8': "relu2_2",
'15': "relu3_3",
'22': "relu4_3"
}
output = []
for name, module in vgg_layers._modules.items():
x = module(x)
#x = nn.parallel.data_parallel(module, x, range(num_gpu))
if name in layer_name_mapping:
output.append(x)
return output
def perceptual_loss(self, x, target, loss_func):
self.x_result = self.forward(x)
self.target_result = self.forward(target)
loss_ = 0
for xx, yy in zip(self.x_result, self.target_result):
loss_ += loss_func(xx, yy.detach())
return loss_
def style_loss(self, x, target, loss_func):
#x_result = self.forward(x)
#target_result = self.forward(target)
loss_ = 0
for xx, yy in zip(self.x_result, self.target_result):
loss_ += loss_func(gram_matrix(xx), gram_matrix(yy.detach()))
return loss_
================================================
FILE: opt/__init__.py
================================================
================================================
FILE: opt/config.py
================================================
# -*- coding: utf-8 -*-
import argparse
import torch
class BaseOptions():
def __init__(self):
"""Reset the class; indicates the class hasn't been initailized"""
self.initialized = False
def initialize(self, misc_arg):
# data set
misc_arg.add_argument('--batch_size', type=int,
default=6, help='input batch size')
misc_arg.add_argument('--no_flip', action='store_true',
help='if specified, do not flip the images for data augmentation')
# net set
misc_arg.add_argument('--input_nc', type=int, default=9,
help='# of input image channels')
misc_arg.add_argument('--output_nc', type=int, default=3,
help='# of output image channels')
misc_arg.add_argument('--ngf', type=int, default=64,
help='# of gen filters in first conv layer')
misc_arg.add_argument('--ndf', type=int, default=64,
help='# of discrim filters in first conv layer')
misc_arg.add_argument('--netD', type=str, default='basic',
help='selects model to use for netD')
misc_arg.add_argument('--n_layers_D', type=int, default=3,
help='only used if netD==n_layers')
misc_arg.add_argument('--gpu_ids', type=str, default='0',
help='gpu ids: e.g. 0 0,1,2, 0,2. use -1 for CPU')
# loss set
misc_arg.add_argument('--loss_percept', action='store_true',
help='include perceptual loss')
misc_arg.add_argument('--loss_faceID', action='store_true',
help='include face identity loss')
# misc_arg.add_argument('--loss_percept', type=bool, default=True,
# help='include perceptual loss')
# misc_arg.add_argument('--loss_faceID', type=bool, default=True,
# help='include face identity loss')
misc_arg.add_argument('--gan_start_epoch', type=int, default=0,
help='start to include GAN loss from which epoch')
# path and name
misc_arg.add_argument('--name', type=str, default='experiment_name',
help='name of the experiment. It decides where to store samples and models')
misc_arg.add_argument('--load_path', type=str, default='trained_model')
misc_arg.add_argument('--load_model_iter', type=str, default='latest',
help='which iteration to load? if load_iter > 0, the code will load models by iter_[load_iter]; otherwise, the code will load models by [epoch]')
misc_arg.add_argument('--num_threads', default=4, type=int,
help='# threads for loading data')
misc_arg.add_argument('--save_dir', type=str,
default='runs', help='output path')
# norm and dropout
misc_arg.add_argument('--norm', type=str, default='instance',
help='instance normalization or batch normalization for discriminator')
# misc_arg.add_argument('--no_dropout', action='store_true',
# help='no dropout for the generator')
# init
misc_arg.add_argument('--init_type', type=str, default='normal',
help='network initialization [normal|xavier|kaiming|orthogonal]')
misc_arg.add_argument('--init_gain', type=float, default=0.02,
help='scaling factor for normal, xavier and orthogonal.')
misc_arg.add_argument('--verbose', action='store_true',
help='if specified, print more debugging information')
# display
misc_arg.add_argument('--log_step', type=int, default=200,
help='log after n iters')
misc_arg.add_argument('--save_step', type=int,
default=200, help='log after n iters')
misc_arg.add_argument('--save_by_iter', action='store_true',
help='whether saves model by iteration')
misc_arg.add_argument('--phase', type=str, default='test',
help='train, val, test, etc')
# optimizer
misc_arg.add_argument('--niter', type=int, default=100,
help='# of iter at starting learning rate')
misc_arg.add_argument('--niter_decay', type=int, default=100,
help='# of iter to linearly decay learning rate to zero')
misc_arg.add_argument('--beta1', type=float, default=0.5,
help='momentum term of adam')
misc_arg.add_argument('--lr', type=float, default=0.0001,
help='initial learning rate for adam')
misc_arg.add_argument('--no_lsgan', action='store_true',
help='do *not* use least square GAN, if false, use vanilla GAN')
misc_arg.add_argument('--pool_size', type=int, default=50,
help='the size of image buffer that stores previously generated images')
misc_arg.add_argument('--lr_policy', type=str, default='lambda',
help='learning rate policy: lambda|step|plateau|cosine')
misc_arg.add_argument('--lr_decay_iters', type=int, default=50,
help='multiply by a gamma every lr_decay_iters iterations')
self.initialized = True
return misc_arg
def get_config(self):
"""Initialize our parser with basic options(only once).
Add additional model-specific and dataset-specific options.
These options are defined in the function
in model and dataset classes.
"""
if not self.initialized: # check if it has been initialized
parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser = self.initialize(parser)
# get the basic options
config, _ = parser.parse_known_args()
# set gpu ids, transfrom string to int number
str_ids = config.gpu_ids.split(',')
config.gpu_ids = []
for str_id in str_ids:
id = int(str_id)
if id >= 0:
config.gpu_ids.append(id)
if len(config.gpu_ids) > 0:
torch.cuda.set_device(config.gpu_ids[0])
return config
================================================
FILE: opt/configTrain.py
================================================
# -*- coding: utf-8 -*-
from opt.config import BaseOptions
class TrainOptions(BaseOptions):
"""This class includes training options.
It also includes shared options defined in BaseOptions.
"""
def initialize(self, misc_arg):
misc_arg = BaseOptions.initialize(self, misc_arg)
misc_arg.add_argument('--lambda_vgg', type=int, default=1)
misc_arg.add_argument('--lambda_reconstruct', type=int, default=25)
misc_arg.add_argument('--lambda_pix', type=int, default=25)
misc_arg.add_argument('--lambda_id', type=int, default=1)
misc_arg.add_argument('--lambda_gan', type=int, default=1)
self.initialized = True
return misc_arg
================================================
FILE: requirements.txt
================================================
tqdm==4.32.2
opencv_python==3.4.1.15
torch==0.4.1
torchvision==0.2.1
scipy==1.0.1
matplotlib==2.2.2
numpy==1.15.0
tensorboardX==1.8
================================================
FILE: test.py
================================================
import time
import scipy.misc as m
import numpy as np
import cv2
import torch
import torchvision.utils as vutils
import argparse
from tqdm import *
from model.spade_model import SpadeModel
from opt.configTrain import TrainOptions
from loader.dataset_loader_demo import DatasetLoaderDemo
from fusion.affineFace import *
parser = argparse.ArgumentParser()
parser.add_argument('--pose_path', type=str, default='data/poseGuide/imgs', help='path to pose guide images')
parser.add_argument('--ref_path', type=str, default='data/reference/imgs', help='path to appearance/reference images')
parser.add_argument('--pose_lms', type=str, default='data/poseGuide/lms_poseGuide.out', help='path to pose guide landmark file')
parser.add_argument('--ref_lms', type=str, default='data/reference/lms_ref.out', help='path to reference landmark file')
args = parser.parse_args()
if __name__ == '__main__':
trainConfig = TrainOptions()
opt = trainConfig.get_config() # namespace of arguments
# init test dataset
dataset = DatasetLoaderDemo(gaze=(opt.input_nc == 9), imgSize=256)
root = args.pose_path # root to pose guide img
path_Appears = args.pose_lms.format(root) # root to pose guide dir&landmark
dataset.loadBounds([path_Appears], head='{}/'.format(root))
root = args.ref_path # root to reference img
path_Appears = args.ref_lms.format(root) # root to reference dir&landmark
dataset.loadAppears([path_Appears], '{}/'.format(root))
dataset.setAppearRule('sequence')
# dataloader
data_loader = torch.utils.data.DataLoader(dataset=dataset,
batch_size=opt.batch_size,
shuffle=False,
num_workers=12, drop_last=False)
print('dataset size: {}\n'.format(dataset.shape()))
# output sequence: ref1-pose1, ref1-pose2, ref1-pose3, ... ref2-pose1, ref2-pose2, ref2-pose3, ...
boundNew = []
appNew = []
for aa in dataset.appearList:
for bb in dataset.boundList:
boundNew.append(bb)
appNew.append(aa)
dataset.boundList = boundNew
dataset.appearList = appNew
model = SpadeModel(opt) # define model
model.setup(opt) # initilize schedules (if isTrain), load pretrained models
model.set_logger(opt) # set writer to runs/test_res
model.eval()
iter_start_time = time.time()
cnt = 1
with torch.no_grad():
for step, data in tqdm(enumerate(data_loader)):
model.set_input(data) # set device for data
model.forward()
# fusionNet
for i in range(data['img_src'].shape[0]):
img_gen = model.fake_B.cpu().numpy()[i].transpose(1, 2, 0)
img_gen = (img_gen * 0.5 + 0.5) * 255.0
img_gen = img_gen.astype(np.uint8)
img_gen = dataset.gammaTrans(img_gen, 2.0) # model output image, 256*256*3
# cv2.imwrite('output_noFusion/{}.jpg'.format(cnt), img_gen)
lms_gen = data['pt_dst'].cpu().numpy()[i] / 255.0 # [146, 2]
img_ref = data['img_src_np'].cpu().numpy()[i]
lms_ref = data['pt_src'].cpu().numpy()[i] / 255.0
lms_ref_parts, img_ref_parts = affineface_parts(img_ref, lms_ref, lms_gen)
# fusion
fuse_parts, seg_ref_parts, seg_gen = fusion(img_ref_parts, lms_ref_parts, img_gen, lms_gen, 0.1)
fuse_eye, mask_eye, img_eye = lightEye(img_ref, lms_ref, fuse_parts, lms_gen, 0.1)
# res = np.hstack([img_ref, img_pose, img_gen, fuse_eye])
cv2.imwrite('output/{}.jpg'.format(cnt), fuse_eye)
cnt += 1
iter_end_time = time.time()
print('length of dataset:', len(dataset))
print('time per img: ', (iter_end_time - iter_start_time) / len(dataset))
================================================
FILE: utils/__init__.py
================================================
#coding:utf-8
================================================
FILE: utils/affineFace.py
================================================
from utils.points2heatmap import *
from utils.calcAffine import *
def affineface(img, src_pt, dst_pt, heatmapSize=256):
# naive mode
curves_src, _ = points2curves(src_pt)
pts_fivesense_src = np.vstack(curves_src[1:])
curves_dst, _ = points2curves(dst_pt)
pts_fivesense_dst = np.vstack(curves_dst[1:])
affine_mat = calAffine(pts_fivesense_src, pts_fivesense_dst)
pt_aligned = affinePts(affine_mat, src_pt)
img_aligned = affineImg(img, affine_mat)
return img_aligned, pt_aligned
if __name__ == '__main__':
pass
================================================
FILE: utils/affine_util.py
================================================
from __future__ import print_function
import torch
import numpy as np
import inspect
import re
import numpy as np
import os
import collections
import cv2
# Converts a Tensor into a Numpy array
# |imtype|: the desired type of the converted numpy array
def th_affine2d(x, matrix, output_img_width, output_img_height, center=True, is_landmarks=False):
"""
2D Affine image transform on torch.Tensor
"""
assert(matrix.ndim == 2)
matrix = matrix[:2, :]
transform_matrix = matrix
src = x
if is_landmarks:
dst = np.empty((x.shape[0], 2), dtype=np.float32)
for i in range(src.shape[0]):
dst[i, :] = AffinePoint(np.expand_dims(
src[i, :], axis=0), transform_matrix)
else:
# cols, rows, channels = src.shape
dst = cv2.warpAffine(src, transform_matrix, (output_img_width, output_img_height),
cv2.INTER_AREA, cv2.BORDER_CONSTANT, borderValue=(0, 0, 0))
# for gray image
if dst.ndim == 2:
dst = np.expand_dims(np.asarray(dst), axis=2)
return dst
def AffinePoint(point, affine_mat):
"""
Affine 2d point
"""
assert(affine_mat.shape[0] == 2)
assert(affine_mat.shape[1] == 3)
assert(point.shape[1] == 2)
point_x = point[0, 0]
point_y = point[0, 1]
result = np.empty((1, 2), dtype=np.float32)
result[0, 0] = affine_mat[0, 0] * point_x + \
affine_mat[0, 1] * point_y + \
affine_mat[0, 2]
result[0, 1] = affine_mat[1, 0] * point_x + \
affine_mat[1, 1] * point_y + \
affine_mat[1, 2]
return result
def exchange_landmarks(input_tf, corr_list):
"""
Exchange value of pair of landmarks
"""
#print(corr_list.shape)
for i in range(corr_list.shape[0]):
temp = input_tf[corr_list[i][0], :].copy()
input_tf[corr_list[i][0], :] = input_tf[corr_list[i][1], :]
input_tf[corr_list[i][1], :] = temp
return input_tf
================================================
FILE: utils/calcAffine.py
================================================
# -*- coding: utf-8 -*-
"""
Created on Fri Dec 29 13:43:03 2017
"""
import numpy as np
import os, sys, shutil
import cv2
import matplotlib.pyplot as plt
from tqdm import tqdm
# affine points via least square method
# src_p[input] -- np.array([[x,y],...])
# dst_p[input] -- list[float]
# affine_mat[output] -- np.array() | matrix of affine
# pt_align[output] -- np.array() | aligned points
def calAffine(src_p, dst_p):
p_N = len(src_p)
U = np.mat(list(dst_p[:, 0]) + list(dst_p[:, 1]))
xx_src, yy_src = list(src_p[:, 0]), list(src_p[:, 1])
X = np.mat(np.stack([xx_src + yy_src, yy_src + [-ii for ii in xx_src], [1 for ii in range(p_N)] + [0 for ii in range(p_N)], [0 for ii in range(p_N)] + [1 for ii in range(p_N)]], axis=1))
result = np.linalg.pinv(X) * U.T
affine_mat = np.zeros([2, 3])
affine_mat[0][0] = result[0][0]
affine_mat[0][1] = result[1][0]
affine_mat[0][2] = result[2][0]
affine_mat[1][0] = -result[1][0]
affine_mat[1][1] = result[0][0]
affine_mat[1][2] = result[3][0]
return affine_mat
def affinePts(affine_mat, pt):
src_align = pt.T
new_align = np.mat(affine_mat[:2, :2]) * np.mat(src_align) + np.reshape(affine_mat[:, 2], (-1, 1))
pt_align = np.array(np.reshape(new_align.T, -1))[0].reshape(-1, 2)
return pt_align
# affine Image from pt_src to pt_mean
# img[input] -- np.array()
# pt_src,pt_mean[input] -- list[float] format = x0,y0,x1,y1,...,xn,yn
# img_align -- np.array() | aligned image
def affineImg(img, TransMat, dsize=256):
img_align = cv2.warpAffine(img, TransMat, (dsize, dsize), borderValue=(155, 155, 155))
return img_align
# if __name__ == '__main__':
# path_src = '/media/heyue/8d1c3fac-68d3-4428-af91-bc478fbdd541/Project/Face2Face/detectface/samples/common/output/landmarks.txt'
# output_pt = 'lms/lms.txt'
# output_img = 'imgs'
# affineList(path_src, output_pt, output_img, 'meanpose384.txt', k=2,
# head='/media/heyue/8d1c3fac-68d3-4428-af91-bc478fbdd541/Project/Face2Face/Data/test')
#
# '''
# path_src = 'alignedPoints_256.txt'
# output_pt = 'output/AU_points.txt'
# output_img = 'output/AU'
# head = '/media/heyue/8d1c3fac-68d3-4428-af91-bc478fbdd541/Project/Face2Face/net/GANimation/dataset_emo'
# affineList(path_src,output_pt,output_img,'meanpose384.txt',k=2,head = head)
# print('done')
# '''
================================================
FILE: utils/lms.test
================================================
9/19256.png
59.2707332221 140.50139565 60.7953501505 155.840424728 63.0589911357 170.835862842 65.4653911442 185.631580966 68.932073428 200.297533329 73.1550212248 214.977364271 78.3881163686 228.991811878 85.782502989 242.139749636 94.4189019168 254.626618041 104.924653651 265.730908768 117.005647331 275.139883829 129.815078926 283.476031021 143.37127461 290.826952689 157.705309344 295.664839878 172.967870215 299.278662395 188.142331646 301.003628508 203.391112954 299.106629196 214.045289167 294.2596111 222.110408022 286.9237645 227.824291319 277.536297759 232.693593582 267.177742414 236.99960223 257.480352842 241.334591417 246.280666171 245.480590158 235.443775065 247.893018636 225.0246153 250.039055401 213.309511006 252.46659434 202.406234293 254.773921111 191.514755894 256.225473022 180.407283758 256.372423739 168.78606124 256.828096529 158.1029349 256.357407963 146.970661227 256.985755887 136.084872502 114.586808868 135.321729885 132.74553607 124.70268833 152.040049834 122.438303213 170.91265272 124.446420684 189.598270691 128.753271858 218.845519791 130.545330844 228.999942361 125.463643143 239.492576365 121.450320408 249.411474877 118.880680476 258.63657448 126.027651995 205.708653694 147.024191699 209.353867509 164.673407501 213.07951614 182.28544999 216.65254958 199.749653618 185.393391977 212.936340413 196.392694475 214.773223325 208.047812188 215.409272233 216.617837427 213.903127509 224.635758048 208.808564363 131.858618758 149.400812206 141.803898017 141.837268135 165.392099346 141.877548881 173.496075357 150.424830919 163.490303701 155.657311596 141.529345276 154.72979133 218.371201941 148.56854991 223.046388058 140.281171009 241.050909426 136.381149039 248.061006408 142.790943892 242.89240324 149.838912486 225.871684848 151.684002643 133.124127304 132.692006387 152.894224179 131.235446775 171.782386626 133.906632401 189.462860167 136.799074227 219.621570411 137.486735175 230.071839672 132.992997019 239.766994382 129.429999215 249.825871997 127.146855576 153.647803621 138.716963931 152.532061597 156.828066976 153.08423455 148.835235305 231.661808615 135.504713658 234.498881293 151.922103669 233.452793735 144.753889108 189.998753616 148.65085426 211.829934016 147.614811963 183.50083607 192.236196887 224.472572542 190.084188921 177.735356983 205.116053997 229.367980432 201.176585545 165.681562295 246.508253322 182.299302633 238.905006917 200.693193025 234.064595888 208.664367869 234.20392116 215.735333293 232.49493141 222.794397869 233.948770072 228.546120971 240.699878289 223.70320508 247.896904973 216.815692657 252.697373346 207.576602906 255.753809513 193.485770695 256.258864646 178.869826197 252.538349197 169.738702535 246.227459702 188.733957672 243.310082701 207.91329263 243.195118021 217.199482234 241.363744993 225.538937849 241.024455906 216.417664974 241.68074608 207.352268122 242.826354474 188.391096125 244.116216393 153.102035611 148.834301512 233.454847703 144.753781362 153.725930258 145.114166383 153.084286376 147.501284993 151.563270626 149.641380309 149.466570187 151.172505503 146.994850192 151.941333886 144.421277117 151.998757233 142.008750244 151.173766886 139.965573287 149.524089837 138.660797851 147.318713561 138.066185917 145.105605337 137.981505574 144.276587636 138.369191154 142.03460668 139.460180447 139.654124445 141.345427985 137.9398006 143.643632066 136.821013852 146.278905487 136.500155145 148.780330685 136.899632999 151.03364833 138.197076184 152.669510423 140.170197508 153.563862259 142.548142291 221.73503399 142.544143402 222.324681326 144.296157045 223.747033131 146.727416182 225.871748815 148.622716569 228.565630813 149.654700302 231.454922255 149.674084725 234.115583168 148.620042416 236.265651641 146.666641629 237.611757594 144.110753208 238.165864921 141.363673677 238.111969948 139.014414155 237.364507682 136.34796902 235.853386482 133.808087344 233.553001096 132.136889175 230.82140856 131.419945958 227.864313086 131.743955739 225.252816986 132.923802015 223.246766375 134.939675132 222.010432246 137.428703155 221.574681609 139.64437078
9/18313.png
106.48914616 132.672242138 107.415715889 145.711047527 108.089733661 158.217376145 109.097645413 170.943788737 111.49622388 183.266693209 114.497992701 195.61203933 117.4158239 207.928001363 121.647678133 219.699144661 125.653509329 231.889662553 130.676008838 243.633838019 136.936671504 254.620283763 143.778754614 265.558231432 150.979034341 276.034626287 158.980762336 285.070427474 168.636700268 293.83382138 179.453387051 300.075893109 192.162578599 302.277568977 206.841866015 301.3882526 220.759331914 297.429317905 233.685173048 290.916029164 246.050769856 282.338019796 256.605411476 272.805536915 266.830653498 261.502662583 276.33434463 250.115474513 284.151631088 238.342934186 289.489777878 224.304358509 293.74386166 210.381942125 297.746770812 196.27342125 300.729988231 181.733988216 302.378821257 167.174265776 304.249079801 152.938836638 304.658511111 138.276047208 305.071288243 123.954987635 112.847168039 131.899352655 125.202342376 123.503510827 138.703473483 124.821782401 152.072583536 127.758609354 165.981191284 131.636220127 203.018679197 126.642964252 219.322462372 119.756744827 236.917121769 114.683829451 254.534167436 112.995767178 270.818333052 122.148798935 182.686748366 149.842311592 181.890113083 168.336652534 180.591242808 186.82105254 179.082507225 205.369823936 164.673462438 213.922283489 173.873411976 217.958740668 183.403257567 219.123765839 194.323037043 216.449363124 205.249015343 213.573475192 128.488183749 151.31398383 136.87693746 147.744556713 155.183765167 148.024973775 163.032068502 152.258383687 154.814616274 154.89594445 137.053182665 154.737244799 213.122829346 150.005593506 220.596053083 142.584600921 242.236942333 140.518517549 252.98372581 143.783167642 243.973767857 150.058514219 222.740135367 151.989105048 124.883888821 133.189669134 138.785197683 134.127358206 152.084180916 136.918700167 165.205495626 139.182662383 203.899336715 134.347678637 220.240167696 129.001630836 236.932956357 124.692478267 254.349869402 122.707408406 145.810942286 146.553356026 145.984092888 155.815542063 145.753005675 151.444313936 231.248896524 139.688929543 233.28793687 152.067511801 232.949303358 146.533350971 173.385300061 150.939051443 197.712775443 149.666354053 164.406491946 193.347326232 205.614876812 192.808966763 159.113212686 205.590542538 211.774756737 204.963119795 159.334247959 246.17033926 166.361691723 240.000440723 175.891471738 237.031314863 183.560816212 238.085511132 191.568983323 236.614927606 206.898009026 239.205955284 223.0520094 245.333374235 211.611766532 253.858648396 199.353988616 259.518893372 185.877344433 261.787193023 175.724784804 260.756856459 166.151721142 254.625118803 163.100185194 246.472029038 173.701483102 246.022122151 184.69960444 247.829447571 201.401870894 245.822144351 218.333681184 245.686391746 201.318101332 246.444242567 185.088446002 247.508323029 173.863314132 246.861554284 145.771761246 151.443138211 232.961306923 146.532598507 152.459077151 150.360388735 151.880913147 152.034778676 150.687841843 153.463769705 149.114967358 154.396881419 147.323728686 154.826253251 145.522226445 154.779511285 143.873800922 154.134946958 142.489117828 152.927800571 141.65883591 151.36128393 141.311614635 149.823233132 141.344734109 149.402393953 141.743326134 147.807046173 142.598382835 146.188356315 144.025360235 145.126560133 145.65737248 144.500047682 147.49907096 144.358989834 149.212416454 144.633443477 150.760126283 145.545671647 151.860918363 146.88584938 152.480365294 148.597387485 227.924359649 143.328780536 228.776704934 145.535615715 230.567435638 147.669316343 232.937918279 149.103130649 235.667134951 149.687475432 238.451677638 149.496351824 240.994527509 148.41732722 243.08501528 146.477918036 244.344520333 143.994089069 244.817729155 141.474442122 244.82921745 140.215926534 244.110093791 137.803409348 242.600089712 135.41410888 240.322193509 133.871306219 237.694357661 133.065952494 234.826894059 133.147792164 232.21192646 133.96546826 229.994982439 135.677815247 228.497249141 138.001842309 227.831235297 140.545626305
9/19869.png
110.245464177 148.829301092 109.958189408 161.084203848 109.730843772 173.034262362 109.703148885 184.953274131 110.916384523 196.671940639 113.143185036 208.720416614 115.752609491 220.390328909 119.946544729 231.346135802 124.347461589 242.609061802 129.981212104 253.288024557 136.91716085 263.271156305 144.070928045 273.036081186 151.492454317 282.448806427 159.563820795 290.668665906 169.184807464 298.297906136 179.867381002 303.752905258 192.143799331 305.526843759 206.24083174 304.108790136 219.470034352 300.325274358 232.277552849 295.280395413 245.211004896 288.568311956 256.672340579 281.003872168 267.540954022 271.907935343 277.918025824 262.282467807 286.591136251 252.298826524 292.950527309 239.504488134 297.274798901 226.666050766 300.799673183 213.35461447 302.804299458 199.763470165 304.156516395 185.843153816 305.621658405 172.364673163 305.978605954 158.397528001 306.344036754 144.847427091 115.158408801 130.593369953 125.425441538 119.748865704 137.4032323 118.079374989 149.884781878 119.656907731 162.719434597 123.50112244 201.00370918 120.63808576 216.996071381 115.407738672 234.637367238 113.135471059 251.870550227 115.413017023 267.731011378 126.321561511 180.770483915 147.683802573 179.679613463 163.421609287 178.059323268 179.209566407 176.610655732 195.201430783 162.33964871 207.993385361 171.860432863 211.177135312 181.665372 211.519677529 193.687845484 207.812373844 205.711453541 205.052011661 127.2937918 151.246436155 135.399786431 143.656834038 156.388568435 144.762517163 163.54204846 152.166695511 154.903900711 155.65213948 135.779192622 155.485417804 210.936260237 150.107521268 218.781286879 141.369186129 242.343032089 139.727266587 252.365699653 145.971512562 243.113231495 151.393710048 221.003683915 152.36716455 125.458452893 127.680096213 137.771680259 126.615953447 150.045906506 128.786541335 162.193643974 131.462699002 201.756171426 128.96726159 218.000932428 124.592997073 234.776869397 122.025030265 251.804959575 123.101921725 145.938906608 141.262334928 145.436329759 156.675274923 145.53128924 150.086237403 230.399095311 137.532877223 232.004505762 152.798354303 231.712248934 146.590700504 172.575974279 149.75751461 195.776529787 148.601153999 162.457116329 186.948189109 204.796838533 184.870006129 156.005636829 199.960778122 213.203829315 196.137553083 151.395921113 233.946685725 162.837917004 228.078855349 177.219584857 225.339022655 184.891009804 225.126288288 192.518319946 224.257546259 215.058344801 224.357921954 237.677210995 227.443201326 224.046897741 243.328721249 207.601315789 255.323932135 187.126877986 260.766652313 171.94667495 258.166824733 159.785582063 247.588910705 155.872394208 235.230526391 170.417260005 232.66075966 185.720946114 232.710684618 209.613279863 229.613787876 232.91840261 229.14295881 210.711825919 243.101625664 186.32442677 247.400546622 169.584992156 244.641951312 145.545987519 150.085773429 231.731645027 146.590757006 154.172787887 146.897621223 153.402829241 149.263956198 151.804805735 151.350802737 149.604541104 152.769833872 147.037637941 153.421322286 144.426923571 153.320490677 142.024766445 152.369088795 140.030218457 150.623028808 138.784199077 148.331700328 138.268655644 146.046070342 138.23486678 145.229900865 138.734284085 143.000509643 139.933199842 140.664103987 141.918387227 139.020467918 144.287662217 138.061565141 146.952257375 137.877806511 149.463285489 138.401103214 151.696301445 139.799232027 153.302375298 141.844456817 154.131819591 144.307433446 224.682316483 143.905470924 225.62914445 146.252388917 227.410255457 148.207154693 229.696735598 149.455797814 232.273203237 149.933558815 234.846022687 149.702539529 237.165950716 148.642203992 239.048083799 146.805871587 240.133154552 144.485766239 240.487980555 142.230844643 240.449351438 141.557944585 239.807654626 139.385974986 238.475737586 137.148853232 236.415543577 135.653647638 234.027570233 134.791413109 231.37351349 134.694624745 228.895926096 135.273072995 226.731815861 136.735770404 225.246307919 138.827759218 224.554754566 141.347442455
9/18824.png
69.4807351723 151.801376805 70.2700786681 166.669154882 71.8143754474 181.241441228 73.4269978983 195.55873371 76.2527942641 209.799132528 80.0043269856 223.968911671 84.8441464888 237.466449121 91.8781325344 250.004923335 100.188678082 261.797310386 110.160991578 272.024613754 121.769699443 280.811142146 133.89600159 288.788219684 146.713776859 295.961609825 160.525106331 300.493339418 175.307859426 304.226228537 189.91308948 306.171458044 204.221558442 303.956575289 213.323620951 298.455847863 219.22790111 290.601383296 223.374967269 281.460247035 227.643970704 271.693185861 231.593874444 262.524965878 235.968566423 252.060185303 240.35214401 242.119502116 243.165813416 232.687239338 245.348476302 221.666267774 247.349904237 211.386183925 249.018476926 201.06432608 249.498083565 190.297317376 247.903824721 179.234798687 246.743679457 169.20659787 244.739155353 158.487321574 244.238579347 148.053784737 125.096213735 116.720124836 140.081668454 105.896495598 155.067981074 103.540847004 169.68936401 104.447227641 184.639339075 108.401194107 214.707007946 113.958461967 221.008128327 112.470160527 227.603550136 112.180608215 233.442581862 113.653703605 239.327521598 122.446098298 204.111672709 147.838110575 209.606371136 161.160861426 215.466825351 174.494745106 221.161057828 187.997954986 186.465877545 212.474209367 199.336571231 212.283758816 212.428688328 211.637724313 221.738915823 210.458519637 229.935808671 205.838826967 137.223800922 144.126809682 146.829807002 138.094431327 168.772421818 140.051422374 175.774119466 149.162405184 165.696483193 152.449315441 145.496437668 149.459778139 217.567376811 152.587047705 221.540459568 146.267503436 236.241405671 144.799717822 240.913939855 150.633364888 236.385024141 155.688155581 223.052157322 156.054708474 140.112800115 114.549578725 155.80052298 113.380681874 170.537003696 115.706630278 184.613403176 118.915844355 215.310954826 122.387534635 221.804408493 121.319734116 227.354584946 121.143225458 233.528637123 121.744218044 157.929718299 136.172247856 155.425268244 152.612596014 156.91544926 145.444983919 228.826987893 143.334772928 229.701924243 157.00342149 229.595553687 150.854403745 190.246896155 148.296107106 210.582852603 149.946050881 185.962823549 186.981646229 227.617763787 184.383302305 179.027541323 202.195566143 233.168787548 196.85038802 167.145586134 246.648869672 186.555274252 239.135192231 206.694950861 232.161149718 213.87626209 232.684044523 219.856638543 232.038303182 225.499612324 236.286903068 228.035337953 243.14564591 224.196178584 250.672063621 218.978127782 255.970407105 211.339631337 259.487898436 196.188639049 258.621431544 180.748136138 253.921478848 171.60830814 246.888249998 191.93566201 244.827638362 212.028395822 245.149003446 219.127299427 244.12759024 225.165272106 243.994396347 218.490940622 244.586956248 211.694394091 244.782245347 191.750990766 245.612113479 156.93274299 145.44396392 229.596218831 150.854364514 172.474556654 140.488510493 171.820439665 142.9779942 170.040697724 147.2392217 167.0493125 150.908460333 163.018419099 153.35081736 158.331348966 154.128525109 153.685226026 153.042946702 149.706897066 150.356131081 146.958625721 146.42337564 145.544828545 141.917325918 145.224489794 137.357462427 146.117691929 132.765403269 148.299332118 128.449100982 151.792609744 125.390351905 156.169352453 123.7645879 161.063809693 123.843677741 165.569437351 125.47818312 169.11164603 128.556488098 171.448275582 132.537550978 172.309353356 135.673205226 214.77651454 149.516489364 215.404518082 151.1674488 216.878394689 153.806414101 219.112808663 155.929964853 221.952639081 157.137431413 225.047389323 157.167151907 227.936031163 156.010942924 230.236804261 153.895562952 231.644607371 151.113999539 232.164982597 148.077079527 231.976165067 145.124705563 231.029750895 142.228362052 229.293492133 139.588185835 226.781661447 137.898825644 223.817967893 137.261998628 220.661911894 137.765855125 217.901300241 139.2255479 215.911027798 141.546879951 214.770808519 144.315998334 214.48742804 146.391818237
9/19595.png
74.4244224358 133.664743499 75.7215088889 148.601916915 77.5045565718 163.196653964 79.372922785 177.746965257 82.0947246067 192.140287097 85.1699660466 206.698737736 88.6776595779 221.007755836 94.1089962315 234.738807963 100.841536719 248.141829955 109.626610062 260.195864335 120.212759837 270.885950946 131.437117639 280.639295749 143.538643841 289.344346602 156.635847135 295.887845075 170.876182508 301.3601746 185.332310332 304.391321099 200.486917729 304.356448861 213.185886687 300.626640372 224.063752659 294.420254316 232.957510443 285.2431258 240.855873541 274.566582542 247.298216757 263.979042936 253.443482862 251.953029683 259.442473639 240.368286023 263.532753707 228.861978904 266.332035021 215.791926375 268.418502645 203.429471171 270.485203653 190.973954974 272.023174831 178.485633389 272.534792985 165.514580256 273.071758094 153.191967367 272.331694216 140.52304056 272.151693699 128.190352571 110.303971361 137.062857659 126.409594615 126.71107324 143.870491914 125.333281857 161.272057564 127.566410872 178.541846833 131.663706289 214.532898112 131.261295824 228.119166808 125.812189923 242.347597053 121.660183258 256.199047753 119.927602512 268.12735852 128.136923585 198.646036779 146.974272846 201.163338592 166.444477235 203.403248911 186.132748885 205.394274456 205.564629019 178.846316127 213.16832158 190.005644421 216.508481686 201.593162862 218.767305983 210.562510683 215.610155382 219.750087388 210.43823679 130.382465566 148.780554258 140.671786435 143.040899546 161.959339805 142.842848778 169.837128152 149.631126133 160.611529936 152.593054854 140.500967755 152.198327072 216.285222655 147.351955144 222.526275362 140.047176789 241.920904494 137.473142137 251.133152725 142.102756151 242.984825721 147.31136935 224.272999256 148.560671699 126.614115503 135.246504733 144.40996987 134.085376593 161.760119924 136.556068166 178.108201583 139.09450361 215.24301508 138.022410922 229.031723194 133.594276038 242.345269617 130.225272566 256.028206416 128.458423855 151.346459205 140.686512334 150.565711107 153.476446626 150.574161738 148.042763817 231.972476668 136.613570114 233.484430776 148.655362906 233.628068292 143.788081302 184.701490912 148.204111106 207.32031875 146.925008419 177.405145437 192.669867461 219.294310999 191.203743729 171.303442528 204.762176519 225.387123161 202.379446638 155.321585787 240.472759857 173.410240067 239.224676358 192.392173918 238.744526493 200.378743862 238.942749996 208.05991076 236.743701944 219.250296194 236.788541255 231.401795941 236.95067661 222.963494325 246.363338758 212.79694842 253.722414247 200.062823497 257.332821485 184.178010836 256.469342495 168.672586426 249.706529102 159.94236505 241.828740835 179.914510128 243.970996493 200.120400264 246.304086892 213.956847516 242.720078847 227.168608703 238.686498525 213.556908547 243.142432657 199.828198637 245.971925079 179.588351937 244.872634313 150.591700421 148.041968548 233.629448009 143.788694692 157.052391021 144.000865004 156.725924638 145.193851973 155.537185169 148.04686971 153.535098909 150.433859528 150.784096514 151.986885995 147.641126014 152.587695073 144.480942824 152.182792582 141.550740224 150.79747126 139.311047603 148.540631736 137.974005847 145.767668669 137.51970161 143.157464544 137.892553227 140.080524813 139.207082419 137.094724392 141.521431953 134.960576324 144.416661598 133.741013759 147.6975064 133.527356833 150.798184153 134.125575894 153.556505935 135.700521927 155.619484955 138.040404932 156.704023739 140.834956686 224.447863221 141.639108912 225.19253752 143.676458598 226.759139195 145.668427655 228.889131066 147.026494014 231.377366851 147.638964412 233.92721994 147.538215636 236.275745596 146.619906878 238.22486648 144.898427306 239.432089843 142.646844392 239.911467719 140.339039045 239.942716151 139.194587831 239.334721309 136.981376723 238.022392403 134.768332983 235.982589614 133.297348897 233.603989522 132.508891438 230.983595582 132.499954865 228.565585496 133.182803358 226.493655688 134.700306517 225.065470119 136.789540991 224.407701278 139.098675723
9/19022.png
105.853911786 150.688842749 106.304755687 163.478940126 107.058193774 175.858800554 108.093608467 188.254071331 110.229936105 200.396513414 112.766738097 212.860480821 115.210241137 225.029971473 118.716371193 236.735315629 122.300263461 248.923973988 127.100442957 260.508494302 133.61804031 271.318109138 140.868919038 281.410912745 149.169107994 290.565297806 158.540567204 297.753250206 169.432213007 304.124802445 180.946535516 308.6148877 193.542154585 309.948068083 207.549631581 308.505301114 220.76968111 305.208204526 233.451038713 300.090467454 245.975225286 292.97329979 256.672108338 284.777100812 266.381493607 274.504398947 275.362423646 263.545693043 282.061802271 252.456912497 286.854969624 238.833684208 289.928046451 225.49834043 292.790172123 212.034993583 294.366884196 198.369996927 295.474931021 184.364889576 296.831564829 170.878509905 296.903260249 156.938634422 297.266115108 143.399566056 116.304739438 132.882383554 127.912101677 120.314053712 141.71663141 117.955114229 155.815006556 119.31210216 170.258223912 123.024928223 199.684637675 121.269258893 215.400914864 115.767697653 232.299514693 113.01896136 248.852728629 114.843558631 264.054600974 126.807430409 185.04317789 141.528634167 185.183521016 157.61319909 184.738888808 173.74555895 184.290973012 189.749690102 168.237920197 204.288286214 177.145813987 206.714088635 186.403867424 207.14750767 196.980990698 204.61604892 207.730063248 202.267617824 132.39440308 145.597696654 140.744441652 137.702594423 160.522211282 138.494458912 167.145615638 145.593700621 159.357529903 149.313758229 141.099701459 149.300662591 209.918812446 143.450660882 216.583980293 134.931619486 237.958407537 133.411665142 247.353427929 140.202166925 238.825297548 145.378180406 218.771264736 146.113453826 128.244438385 128.968131144 142.40245084 127.061883512 156.396350369 128.854706759 170.03519427 131.101479494 200.311398144 129.538880662 216.186949721 125.32534772 232.331881862 122.835928446 248.788032596 123.68794603 150.760209549 135.090558168 150.280482337 150.424354998 150.065194766 143.988975242 227.144313534 131.12358597 228.71299771 146.634009031 228.667597337 140.35964247 176.638400006 143.360093585 197.331703788 142.19173124 168.345583532 183.783124179 207.410676297 182.164130291 162.476447813 196.749745738 213.989893988 193.982610498 158.243354322 242.16225384 166.597125534 231.693527273 179.101008482 226.682022275 187.224280453 227.232709673 195.451885443 225.917992458 210.86120123 229.524986825 225.09774603 239.637971883 214.81208181 249.816069769 202.589610148 256.6727466 188.389553595 259.804956041 176.405821916 258.802618562 165.482536053 252.01117807 161.314320622 241.739365482 173.479273129 235.07489689 187.42068506 234.461425635 204.747234017 234.212221293 221.29974001 239.398270593 205.17836789 245.854164848 188.109749171 248.6157591 173.864728577 247.179831606 150.082426937 143.988441253 228.682761647 140.35917256 161.280066397 138.860105662 161.07098251 139.415783897 160.109143918 142.4199581 158.282766052 145.044542752 155.599418544 146.843596422 152.426789823 147.570452718 149.177360876 147.26614573 146.139145028 146.007304175 143.726931864 143.820401682 142.245033322 140.974057598 141.717994561 137.837323827 142.07024103 134.620906772 143.365123516 131.565705272 145.677808647 129.360718675 148.628181238 128.163305121 151.962385722 127.997199873 155.123444948 128.67819745 157.887808842 130.317407447 159.944511568 132.720931168 160.98639282 135.639556093 220.549012597 137.867294167 221.362117279 140.127207891 223.248967977 142.482178967 225.797227766 144.094278085 228.778872635 144.885707534 231.874281312 144.895349546 234.71252579 143.909085163 237.047996569 141.882779151 238.496647026 139.180896086 239.074888831 136.22172627 238.876118418 133.503278306 237.700859522 130.755754057 235.615071044 128.460073692 232.815525094 127.177277495 229.788044255 126.710164399 226.687095102 127.211946065 224.048886248 128.552854942 222.092566418 130.813723559 220.85843946 133.491710279 220.414261706 134.810167405
9/19916.png
95.2767910751 126.809887821 96.1610041532 140.668978114 97.1712750455 154.117666679 98.4616574186 167.729488962 100.946784689 181.011968182 103.82395298 194.399018442 106.804852556 207.638992554 111.326188645 220.272837509 116.288335188 233.153280591 122.68213832 245.247516364 130.50029673 256.423697433 138.752939865 267.349674917 147.384651295 277.874381193 156.911089211 286.907533908 168.162598012 295.261146595 180.504350865 300.860278984 194.363685923 302.529834898 209.025394305 300.909261294 222.570543927 296.277509068 234.922790029 288.92073999 246.79259185 279.454876017 256.627390857 269.182875567 265.980324209 257.221358307 274.684298266 245.28327563 281.807384886 233.053646155 286.57573427 218.81150233 289.923920006 204.836247919 292.906251951 190.753547444 295.102180642 176.380123942 296.321788426 161.903134424 297.759033857 147.813987976 297.945669692 133.238594198 298.320027286 119.04763484 108.261737636 134.004490065 122.09696225 125.843724022 137.160102923 127.177999277 152.081053372 130.451708983 167.373611453 134.708808776 207.540427842 133.687352451 223.633872672 128.039538618 240.641058617 123.614570224 257.345820876 121.735653215 272.721029723 129.621207649 186.223892518 151.843003149 186.594737554 171.41424043 186.656708261 191.009175528 186.546199842 210.555770115 168.448512052 215.80480723 178.302937662 220.021394777 188.381443098 222.257169315 198.995154047 218.026183293 209.769180464 213.865183088 125.970944268 150.958869117 135.147962166 145.715159958 155.564648984 147.311026132 163.296938038 153.797433225 154.14238206 157.634755045 134.524803773 156.526623129 214.208236727 152.31762099 221.588227797 144.525828148 243.409154786 142.026368976 253.848387604 146.410200299 245.010445462 152.816428609 223.767055061 154.578831735 121.981574118 134.738719706 137.320556833 135.661881309 152.198061079 138.855890165 166.789620953 141.507095253 208.312171788 140.524451521 224.49504723 136.0235408 240.678113594 132.477539482 257.170548827 130.721809475 145.430379772 144.263208195 144.266829269 158.614357297 144.73901428 151.854040728 232.215202045 141.085375621 234.331365443 154.849775567 233.98566871 148.838323374 175.235384347 152.75771575 200.044185772 151.840971513 167.417272377 195.605573901 210.24279548 194.005098234 161.731340094 207.691365978 216.639011485 205.235937146 155.288749948 240.094709745 167.856775942 238.20072971 181.535675806 237.346492655 189.471367774 238.455450162 197.432037133 236.599556146 213.822148539 236.113330904 230.803225116 237.707553765 218.543772034 247.427315499 205.19993674 254.218760009 190.010100801 256.965469449 176.978238902 255.707583508 164.841519746 248.930450484 159.67706025 241.465764564 174.443174874 243.240155171 189.705550444 245.637982244 208.170370814 242.362876072 226.147260118 238.981667366 207.994412399 242.940831498 189.828418694 245.290148968 174.314287036 244.100050871 144.758168067 151.853495747 233.988506309 148.838242636 151.942139093 148.02680443 151.253052703 150.407783968 149.61795321 152.802190806 147.287519784 154.538356869 144.487826326 155.452235508 141.559990154 155.555420165 138.802158399 154.69672295 136.449788213 152.892163589 134.90110295 150.413691048 134.174055429 147.81741402 134.059528385 146.362386346 134.614766594 143.76867315 135.994145419 141.145121009 138.239563595 139.315527404 140.918838265 138.261660309 143.915370972 138.102563024 146.724548199 138.744677235 149.176257867 140.347617421 150.933244618 142.636279955 151.808874129 145.114733581 228.570364828 146.006964338 229.339683222 148.16753464 230.978209497 150.210366893 233.164021747 151.600780733 235.701300045 152.21868243 238.302042471 152.122674612 240.70427328 151.188156109 242.717080312 149.442568113 243.960132479 147.167942153 244.462294822 144.854758843 244.499324458 143.835602735 243.911076197 141.576163917 242.605022403 139.291000437 240.546755312 137.767032354 238.138536427 136.8941278 235.45905107 136.817722513 232.973008085 137.426617793 230.80359742 138.908291498 229.295657058 141.005861089 228.564793889 143.415718277
9/19638.png
100.562046405 149.522166586 101.084157241 162.439269839 101.947238113 175.056760522 102.734229182 187.490330143 104.210175671 199.87879917 106.27297844 212.472004531 108.377955609 224.932726484 112.497850369 236.668689804 117.259191588 248.448771535 123.821631321 259.139847402 131.920641077 269.098920938 140.357523179 278.608859553 149.123828448 287.428382136 158.933599084 294.616923918 170.064358 301.185135553 181.763451086 305.615925063 194.702235042 306.88758026 207.879887482 304.903847575 220.05139032 300.707947749 231.334158812 294.565731417 242.419468885 286.652651596 251.727855705 278.037759875 260.911970447 268.039213233 269.763922142 257.948991468 276.92149081 247.788908873 281.748290998 235.058860217 285.218757715 222.608221538 287.901698151 209.741752421 289.580845021 196.788339825 290.382182455 183.522055736 291.667296606 170.69353414 291.586823136 157.37114526 291.694665625 144.493910893 111.031315352 131.142610839 122.48030853 118.462805359 136.22162 115.285825298 150.565479922 116.029236928 165.41248357 119.675515663 207.189196059 117.766869613 222.329217896 112.867907297 238.912820797 111.017271406 254.919127705 113.898026846 268.646102584 126.358467912 187.05447568 146.182883817 187.78273271 164.092824198 188.116529556 182.240836309 188.373615832 200.375048672 167.552119632 209.170120996 178.675890689 211.918853464 190.150807745 214.205025916 201.527352352 209.706884892 213.179542658 206.512584546 127.320623552 149.854295586 136.05495604 143.363102976 156.408373623 143.678717978 164.391960419 150.296938883 155.239056389 152.515434145 136.677067311 152.4031807 214.369255367 148.941780174 221.782687612 141.084009225 243.225780202 139.970472252 252.987319797 145.645117545 243.336464298 149.568906017 223.383933309 150.005133425 122.687428461 127.437123544 136.775728881 124.859100785 150.926091571 125.960293127 164.795966807 127.940217549 208.039367778 125.800546295 223.404842144 122.224517103 238.864104498 120.762756493 254.639036395 122.628764211 146.039563554 140.883846711 146.071398396 153.095085059 145.914800681 148.301070708 232.340277127 137.800330253 233.138102745 150.154550912 233.522889935 145.617123016 176.127801844 148.066963901 200.581995155 147.197405752 167.429095582 187.923685241 211.911422505 186.540396013 159.490813303 200.52033127 220.929170421 197.851549627 146.217187106 232.251362427 163.958354131 226.689036946 183.058904598 225.948974135 191.570300823 226.854726373 199.838552226 224.989339717 219.955165653 223.859688205 240.958052903 227.369745249 229.3430783 245.771841683 213.309901349 259.121022074 192.377911275 264.358482335 173.162479861 261.827241178 157.128264341 249.345332893 151.013067369 233.717492764 171.38291654 231.579269121 191.548672527 232.858867683 213.954820494 229.506873687 236.239248318 229.056939626 216.558390737 246.1421495 192.240408345 252.118967966 169.431945234 247.777340674 145.930910998 148.300131134 233.546004737 145.615774931 149.760008638 145.457361894 149.160750119 147.55863469 147.793996326 149.394720318 145.94047872 150.669519401 143.775257639 151.311675893 141.543956368 151.363091206 139.453979443 150.668291707 137.6637737 149.256692277 136.523560382 147.360886733 136.006646634 145.463168269 135.959501465 144.90786999 136.315054676 142.96629643 137.258734596 140.913813973 138.908067997 139.452021993 140.883716673 138.487779697 143.162208099 138.160913749 145.339833833 138.418360366 147.337831735 139.476265201 148.804439173 141.135860224 149.633388262 143.250125949 226.85717886 143.362528403 227.74158097 145.403276932 229.367172287 147.089444191 231.417750663 148.133037168 233.705497252 148.495108698 235.975792519 148.244626513 238.007769017 147.270070795 239.648343359 145.609888471 240.569247247 143.540460734 240.852806712 141.545453797 240.820867156 140.985754552 240.200631771 139.063840206 238.97569408 137.113235259 237.111454984 135.8542509 234.985060715 135.14962718 232.633452688 135.114582411 230.455881612 135.653440176 228.560435724 136.980585575 227.279900865 138.848892849 226.701204418 141.109914746
9/18256.png
97.9301833087 147.204343791 98.5089133455 160.119439014 99.5679568871 172.532794924 101.068926641 184.952006988 103.732883512 197.011718803 106.839766752 209.284611589 109.920417108 221.254628186 114.404952398 232.699405014 119.302861719 244.31607469 125.610773037 255.031838326 133.458928669 264.728662318 141.736245603 274.06969498 150.457054286 282.974054216 160.029717139 290.072527996 170.941304121 296.725082231 182.44288001 301.305509919 195.115122196 302.579873045 208.0301251 301.073413234 219.877297542 296.973388214 230.837653485 290.560922142 240.998010822 282.307647595 249.594710429 273.42508193 257.949684341 262.967532167 265.693792657 252.473510298 271.79638881 241.818894568 275.937061076 229.108633232 279.143192542 216.774860687 282.349954568 204.29785078 284.788132824 191.606683247 286.31514767 178.698573161 288.072315608 166.322577408 288.141140649 153.401693177 288.052910625 140.833239197 108.677118906 124.451551792 121.857505002 112.792427044 137.01127831 110.108169768 152.288748959 110.801073122 167.712257946 114.06895678 208.048492458 112.522436335 223.143893281 107.880168707 239.610990063 105.788107177 255.645292304 107.689465897 269.779943299 118.859479334 188.578849096 147.270541764 189.106098633 163.875074006 189.368556634 180.518408395 189.49677702 197.542806738 170.166951894 213.843156309 180.55526181 214.940495062 191.067575665 215.422603324 201.557909423 213.276696525 212.371530509 212.046733021 123.403585446 150.059033746 132.837125312 143.317032626 154.989771887 144.871747114 162.606741747 152.875749078 152.724758397 157.647576419 131.891048632 156.457613351 216.984334601 151.356004657 223.862464905 142.252814548 246.704975192 139.583828834 256.659628012 145.536787871 248.662288433 153.443593562 226.86158984 155.163325856 122.136896055 122.120355166 137.57226634 120.197767438 152.635557638 121.490382219 167.299457402 123.188621446 209.03946991 121.476698608 224.38776511 118.064259653 239.802093615 116.182822338 255.504492889 116.96965725 144.025707909 141.309405857 142.246218468 159.052850037 143.153293208 150.680180236 234.953827559 138.160620555 237.924720561 156.038281026 236.869302265 147.771838213 176.015355897 149.905989907 202.597433307 149.006577488 170.301923611 191.704546508 211.011418135 190.517626114 163.909694276 205.342115225 218.048239316 203.584763418 159.126920109 244.119318973 171.377912437 239.283649014 184.111685939 235.611391983 192.170752971 236.963235801 200.248736705 235.138077693 214.166641682 238.37800516 229.137227813 242.972056975 218.613139193 252.276325879 207.024722559 258.816642467 193.295830627 261.352773968 180.645527129 259.932493658 168.666140242 253.235855415 164.28088856 244.992862094 178.436116724 245.154261645 192.621417081 247.085567528 208.398159535 244.913453689 224.008216508 243.975431921 208.100171838 245.523848162 192.775906478 246.688312297 178.497250687 245.876650654 143.172519444 150.679691021 236.88568017 147.771421474 153.733606839 147.029941398 153.049358428 149.430098945 151.417545201 151.862917992 149.061426029 153.609461255 146.227408763 154.524512783 143.263406564 154.615608367 140.46245693 153.749883888 138.057079362 151.936386263 136.448562407 149.432108579 135.680234349 146.791732559 135.534052755 145.415254696 136.056719989 142.797607469 137.40683291 140.131116513 139.660270837 138.269616255 142.361927683 137.193335028 145.393068613 136.996499681 148.242434468 137.608751668 150.7632492 139.198423371 152.600237859 141.49666272 153.558920339 144.085167239 229.211049209 145.090164351 230.036599419 147.26308065 231.603958772 149.129966287 233.653336655 150.366098787 236.010747506 150.882561997 238.387357482 150.697393511 240.515595583 149.684156021 242.21968787 147.935677569 243.186988276 145.766307362 243.51358445 143.675943861 243.49887719 143.069943254 242.962990666 141.023614655 241.818584983 138.893013803 239.959775342 137.445394534 237.772853097 136.600385745 235.316767096 136.490445719 233.04392368 137.029245781 231.07596035 138.404652598 229.744376148 140.343164793 229.098145352 142.725253922
9/19311.png
62.5969909545 147.055776445 64.0947240374 162.050613955 66.4204562104 176.991401088 68.4712110315 191.485101291 71.3024088633 206.105868019 75.0420934412 220.82717253 79.9471914125 234.786713554 87.2153421301 247.763679923 95.8768793891 259.891811492 106.577766269 270.179147392 118.770141967 279.056968772 131.40811085 286.81230048 144.512676183 293.482862607 158.277636614 298.370772334 172.866413551 302.4287694 187.358836723 304.828671355 202.298300558 304.158705301 213.644366728 299.905886425 222.673293116 292.909058477 229.311366241 283.510681475 235.039381941 272.764207059 240.045125894 262.477995653 245.432743142 251.17189122 250.876394488 240.390675643 254.922681523 230.072526674 258.494253787 218.127032252 261.344640941 206.993565944 263.612734185 195.707152437 264.680288345 184.337386816 264.297796059 172.293148774 263.713962021 161.239206291 262.303173717 149.673029193 261.597626004 138.443461666 116.91270198 138.987700983 132.993239783 128.512085933 149.885950659 125.67610617 166.958499742 125.890478466 183.749932884 128.604176484 219.458245055 128.175503577 229.811999644 124.100308216 240.717246446 121.373675794 251.131562099 120.258522231 260.300534404 128.458110123 204.042481796 146.908906688 208.005827982 164.409185297 211.756443298 182.061454341 215.35073646 199.510882121 182.728070955 208.338090693 195.130803827 211.023954795 208.504877395 213.418434794 216.950671833 211.140197749 225.200180815 205.237279404 134.070704744 150.487634363 143.686748613 143.930607679 165.283531587 142.381901184 173.742608213 149.562367566 164.282683006 152.444787558 144.342689652 152.607542114 218.031816261 147.531224731 224.064372249 140.251832752 242.410446258 137.560516103 250.122654062 143.44055716 242.995966372 147.558547161 225.674396603 148.513732626 133.08733111 135.483124437 150.462060475 133.092465124 167.569364759 134.441277737 183.487558825 136.451692829 220.001836389 135.011962506 230.623215077 131.085252838 240.622747844 128.42500201 251.023409843 127.434103956 154.254762014 140.47062802 154.427968939 153.286309935 154.303963529 148.330023139 232.950370209 136.441845051 234.236109722 148.405481491 234.173697611 143.985213798 189.239403744 148.142125177 210.984403625 146.956992536 181.018640381 188.164989666 224.653421896 186.966304487 173.846680446 199.876501874 230.992399215 197.609729731 147.695150116 234.194256391 172.502352754 229.680100626 198.440155487 228.342199542 205.919428675 228.103662202 213.012567841 227.276419239 223.242316398 225.505260391 232.273532147 227.649986617 226.256282359 241.086811899 217.00520146 251.492385738 203.264144849 257.252012446 182.497292265 256.679356619 163.117504506 247.891206918 151.802056911 235.408944147 178.277044801 232.722943721 204.694999352 233.310370296 217.664186836 231.243634539 229.012891366 229.505717606 217.739632625 241.35087254 203.387143844 247.150422889 176.490139993 246.403910924 154.319418915 148.330029721 234.178024856 143.985833856 167.957362454 145.171191883 167.272187404 147.074385761 165.575275469 149.767228267 163.106484146 151.821743642 160.056934143 153.021637832 156.761730338 153.291631241 153.562799833 152.505135287 150.786566955 150.646833586 148.904016297 147.94133894 148.027474312 144.831959724 148.032410123 141.951695622 149.000612183 138.868448799 150.889195383 136.118193868 153.637895346 134.399488727 156.780602385 133.654022629 160.154713995 133.909546031 163.210692471 135.034763227 165.674007778 137.129090013 167.324734495 139.818412337 167.967412473 141.895272013 227.861711332 141.094841143 228.597744579 143.160612693 230.128220454 145.142645231 232.199269868 146.530184614 234.627339438 147.190236337 237.128044803 147.135043611 239.434365073 146.251359284 241.349063728 144.59968449 242.532020138 142.419742022 242.994774001 140.19621778 242.977328832 139.061784834 242.38786661 136.869341513 241.118987211 134.672899579 239.146893334 133.188960711 236.83325838 132.376876695 234.270768854 132.344256504 231.892225315 132.974441704 229.853356446 134.412161045 228.451007798 136.417673305 227.811459328 138.605883977
9/18991.png
82.7764495822 172.641761681 83.840797426 185.600370608 85.6419387463 198.162813342 87.4655619433 210.37222825 90.1248170426 222.419799575 93.2915690643 234.768154032 97.1191217713 246.343047587 102.514460893 256.975203684 108.968744583 267.259542663 117.35645982 276.050210208 127.384547382 283.176639193 138.070863211 288.866173924 149.398497469 293.574405876 161.372502362 296.341132249 173.918717547 298.948555496 185.987027758 301.322927743 198.444194452 301.90873122 210.016647688 300.995667563 220.887649372 298.716944442 231.309949443 295.228853148 241.594145231 289.643920559 249.99462331 283.052409319 257.577850941 274.30087953 264.290566844 264.691650039 268.693986212 255.07642969 271.455017984 243.180110034 273.198767777 231.937736991 275.036849989 220.702328337 276.075144998 209.401815355 276.47257117 197.762631835 276.990397968 186.98086194 275.88716635 175.377852186 274.861989262 164.327517879 111.628327137 128.290462151 124.810940838 114.879772129 140.872103945 110.088323379 157.373101323 110.481467232 173.575480654 115.052219952 211.113278603 116.599792494 224.593500872 112.510691355 239.112642586 112.13850652 252.575834885 116.594667785 262.652483525 129.558712522 194.388549367 142.136202888 196.404286467 152.972999108 197.962266823 163.87955838 199.500568412 174.945705822 175.149497943 197.549023374 186.06327186 197.810804235 197.204925263 197.648097531 206.706402276 197.496892261 216.563542879 197.546318627 127.839509696 147.66090265 136.628089597 139.865247493 158.745676858 140.326461603 166.861674451 148.035466572 157.333769174 151.536004153 137.111686708 151.191831355 216.592800764 148.925594802 224.283904227 141.123770263 245.014683585 141.148941872 252.191929391 149.341493673 244.293060252 152.823675323 225.283915213 151.84775481 125.132331422 123.143970242 141.335957788 119.489164849 157.571575099 120.890785533 172.890436082 123.34219357 212.044281806 124.304413238 225.752905008 121.941318488 239.160621178 121.452140224 252.184936664 124.617059548 147.662767086 136.903900433 147.327922707 152.42560356 147.490081599 145.990305161 234.878735168 138.11697923 234.74717484 153.068089058 234.753123012 147.292484617 181.029649151 144.841396306 205.360734381 145.191418564 175.737274981 176.080134772 215.31031067 176.317036318 168.03560026 189.492599899 222.449409757 190.043470033 158.804705771 233.371465542 173.292205268 222.653959126 189.897587217 217.171671856 198.153242684 217.807042241 206.169196549 216.686675228 218.419586194 222.980165256 229.562678822 234.589724384 219.674890765 239.561806993 208.980582567 241.234766933 197.802862616 242.185965032 185.071126253 241.693597462 171.189104442 238.805064684 164.243327822 233.072661834 180.986920727 227.428568686 197.749046294 227.294389102 211.584005104 228.118025472 224.514480298 233.916610546 211.285335651 231.892849686 197.918875225 230.949460102 181.198623191 231.518714366 147.507412287 145.990183633 234.763521425 147.292411701 155.647674277 142.72537994 154.781284909 145.223730934 153.045285918 147.3884817 150.711570406 148.843364353 148.013104666 149.4938482 145.278682725 149.366744622 142.767448248 148.343768466 140.689906127 146.4886102 139.416773568 144.072508587 138.921661075 141.682148026 138.905168718 140.912049993 139.455304316 138.59339165 140.726282849 136.154597096 142.81618555 134.433825142 145.302671558 133.410466478 148.110292793 133.200131602 150.755349507 133.732033374 153.112981931 135.193534732 154.799598065 137.355776154 155.645939614 140.006419948 228.587729392 143.70233404 229.389953328 146.191063977 231.082086115 148.349107145 233.387823696 149.824837638 236.06770898 150.479202491 238.796953665 150.362314276 241.272202199 149.318236564 243.285909235 147.408738489 244.504149002 144.962687879 245.000283853 142.555670716 245.068551015 141.613742697 244.524075445 139.302066851 243.223101978 136.870215449 241.127165462 135.160077854 238.62633912 134.138415236 235.825890422 133.987510501 233.218877247 134.629727175 230.960871286 136.204013193 229.381979224 138.430055824 228.605380795 140.979771799
9/19837.png
103.759570973 144.778023042 104.201788827 157.723470124 104.821637997 170.237612357 105.7967451 182.804209322 107.939409406 195.014356612 110.703623298 207.494351128 113.460653865 219.689181821 117.564967 231.338937426 121.864606876 243.291295299 127.447656961 254.543562686 134.580815965 264.987807295 142.122451341 275.037573752 150.305516098 284.449773637 159.278989327 292.401705097 169.618191989 299.927500472 180.88466791 305.220450251 193.601130708 307.14442905 207.457507589 305.78870569 220.323835207 301.964133401 232.415024489 295.84245708 244.181638342 287.741903961 254.165662766 278.856019525 263.465155287 268.149556264 272.140248379 257.041129936 278.844477296 245.742779961 283.577450361 232.208219069 286.833821235 218.978720499 289.88270422 205.570203089 291.963058982 192.023065883 293.304963161 178.148196624 294.741532173 164.802169838 294.773777628 150.982652299 294.980833939 137.534365754 110.796570513 131.736421932 122.682935892 120.986931583 136.875661235 118.954698444 151.50090672 120.494463758 166.289799648 124.292673539 204.409825365 121.679019959 220.335243434 115.962205214 237.662637701 112.897493676 254.72902643 114.388496997 269.772773417 125.009360724 185.534972923 144.598703427 185.606742003 162.241084462 185.123314931 179.94693629 184.619609909 197.671737013 168.052115307 209.089078351 177.451497111 212.367883366 187.205645348 213.538126753 197.788729197 210.653276978 208.442809616 207.758772006 126.816995084 147.49487326 135.654584396 139.41728233 157.757336687 141.039353242 165.272212657 148.731103162 156.178289811 152.692111258 135.793498948 152.614049642 212.721911996 146.676059763 220.377691026 137.173595612 244.275380371 134.344800729 254.080360373 141.784268544 245.406088555 148.245285032 222.936455032 149.398239596 122.877317106 128.60448657 137.405913417 127.040991036 151.8662158 129.220063882 165.850257718 131.627552555 205.180295298 129.137177292 221.272290143 124.580340351 237.73412213 121.585827745 254.507208765 122.063616817 146.92035115 137.089620361 146.054237664 154.122808554 146.24807748 146.615938796 232.028389157 132.478093278 234.181211102 150.095693949 233.54018856 142.75021914 175.892966209 146.515749607 198.987199299 145.377877119 168.300657859 188.436954624 207.956103877 187.443407166 162.062864743 201.131006107 214.89802472 199.561938853 156.337344004 239.641743348 167.997277664 236.238738155 180.323917116 234.722041402 188.264608165 235.664983639 196.241597904 234.252444527 212.23863094 235.621197338 229.605514932 238.989839218 218.354112128 250.217851919 205.338069767 258.543599977 189.717258724 261.933360919 176.294900096 259.69162412 164.623807091 250.9739439 160.496156478 240.624901416 174.675574445 240.179837772 188.76737211 242.494442295 206.568338509 240.380925054 224.917793407 240.042012267 207.42469956 246.69553283 189.191289852 249.310763685 173.965838182 246.875617459 146.264844127 146.615310774 233.547873273 142.749931297 154.466392116 142.400895969 153.742328668 144.791932936 152.188540116 146.883841939 150.038735261 148.322970964 147.522986894 149.032142397 144.944298356 149.032187913 142.543739454 148.184653622 140.521311724 146.524723155 139.224850707 144.298915805 138.659685538 142.060407973 138.597585154 141.31767026 139.038214907 139.110748187 140.165277295 136.776674048 142.082394379 135.102103672 144.394416854 134.052866692 147.025916132 133.744422465 149.539232351 134.146991289 151.803438193 135.444572406 153.461990445 137.419655889 154.356594202 139.843108556 227.576415491 138.722191096 228.556959292 141.292137085 230.512325435 143.645774776 233.114714688 145.249701907 236.11487955 145.930661353 239.165932649 145.728930374 241.929220201 144.512378585 244.164067337 142.373971707 245.489421166 139.637201659 245.958969514 136.913598261 245.922786367 135.592727689 245.144352695 132.964840826 243.520786208 130.317539693 241.065041792 128.557008212 238.20591948 127.625909145 235.075161494 127.679514072 232.199098221 128.556091579 229.770897253 130.407213279 228.136225781 132.935897937 227.433009342 135.678048021
9/19167.png
100.904717881 134.262934269 101.437299822 147.227736379 102.060826572 159.688925874 103.282625974 172.35413436 105.922858002 184.540939649 109.223424817 196.721818479 112.567191503 208.808048108 117.286429762 220.445577431 122.050576661 232.231168473 128.076234094 243.171522746 135.35265905 253.416815506 143.117518852 263.369863172 151.125695959 273.012733844 159.620590005 281.465660394 169.711501345 289.699393738 180.748925543 295.56518977 193.430922151 297.865220861 207.697333189 297.028136057 221.16193563 293.418326007 233.815329359 287.087085004 245.721576583 278.936717552 256.07418633 269.933393441 265.998437442 259.120201956 275.352068741 248.45378809 282.949085702 237.191027003 288.155616 223.561400827 292.073271397 210.222927882 295.730046854 196.611113756 298.586863046 182.659396216 300.293732739 168.662182013 302.078871931 155.01178854 302.380964248 140.954461084 302.522919772 127.260972796 107.560994225 128.839680979 120.738389542 119.031205061 135.672198097 120.305961752 150.382455047 124.1618857 165.339958281 129.28985212 204.320323922 127.689163164 220.735112859 120.998680049 238.700101282 115.873486673 256.812952003 114.461109363 273.459341892 124.351701745 183.698606264 150.618111552 183.499605815 167.434860965 182.990730677 184.30683397 182.376843188 201.488329623 165.974672239 213.33197366 175.63581014 215.952319269 185.70685629 216.688569148 196.688730147 214.268891903 207.898636226 212.435480654 124.259905044 150.912658622 133.662707498 147.611669334 152.474323963 148.63774448 159.950889226 154.000714721 150.790845891 156.785459752 132.336726529 155.376371571 215.538202946 152.915974949 223.924242644 147.085360497 245.088454959 145.968704196 255.505851726 147.676242881 246.522976814 153.400872646 225.330275677 154.874698758 120.611567621 129.88870136 135.811421475 130.995408396 150.326515515 134.412326373 164.517757335 137.399581078 205.327570443 135.7570294 222.001409088 131.156726501 238.966660077 127.223469719 256.720706704 125.332381738 143.067050671 146.862798838 141.490933512 157.464105515 142.268702655 152.280784446 234.542615142 145.445663803 235.867923378 155.305693496 235.544653754 150.471061408 172.250053619 152.15729268 199.4576888 151.492509638 166.065961832 192.034573873 207.478345835 191.349624936 160.190745067 204.961919732 214.259062876 203.777681456 159.357825622 245.475069175 168.842869325 240.481623995 179.263864848 237.286447377 187.299110787 238.700358612 195.285863703 236.983650813 210.159957467 239.893963469 226.444018235 244.596881792 215.578135105 254.126625526 203.320970684 260.839397321 189.251538079 263.321619684 177.976754746 261.840320589 167.437550301 254.719063485 164.106056189 246.246647973 176.011226774 246.239226382 187.993370063 248.350265907 204.587964696 246.198185981 221.292347321 245.321583205 204.544941373 247.201905202 188.417846222 248.648492127 176.199747988 247.550240036 142.285598651 152.279928485 235.561585415 150.470909447 147.687278303 149.914265553 147.001758687 152.139818189 145.52183271 154.065417078 143.530491324 155.383815984 141.215869974 156.019138038 138.849918825 156.003483531 136.660685787 155.19896004 134.812694368 153.648756602 133.662487858 151.605840292 133.171502755 149.576816513 133.15668479 148.99174768 133.571276073 146.949314016 134.606917737 144.794249838 136.375299017 143.26834615 138.487185557 142.295332073 140.906864064 142.006179477 143.205052824 142.342174049 145.285229373 143.515712826 146.790414681 145.307023716 147.618007786 147.571673531 226.279050381 148.56683414 227.241630057 150.709598979 228.961490351 152.475212378 231.133825776 153.545269405 233.550937779 153.902059914 235.934595999 153.598498573 238.063358398 152.540638564 239.768604599 150.771606813 240.709918049 148.579678761 240.977995198 146.473381111 240.921954464 145.938395498 240.254363095 143.906319975 238.961163987 141.855120002 236.98657504 140.550598517 234.745035096 139.856007533 232.279222965 139.847529457 229.999978396 140.428829613 228.014921816 141.829489505 226.677124712 143.788119047 226.072516959 146.203669473
9/19292.png
88.5545004149 137.801212047 89.5323647629 151.510807886 90.9431243119 164.779013452 92.6939513271 178.110437698 95.4429688682 190.996563638 98.6781745357 204.057901051 101.898260486 216.86704677 106.718854305 229.183839319 112.042768837 241.575400706 119.104725404 252.987426175 127.816509159 263.353895346 137.20980965 273.010577985 147.2624093 281.941517091 158.098305014 289.318964926 170.240726283 296.156052138 182.901213041 300.805777723 196.639776084 302.098813527 210.06359791 300.03813785 222.380321634 295.493377223 233.590489894 288.412389138 244.234917009 279.652826062 253.098635408 270.345115914 261.368948423 259.238068893 269.033373079 248.156469337 274.646475985 236.763227651 278.105017941 223.399343769 280.637627301 210.545829239 283.090384374 197.662231327 285.021852576 184.6153782 286.262676449 171.314274113 287.659022077 158.472372744 287.490981053 145.193571493 287.355869624 132.273552411 105.425463331 131.071116272 119.914472195 121.209726357 136.418990361 120.778001263 152.727062027 124.045364899 169.06288191 129.173902027 209.638638531 128.513467171 225.389427485 122.541614711 242.18841633 118.443212327 258.741799606 118.046158679 273.062513206 127.010971153 189.359970377 147.222359366 190.324390344 166.03923137 190.931225986 184.904198827 191.506780833 203.718018142 172.360955202 212.435532933 182.378249859 215.486799471 192.579686353 217.341388207 202.113618981 213.866725321 211.981018056 210.800095182 124.87808562 147.116683391 134.982137902 140.896302532 157.263029599 142.730368203 165.137018561 150.056295437 155.471862422 152.737248055 134.524365182 151.639782176 215.27865434 149.306912375 223.063060777 141.346925001 245.38480925 139.066863272 255.295776749 144.27597492 246.392767251 149.91657711 224.956926059 150.984549157 119.943733102 129.629984432 136.674946876 129.502481313 152.826092642 133.076642121 168.296380472 136.454887424 210.484443112 135.30897041 226.427343116 130.64294156 242.271243065 127.025412982 258.53539234 126.060395933 146.302018149 139.249594888 145.033096971 153.412372472 145.282553649 147.331951168 234.079292246 137.72097069 235.660219968 151.337878498 235.382870661 145.705401661 177.737022853 148.553476013 202.146725516 147.989129768 171.709576974 192.029185747 211.327512679 191.090269742 165.620316731 204.448686074 218.162996719 202.99298092 159.040026485 242.534368187 171.679645197 236.715473671 185.64645583 234.238202377 193.972889066 235.423068247 202.081361626 233.432190099 216.73312784 235.452512172 231.617545833 240.678525792 221.70111438 252.277903212 209.500574316 260.216605911 194.886737428 263.395346377 181.289278967 261.798903624 168.48266119 253.723593243 163.876122067 243.461243443 178.795284525 241.110591331 194.107365064 242.924221765 210.752337056 240.917101815 226.766209575 241.650365571 210.685341143 248.110349347 194.429118536 250.858246944 178.880261351 248.904325842 145.302081665 147.331340989 235.394442818 145.705040074 150.690431467 142.090869277 149.66411376 145.366305535 147.197334058 148.579603936 143.827997545 150.827098598 139.915203121 151.993317906 135.830570496 152.131807058 132.037958934 150.934865942 128.890861793 148.362490559 126.871962955 144.848457734 125.998821993 140.95474037 126.184171432 137.132200198 127.747804691 133.490299244 130.574586148 130.491629508 134.29361687 128.754463706 138.307146264 128.008616906 142.41571178 128.543392274 145.952400383 130.284788833 148.562362508 133.298915734 150.245193938 136.877746871 150.723603404 138.032003415 231.459380398 142.150249002 231.978970734 143.232270129 233.825665589 146.84217944 236.630116901 149.824890463 240.269440262 151.653575814 244.385170578 152.185955411 248.547171463 151.488318943 252.34612304 149.574904073 255.149624758 146.526284123 256.637813429 142.763666316 256.850850648 138.807775227 255.876792895 134.765902248 253.681215775 131.111872803 250.376650695 128.736268697 246.452360472 127.588327943 242.180742449 127.766686099 238.201249839 128.953088628 234.80974052 131.301907381 232.438182269 134.573767573 231.469611755 138.005947996
9/19617.png
154.634225187 144.729168652 156.824124871 155.292200733 157.554136932 165.262787285 158.063520952 175.362649019 158.851579422 185.430925938 159.488637686 195.834981676 159.963623843 205.906873383 161.31127889 215.227499967 161.416044927 225.537614132 163.592589942 235.319941047 167.420064951 243.870039273 169.090400363 253.492847163 173.468969647 262.066642757 177.243227201 270.260393651 177.935220199 280.295855194 179.849294357 290.426860156 186.802034638 297.573142945 201.644645681 301.785188611 216.434031158 300.391818412 231.158772024 297.274413715 246.19998582 292.107900308 259.799847055 285.754054745 272.911666459 277.747743345 285.866021461 268.61204014 296.999616323 258.901277764 305.775063425 246.437523531 312.362677573 233.082454511 318.469012348 219.045296577 321.816606112 204.269489554 323.70022061 189.137345424 325.930458046 174.287563311 327.048928222 158.873672626 328.719947807 143.848279094 148.5183475 132.384983669 151.37125883 129.087982697 153.904423963 129.64482007 156.929644831 130.227813997 160.511034131 131.107123314 175.826147752 125.747173084 191.039262048 122.36341358 207.316520664 122.331303376 222.84555747 126.357236233 237.468959251 136.931913501 165.519581958 144.363639175 159.007135833 161.489608866 150.761594979 178.409581847 143.311360796 195.5152272 145.759776905 205.923904497 155.454885021 210.917616369 166.067664245 211.150782148 175.253071567 210.180743351 183.662601404 208.359842544 158.083989381 146.507431248 158.849774816 145.584029861 161.473628159 144.781474556 162.76248926 145.344838137 161.981370761 147.099921788 159.584590596 147.426336187 197.256373447 147.305297448 200.881693215 140.260076962 218.493839051 144.549533306 225.743902521 148.812922096 218.951160489 152.123516109 202.542824591 153.089734364 151.229043221 133.010056905 154.119703363 132.760213087 156.990003292 133.66106484 160.127770735 133.90351083 175.703125505 134.327410915 191.017785684 131.327351253 206.862654243 130.671679877 222.449600909 133.423010166 160.035514867 144.627517799 160.695618249 147.609728997 160.484514892 146.05036621 209.860907809 141.082825979 210.47403064 153.524349043 210.621641113 147.959388496 164.740678226 144.636335847 181.29415132 145.52375282 145.712860525 186.076249046 185.271079826 189.353998492 141.171800128 196.284219838 188.964513804 201.356607107 170.83030467 241.122060949 166.5444896 238.437024601 165.117999054 234.541635775 166.739749575 235.125990398 170.688145378 233.110032532 187.489586018 237.6614689 204.212619129 243.872133441 193.397014038 247.032221713 182.753034172 249.231141274 172.103085541 249.612035578 169.724628447 246.984333859 169.281381625 244.011457197 170.840548573 240.830258583 171.348061517 239.986927253 171.998925468 240.733246354 186.568931373 240.704815613 201.360996074 243.141138512 187.008333085 241.616919496 173.414601295 240.852639722 172.222671595 240.664338402 160.500433048 146.050418481 210.622969804 147.958729596 146.686397426 133.399730716 146.449170913 134.746040187 145.972523929 135.682993048 145.248687002 136.556787411 144.162238136 136.96775954 142.891301478 137.005371684 141.840384961 136.290902083 141.019943996 135.276067616 140.65937821 133.988152124 140.622398832 132.726503234 140.553438366 131.71082014 140.714397038 130.766197994 141.180422056 129.630231158 141.843280767 128.743634547 142.966587709 128.217375577 144.23863526 128.245430182 145.281901973 128.865691225 146.110869441 129.909737291 146.516026899 130.949075345 146.673983524 131.928571182 196.171768969 145.942708641 197.062914179 148.452108282 198.886059938 150.724466961 201.390158611 152.255543185 204.269570921 153.00719054 207.220697453 152.926014247 209.865459076 151.815773268 211.919131446 149.732573307 213.137760439 147.041734807 213.556514017 144.142657967 213.237349581 141.355902585 211.961026586 138.792177015 209.838714093 136.732406836 207.097885244 135.588562478 204.170072755 135.235377639 201.241218283 135.805452377 198.79883931 137.293091686 197.155381917 139.683403242 196.176201587 142.371048455 195.95254385 143.018985861
9/18674.png
91.4131631482 150.114602711 92.1357007664 163.820218198 93.2364127845 177.284845376 94.2332237325 190.55292335 95.8455428805 203.795269372 97.9171573215 217.245768912 100.212009123 230.529514312 104.547517315 243.092887676 110.068979997 255.457779626 117.638829931 266.420948643 126.914130288 276.384504562 136.629233337 285.530463155 146.981543747 293.720495369 158.390008241 300.013215313 170.93285359 305.774915926 183.709787801 309.562212864 197.487535412 310.620333504 210.641175825 308.362382438 222.506452232 303.836046079 233.004092269 296.938849121 243.002611476 288.071657247 251.162746469 278.572777592 258.978029462 267.598634511 266.398088873 256.536545736 271.832104355 245.614338479 275.413433025 232.574953725 277.62738819 219.975012216 279.347970019 207.195631686 280.338682664 194.535431839 280.602316675 181.485912926 281.279117684 169.035551496 280.805696915 156.018094669 280.793769434 143.545814189 110.156126409 134.577548809 123.687454356 122.676289013 139.513364183 119.660661337 155.792435322 121.010051579 172.33569481 125.319400501 210.796277354 124.659465884 225.354610591 118.811284749 241.125164585 115.708165362 256.441130826 116.921434459 269.272787498 128.265417676 192.769356758 144.093441887 194.553039724 160.963573233 195.994218088 178.091327824 197.26635351 195.126637627 172.982005413 204.650049626 183.904376346 207.65831545 195.447239305 210.077235859 205.795876082 206.500543957 216.627393197 202.647542472 128.033940456 147.939690661 137.00308027 141.085824324 157.573518722 141.236138834 165.78442922 147.883941765 156.578622468 150.837131248 137.448530477 150.777980489 215.093010051 146.58457226 222.433683994 139.042804958 243.492126145 136.926364276 252.61532501 143.183719385 243.634208309 147.48154996 223.867988582 147.938681798 123.81658927 130.177390579 140.066202136 127.786693019 156.29364192 130.017107286 171.89214997 132.714018514 211.4534718 131.760215361 226.299938802 127.280590905 241.067889082 124.386028381 256.215250767 124.87253898 147.201491887 138.46241298 147.098233175 151.573034792 147.100363203 146.263291419 232.707893663 135.305679115 233.524145328 148.133609199 233.80779243 143.298307872 179.703431744 145.851607667 203.784354586 145.036642998 172.485521374 184.389880574 215.901108513 183.092341974 164.950125378 196.615228781 223.961088688 194.201353848 143.662275354 229.063584415 165.206681556 227.483893611 186.546842195 227.552842277 195.125574665 228.157084957 203.322698184 226.082049966 220.535642444 224.617930648 239.806977457 224.604703814 229.175039087 240.690181117 214.218204985 252.589878916 194.994276381 257.246541881 175.028819596 254.830181993 157.355902194 243.836371211 148.708515489 230.946900155 172.310587479 231.679452118 194.934915207 233.858014827 214.979723476 229.930670142 235.121954649 226.799314451 216.574876677 241.15514658 194.748870078 246.354443079 170.053797282 242.863013347 147.117541018 146.262688481 233.815374787 143.298729692 150.873977246 143.425783109 150.207357846 145.484164421 148.786550309 147.262875748 146.903827191 148.464779431 144.730194414 149.022644838 142.515014273 148.984820018 140.461453628 148.220341217 138.725559868 146.765008755 137.650590902 144.853063728 137.198809435 142.957770675 137.180709018 142.422485828 137.596328966 140.504314735 138.598416795 138.497450798 140.283984467 137.107774791 142.275070392 136.229536147 144.543649074 135.985487322 146.694485108 136.306945891 148.647104504 137.410003806 150.054597204 139.093570198 150.821802608 141.233712743 228.331660523 140.398142531 229.14523634 142.232579546 230.62552515 143.775311025 232.476859988 144.736764628 234.545254803 145.08106768 236.599816111 144.854850203 238.442473416 143.961870554 239.934864556 142.45858215 240.765163824 140.594651821 241.011140724 138.802140049 240.94249615 138.336049286 240.371134386 136.561906527 239.273166769 134.778729147 237.583626859 133.64442251 235.667032393 133.018919686 233.541659242 132.991701845 231.573206901 133.456370574 229.853907821 134.629835459 228.702169898 136.293156797 228.165993603 138.363470605
9/19391.png
130.281055237 151.946157939 130.638205149 163.616451706 130.443775618 174.773130042 130.290168164 186.069154982 131.638478556 197.10618708 134.090362763 208.463924608 136.955966344 219.434702825 140.885774478 229.614225226 144.44662045 240.636822225 148.412035097 251.280333165 152.794591971 261.979232297 156.809678388 272.884536759 161.119186018 283.436702491 165.987864065 292.723322439 172.247719454 301.854971619 179.837873485 309.328369627 190.066933452 312.952788579 204.372657093 313.43582169 218.178832402 310.643403804 231.804445551 306.4338017 245.186360151 299.694755843 256.728445294 291.312486426 267.396388634 281.391914452 277.582971965 270.5840382 285.909312521 259.928959319 292.440150099 246.706354163 297.321958464 233.411314044 301.723325352 219.752191786 304.149993254 205.718202514 305.581346638 191.279956734 307.240737419 177.359563344 307.843637804 162.865254489 308.727242746 148.760895164 129.824572513 130.873092835 136.799172658 119.338953559 144.992536317 116.939846693 153.751807376 116.731416285 163.344591497 118.291814374 192.323642385 113.830376229 207.673545194 110.087562528 224.256714728 109.438873869 240.168703281 113.101213215 254.480895676 126.035718791 176.568541001 141.636914374 173.348373996 154.602048287 169.426580894 167.631023652 165.768393893 180.925154472 157.800257649 198.446097179 166.156381641 201.405389804 174.994883607 200.522891983 186.493218013 199.078692166 197.546781355 197.766293909 139.423703924 149.81694379 144.380909356 144.077693172 159.160163206 142.043436771 165.163965895 146.205655749 159.868655252 150.190651057 146.406564481 152.038622067 205.85330061 143.700945365 212.226321447 137.659030997 231.464619641 139.373965151 240.191740285 143.591967645 232.114219883 147.448470088 213.818484225 147.223318005 136.820297755 128.372613984 145.372626425 126.017914092 154.163780139 125.961444891 163.080954535 126.341860179 192.709483188 122.915652866 208.088316504 120.218076145 223.882402826 119.589618395 239.679610496 122.260090005 151.412430621 141.20777412 153.364189816 152.110005996 152.289931988 147.192645032 221.866855952 136.666139345 222.778602173 148.12576146 222.890275531 143.07750657 171.339707965 143.610730266 191.104787863 142.267129267 158.630616681 177.389666079 197.069127189 176.75118142 153.472379052 190.299316927 203.691756061 189.180664689 164.313478112 243.161239878 163.205871085 231.144325433 169.797810936 223.176739412 175.839485632 222.538317288 182.595104641 221.62511369 201.568577279 227.753773696 218.113216923 240.78561795 209.770517014 253.799842363 197.482490718 263.398970944 182.177037308 267.149305495 172.696627048 263.861407544 166.309270141 254.503330976 166.908306681 242.648584792 170.040049711 233.380094532 177.574137753 229.954337603 197.255582859 230.769954334 214.473842296 240.794408152 199.725240863 253.157502508 181.780790623 256.200834462 172.580823594 252.215650687 152.304833382 147.191981019 222.928883687 143.075786174 166.885486024 141.081593302 166.528468463 141.476884851 165.344043737 145.129387473 163.084194205 148.346590603 159.863403253 150.590628211 156.018442866 151.590630843 151.985424216 151.431682075 148.157930143 150.143072675 145.021707417 147.629181324 143.064688185 144.208963428 142.368973688 140.338906252 142.829334295 136.39900948 144.51185936 132.745350402 147.341700674 130.074197611 150.931909269 128.581286829 154.988561257 128.234922486 158.934246796 128.901344174 162.468393717 130.719501508 165.192499452 133.54969794 166.575819377 137.163114339 217.67406195 142.501634215 218.465564135 144.393818637 219.939168765 145.95155918 221.804374249 146.930580614 223.89344895 147.267774306 225.964574099 147.019387435 227.797061581 146.091415762 229.25131001 144.534194253 230.042447495 142.625781202 230.270574856 140.800144013 230.222135392 140.261393722 229.670022163 138.517353383 228.561973175 136.725466913 226.876766964 135.551787353 224.936835083 134.891003754 222.790705086 134.872617827 220.821785633 135.404211121 219.131336158 136.656371176 218.008386651 138.3834311 217.510915555 140.439605652
9/18339.png
114.794961042 150.208104178 114.841742315 162.549118176 114.748246815 174.396014493 114.863539242 186.246563343 116.399251858 197.8669066 118.852454844 209.789579154 121.551586033 221.372090055 125.418903343 232.284962704 129.112660323 243.754584091 133.778565219 254.629560952 139.565229088 265.300608233 145.753717381 275.726471439 152.507164183 285.572231927 160.123704899 294.029939842 169.572205874 301.785640474 179.811495055 307.320664377 191.480706965 309.348115445 205.741978424 308.473584457 219.242033829 305.076770549 232.374876156 300.113803935 244.913886482 292.947295996 255.842592218 284.453456474 266.079451872 274.233696483 275.524368175 263.513106141 283.513654128 252.591003255 289.267864542 239.242270941 293.347577651 225.973858879 296.946039735 212.44386056 299.122339314 198.699161923 300.519602071 184.56102672 301.932448285 171.021046628 302.319949851 156.856248505 302.456997041 143.080235176 118.722023288 134.302518594 128.421994776 122.781742723 139.795695461 121.204252884 151.731287272 121.746971394 164.148451299 123.739540814 201.418936822 118.642858509 217.726378315 113.420952293 235.363894622 110.605286007 252.61127751 111.558470126 268.635568704 122.942351744 180.292686508 144.062145629 178.912657347 158.348035645 177.026999041 172.671592435 175.172094388 187.237289256 161.434649233 204.335370447 171.06076331 206.379441768 181.151005293 205.671404162 193.343649497 203.521324515 205.317543717 201.549035267 129.934595468 153.129044176 136.798174905 147.652685951 154.054568742 145.112892655 161.825763248 149.171254731 154.348915517 152.645416315 138.220284573 154.759468605 213.7023313 143.79640073 223.199939409 137.716333011 244.4860944 137.5968022 253.469345518 141.248129065 244.702390972 144.623916992 223.916705523 144.889251221 128.601105259 132.528520275 140.321245984 130.967621831 152.065478839 131.564773987 163.888839576 132.237472391 202.085091286 127.579971293 218.568690512 123.532365384 235.450866229 120.902892291 252.586451057 121.31252045 145.105976752 144.678457442 146.547618811 154.362220544 145.903010518 150.12808843 233.979828214 136.072264708 234.159696526 145.205054852 233.926201561 141.675753531 171.420809614 146.398088281 196.989938684 143.545318243 161.484102773 182.280985734 204.291371899 179.690804609 155.595550602 195.972693463 212.172035564 192.147888151 155.971566503 241.81814939 162.850237215 230.700865849 174.006332428 225.11895525 181.796750049 225.172289235 189.950777689 223.872465962 208.181531541 227.417907691 225.622402261 236.612607861 214.341178808 248.869616725 200.906966955 257.561249424 185.070786877 261.464319929 173.256061568 260.492454364 162.647673885 253.045622984 159.8567622 241.703182553 171.032192866 235.524979827 183.178588516 234.558627458 202.423044361 233.331784479 221.110140921 236.96440459 202.639167096 244.342604412 183.740310165 247.063162557 171.207214106 246.529863587 145.92149107 150.127987625 233.952899425 141.676292389 156.774653537 146.73329084 156.034689939 148.734655056 154.63246297 150.417923611 152.745184298 151.519287542 150.593106012 152.010850527 148.427531264 151.910407383 146.43950913 151.106713879 144.799747646 149.649490774 143.796112854 147.734299615 143.40318502 145.845894618 143.398331491 145.33255715 143.865334239 143.48395308 144.895940305 141.559719317 146.585579096 140.239713674 148.563675091 139.474621113 150.77567271 139.30170142 152.871575181 139.67974113 154.743639039 140.808513156 156.093675266 142.48224498 156.803388408 144.597540101 231.805405485 138.867242461 232.598925759 141.03752375 234.210167498 142.913122912 236.290950847 144.158487912 238.65385313 144.671283799 241.048748481 144.524061389 243.24289767 143.588722477 245.058653439 141.923732827 246.132681938 139.803499427 246.537520318 137.726616366 246.535413348 137.089372085 246.006600547 135.01005242 244.826518151 132.871312555 242.936380657 131.440781117 240.732445934 130.582137548 238.262797033 130.451866699 235.959384802 130.931934421 233.913057666 132.238826996 232.490122544 134.14124379 231.77050252 136.484055181
9/18052.png
102.97048691 152.713267565 103.379721039 165.569362608 104.155490585 178.050376734 105.087720539 190.446901406 106.728032608 202.70569848 108.855310542 215.327728113 111.186038194 227.706424758 115.107137657 239.345713324 119.73861193 251.236103499 125.867832189 262.051890178 133.371182036 272.351429109 141.208472878 282.238518025 149.524303008 291.26835849 158.871599235 298.659908974 169.958181302 305.097007351 181.718458587 309.190884994 194.417843844 310.329440509 207.816967786 308.482967491 220.069310429 304.610929807 231.593721142 298.884704989 242.766843093 291.231629539 252.194787627 282.768382108 261.216814181 272.836573241 269.988432816 262.53034169 277.212578659 252.307790079 282.515039037 239.598285498 285.911369071 227.040265447 288.803834456 214.082769342 290.579939512 201.119079861 291.582439793 187.693933451 292.948780565 174.878928581 293.20955712 161.435621515 293.800802589 148.427008232 113.04016548 134.183556704 124.113931873 120.726231394 137.316748951 117.013131155 151.468701751 116.468354621 166.105761588 118.496824031 206.104621822 116.316640047 221.666739774 112.755179379 238.420604039 111.972006476 254.421072769 115.094670647 268.296624718 128.388188687 185.847333965 143.543867285 186.066154591 160.294960824 185.835539841 177.356453558 185.71280001 194.36338608 166.237577644 205.804967837 177.253232805 208.407126745 188.549517735 210.019157471 200.14410784 206.065888706 211.743203563 202.827086291 128.616737051 149.548292297 137.3547717 144.342536181 155.988473146 142.663290659 164.395309223 147.211161764 155.772466902 149.133097456 138.029191523 150.389397504 212.598261588 145.197089143 220.825255462 139.048878267 241.286200738 139.881770134 251.114064411 144.196928727 240.891545076 146.645069296 221.225837906 146.017100709 124.430399023 130.185050766 137.996181831 126.781793065 152.040297611 126.502011833 165.836369996 127.015185419 206.77438398 124.791917755 222.489898033 122.358585897 238.286659962 122.018818996 254.049674562 124.54356062 146.415569172 141.781796039 147.023875793 150.216016802 146.65899051 147.032473748 231.16072129 137.555095574 230.727624645 146.563567013 231.854024166 143.453718829 175.487060278 145.135383322 199.12174461 143.949728805 165.840704128 184.115725729 210.598595651 182.30118501 158.733386201 196.908170126 219.15856202 193.905921976 148.838654535 231.419906094 164.769887004 226.804356941 181.763245787 226.176624879 190.122831883 226.967790064 198.35256319 225.386505862 218.007485565 224.23121505 239.28344568 227.234888467 227.01990614 242.034610053 211.233887605 252.471532427 191.759440673 256.637031626 174.519190043 254.958434022 159.578682581 245.321391945 153.251809738 232.761532624 172.150384738 230.324276388 190.472685195 231.556778288 212.153691755 228.721745177 234.561109482 228.600318597 214.507395084 242.411365586 191.157796647 247.033653602 170.54258399 244.032220626 146.67328382 147.031718887 231.871565957 143.452792409 154.712529264 143.817391424 154.16964566 145.49177156 153.050164904 146.95619659 151.527268812 147.937008146 149.762425093 148.432193847 147.96193271 148.449752326 146.28019336 147.866212374 144.841442902 146.714754811 143.929975022 145.172560335 143.519392859 143.628018787 143.498499165 143.244732495 143.822585861 141.658785448 144.593553739 140.010147149 145.959844128 138.8766838 147.566190531 138.168355961 149.407198428 137.931054089 151.156608603 138.130539512 152.767065466 138.980401782 153.950274219 140.301716986 154.64792487 142.051307532 227.675222308 140.492445293 228.444995896 142.275362827 229.859218809 143.751915231 231.62845504 144.672224486 233.60497897 145.014774446 235.570876882 144.835376154 237.338771638 144.003199673 238.780409791 142.581699546 239.589824683 140.809850284 239.849569186 139.103514764 239.813687027 138.658475276 239.279736588 136.961481118 238.234946893 135.251072 236.622155465 134.159623546 234.79077126 133.530563007 232.751415348 133.47864638 230.862567035 133.895684991 229.202607394 135.007521697 228.079797517 136.593954546 227.547769838 138.547909806
9/19294.png
94.9903636366 155.477297946 95.847200892 168.77001105 97.200837065 181.560320952 98.5958013753 194.229075628 100.761244041 206.735747986 103.318409183 219.622647148 105.975389103 232.037119962 110.086711255 243.811826299 114.853926654 255.895406598 121.219367826 266.928198635 129.359869054 277.0242411 138.323024791 286.270960189 148.176894431 294.382964607 159.284807313 300.143082493 171.760348408 304.805695005 184.150349498 307.906551951 197.109072275 308.648387759 209.9090878 306.997853933 221.763652957 303.632736686 232.976811538 298.371380112 243.313391724 290.753303406 251.547055176 281.909683838 259.079713355 271.285673345 266.055612925 260.371505042 270.922446323 249.513108368 274.173416983 236.570678986 276.318786832 224.165358577 278.511594085 211.856639382 279.850656213 199.43452275 280.48207916 186.627318385 281.359786384 174.480477717 280.71008111 161.788855584 280.240005428 149.546923494 114.776718843 130.026054686 126.802660607 116.470047142 140.926979387 112.161135443 155.809044008 111.513703432 170.989931321 113.734198084 206.81956405 113.416443746 221.231868202 110.551935394 236.446110433 110.387913174 250.598873566 114.121100401 262.7949305 126.762582186 190.462867385 141.165597113 191.704650084 156.929746184 192.486128419 172.887552667 193.197635118 188.752884044 170.352755649 207.031523004 181.722657514 208.266898868 193.350185977 208.487303863 204.369989586 206.833425733 215.501570173 205.369303632 130.445505512 146.089353405 139.575983997 138.814455686 160.011201053 137.842320346 167.214147406 145.028317219 158.773639532 148.070897989 140.222149763 148.531044565 214.284781993 144.156017362 221.417594051 136.553173644 241.751017434 136.684627323 250.372160509 143.291921358 241.559255523 146.705148481 222.667926356 146.166783324 127.236679659 126.268764673 141.661375807 122.64085233 156.336433188 122.356325844 170.626515193 123.084118034 207.666381709 122.44431167 222.223457157 120.615482902 236.415668678 120.86156479 250.407494877 123.554781847 149.705670127 135.506039277 149.583286841 149.069716394 149.185053546 143.764638728 231.644849473 133.962891545 231.924800855 146.899922154 232.375212511 142.080449347 179.314053531 142.876625959 202.236645331 142.239443482 170.81541418 183.768115179 213.859986713 182.847743463 163.36504142 197.712984135 221.715279899 196.474880138 155.383622151 241.740713986 169.128137125 231.598364283 185.698832757 227.337143351 194.389010614 228.297262451 203.091920123 226.42736867 217.939850558 230.334496616 231.742232546 240.185106294 220.587117123 248.5110909 208.325994632 253.496385837 194.767459539 255.309756454 181.02197186 255.085275321 166.99591882 250.014994271 160.23407408 241.855617836 177.244596047 239.569995204 194.413595416 240.393053561 211.024200359 238.993029696 226.884265788 240.182753975 210.652131777 239.783050873 194.514502169 240.325932349 177.266949982 240.601636261 149.204051663 143.764719473 232.398432433 142.080548036 161.902638372 141.97302522 161.083904465 144.217696648 159.390825662 146.394620123 157.065910036 147.900393834 154.377663803 148.636944791 151.613118789 148.617454921 149.029636705 147.717065292 146.857904827 145.961383998 145.469209489 143.572230479 144.886584764 141.090103291 144.895117244 139.744767082 145.599834677 137.340586913 147.061494803 134.979028085 149.286748601 133.413020827 151.851600845 132.571102068 154.676939495 132.524707533 157.309908368 133.205066073 159.574569917 134.783823367 161.171677626 136.976137953 161.922718235 139.235221287 230.3750665 140.51477828 231.144853166 142.667690638 232.691438079 144.496005154 234.690187078 145.728141752 236.969894953 146.250085665 239.279855676 146.110730268 241.371559996 145.173215545 243.070117913 143.528272256 244.042921599 141.455945441 244.383568896 139.449889236 244.347560528 138.816454173 243.843760322 136.827214686 242.719701849 134.753715986 240.923700261 133.333160137 238.802368137 132.467102467 236.420383835 132.33454413 234.208155178 132.822620016 232.271848014 134.116840428 230.946729198 135.973761253 230.299017227 138.210012145
9/19866.png
71.1380890222 165.903371544 72.1373437769 179.861751895 73.9160065208 193.614105309 75.5480820819 206.993013268 77.9833270948 220.260312009 81.3361714966 233.697589371 85.9171983494 246.168097055 92.5661745047 257.606296512 100.713659407 268.247811494 110.923776691 277.054518239 122.670659475 284.170051944 134.861420229 289.945543165 147.626347692 294.565163269 160.942726672 297.338217906 174.778051097 299.936826219 188.224394245 302.088968578 201.845274656 302.380157842 212.884697364 300.488288372 222.47167363 296.612291054 230.929110439 290.720362474 238.63571484 283.040936489 245.006885454 274.913737172 250.615079297 264.862158101 255.804705782 254.761867456 258.699722786 244.916690728 260.798001426 233.29678996 262.350103692 222.571138742 264.149218579 211.998393559 265.11077503 201.294840875 265.037411178 190.131847287 264.687755095 180.021403717 263.096848336 169.094678711 262.002008497 158.651814152 114.355088401 128.201018348 129.452349106 115.55958619 145.988209527 111.827849811 162.95867987 112.546047582 179.61050592 116.844981784 214.501007075 120.48391656 226.009715434 117.553267626 238.099074234 117.262456669 249.162728851 120.480730753 257.761550799 133.427375413 200.263190979 142.965720146 203.511968717 153.808344999 206.541994853 164.765310602 209.578876645 175.772068472 179.426290566 198.871030337 191.533196385 199.261059932 204.216471854 198.744612514 213.325675653 199.5963106 222.558897538 198.604378825 129.766719244 145.258136285 138.264043522 138.141188862 160.147508246 137.434389998 168.487493719 145.406067456 158.898600014 148.707367395 138.844084224 148.348133258 217.523367788 148.927459019 224.75104213 142.451034786 243.776052192 144.02910018 249.614602917 152.065300183 242.241884061 154.716912205 225.045427595 152.22466383 129.508915494 123.945260076 146.505998079 121.098374856 163.474358439 122.973780736 179.351196635 125.854864339 215.130726934 128.617628921 226.825064579 126.860540475 237.905745859 126.669445271 248.799051915 129.154768406 149.021113564 134.864983559 148.986714194 149.402028591 149.372567805 143.579310594 234.543688901 140.542036241 233.422088116 153.945105046 233.948576683 148.79818778 184.669695617 143.976513245 208.81139265 145.625299941 179.278638491 176.489401064 221.422219994 177.172191145 171.989326524 189.798118449 227.748028697 191.037145751 160.078160049 236.170271768 177.76068865 227.243542964 197.209128245 221.54018557 205.445580006 221.955163711 213.066006696 221.533310732 223.763649853 227.231061301 231.831781057 237.506795589 223.880410198 242.531033348 214.706312324 244.722487011 204.650519006 245.782694371 190.155366921 244.706560587 174.754485318 241.336555332 164.677039046 236.156250318 184.648527173 232.61975641 204.723492276 233.145384512 217.011964598 233.687213117 228.064864904 237.040902156 216.497164633 234.019431694 204.724889329 232.753980612 184.738615802 233.316807412 149.388930466 143.579423724 233.959048786 148.798260183 162.311358347 139.145912425 161.499830404 141.403693053 159.974168473 143.382284924 157.875471968 144.743613043 155.427583739 145.394317067 152.93510269 145.29984885 150.681798294 144.31424482 148.88153413 142.556800399 147.809970123 140.297127402 147.410881244 138.09381231 147.417074586 137.292700956 147.952818363 135.161957514 149.113044381 132.928842032 151.021382359 131.361982023 153.295564335 130.463970827 155.847298171 130.319287904 158.231195323 130.87707263 160.269421329 132.297766292 161.68311161 134.301916853 162.387373678 136.67163085 230.923196606 145.310838814 231.570450797 147.310999387 232.86267316 149.085876465 234.596894708 150.360469971 236.67420681 150.956061799 238.811438963 150.837312714 240.683678704 149.8527417 242.118819224 148.186393006 242.889952887 146.185566421 243.139185337 144.313348083 243.107911784 143.629794476 242.704939057 141.798884498 241.78458825 139.821053657 240.222269877 138.391868694 238.297544168 137.500826244 236.087417854 137.370347011 234.070932468 137.917344062 232.397291204 139.241029632 231.327733049 141.046049773 230.81829171 143.159302548
9/19592.png
93.1103828317 128.555316423 93.9579375696 141.997442624 94.9630698611 155.075739337 96.2527014618 168.153925337 98.949971285 180.91907184 102.493106105 193.710115934 106.147364026 206.15917502 111.701013192 218.025525132 117.490821356 229.89503002 124.667765791 241.049012847 133.163913252 251.200697234 141.666107575 261.28671298 150.265603769 271.430234609 159.533024374 280.345535155 170.151899215 289.17689395 181.897672381 295.497431121 195.552087571 297.5141972 209.319749494 295.762251851 221.863714807 290.656649386 233.005970491 282.790299228 243.699518017 273.37099591 252.504897892 263.559259825 261.234215311 252.24173429 269.423320072 241.331204645 276.105440973 230.091608723 280.403979463 216.951564564 283.875600598 204.150155327 286.846254463 191.11663797 289.121432594 177.740467305 290.333348564 164.20227521 291.749545422 151.058176743 291.687914371 137.520339937 291.654577003 124.366200265 103.811588331 127.036356879 118.161415933 119.337675353 133.820788065 121.04206593 148.990330866 125.445520681 164.324128673 131.089980206 208.194472805 130.83337884 223.559479444 125.092637181 240.139683793 120.526540509 256.640657977 119.054413893 271.974614727 125.858475006 187.310276574 153.625661944 187.875499807 174.501389727 188.210644592 195.344787042 188.355252839 216.298750767 169.450216531 220.182332433 179.734993839 224.873200817 190.436355297 227.621018239 201.097943955 223.16255668 212.079249722 219.132964731 121.585178291 148.039526834 132.529798333 141.66171675 156.458066296 147.298606759 163.538194173 156.567410784 152.790827474 159.693076499 130.302048228 155.725042037 216.41996051 156.437341196 223.621121138 146.87059545 247.482474749 141.9689452 258.013996769 147.333779212 250.328263998 155.551658713 227.45316976 158.716323091 118.124889398 127.924893697 134.003485338 129.571729267 149.008858017 134.274359254 163.501557502 138.440520402 209.093339967 137.847800726 224.821456296 132.998993984 240.499978384 128.967511748 256.739293283 126.925260518 145.03468624 141.392731508 141.333071077 159.717038747 142.751317023 151.277600901 235.05196336 141.604779943 239.202063746 158.72413358 237.359143259 151.061047631 175.898944494 155.09274053 201.707729548 154.793181484 168.850225478 199.282768939 211.826491789 198.730202612 162.642130483 211.341925478 218.803904487 210.306257078 157.000843629 238.834750574 170.461468081 238.819452724 184.580801051 239.343466676 192.282692033 240.367078182 199.720242694 239.019026605 216.700923959 238.562958374 234.478645346 238.252992256 223.866465511 252.324619915 210.660077513 263.071081076 193.519329823 267.300583324 178.507670296 263.769608724 165.842047597 252.523428893 162.328257057 241.073087393 177.484753051 243.258899053 192.537038629 246.774111653 211.093206322 243.469209608 229.19303173 240.627606317 211.991032229 251.133092075 193.204495945 254.60154403 176.75074308 250.711072944 142.772905493 151.277140381 237.360632119 151.061015871 151.860776796 146.99226547 150.989102504 149.701709366 149.17980628 152.063983359 146.698374342 153.676462411 143.807775261 154.437461777 140.861519949 154.369975975 138.132332036 153.340831877 135.851558438 151.406910261 134.408567373 148.840683054 133.801999985 146.276722238 133.747133893 145.415450389 134.28787995 142.903726571 135.614420726 140.255332618 137.836727942 138.36611378 140.497973714 137.222727157 143.513521127 136.942417359 146.373352357 137.462331844 148.941738633 138.983474434 150.811510868 141.271736131 151.783479863 144.066973074 230.513224231 147.220335715 231.370199713 149.966636624 233.053320218 152.442837813 235.39915323 154.249505676 238.249578312 155.096574301 241.186996191 154.889428944 243.724746426 153.473112249 245.631580792 151.1445065 246.666376681 148.366043286 247.000703394 145.74402836 246.997052338 144.734944557 246.512686492 142.243598616 245.299333538 139.479577718 243.191230159 137.435470813 240.531278544 136.19556976 237.488179949 136.057162453 234.725374541 136.908531855 232.478294694 138.788597142 231.041016688 141.308274008 230.375324817 144.247181337
9/19815.png
67.4290280087 142.58021438 68.7332482742 157.556119916 70.8687359055 172.098550856 73.0435762275 186.57646974 76.2482887896 200.795390503 80.1144470239 214.986703628 84.9175257136 228.735898739 91.8145191665 241.757809092 99.8222948141 254.228945954 109.615489135 265.130777031 120.926757184 275.14062748 132.797298827 283.869562173 145.443135448 291.670269503 159.01715241 296.816230584 173.750948237 300.993853204 188.139961554 303.327691227 202.773286403 302.309763079 213.743195797 297.949526992 222.356421727 291.075639304 229.315866311 282.088206754 235.432463092 272.013718472 240.430579698 261.853007718 245.356423329 250.702123866 250.347628365 240.171474841 253.992150748 230.069242236 256.710479884 218.251783001 259.317692677 207.542237912 262.016607504 196.570257832 263.821329472 185.33595519 263.948435854 173.590934305 264.088819275 162.675738204 262.667714493 151.214914998 261.648610784 140.020577199 111.335152922 122.31913206 128.227550748 109.853975094 145.755284676 109.522645493 162.819527924 112.900347418 179.887830415 118.192103765 216.116874797 121.134720992 227.633485885 117.276084359 239.316046954 114.262553955 250.089046742 113.09263191 259.791047002 123.430274022 200.862480196 149.382364574 203.773738076 163.39201736 206.773886579 177.298563837 209.590355425 191.50959993 181.08715819 208.64168369 193.2564458 209.420059653 205.61404096 209.434561057 214.095267204 208.833225566 222.241641604 205.754245662 127.789244925 150.392967961 137.449054779 151.736041688 158.07812133 151.495329448 168.97085196 150.504643892 158.185233937 152.88311973 137.265018893 152.672170336 220.334084675 151.430888728 228.275527885 152.738341272 244.324741534 153.772738064 251.577740234 152.316022598 244.557262283 155.146746603 227.887229653 153.816591721 128.322149525 121.228643525 146.207698313 121.475553475 163.313393939 124.827841798 179.519664898 128.178100946 216.838873304 129.687378488 228.568504771 126.941500065 239.256114776 124.736312837 250.111425352 123.521066045 147.653135638 152.049045652 147.855761699 153.686508813 148.336859036 151.839811923 236.383653364 153.751202025 236.010509601 154.954837517 236.223545852 153.457303275 185.190144701 149.81429215 210.501283229 150.129859745 179.612844527 186.674677563 220.964266959 186.26995086 174.335284541 199.037761594 225.615432235 198.249696948 167.353503709 243.418303414 182.227736679 233.664681272 198.681277616 227.433300351 207.445531558 227.576236702 215.618407436 226.969791509 223.202835404 232.11218362 228.859941356 241.407039552 223.258178139 251.639698155 215.701009918 259.827050671 206.116373004 263.93054695 192.407323216 262.551253683 178.215629664 254.23623787 171.684264822 243.578858589 189.025303288 237.660718158 206.637133313 237.716565089 216.477124843 238.057665488 225.310979492 241.64320383 216.15471287 247.376245289 206.148382767 249.471790765 188.51482164 248.940525056 148.352645392 151.839546283 236.251698297 153.457516107 160.267712116 149.179113448 159.43095829 151.636652317 157.702232389 153.841443064 155.333738175 155.358770166 152.59641424 156.092075771 149.797117789 156.080136032 147.199270397 155.146878899 145.029494059 153.336550002 143.66686267 150.899543163 143.096761456 148.436155694 143.070837799 147.425979055 143.678866011 145.003184083 145.032964525 142.526641405 147.208044483 140.816899986 149.767295025 139.823639271 152.629750051 139.631295165 155.324199687 140.185320049 157.698350527 141.696990114 159.399215421 143.874965669 160.265850352 146.416436792 223.541692917 150.901129328 224.571934803 153.658859301 226.86525514 156.527840872 229.985994479 158.506165209 233.595381357 159.494489555 237.33585514 159.500890314 240.795276061 158.309171393 243.628697056 155.901794311 245.413763891 152.638756726 246.1138041 149.033534295 245.833852835 145.575248294 244.322678028 142.266753131 241.704549493 139.557026074 238.281906105 138.03828309 234.607445166 137.507614593 230.856624932 138.110364571 227.62984499 139.785624737 225.26134778 142.56759108 223.773938238 145.853082244 223.324218803 147.198064048
9/19930.png
115.696064505 143.544442442 116.350187579 155.611988809 116.47203394 167.338800154 116.704314515 179.070601861 118.276395389 190.588378671 120.944312914 202.390024051 123.972408673 213.738630948 128.421165984 224.350530204 132.663799648 235.517313695 137.654386105 246.136345186 143.772528311 256.32047847 149.841142133 266.588146394 156.042601824 276.680705439 162.407950504 285.773264109 170.046750491 294.878155891 179.21571043 302.01893034 190.68146125 305.264524969 205.367399748 305.359275362 219.283801015 302.775914619 232.778978244 298.05655063 246.388756508 291.059001013 258.023895532 282.986947554 269.058382374 273.084829181 279.582029186 262.413335461 288.261920312 251.445214989 294.850085049 237.819360227 299.156896006 224.112732267 302.942496136 209.957194298 305.264275248 195.602036595 306.54800555 180.842693008 307.873552603 166.564100341 307.994536973 151.875868243 308.378566582 137.525359624 115.507286856 130.690233517 125.933540515 121.472236341 137.622330835 122.169121934 149.329663323 124.920928001 161.543384578 129.122250021 196.220352911 126.274359194 213.166657302 120.735326624 231.483137204 117.587069281 249.681690467 118.505555204 266.754481254 128.462296907 177.171909671 145.566004337 175.666929463 163.749656926 173.482990913 181.969009303 171.550421166 200.211609148 160.482647571 209.081828488 169.433834445 214.346267208 179.319617596 215.231300039 191.053389007 212.024247427 202.461342479 209.080817272 127.769300554 146.347970071 135.896175947 138.845267178 156.422990102 141.518512622 163.178789754 149.08788459 154.775641765 152.716552289 135.774260647 152.07887395 209.029306855 147.600041184 217.088200594 137.894490053 242.012090078 136.260150865 252.155522929 143.525520352 243.329370195 149.859107256 219.935829044 150.802659843 125.724588989 129.550633975 137.740465715 130.004260155 149.45763498 133.23155562 160.98285783 136.352112075 196.593701246 134.003918243 213.846052074 129.312960248 231.487463567 126.051131173 249.555206909 126.034133736 146.463564263 137.217649062 145.318450633 153.874085336 145.645357738 146.355442246 229.376111413 133.928874171 231.633476806 151.747595501 230.75665818 144.13973958 170.674301814 147.116830624 192.920814509 146.274403245 160.567766753 188.556655196 202.970306352 188.341306581 154.803398908 200.756300261 210.073515122 200.429153001 154.493607992 233.712371383 163.628481562 232.567662664 174.270259283 232.928181718 182.079036509 233.625414775 189.947315063 232.864258029 211.377737714 233.899923257 234.060836745 235.154295734 219.620397683 247.786909148 203.374888222 257.282336871 184.553236389 260.641425213 171.672609176 257.156234377 161.295739072 246.535104176 158.432814048 235.731441092 170.320372275 237.893707999 182.724819265 240.36140031 206.053183208 238.742044446 229.17920533 236.859654867 207.037145567 245.391681229 184.163314974 247.289353972 170.634106519 244.05228257 145.65980893 146.354542045 230.755960389 144.139123626 150.04037739 142.459464538 149.80211944 143.557075938 148.780916239 146.662599529 146.917237617 149.401870499 144.175560025 151.236376261 140.916020259 151.913544977 137.641702898 151.390168676 134.682979849 149.814479516 132.474286557 147.375046191 131.16347445 144.391709879 130.708565381 141.164484496 131.073924049 137.894075654 132.370654106 134.723251059 134.689776662 132.425117298 137.694417136 131.12814479 141.141495674 131.026838168 144.340446248 131.907040211 147.010534486 133.803383511 148.871589005 136.410204219 149.744987363 139.114816651 213.043686594 140.66344385 213.435628363 141.461327145 214.917338616 144.689873238 217.330305608 147.381009776 220.592641193 149.067032726 224.270039223 149.529691383 227.895956643 148.795410188 231.146949289 146.964786675 233.549515306 144.164251378 234.835701551 140.760648454 235.046499866 137.253244007 234.241058777 133.691766649 232.414507978 130.400295606 229.537861163 128.183811613 226.064861671 127.170332709 222.270079972 127.353305959 218.778531738 128.487398728 215.86854101 130.67924472 213.848648998 133.631348361 212.999203225 136.995876053
9/18021.png
98.1740145477 143.737152885 98.8121578128 156.817437379 99.7980012578 169.454570115 101.160277694 182.180083799 103.576969893 194.454867027 106.527041041 206.95687014 109.339843816 219.201557849 113.63691457 230.905988874 118.275806515 242.773715094 124.527257821 253.767344001 132.378374802 263.630616185 140.724951679 273.256145464 149.458747713 282.262626976 159.060119945 289.682351933 169.995988156 296.645620481 181.693667578 301.382499982 194.641197274 302.687351513 207.911637798 301.224459036 220.187874892 297.15900161 231.559592649 290.659651019 242.391782651 282.277989219 251.386126257 273.345289202 260.121970439 262.643698973 268.290586433 251.942063862 274.524003477 240.948459542 278.458482506 227.91206073 281.49333905 215.144548716 284.448089475 202.284506029 286.759129138 189.19784927 288.218788629 175.955177197 289.921639903 163.196717898 289.914935047 149.856489129 289.843777086 136.883671477 108.297142219 127.705236058 121.315077832 117.674595986 136.623006957 115.765469824 151.948324197 117.638649494 167.231888559 122.055301798 207.8346635 120.807503685 223.703587462 115.024456338 241.067658098 111.917955868 258.071261307 113.397106663 272.414574765 123.212803889 186.956867759 145.96701694 187.494614143 164.836026862 187.747965101 183.808043001 187.930737648 202.857105717 169.208511864 214.810194211 179.662309874 217.335297847 190.28444211 218.473687409 200.750415036 215.022208799 211.385334307 212.454369527 123.366297154 147.763151528 132.431093263 140.019553416 154.792576409 142.212485902 162.570274821 150.658703655 152.529690888 153.64047128 132.053508241 152.576467692 215.426662605 149.650920584 222.98415415 140.584939865 246.453545802 137.679840618 256.566028346 144.090198973 247.46102067 150.286705288 225.489836739 151.608516446 121.457065654 126.200285856 136.99220992 125.045995592 152.00246264 127.444929794 166.433354932 130.042929836 208.942802751 128.579838374 224.986069715 124.293060791 241.180550544 121.532143493 257.804736581 121.804026077 143.745767305 137.96238106 142.242355379 154.454877584 143.053116479 147.52103608 234.436755353 136.130643567 236.470232786 151.988463482 236.005109003 145.39109008 175.207477251 148.177910507 201.0059318 147.485118888 168.95193292 193.140483222 210.25594688 191.668286907 162.214269176 206.362246271 217.865957379 204.170321446 155.745107366 241.712335536 169.825724335 239.079115781 183.992935379 237.12232236 192.260749277 238.277827147 200.561339062 236.298703268 216.430851617 237.237744639 233.581266984 238.651602649 221.638588672 249.658701054 208.373373068 257.697833166 192.718246806 260.953533931 178.813498546 259.419528187 165.877125036 251.827667613 161.471315219 243.366756126 176.955150374 244.585626651 192.460791815 246.70709279 210.37895407 243.451799751 227.868811532 240.344829033 210.150275491 245.008597189 192.503082291 247.303576536 176.859917787 246.301303818 143.075154317 147.520180323 236.013924138 145.390747778 149.751394076 142.947511137 148.984696314 145.483820588 147.324804097 147.715655164 145.045501464 149.266739461 142.37289414 150.029709406 139.62804261 150.031090034 137.080657748 149.111796273 134.939107412 147.332892951 133.577235021 144.962372922 132.986836639 142.584702354 132.933977423 141.753939983 133.412043064 139.400992812 134.623562567 136.914359277 136.664992954 135.129730112 139.125731715 134.00713616 141.929380454 133.701098733 144.597661321 134.152941226 146.984118304 135.550804871 148.71853803 137.663340967 149.645536926 140.221430607 229.130265137 142.13992073 229.974513525 144.177403666 231.507327549 145.948309791 233.472149537 147.1168215 235.729792027 147.584609599 237.996073951 147.346523749 239.986151379 146.285612604 241.541372783 144.541198914 242.371743337 142.43206709 242.59949638 140.434619936 242.529462461 139.843627715 241.957584297 137.886059678 240.830574042 135.861028147 239.034645256 134.513803772 236.937066989 133.753611997 234.587534389 133.723456441 232.442138338 134.308513822 230.629895797 135.686228409 229.458474514 137.576121749 228.927047535 139.881113255
================================================
FILE: utils/metric.py
================================================
import torch as th
import torch.nn as nn
def gram_matrix(feat):
# https://github.com/pytorch/examples/blob/master/fast_neural_style/neural_style/utils.py
(b, ch, h, w) = feat.size()
feat = feat.view(b, ch, h * w)
feat_t = feat.transpose(1, 2)
gram = th.bmm(feat_t, feat) / (ch * h * w)
gram = gram.view(b, h, w)
return gram
================================================
FILE: utils/points2heatmap.py
================================================
import numpy as np
import cv2
import os
import math
def curve_interp(points, heatmapSize=256, sigma=3):
sigma = max(1,(sigma // 2)*2 + 1)
img = np.zeros((heatmapSize, heatmapSize), np.uint8)
for ii in range(1, points.shape[0]):
cv2.line(img, tuple(points[ii-1].astype(np.int32)),tuple(points[ii].astype(np.int32)), (255), sigma)
img = cv2.GaussianBlur(img, (sigma, sigma), sigma)
return img.astype(np.float64)/255.0
def curve_fill(points, heatmapSize=256, sigma=3, erode=False):
sigma = max(1,(sigma // 2)*2 + 1)
points = points.astype(np.int32)
canvas = np.zeros([heatmapSize, heatmapSize])
#hull = cv2.convexHull(points)
#cv2.fillConvexPoly(canvas, hull, 255)
cv2.fillPoly(canvas,np.array([points]),255)
kernel = np.ones((sigma, sigma), np.uint8)
if erode:
erode_kernel = np.ones((int(0.5*sigma), int(0.5*sigma)), np.uint8)
canvas = cv2.erode(canvas, erode_kernel)
else:
canvas = cv2.dilate(canvas, kernel)
canvas = cv2.GaussianBlur(canvas, (sigma,sigma), sigma)
return canvas.astype(np.float64)/255.0
def curves2heatmap(curves,heatmapSize=256,sigma=3,flag='line'):
#-----------------------input--------------------------
# curves [list of ndarray] : points coordinate in [0,heatmapSize]
# heatmapSize[int]: the size of the generated heatmap
# sigma[float]: Boundary vagueness
# flag[string]: 'line' or 'segment'
#-----------------------output----------------
# heatmap[ndarray,float64]: [D,D,num of curves],range in (0.0,1.0)
#=============================================
heatmap = np.zeros((heatmapSize, heatmapSize, len(curves)),np.float64)
for i in range(len(curves)):
if flag == 'line':
heatmap[:, :, i] = curve_interp(curves[i], heatmapSize, sigma)
else:
heatmap[:, :, i] = curve_fill(curves[i], heatmapSize, sigma)
return heatmap
def curves2segments(curves,heatmapSize=256,sigma=3):
#res[ndarray]: range in (0,1) [Channel,Size,Size]
face = curve_fill(np.vstack([curves[0],curves[2][::-1],curves[1][::-1]]),heatmapSize,sigma)
browL = curve_fill(np.vstack([curves[1],curves[13][::-1]]),heatmapSize,sigma)
browR = curve_fill(np.vstack([curves[2],curves[14][::-1]]),heatmapSize,sigma)
eyeL = curve_fill(np.vstack([curves[5],curves[6]]),heatmapSize,sigma)
eyeR = curve_fill(np.vstack([curves[7],curves[8]]),heatmapSize,sigma)
gazeL = curve_fill(curves[15],heatmapSize,sigma)
gazeR = curve_fill(curves[16],heatmapSize,sigma)
#intersect eye and gaze
gazeL = gazeL * eyeL
gazeR = gazeR * eyeR
#2 to 1
eye = np.max([eyeL,eyeR],axis=0)
gaze = np.max([gazeL,gazeR],axis=0)
brow = np.max([browL,browR],axis=0)
nose = curve_fill(np.vstack([curves[3][0:1],curves[4]]),heatmapSize,sigma)
lipU= curve_fill(np.vstack([curves[9],curves[10][::-1]]),heatmapSize,sigma)
lipD= curve_fill(np.vstack([curves[11],curves[12][::-1]]),heatmapSize,sigma)
tooth = curve_fill(np.vstack([curves[10],curves[11][::-1]]),heatmapSize,sigma)
return np.stack([face,brow,eye,gaze,nose,lipU,lipD,tooth])
def points2curves(points, heatmapSize=256, sigma=1, heatmap_num=17):
#-----------------------input--------------------------
# points[ndarray]: [...,[x,y],...],range in (0.0,1.0)
# heatmapSize[int]: the size of the generated heatmap
# heatmapNum: number of heatmap channels
#-----------------------output----------------
# curves [list of ndarray] : points coordinate in [0,heatmapSize]
# =====================================================
# resize points (0-1) to heatmapSize(0-D)
for i in range(points.shape[0]):
points[i] *= (float(heatmapSize))
# curve define
curves = [0]*heatmap_num
curves[0] = np.zeros((33, 2)) # contour
curves[1] = np.zeros((5, 2)) # left top eyebrow
curves[2] = np.zeros((5, 2)) # right top eyebrow
curves[3] = np.zeros((4, 2)) # nose bridge
curves[4] = np.zeros((9, 2)) # nose tip
curves[5] = np.zeros((5, 2)) # left top eye
curves[6] = np.zeros((5, 2)) # left bottom eye
curves[7] = np.zeros((5, 2)) # right top eye
curves[8] = np.zeros((5, 2)) # right bottom eye
curves[9] = np.zeros((7, 2)) # up up lip
curves[10] = np.zeros((5, 2)) # up bottom lip
curves[11] = np.zeros((5, 2)) # bottom up lip
curves[12] = np.zeros((7, 2)) # bottom bottom lip
curves[13] = np.zeros((5, 2)) # left bottom eyebrow
curves[14] = np.zeros((5, 2)) # left bottom eyebrow
if heatmap_num == 17:
curves[15] = np.zeros((20, 2)) # left gaze
curves[16] = np.zeros((20, 2)) # right gaze
# assignment proccess
# countour
for i in range(33):
curves[0][i] = points[i]
for i in range(5):
# left top eyebrow
curves[1][i] = points[i+33]
# right top eyebrow
curves[2][i] = points[i+38]
# nose bridge
for i in range(4):
curves[3][i] = points[i+43]
# nose tip
curves[4][0] = points[80]
curves[4][1] = points[82]
for i in range(5):
curves[4][i+2] = points[i+47]
curves[4][7] = points[83]
curves[4][8] = points[81]
# left top eye
curves[5][0] = points[52]
curves[5][1] = points[53]
curves[5][2] = points[72]
curves[5][3] = points[54]
curves[5][4] = points[55]
# left bottom eye
curves[6][0] = points[55]
curves[6][1] = points[56]
curves[6][2] = points[73]
curves[6][3] = points[57]
curves[6][4] = points[52]
# right top eye
curves[7][0] = points[58]
curves[7][1] = points[59]
curves[7][2] = points[75]
curves[7][3] = points[60]
curves[7][4] = points[61]
# right bottom eye
curves[8][0] = points[61]
curves[8][1] = points[62]
curves[8][2] = points[76]
curves[8][3] = points[63]
curves[8][4] = points[58]
# up up lip
for i in range(7):
curves[9][i] = points[i+84]
# up bottom lip
for i in range(5):
curves[10][i] = points[i+96]
# bottom up lip
curves[11][0] = points[96]
curves[11][1] = points[103]
curves[11][2] = points[102]
curves[11][3] = points[101]
curves[11][4] = points[100]
# bottom bottom lip
curves[12][0] = points[84]
curves[12][1] = points[95]
curves[12][2] = points[94]
curves[12][3] = points[93]
curves[12][4] = points[92]
curves[12][5] = points[91]
curves[12][6] = points[90]
# left bottom eyebrow
curves[13][0] = points[33]
curves[13][1] = points[64]
curves[13][2] = points[65]
curves[13][3] = points[66]
curves[13][4] = points[67]
# right bottom eyebrow
curves[14][0] = points[68]
curves[14][1] = points[69]
curves[14][2] = points[70]
curves[14][3] = points[71]
curves[14][4] = points[42]
if heatmap_num == 17:
# left gaze
for i in range(20):
curves[15][i] = points[106+i]
# right gaze
for i in range(20):
curves[16][i] = points[106+20+i]
boundary = np.zeros((heatmapSize, heatmapSize, heatmap_num))
for i in range(heatmap_num):
# curve_map = np.full((heatmap_size, heatmap_size), 255, dtype=np.uint8)
valid_points = [curves[i][0, :]]
for j in range(1, curves[i].shape[0]):
if (distance(curves[i][j, :], curves[i][j - 1, :]) > 0.001):
valid_points.append(curves[i][j, :])
if len(valid_points) > 1:
curve_map = curve_fitting(curves[i], heatmapSize, sigma)
boundary[:, :, i] = curve_map
# print(heatmap.max(), heatmap.min(), 'heatmap')
return curves, boundary.astype(np.float64) / 255
def distance(p1, p2):
return math.sqrt((p1[0]-p2[0])*(p1[0]-p2[0])+(p1[1]-p2[1])*(p1[1]-p2[1]))
def curve_fitting(points, heatmap_size, sigma):
curve_tmp = curve_interp(points, heatmap_size, sigma)
return curve_tmp
if __name__ == '__main__':
import matplotlib.pyplot as plt
res = list()
path = '../2019CVPR_reconstruct/data/celebHQ/lms.txt'
head = '../2019CVPR_reconstruct/data/celebHQ/align_384'
with open(path, 'r') as fin:
data = fin.read().splitlines()
N = len(data)//2
for i in range(N):
imgPath = os.path.join(head, data[2*i+0])
landmarks = list(map(float, data[2*i+1].split()))
res.append((imgPath, landmarks))
for path,landmark in res:
points = (np.array(landmark).reshape(-1,2).astype(np.float32)-64)/256.0
curves = points2curves(points)
segments = curves2segments(curves)
img= np.sum(segments,axis=0)
plt.figure()
plt.imshow(img)
for i in range(len(points)):
plt.plot(points[i][0],(points[i][1]),'.',255,1)
if i<=106:
plt.text(points[i][0], (points[i][1]), str(i), fontsize=5)
else:
plt.text(points[i][0], (points[i][1]), str(i), fontsize=3)
plt.show()
================================================
FILE: utils/transforms.py
================================================
"""
Affine transforms implemented on torch tensors, and
requiring only one interpolation
"""
import math
import random
import torch as th
import cv2
from utils.affine_util import th_affine2d, exchange_landmarks
import numpy as np
#from utils.points2heatmap import getCurve17
def initAlignTransfer(size, mirror=False, gaze=True):
corr_list_base = np.array([0, 32, 1, 31, 2, 30, 3, 29, 4, 28, 5, 27, 6, 26, 7, 25, 8, 24, 9, 23, 10, 22, 11, 21, 12, 20, 13, 19, 14, 18, 15, 17, 33, 42, 34, 41, 35, 40, 36, 39, 37, 38, 64, 71, 65, 70, 66, 69, 67,
68, 52, 61, 53, 60, 72, 75, 54, 59, 55, 58, 56, 63, 73, 76, 57, 62, 74, 77, 104, 105, 78, 79, 80, 81, 82, 83, 47, 51, 48, 50, 84, 90, 96, 100, 85, 89, 86, 88, 95, 91, 94, 92, 97, 99, 103, 101, ]).reshape([-1, 2])
corr_list_gaze = 106 + np.array([0, 20, 1, 21, 2, 22, 3, 23, 4, 24, 5, 25, 6, 26, 7, 27, 8, 28, 9 , 29,
10, 30, 11, 31, 12, 32, 13, 33, 14, 34, 15, 35, 16, 36, 17, 37, 18, 38, 19, 39]).reshape(-1, 2)
if gaze:
corr_list = np.vstack([corr_list_base, corr_list_gaze])
else:
corr_list = corr_list_base
transform_align = AffineCompose(rotation_range=10,
translation_range=10,
zoom_range=[0.9, 1.1],
fine_size=size,
mirror=mirror,
corr_list=corr_list
)
return transform_align
# input pt: numpy N*2
# input img: numpy, 3*256*256, 0-255
class AffineCompose(object):
def __init__(self,
rotation_range,
translation_range,
zoom_range,
fine_size,
mirror=False,
corr_list=None
):
self.fine_size = fine_size
self.rotation_range = rotation_range
self.translation_range = translation_range
self.zoom_range = zoom_range
self.mirror = mirror
self.corr_list = corr_list
def __call__(self, *inputs):
rotate = random.uniform(-self.rotation_range, self.rotation_range)
trans_x = random.uniform(-self.translation_range,
self.translation_range)
trans_y = random.uniform(-self.translation_range,
self.translation_range)
if not isinstance(self.zoom_range, list) and not isinstance(self.zoom_range, tuple):
raise ValueError('zoom_range must be tuple or list with 2 values')
zoom = random.uniform(self.zoom_range[0], self.zoom_range[1])
# rotate
transform_matrix = np.zeros([3, 3])
center = (self.fine_size/2.-0.5, self.fine_size/2-0.5)
M = cv2.getRotationMatrix2D(center, rotate, 1)
transform_matrix[:2, :] = M
transform_matrix[2, :] = np.array([[0, 0, 1]])
# translate
transform_matrix[0, 2] += trans_x
transform_matrix[1, 2] += trans_y
# zoom
for i in range(3):
transform_matrix[0, i] *= zoom
transform_matrix[1, i] *= zoom
transform_matrix[0, 2] += (1.0 - zoom) * center[0]
transform_matrix[1, 2] += (1.0 - zoom) * center[1]
# mirror about x axis in cropped image
do_mirror = False
'''
if self.mirror:
mirror_rng = random.uniform(0., 1.)
if mirror_rng > 0.5:
do_mirror = True
do_mirror = True
if do_mirror:
transform_matrix[0, 0] = -transform_matrix[0, 0]
transform_matrix[0, 1] = -transform_matrix[0, 1]
transform_matrix[0, 2] = float(
self.fine_size)-transform_matrix[0, 2]
'''
outputs = []
for idx, _input in enumerate(inputs):
# input: pt, face_256
if _input.ndim == 3:
is_landmarks = False
else:
is_landmarks = True
input_tf = th_affine2d(_input,
transform_matrix,
output_img_width=self.fine_size,
output_img_height=self.fine_size,
is_landmarks=is_landmarks)
'''
if do_mirror:
if is_landmarks:
input_tf = exchange_landmarks(input_tf, self.corr_list)
else:
#input_tf = cv2.flip(input_tf, 1)
pass
'''
outputs.append(input_tf)
return outputs if idx >= 1 else outputs[0]
def dealcurve(curve):
cmean = curve.mean(0)
angle = (random.random()*10)-5
scale = ((random.random()-0.5)*0.1)+1.0
m = cv2.getRotationMatrix2D((0,0),angle,scale)
m = np.vstack([m,[0,0,1]])
dmean = (np.random.rand(1,2)-0.5)*10
curve = curve - cmean
curve = cv2.perspectiveTransform(np.array([curve]),m)
curve += cmean
curve += dmean
return curve[0]
'''
def shakeCurve(points):
pts = points.copy()
curves,_ = getCurve17(pts)
Lbrow = np.vstack([curves[1],curves[13]])
Rbrow = np.vstack([curves[2],curves[14]])
Leye = np.vstack([curves[5],curves[6],curves[15]])
Reye = np.vstack([curves[7],curves[8],curves[16]])
Nose = np.vstack([curves[3],curves[4]])
Mouth = np.vstack([curves[9],curves[10],curves[11],curves[12]])
Bound = curves[0]
Lbrow = dealcurve(Lbrow)
Rbrow = dealcurve(Rbrow)
Leye = dealcurve(Leye)
Reye = dealcurve(Reye)
Nose = dealcurve(Nose)
Mouth= dealcurve(Mouth)
Bound= dealcurve(Bound)
for i in range(33):
pts[i] = Bound[i]
for i in range(5):
pts[i+33] = Lbrow[i]
for i in range(5):
pts[i+38] = Rbrow[i]
for i in range(4):
pts[i+43] = Nose[i]
pts[80] = Nose[4+0]
pts[82] = Nose[4+1]
for i in range(5):
pts[i+47] = Nose[6+i]
pts[83] = Nose[11]
pts[81] = Nose[12]
pts[52] = Leye[0]
pts[53] = Leye[1]
pts[72] = Leye[2]
pts[54] = Leye[3]
pts[55] = Leye[4]
pts[55] = Leye[5]
pts[56] = Leye[6]
pts[73] = Leye[7]
pts[57] = Leye[8]
pts[52] = Leye[9]
pts[58] = Reye[0]
pts[59] = Reye[1]
pts[75] = Reye[2]
pts[60] = Reye[3]
pts[61] = Reye[4]
pts[61] = Reye[5]
pts[62] = Reye[6]
pts[76] = Reye[7]
pts[63] = Reye[8]
pts[58] = Reye[9]
for i in range(7):
pts[i+84] = Mouth[i]
for i in range(5):
pts[96+i] = Mouth[7+i]
pts[96] = Mouth[12]
pts[103] = Mouth[13]
pts[102] = Mouth[14]
pts[101] = Mouth[15]
pts[100] = Mouth[16]
pts[84] = Mouth[17]
pts[95] = Mouth[18]
pts[94] = Mouth[19]
pts[93] = Mouth[20]
pts[92] = Mouth[21]
pts[91] = Mouth[22]
pts[90] = Mouth[23]
pts[33] = Lbrow[5+0]
pts[64] = Lbrow[5+1]
pts[65] = Lbrow[5+2]
pts[66] = Lbrow[5+3]
pts[67] = Lbrow[5+4]
pts[68] = Rbrow[5+0]
pts[69] = Rbrow[5+0]
pts[70] = Rbrow[5+0]
pts[71] = Rbrow[5+0]
pts[42] = Rbrow[5+0]
for i in range(20):
pts[106+i] = Leye[10+i]
pts[106+20+i] = Reye[10+i]
return pts
'''
================================================
FILE: utils/warper.py
================================================
import numpy as np
import scipy.spatial as spatial
from builtins import range
import cv2
from matplotlib import pyplot as plt
def warping(img, src_bound, dst_bound, size=(256, 256)):
d = 254
bound = np.array([0, 0, 0, d, d, 0, d, d]).reshape(-1, 2)
src_bound = np.vstack([src_bound, bound]).astype(np.int32)
dst_bound = np.vstack([dst_bound, bound]).astype(np.int32)
src_bound[src_bound > d] = d
dst_bound[dst_bound > d] = d
src_bound = src_bound.astype(np.int32)
dst_bound = dst_bound.astype(np.int32)
res = warp_image(img, src_bound, dst_bound, size)
return res
def bilinear_interpolate(img, coords):
""" Interpolates over every image channel
http://en.wikipedia.org/wiki/Bilinear_interpolation
:param img: max 3 channel image
:param coords: 2 x _m_ array. 1st row = xcoords, 2nd row = ycoords
:returns: array of interpolated pixels with same shape as coords
"""
int_coords = np.int32(coords)
x0, y0 = int_coords
dx, dy = coords - int_coords
# 4 Neighour pixels
q11 = img[y0, x0]
q21 = img[y0, x0+1]
q12 = img[y0+1, x0]
q22 = img[y0+1, x0+1]
btm = q21.T * dx + q11.T * (1 - dx)
top = q22.T * dx + q12.T * (1 - dx)
inter_pixel = top * dy + btm * (1 - dy)
return inter_pixel.T
def grid_coordinates(points):
""" x,y grid coordinates within the ROI of supplied points
:param points: points to generate grid coordinates
:returns: array of (x, y) coordinates
"""
xmin = np.min(points[:, 0])
xmax = np.max(points[:, 0]) + 1
ymin = np.min(points[:, 1])
ymax = np.max(points[:, 1]) + 1
return np.asarray([(x, y) for y in range(ymin, ymax)
for x in range(xmin, xmax)], np.uint32)
def process_warp(src_img, result_img, tri_affines, dst_points, delaunay):
"""
Warp each triangle from the src_image only within the
ROI of the destination image (points in dst_points).
"""
roi_coords = grid_coordinates(dst_points)
# indices to vertices. -1 if pixel is not in any triangle
roi_tri_indices = delaunay.find_simplex(roi_coords)
for simplex_index in range(len(delaunay.simplices)):
coords = roi_coords[roi_tri_indices == simplex_index]
num_coords = len(coords)
out_coords = np.dot(tri_affines[simplex_index],
np.vstack((coords.T, np.ones(num_coords))))
x, y = coords.T
result_img[y, x] = bilinear_interpolate(src_img, out_coords)
return None
def triangular_affine_matrices(vertices, src_points, dest_points):
"""
Calculate the affine transformation matrix for each
triangle (x,y) vertex from dest_points to src_points
:param vertices: array of triplet indices to corners of triangle
:param src_points: array of [x, y] points to landmarks for source image
:param dest_points: array of [x, y] points to landmarks for destination image
:returns: 2 x 3 affine matrix transformation for a triangle
"""
ones = [1, 1, 1]
for tri_indices in vertices:
src_tri = np.vstack((src_points[tri_indices, :].T, ones))
dst_tri = np.vstack((dest_points[tri_indices, :].T, ones))
mat = np.dot(src_tri, np.linalg.inv(dst_tri))[:2, :]
yield mat
def warp_image(src_img, src_points, dest_points, dest_shape, dtype=np.uint8):
# Resultant image will not have an alpha channel
num_chans = 3
src_img = src_img[:, :, :3]
rows, cols = dest_shape[:2]
result_img = np.zeros((rows, cols, num_chans), dtype)
delaunay = spatial.Delaunay(dest_points)
tri_affines = np.asarray(list(triangular_affine_matrices(
delaunay.simplices, src_points, dest_points)))
process_warp(src_img, result_img, tri_affines, dest_points, delaunay)
return result_img
if __name__ == "__main__":
pass