master edf583442af9 cached
22 files
132.6 KB
34.1k tokens
1 requests
Download .txt
Repository: unibas-gravis/basel-face-pipeline
Branch: master
Commit: edf583442af9
Files: 22
Total size: 132.6 KB

Directory structure:
gitextract_eb7rw3gs/

├── LICENSE.txt
├── README.md
├── build.sbt
├── pipeline-data/
│   └── .gitignore
└── src/
    └── main/
        ├── resources/
        │   └── logback.xml
        └── scala/
            ├── ch/
            │   └── unibas/
            │       └── cs/
            │           └── gravis/
            │               └── facepipeline/
            │                   ├── BU3DDataProvider.scala
            │                   ├── DataProvider.scala
            │                   └── PipelineStep.scala
            ├── fitting/
            │   ├── StandardFitScript.scala
            │   └── experiments/
            │       ├── QualitativeLFW.scala
            │       └── RecognitionMultiPIE.scala
            ├── modelbuilding/
            │   └── ModelBuilding.scala
            ├── preprocessing/
            │   ├── ConvertBu3DRawData.scala
            │   └── PrepareReferenceLandmarks.scala
            └── registration/
                ├── Registration.scala
                ├── experiments/
                │   └── Bu3DFELandmarkEvaluation.scala
                ├── metrics/
                │   └── HuberDistanceMetric.scala
                ├── modelbuilding/
                │   ├── BuildCoreExpressionPrior.scala
                │   ├── BuildNeutralPrior.scala
                │   ├── FaceKernel.scala
                │   └── FaceMask.scala
                └── utils/
                    └── VisualLogger.scala

================================================
FILE CONTENTS
================================================

================================================
FILE: LICENSE.txt
================================================
                                 Apache License
                           Version 2.0, January 2004
                        http://www.apache.org/licenses/

   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

   1. Definitions.

      "License" shall mean the terms and conditions for use, reproduction,
      and distribution as defined by Sections 1 through 9 of this document.

      "Licensor" shall mean the copyright owner or entity authorized by
      the copyright owner that is granting the License.

      "Legal Entity" shall mean the union of the acting entity and all
      other entities that control, are controlled by, or are under common
      control with that entity. For the purposes of this definition,
      "control" means (i) the power, direct or indirect, to cause the
      direction or management of such entity, whether by contract or
      otherwise, or (ii) ownership of fifty percent (50%) or more of the
      outstanding shares, or (iii) beneficial ownership of such entity.

      "You" (or "Your") shall mean an individual or Legal Entity
      exercising permissions granted by this License.

      "Source" form shall mean the preferred form for making modifications,
      including but not limited to software source code, documentation
      source, and configuration files.

      "Object" form shall mean any form resulting from mechanical
      transformation or translation of a Source form, including but
      not limited to compiled object code, generated documentation,
      and conversions to other media types.

      "Work" shall mean the work of authorship, whether in Source or
      Object form, made available under the License, as indicated by a
      copyright notice that is included in or attached to the work
      (an example is provided in the Appendix below).

      "Derivative Works" shall mean any work, whether in Source or Object
      form, that is based on (or derived from) the Work and for which the
      editorial revisions, annotations, elaborations, or other modifications
      represent, as a whole, an original work of authorship. For the purposes
      of this License, Derivative Works shall not include works that remain
      separable from, or merely link (or bind by name) to the interfaces of,
      the Work and Derivative Works thereof.

      "Contribution" shall mean any work of authorship, including
      the original version of the Work and any modifications or additions
      to that Work or Derivative Works thereof, that is intentionally
      submitted to Licensor for inclusion in the Work by the copyright owner
      or by an individual or Legal Entity authorized to submit on behalf of
      the copyright owner. For the purposes of this definition, "submitted"
      means any form of electronic, verbal, or written communication sent
      to the Licensor or its representatives, including but not limited to
      communication on electronic mailing lists, source code control systems,
      and issue tracking systems that are managed by, or on behalf of, the
      Licensor for the purpose of discussing and improving the Work, but
      excluding communication that is conspicuously marked or otherwise
      designated in writing by the copyright owner as "Not a Contribution."

      "Contributor" shall mean Licensor and any individual or Legal Entity
      on behalf of whom a Contribution has been received by Licensor and
      subsequently incorporated within the Work.

   2. Grant of Copyright License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      copyright license to reproduce, prepare Derivative Works of,
      publicly display, publicly perform, sublicense, and distribute the
      Work and such Derivative Works in Source or Object form.

   3. Grant of Patent License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      (except as stated in this section) patent license to make, have made,
      use, offer to sell, sell, import, and otherwise transfer the Work,
      where such license applies only to those patent claims licensable
      by such Contributor that are necessarily infringed by their
      Contribution(s) alone or by combination of their Contribution(s)
      with the Work to which such Contribution(s) was submitted. If You
      institute patent litigation against any entity (including a
      cross-claim or counterclaim in a lawsuit) alleging that the Work
      or a Contribution incorporated within the Work constitutes direct
      or contributory patent infringement, then any patent licenses
      granted to You under this License for that Work shall terminate
      as of the date such litigation is filed.

   4. Redistribution. You may reproduce and distribute copies of the
      Work or Derivative Works thereof in any medium, with or without
      modifications, and in Source or Object form, provided that You
      meet the following conditions:

      (a) You must give any other recipients of the Work or
          Derivative Works a copy of this License; and

      (b) You must cause any modified files to carry prominent notices
          stating that You changed the files; and

      (c) You must retain, in the Source form of any Derivative Works
          that You distribute, all copyright, patent, trademark, and
          attribution notices from the Source form of the Work,
          excluding those notices that do not pertain to any part of
          the Derivative Works; and

      (d) If the Work includes a "NOTICE" text file as part of its
          distribution, then any Derivative Works that You distribute must
          include a readable copy of the attribution notices contained
          within such NOTICE file, excluding those notices that do not
          pertain to any part of the Derivative Works, in at least one
          of the following places: within a NOTICE text file distributed
          as part of the Derivative Works; within the Source form or
          documentation, if provided along with the Derivative Works; or,
          within a display generated by the Derivative Works, if and
          wherever such third-party notices normally appear. The contents
          of the NOTICE file are for informational purposes only and
          do not modify the License. You may add Your own attribution
          notices within Derivative Works that You distribute, alongside
          or as an addendum to the NOTICE text from the Work, provided
          that such additional attribution notices cannot be construed
          as modifying the License.

      You may add Your own copyright statement to Your modifications and
      may provide additional or different license terms and conditions
      for use, reproduction, or distribution of Your modifications, or
      for any such Derivative Works as a whole, provided Your use,
      reproduction, and distribution of the Work otherwise complies with
      the conditions stated in this License.

   5. Submission of Contributions. Unless You explicitly state otherwise,
      any Contribution intentionally submitted for inclusion in the Work
      by You to the Licensor shall be under the terms and conditions of
      this License, without any additional terms or conditions.
      Notwithstanding the above, nothing herein shall supersede or modify
      the terms of any separate license agreement you may have executed
      with Licensor regarding such Contributions.

   6. Trademarks. This License does not grant permission to use the trade
      names, trademarks, service marks, or product names of the Licensor,
      except as required for reasonable and customary use in describing the
      origin of the Work and reproducing the content of the NOTICE file.

   7. Disclaimer of Warranty. Unless required by applicable law or
      agreed to in writing, Licensor provides the Work (and each
      Contributor provides its Contributions) on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
      implied, including, without limitation, any warranties or conditions
      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
      PARTICULAR PURPOSE. You are solely responsible for determining the
      appropriateness of using or redistributing the Work and assume any
      risks associated with Your exercise of permissions under this License.

   8. Limitation of Liability. In no event and under no legal theory,
      whether in tort (including negligence), contract, or otherwise,
      unless required by applicable law (such as deliberate and grossly
      negligent acts) or agreed to in writing, shall any Contributor be
      liable to You for damages, including any direct, indirect, special,
      incidental, or consequential damages of any character arising as a
      result of this License or out of the use or inability to use the
      Work (including but not limited to damages for loss of goodwill,
      work stoppage, computer failure or malfunction, or any and all
      other commercial damages or losses), even if such Contributor
      has been advised of the possibility of such damages.

   9. Accepting Warranty or Additional Liability. While redistributing
      the Work or Derivative Works thereof, You may choose to offer,
      and charge a fee for, acceptance of support, warranty, indemnity,
      or other liability obligations and/or rights consistent with this
      License. However, in accepting such obligations, You may act only
      on Your own behalf and on Your sole responsibility, not on behalf
      of any other Contributor, and only if You agree to indemnify,
      defend, and hold each Contributor harmless for any liability
      incurred by, or claims asserted against, such Contributor by reason
      of your accepting any such warranty or additional liability.

   END OF TERMS AND CONDITIONS

   APPENDIX: How to apply the Apache License to your work.

      To apply the Apache License to your work, attach the following
      boilerplate notice, with the fields enclosed by brackets "{}"
      replaced with your own identifying information. (Don't include
      the brackets!)  The text should be enclosed in the appropriate
      comment syntax for the file format. We also recommend that a
      file or class name and description of purpose be included on the
      same "printed page" as the copyright notice for easier
      identification within third-party archives.

   Copyright {yyyy} {name of copyright owner}

   Licensed under the Apache License, Version 2.0 (the "License");
   you may not use this file except in compliance with the License.
   You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.


================================================
FILE: README.md
================================================
# Basel Face Registration Pipeline

This repository contains all the code to reproduce our results from our recent publication:
- Thomas Gerig, Andreas Morel-Forster, Clemens Blumer, Bernhard Egger, Marcel Lüthi, Sandro Schönborn and Thomas Vetter 
"Morphable Face Models - An Open Framework" ( [edoc-unibas](https://edoc.unibas.ch/69084/) / [IEEE](https://ieeexplore.ieee.org/document/8373814)  )
IN: 13th IEEE Conference on Automatic Face and Gesture Recognition (FG 2018) pp. 75-82

## Overview

After the following information, we list all nescessary steps that you need to take to aquire and prepare the data as well as to run the pipeline and the experiments in the next sections.

### Problems under Windows

When you expericence some problems under Windows while importing the data, please use the branch **updateScalismoFaces**.

### Reporting problems and discussion

When you experience problems, you have questions or feedback please use the mailing list for [Morphable face models - an open framework](https://groups.google.com/forum/#!categories/scalismo-faces/morphable-face-models---an-open-framework).

## Preparation

### Step 1: Folder structure and Basel reference mesh

For the registration pipeline and the experiments to work properly, some data, such as reference templates and landmarks are needed. The files are available
for download at [Registration Pipeline Data](https://faces.dmi.unibas.ch/bfm/bfm2017.html). The download contains the following in detail:

* Manually clicked landmarks for the BU3D-FE database.
* BFM reference mesh and expression means.
* Landmarks of the reference mesh.
* Region mask for model-building.

You can copy the content of the zip folder into `pipeline-data`. The coarse structure looks the following:

```
pipeline-data
├── data
│   ├── incoming
│   ├── bu3dfe
│   │   ├── original
├── recognition-experiment
```

If needed, you can change the location of the `pipeline-data` directory in the BU3DDataProvider.scala file.

### Step 2: Bu3DFE Database

To register the BU-3DFE you have to acquire the dataset here:

[BU-3DFE](http://www.cs.binghamton.edu/~lijun/Research/3DFE/3DFE_Analysis.html)

and copy the `/original` folder to `data/bu3dfe/original/`.

### Step 3: Sbt (Scala build tool)

We assume that you have sbt already installed. If not, please follow the instructions given
[here](http://www.scala-sbt.org/release/tutorial/Setup.html).

Generally you can run the code using SBT. An example is how to run it in the terminal with:

```
cd /code-directory/
sbt run
```

If you do not have enough memory use:
```
sbt -J-Xmx50g run
```

Then the different steps are then listed and can be executed by entering the number of the script or by using:
```
sbt "run-main package.Classname"
```


## Running the Pipeline

### Step 0: Data pre-processing & Folder Structure Creation

During the pipeline we do not use the BU3DFE database data directly but first convert the data to match our formats.
This step is done only once as a pre-processing and the output can be reused whenever you run a new registration.

To convert the original data from the BU3DFE database to our format use the command:

```
sbt "run-main preprocessing.ConvertBu3DRawData"
```

Explain raw data preprocessing steps in script. (The script might need some cleanup.)

### Step 1: Building the Neutral Prior Model

Pre-computing the neutral prior model can take quite some time.
However, it has to be computed only once offline and is stored in `pipeline-data/data/incoming/reference/gpmodels/`.

You can run the building process with:

```
sbt "run-main registration.BuildNeutralPrior"
```

### Step 2: Building the Core Expression Model

The core expression model augments the neutral model with expression deformations.

```
sbt "run-main registration.BuildCoreExpressionModel"
```

### Step 3: Preprocess Landmarks

This step is used to transform the reference landmarks to the new mean of the generated models and to change the uncertainty
of the individual landmarks.

```
sbt "run-main preprocessing.PrepareReferenceLandmarks"
```

### Step 4: Registration

```
sbt -J-Xmx40g "run-main registration.Registration"
```

### Step 5: Building the Morphable Model

The model building contains two steps:

 - First for each registration result the color is extracted using the input mesh.
 - Based on all meshes with color a model containing shape, color and expression variations is built.

This process may need some time and memory. Once the first step, the color extraction is computed it
can be reused if you change for example the mask of the model that should be built. But to change this
you have to out comment the corresponding line in the source code.

```
sbt -mem 40000 "run-main modelbuilding.ModelBuilding"
```
## Face Reconstruction from 2D Image

First you have to download the Multi-PIE database and copy the necessary files to the correct folders. 
This is described in the README file in the folder recogniton-experiment (comes with seperate download of the Basel Face Pipeline [Data](https://faces.dmi.unibas.ch/bfm/bfm2017.html)). 
For those experiments you need the Basel Face Model 2009 and 2017, which can be downloaded at:
[Probabilistic Morphable Models](https://gravis.dmi.unibas.ch/PMM/)

To run the 3D reconstructions from the Multi-PIE database, you may want to execute it multiple times in parallel
since a single fit taks ~20 minutes:
```
sbt -mem 5000 "fitting.experiments.RecognitionMultiPiePose"
```
And to calculate the recognition scores execute:
```
sbt -mem 5000 "fitting.experiments.RecognitionEvaluation"
```
Those where the neutral scores. To perform the expression experiments, run:
```
sbt -mem 5000 "fitting.experiments.RecognitionMultiPieExpression"
sbt -mem 5000 "fitting.experiments.RecognitionEvaluationEx"
```





================================================
FILE: build.sbt
================================================
name := "basel-face-pipeline"

version := "0.1"

scalaVersion := "2.11.8"

scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8")

resolvers += Resolver.jcenterRepo

resolvers += Resolver.bintrayRepo("unibas-gravis", "maven")

libraryDependencies += "ch.unibas.cs.gravis" %% "scalismo-faces" % "0.5.0"

libraryDependencies += "ch.unibas.cs.gravis" %% "scalismo-ui" % "0.11.+"

libraryDependencies += "com.github.tototoshi" %% "scala-csv" % "1.3.3"

libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.5.0"

libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.7"

libraryDependencies ~= { _.map(_.exclude("org.slf4j", "slf4j-nop")) }



================================================
FILE: pipeline-data/.gitignore
================================================


================================================
FILE: src/main/resources/logback.xml
================================================
<!--
  ~ Copyright University of Basel, Graphics and Vision Research Group
  ~
  ~ Licensed under the Apache License, Version 2.0 (the "License");
  ~ you may not use this file except in compliance with the License.
  ~ You may obtain a copy of the License at
  ~
  ~     http://www.apache.org/licenses/LICENSE-2.0
  ~
  ~ Unless required by applicable law or agreed to in writing, software
  ~ distributed under the License is distributed on an "AS IS" BASIS,
  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  ~ See the License for the specific language governing permissions and
  ~ limitations under the License.
  -->
<configuration>
    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
        </encoder>
    </appender>

    <appender name="FILE" class="ch.qos.logback.core.FileAppender">
        <!-- path to your log file, where you want to store logs -->
        <file>/tmp/face-registration.log</file>
        <append>false</append>
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
        </encoder>
    </appender>

    <logger name="registration.Registration" level="DEBUG" />
    <logger name="breeze.optimize" level="ERROR" />

    <root level="debug">
        <appender-ref ref="STDOUT" />
        <appender-ref ref="FILE" />
    </root>
</configuration>

================================================
FILE: src/main/scala/ch/unibas/cs/gravis/facepipeline/BU3DDataProvider.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package ch.unibas.cs.gravis.facepipeline

import java.io.File

import registration.modelbuilding.FaceMask
import scalismo.faces.io.{MoMoIO, TLMSLandmarksIO}
import scalismo.faces.landmarks.TLMSLandmark3D
import scalismo.faces.mesh.{BinaryMask, ColorNormalMesh3D}
import scalismo.faces.momo.MoMo
import scalismo.geometry.{Landmark, _3D}
import scalismo.io.{LandmarkIO, MeshIO, StatismoIO}
import scalismo.mesh.TriangleMesh
import scalismo.statisticalmodel.StatisticalMeshModel

import scala.io.Source
import scala.reflect.io.Path
import scala.util.{Failure, Success, Try}

object BU3DDataProvider extends DataProvider {

  override case object Neutral extends ExpressionType { override def toString: String = "_NE00" }
  case object Sadness extends ExpressionType { override def toString: String = "_SA04" }
  case object Surprise extends ExpressionType { override def toString: String = "_SU04" }
  case object Disgust extends ExpressionType { override def toString: String = "_DI04" }
  case object Fear extends ExpressionType { override def toString: String = "_FE04" }
  case object Joy extends ExpressionType { override def toString: String = "_HA04" }
  case object Anger extends ExpressionType { override def toString: String = "_AN04" }
  case object CoreExpression extends ExpressionType { override def toString: String = "_ALLEXP" }

  object Expressions {
    def expressionList(): Seq[ExpressionType] = Seq(Neutral, Sadness, Surprise, Disgust, Fear, Joy, Anger)
    def expressionModelTypes(): Seq[ExpressionType] = Seq(Neutral,CoreExpression)
  }
  override def expressions() = Expressions.expressionList()



  case object RAW extends MaskType { override def toString: String = "_RAW" }
  case object F3D extends MaskType { override def toString: String = "_F3D" }

  object Masks {
    def maskList(): Seq[MaskType] = Seq(RAW, F3D)
  }

  override def masks: Seq[MaskType] = Masks.maskList()



  case class BU3DID(override val id: String, override val raceTag: String) extends Person
  object BU3DID {
    def fromFilename(filename: String): BU3DID = {
      BU3DID(filename.substring(0, 5), filename.substring(10, 12))
    }
  }

  override def personFromFilename(filename: String): Person = BU3DID.fromFilename(filename)




  case object Basel extends DataFlag { override def toString: String = "_basel" }
  case object Original extends DataFlag { override def toString: String = "" }
  case object Aligned extends DataFlag { override def toString: String = "_aligned" }

  object Flags {
    def lmFlagList(): Seq[DataFlag] = Seq(Basel, Original, Aligned)
  }



  private def setFileAccessMode(filename: String): Unit = setFileAccessMode(new File(filename))
  private def setFileAccessMode(path: Path): Unit = setFileAccessMode(path.jfile)
  private def setFileAccessMode(file: File): Unit = {
    file.setReadable(true,false)
    file.setWritable(true,false)
  }

  override def repositoryRoot: Path = Path("pipeline-data/")

  override def incoming: BU3DDataProvider.Incoming = {

    new BU3DDataProvider.Incoming {

      val incomingPath = repositoryRoot / "data" / "incoming"
      incomingPath.jfile.mkdirs()

      override def reference: BU3DDataProvider.Reference = new BU3DDataProvider.Reference {

        val referencePath = incomingPath / "reference"
        referencePath.jfile.mkdirs()

        override def loadFaceMask(): Try[FaceMask] = {

          val maskPath = referencePath / "masks"
          maskPath.jfile.mkdirs()

          for {
            level_mask <- MeshIO.readScalarMeshField[Int](new File(maskPath.jfile, "level-mask-l7.vtk"))
            semantic_mask <- MeshIO.readScalarMeshField[Short](new File(maskPath.jfile, "semantic-mask-l7.vtk")).map(_.map(_.toInt))
          } yield {
            FaceMask(level_mask,semantic_mask)
          }

        }

        override def loadMesh(expression: ExpressionType): Try[TriangleMesh[_3D]] = {
          import scalismo.faces.io.MeshIO
          val mshPath = referencePath / "mesh"
          mshPath.jfile.mkdirs()
          expression match {
            case Neutral => MeshIO.read(new File(mshPath.jfile, "mean2012_l7_bfm_nomouth.ply"))
              .map(_.shape)
            case Sadness => MeshIO.read(new File(mshPath.jfile, "mean2015.1_l7_bfm_nomouth-sadness.ply"))
              .map(_.shape)
            case Surprise => MeshIO.read(new File(mshPath.jfile, "mean2015.1_l7_bfm_nomouth-surprise.ply"))
              .map(_.shape)
            case Disgust => MeshIO.read(new File(mshPath.jfile, "mean2015.1_l7_bfm_nomouth-disgust.ply"))
              .map(_.shape)
            case Fear => MeshIO.read(new File(mshPath.jfile, "mean2015.1_l7_bfm_nomouth-fear.ply"))
              .map(_.shape)
            case Joy => MeshIO.read(new File(mshPath.jfile, "mean2015.1_l7_bfm_nomouth-joy.ply"))
              .map(_.shape)
            case Anger => MeshIO.read(new File(mshPath.jfile, "mean2015.1_l7_bfm_nomouth-anger.ply"))
              .map(_.shape)
          }
        }

        override def loadLandmarks(expression: ExpressionType): Try[Seq[Landmark[_3D]]] = {
          val lmPath = referencePath / "landmarks"
          lmPath.jfile.mkdirs()
            LandmarkIO.readLandmarksJson[_3D](new File(lmPath.jfile, s"reference${expression.toString}.json"))
        }

        override def saveLandmarks(expression: ExpressionType, landmarks: Seq[Landmark[_3D]]): Try[Unit] = {
          val lmPath = referencePath / "landmarks"
          lmPath.jfile.mkdirs()
          val res = LandmarkIO.writeLandmarksJson[_3D](landmarks.toIndexedSeq, new File(lmPath.jfile, s"reference${expression.toString}.json"))
          res match {
              case Success(_) => setFileAccessMode(lmPath)
              case _ =>
            }
          res
        }


        override def loadLineLandmarks(expression: ExpressionType): Try[Seq[Landmark[_3D]]] = ???
      }

      def landmarksPath(id: Person, expression: ExpressionType, mask: MaskType = RAW, flag: DataFlag = Basel): Path = {
        incomingPath / "landmarks" / s"${id.id}$expression${id.raceTag}${mask}$flag.tlms"
      }

      override def loadLandmarks(id: Person, expression: ExpressionType): Try[Seq[Landmark[_3D]]] = loadLandmarks(id, expression, RAW, Basel)
      override def loadLandmarks(id: Person, expression: ExpressionType, mask: MaskType): Try[Seq[Landmark[_3D]]] = loadLandmarks(id, expression, mask, Basel)
      override def loadLandmarks(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[Seq[Landmark[_3D]]] = {
        val path = landmarksPath(id, expression, mask, flag)
        TLMSLandmarksIO.read3D(path.jfile) match {
          case Success(tlmsLandmarks) => Success(tlmsLandmarks.filter(_.visible).map(_.toLandmark))
          case Failure(t) => Failure(t)
        }
      }

      override def saveLandmarks(id: Person, expression: ExpressionType, landmarks: Seq[Landmark[_3D]]): Try[Unit] = saveLandmarks(id, expression, RAW, Basel, landmarks)
      override def saveLandmarks(id: Person, expression: ExpressionType, mask: MaskType, landmarks: Seq[Landmark[_3D]]): Try[Unit] = saveLandmarks(id, expression, mask, Basel, landmarks)
      override def saveLandmarks(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, landmarks: Seq[Landmark[_3D]]): Try[Unit] = {
        val path = landmarksPath(id, expression, mask, flag)
        path.jfile.getParentFile.mkdirs()
        val tlms = landmarks.map { lm =>
          TLMSLandmark3D(lm.id, lm.point, visible = true)
        }.toIndexedSeq
        val res = TLMSLandmarksIO.write3D(tlms, path.jfile)
        res match {
          case Success(_) => setFileAccessMode(path)
          case _ =>
        }
        res
      }

      def meshPath(id: Person, expression: ExpressionType, mask: MaskType = RAW, flag: DataFlag = Original): Path = {
        incomingPath / "mesh" / s"${id.id}${expression}${id.raceTag}${mask}$flag.ply"
      }

      override def loadMesh(id: Person, expression: ExpressionType) = loadMesh(id, expression, RAW, Original)
      override def loadMesh(id: Person, expression: ExpressionType, mask: MaskType) = loadMesh(id, expression, mask, Original)
      override def loadMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[TriangleMesh[_3D]] = {
        import scalismo.faces.io.MeshIO
        val path = meshPath(id, expression, mask, flag)
        MeshIO.read(path.jfile).map(_.shape)
      }

      override def loadColoredMesh(id: Person, expression: ExpressionType): Try[ColorNormalMesh3D] = loadColoredMesh(id, expression, RAW, Original)
      override def loadColoredMesh(id: Person, expression: ExpressionType, mask: MaskType): Try[ColorNormalMesh3D] = loadColoredMesh(id, expression, mask, Original)
      override def loadColoredMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[ColorNormalMesh3D] = {
        import scalismo.faces.io.MeshIO
        val path = meshPath(id, expression, mask, flag)
        MeshIO.read(path.jfile).map(ocnm => ColorNormalMesh3D(ocnm.shape,ocnm.color.get,ocnm.normals.get))
      }

      override def saveMesh(id: Person, expression: ExpressionType, mesh: TriangleMesh[_3D]): Try[Unit] = saveMesh(id, expression, RAW, Original, mesh)
      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, mesh: TriangleMesh[_3D]): Try[Unit] = saveMesh(id, expression, mask, Original, mesh)
      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, mesh: TriangleMesh[_3D]): Try[Unit] = {
        import scalismo.faces.io.MeshIO
        val path = meshPath(id, expression, mask, flag)
        path.jfile.getParentFile.mkdirs()
        MeshIO.write(mesh, None, None, path.jfile)
        setFileAccessMode(path)
        Success(Unit)
      }

      override def saveMesh(id: Person, expression: ExpressionType, mesh: ColorNormalMesh3D): Try[Unit] = saveMesh(id, expression, RAW, Original, mesh)
      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, mesh: ColorNormalMesh3D): Try[Unit] = saveMesh(id, expression, mask, Original, mesh)
      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, mesh: ColorNormalMesh3D): Try[Unit] = {
        import scalismo.faces.io.MeshIO
        val path = meshPath(id, expression, mask, flag)
        path.jfile.getParentFile.mkdirs()
        MeshIO.write(mesh, path.jfile)
        setFileAccessMode(path)
        Success(Unit)
      }

      override def ids(expression: ExpressionType): Seq[Person] = {
        new File(incomingPath.jfile, "mesh").listFiles()
          .filter(_.getName.endsWith(".ply"))
          .filter(_.getName.contains(RAW.toString))
          .filter(_.getName.contains(expression.toString))
          .map(file => BU3DID.fromFilename(file.getName))
          .toSeq
      }
    }
  }

  override def registration: BU3DDataProvider.SurfaceRegistration = {

    new BU3DDataProvider.SurfaceRegistration {
      val registrationPath = repositoryRoot / "data" / "registered"
      registrationPath.jfile.mkdirs()

      val referencePath = registrationPath / "reference"
      referencePath.jfile.mkdirs()

      val modelPath = referencePath / "gpmodels"
      modelPath.jfile.mkdirs()

      override def loadPriorModel(expression: ExpressionType): Try[StatisticalMeshModel] = {
        expression match {
          case Neutral => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-neutral.h5"))
          case Sadness => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-sadness.h5"))
          case Surprise => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-surprise.h5"))
          case Disgust => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-disgust.h5"))
          case Fear => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-fear.h5"))
          case Joy => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-joy.h5"))
          case Anger => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-anger.h5"))
          case CoreExpression => StatismoIO.readStatismoMeshModel(new File(modelPath.jfile, "face-model-combined-expressions.h5"))
        }
      }

      override def savePriorModel(model: StatisticalMeshModel, expression: ExpressionType): Try[Unit] = {
        expression match {
          case Neutral => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-neutral.h5"))
          case Sadness => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-sadness.h5"))
          case Surprise => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-surprise.h5"))
          case Disgust => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-disgust.h5"))
          case Fear => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-fear.h5"))
          case Joy => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-joy.h5"))
          case Anger => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-anger.h5"))
          case CoreExpression => StatismoIO.writeStatismoMeshModel(model, new File(modelPath.jfile, "face-model-combined-expressions.h5"))
        }

      }

      def meshPath(id: Person, expression: ExpressionType, mask: MaskType = RAW, flag: DataFlag = Original): Path = {
        registrationPath / "mesh" / s"${id.id}${expression}${id.raceTag}${mask}${flag}.ply"
      }

      override def loadMesh(id: Person, expression: ExpressionType): Try[TriangleMesh[_3D]] = {
        import scalismo.faces.io.MeshIO
        val path = meshPath(id, expression)
        MeshIO.read(path.jfile).map(_.shape)
      }

      override def loadMesh(id: Person, expression: ExpressionType, mask: MaskType): Try[TriangleMesh[_3D]] = ???
      override def loadMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[TriangleMesh[_3D]] = ???
      override def loadColoredMesh(id: Person, expression: ExpressionType): Try[ColorNormalMesh3D] = ???
      override def loadColoredMesh(id: Person, expression: ExpressionType, mask: MaskType): Try[ColorNormalMesh3D] = ???
      override def loadColoredMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[ColorNormalMesh3D] = ???

      override def saveMesh(id: Person, expression: ExpressionType, mesh: TriangleMesh[_3D]): Try[Unit] = {
        import scalismo.faces.io.MeshIO
        val path = meshPath(id, expression)
        MeshIO.write(mesh, None, None, path.jfile)
        setFileAccessMode(path)
        Success(Unit)
      }

      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, mesh: TriangleMesh[_3D]): Try[Unit] = ???
      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, mesh: TriangleMesh[_3D]): Try[Unit] = ???

      override def saveMesh(id: Person, expression: ExpressionType, mesh: ColorNormalMesh3D): Try[Unit] = ???
      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, mesh: ColorNormalMesh3D): Try[Unit] = ???
      override def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, mesh: ColorNormalMesh3D): Try[Unit] = ???

      override def ids(expression: ExpressionType): Seq[Person] = {
        new File(registrationPath.jfile, "mesh").listFiles()
          .filter(_.getName.endsWith("ply"))
          .filter(_.getName.contains(expression.toString))
          .map(file => BU3DID.fromFilename(file.getName))
          .toSeq
      }

    }
  }

  override def model : BU3DDataProvider.ModelBuilding = BU3DModelBuilding

  object BU3DModelBuilding extends BU3DDataProvider.ModelBuilding {
    val modelBuildingPath = repositoryRoot / "data" / "modelbuilding"
    modelBuildingPath.jfile.mkdirs()

    val modelDirectoryPath = modelBuildingPath / "model"
    modelDirectoryPath.jfile.mkdirs()

    val colorExtractdMeshPath = modelBuildingPath / "mesh"
    colorExtractdMeshPath.jfile.mkdirs()

    def meshPath(id: Person, expression: ExpressionType, mask: MaskType = RAW, flag: DataFlag = Original): Path = {
      colorExtractdMeshPath / s"${id.id}${expression}${id.raceTag}${mask}$flag.ply"
    }

    override def loadColoredMesh(id: Person, expression: ExpressionType): Try[ColorNormalMesh3D] = {
      loadColoredMesh(id,expression,RAW,Original)
    }

    def loadColoredMesh(id: Person, expression: ExpressionType, mask: MaskType = RAW, flag: DataFlag = Original): Try[ColorNormalMesh3D] = {
      import scalismo.faces.io.MeshIO
      val path = meshPath(id, expression, mask, flag)
      MeshIO.read(path.jfile).map(_.colorNormalMesh3D.get)
    }

    override def saveColoredMesh(id: Person, expression: ExpressionType, mesh: ColorNormalMesh3D): Try[Unit] = {
      saveColoredMesh(id,expression,mesh,RAW,Original)
    }

    def saveColoredMesh(id: Person, expression: ExpressionType, mesh: ColorNormalMesh3D, mask: MaskType = RAW, flag: DataFlag = Original): Try[Unit] = {
      import scalismo.faces.io.MeshIO
      val path = meshPath(id, expression, mask, flag)
      path.jfile.getParentFile.mkdirs
      MeshIO.write(mesh, path.jfile)
      setFileAccessMode(path)
      Success(Unit)
    }

    def modelPath(mask: MaskType): Path = {
      modelDirectoryPath / s"bu3d_pami17${mask}.h5" // @todo think about registration identifier in name
    }

    override def saveModel(mask: MaskType, momo: MoMo): Try[Unit]  = {
      val path = modelPath(mask)
      path.jfile.getParentFile.mkdirs
      val res = MoMoIO.write(momo, path.jfile, "")
      res match {
        case Success(_) => setFileAccessMode(path)
        case _ =>
      }
      res
    }

    override def loadModel(mask: MaskType): Try[MoMo] = {
      MoMoIO.read(modelPath(mask).jfile, "")
    }

  }

  override def fitting: BU3DDataProvider.Fitting = {
    new BU3DDataProvider.Fitting {

    }
  }





  override def loadMeshMask(from: String, to: String): Try[BinaryMask] = {
    BinaryMask.load(Source.fromFile(new File(repositoryRoot / "data" / "incoming" / "reference" / "masks" / from+"_TO_"+to+".mask")))
  }


}


================================================
FILE: src/main/scala/ch/unibas/cs/gravis/facepipeline/DataProvider.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package ch.unibas.cs.gravis.facepipeline

import registration.modelbuilding.FaceMask
import scalismo.faces.mesh.{BinaryMask, ColorNormalMesh3D}
import scalismo.faces.momo.MoMo
import scalismo.geometry.{Landmark, _3D}
import scalismo.mesh.TriangleMesh
import scalismo.statisticalmodel.StatisticalMeshModel

import scala.reflect.io.Path
import scala.util.Try

trait ExpressionType {
  override def toString : String
}

trait MaskType {
  override def toString: String
}

trait DataFlag {
  override def toString: String
}

trait DataProvider {

  trait Person {
    def id: String
    def raceTag: String
  }

  trait WithMesh {
    def loadMesh(id: Person, expression: ExpressionType): Try[TriangleMesh[_3D]]
    def loadMesh(id: Person, expression: ExpressionType, mask: MaskType): Try[TriangleMesh[_3D]]
    def loadMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[TriangleMesh[_3D]]
    def loadColoredMesh(id: Person, expression: ExpressionType): Try[ColorNormalMesh3D]
    def loadColoredMesh(id: Person, expression: ExpressionType, mask: MaskType): Try[ColorNormalMesh3D]
    def loadColoredMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[ColorNormalMesh3D]
    def saveMesh(id: Person, expression: ExpressionType, mesh: TriangleMesh[_3D]): Try[Unit]
    def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, mesh: TriangleMesh[_3D]): Try[Unit]
    def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, mesh: TriangleMesh[_3D]): Try[Unit]
    def saveMesh(id: Person, expression: ExpressionType, mesh: ColorNormalMesh3D): Try[Unit]
    def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, mesh: ColorNormalMesh3D): Try[Unit]
    def saveMesh(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, mesh: ColorNormalMesh3D): Try[Unit]
  }

  trait WithLandmarks {
    def loadLandmarks(id: Person, expression: ExpressionType): Try[Seq[Landmark[_3D]]]
    def loadLandmarks(id: Person, expression: ExpressionType, mask: MaskType): Try[Seq[Landmark[_3D]]]
    def loadLandmarks(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag): Try[Seq[Landmark[_3D]]]
    def saveLandmarks(id: Person, expression: ExpressionType, landmarks: Seq[Landmark[_3D]]): Try[Unit]
    def saveLandmarks(id: Person, expression: ExpressionType, mask: MaskType, landmarks: Seq[Landmark[_3D]]): Try[Unit]
    def saveLandmarks(id: Person, expression: ExpressionType, mask: MaskType, flag: DataFlag, landmarks: Seq[Landmark[_3D]]): Try[Unit]
  }

  trait WithLineLandmarks {
    def loadLineLandmarks(id: Person, expression: ExpressionType): Try[Seq[Landmark[_3D]]]
    def saveLineLandmarks(id: Person, expression: ExpressionType): Try[Seq[Landmark[_3D]]]
  }

  trait WithIds {
    def ids(expression: ExpressionType): Seq[Person]
  }

  trait Reference {
    def loadMesh(expression: ExpressionType): Try[TriangleMesh[_3D]]
    def loadFaceMask(): Try[FaceMask]
    def loadLandmarks(expression: ExpressionType): Try[Seq[Landmark[_3D]]]
    def saveLandmarks(expression: ExpressionType, landmarks : Seq[Landmark[_3D]]): Try[Unit]
    def loadLineLandmarks(expression: ExpressionType): Try[Seq[Landmark[_3D]]]
  }

  trait Incoming extends WithIds with WithMesh with WithLandmarks {

    def reference: Reference
  }

  trait SurfaceRegistration extends WithIds with WithMesh {
    def loadPriorModel(expression: ExpressionType): Try[StatisticalMeshModel]
    def savePriorModel(model: StatisticalMeshModel, expressionType: ExpressionType): Try[Unit]
  }

  trait ModelBuilding {
    def loadModel( mask: MaskType ) : Try[MoMo]
    def saveModel( mask: MaskType, momo: MoMo ) : Try[Unit]
    def loadColoredMesh(id: Person, expression: ExpressionType): Try[ColorNormalMesh3D]
    def saveColoredMesh(id: Person, expression: ExpressionType, mesh: ColorNormalMesh3D): Try[Unit]
  }

  trait Fitting {}

  def Neutral: ExpressionType

  def repositoryRoot: Path

  def incoming: Incoming

  def registration: SurfaceRegistration

  def model: ModelBuilding

  def fitting: Fitting

  def expressions: Seq[ExpressionType]

  def masks: Seq[MaskType]

  def loadMeshMask(from: String, to: String): Try[BinaryMask]

  def personFromFilename(filename: String): Person
}



================================================
FILE: src/main/scala/ch/unibas/cs/gravis/facepipeline/PipelineStep.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package ch.unibas.cs.gravis.facepipeline

trait PipelineStep {

  def run() : Unit
}


================================================
FILE: src/main/scala/fitting/StandardFitScript.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package fitting

import java.io.File

import scalismo.faces.color.{RGB, RGBA}
import scalismo.faces.deluminate.SphericalHarmonicsOptimizer
import scalismo.faces.image.PixelImage
import scalismo.faces.io.{PixelImageIO, RenderParameterIO, TLMSLandmarksIO}
import scalismo.faces.mesh.MeshSurfaceSampling
import scalismo.faces.parameters.RenderParameter
import scalismo.faces.sampling.face.evaluators.PixelEvaluators._
import scalismo.faces.sampling.face.evaluators.PointEvaluators.IsotropicGaussianPointEvaluator
import scalismo.faces.sampling.face.evaluators.PriorEvaluators.{GaussianShapePrior, GaussianTexturePrior}
import scalismo.faces.sampling.face.evaluators._
import scalismo.faces.sampling.face.loggers._
import scalismo.faces.sampling.face.proposals.ImageCenteredProposal.implicits._
import scalismo.faces.sampling.face.proposals.ParameterProposals.implicits._
import scalismo.faces.sampling.face.proposals.SphericalHarmonicsLightProposals._
import scalismo.faces.sampling.face.proposals._
import scalismo.faces.sampling.face.{MoMoRenderer, ParametricLandmarksRenderer, ParametricModel}
import scalismo.geometry.{Vector, Vector3D, _2D}
import scalismo.sampling.algorithms.MetropolisHastings
import scalismo.sampling.evaluators.ProductEvaluator
import scalismo.sampling.loggers.ChainStateLogger.implicits._
import scalismo.sampling.loggers.ChainStateLoggerContainer.implicits._
import scalismo.sampling.loggers.{BestSampleLogger}
import scalismo.sampling.proposals.MixtureProposal.implicits._
import scalismo.sampling.proposals.{MetropolisFilterProposal, MixtureProposal}
import scalismo.sampling.{ProposalGenerator, TransitionProbability}
import scalismo.utils.Random

/* This Fitscript with its evaluators and the proposal distribution follows closely the proposed setting of:

Markov Chain Monte Carlo for Automated Face Image Analysis
Sandro Sch�nborn, Bernhard Egger, Andreas Morel-Forster and Thomas Vetter
International Journal of Computer Vision 123(2), 160-183 , June 2017
DOI: http://dx.doi.org/10.1007/s11263-016-0967-5

To understand the concepts behind the fitscript and the underlying methods there is a tutorial on:
http://gravis.dmi.unibas.ch/pmm/

 */

object StandardFitScript {

  /* Collection of all pose related proposals */
  def defaultPoseProposal(lmRenderer: ParametricLandmarksRenderer)(implicit rnd: Random):
  ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = {
    import MixtureProposal.implicits._

    val yawProposalC = GaussianRotationProposal(Vector3D.unitY, 0.75f)
    val yawProposalI = GaussianRotationProposal(Vector3D.unitY, 0.10f)
    val yawProposalF = GaussianRotationProposal(Vector3D.unitY, 0.01f)
    val rotationYaw = MixtureProposal(0.1 *: yawProposalC + 0.4 *: yawProposalI + 0.5 *: yawProposalF)

    val pitchProposalC = GaussianRotationProposal(Vector3D.unitX, 0.75f)
    val pitchProposalI = GaussianRotationProposal(Vector3D.unitX, 0.10f)
    val pitchProposalF = GaussianRotationProposal(Vector3D.unitX, 0.01f)
    val rotationPitch = MixtureProposal(0.1 *: pitchProposalC + 0.4 *: pitchProposalI + 0.5 *: pitchProposalF)

    val rollProposalC = GaussianRotationProposal(Vector3D.unitZ, 0.75f)
    val rollProposalI = GaussianRotationProposal(Vector3D.unitZ, 0.10f)
    val rollProposalF = GaussianRotationProposal(Vector3D.unitZ, 0.01f)
    val rotationRoll = MixtureProposal(0.1 *: rollProposalC + 0.4 *: rollProposalI + 0.5 *: rollProposalF)

    val rotationProposal = MixtureProposal(0.5 *: rotationYaw + 0.3 *: rotationPitch + 0.2 *: rotationRoll).toParameterProposal

    val translationC = GaussianTranslationProposal(Vector(300f, 300f)).toParameterProposal
    val translationF = GaussianTranslationProposal(Vector(50f, 50f)).toParameterProposal
    val translationHF = GaussianTranslationProposal(Vector(10f, 10f)).toParameterProposal
    val translationProposal = MixtureProposal(0.2 *: translationC + 0.2 *: translationF + 0.6 *: translationHF)

    val distanceProposalC = GaussianDistanceProposal(500f, compensateScaling = true).toParameterProposal
    val distanceProposalF = GaussianDistanceProposal(50f, compensateScaling = true).toParameterProposal
    val distanceProposalHF = GaussianDistanceProposal(5f, compensateScaling = true).toParameterProposal
    val distanceProposal = MixtureProposal(0.2 *: distanceProposalC + 0.6 *: distanceProposalF + 0.2 *: distanceProposalHF)

    val scalingProposalC = GaussianScalingProposal(0.15f).toParameterProposal
    val scalingProposalF = GaussianScalingProposal(0.05f).toParameterProposal
    val scalingProposalHF = GaussianScalingProposal(0.01f).toParameterProposal
    val scalingProposal = MixtureProposal(0.2 *: scalingProposalC + 0.6 *: scalingProposalF + 0.2 *: scalingProposalHF)

    val poseMovingNoTransProposal = MixtureProposal(rotationProposal + distanceProposal + scalingProposal)
    val centerREyeProposal = poseMovingNoTransProposal.centeredAt("right.eye.corner_outer", lmRenderer).get
    val centerLEyeProposal = poseMovingNoTransProposal.centeredAt("left.eye.corner_outer", lmRenderer).get
    val centerRLipsProposal = poseMovingNoTransProposal.centeredAt("right.lips.corner", lmRenderer).get
    val centerLLipsProposal = poseMovingNoTransProposal.centeredAt("left.lips.corner", lmRenderer).get

    MixtureProposal(centerREyeProposal + centerLEyeProposal + centerRLipsProposal + centerLLipsProposal + 0.2 *: translationProposal)
  }


  /* Collection of all illumination related proposals */
  def defaultIlluminationProposal(modelRenderer: ParametricModel, target: PixelImage[RGBA])(implicit rnd: Random):
  ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = {
    val shOpt = SphericalHarmonicsOptimizer(modelRenderer, target)
    val shOptimizerProposal = SHLightSolverProposal(shOpt, MeshSurfaceSampling.sampleUniformlyOnSurface(100))

    val lightSHPert = SHLightPerturbationProposal(0.001f, fixIntensity = true)
    val lightSHIntensity = SHLightIntensityProposal(0.1f)
    val lightSHBandMixter = SHLightBandEnergyMixer(0.1f)
    val lightSHSpatial = SHLightSpatialPerturbation(0.05f)
    val lightSHColor = SHLightColorProposal(0.01f)

    MixtureProposal((5f / 6f) *: MixtureProposal(lightSHSpatial + lightSHBandMixter + lightSHIntensity + lightSHPert + lightSHColor).toParameterProposal + (1f / 6f) *: shOptimizerProposal)
  }

  /* Collection of all statistical model (shape, texture) related proposals */
  def neutralMorphableModelProposal(implicit rnd: Random):
  ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = {

    val shapeC = GaussianMoMoShapeProposal(0.2f)
    val shapeF = GaussianMoMoShapeProposal(0.1f)
    val shapeHF = GaussianMoMoShapeProposal(0.025f)
    val shapeScaleProposal = GaussianMoMoShapeCaricatureProposal(0.2f)
    val shapeProposal = MixtureProposal(0.1f *: shapeC + 0.5f *: shapeF + 0.2f *: shapeHF + 0.2f *: shapeScaleProposal).toParameterProposal

    val textureC = GaussianMoMoColorProposal(0.2f)
    val textureF = GaussianMoMoColorProposal(0.1f)
    val textureHF = GaussianMoMoColorProposal(0.025f)
    val textureScale = GaussianMoMoColorCaricatureProposal(0.2f)
    val textureProposal = MixtureProposal(0.1f *: textureC + 0.5f *: textureF + 0.2 *: textureHF + 0.2f *: textureScale).toParameterProposal

    MixtureProposal(shapeProposal + textureProposal )
  }

  /* Collection of all statistical model (shape, texture, expression) related proposals */
  def defaultMorphableModelProposal(implicit rnd: Random):
  ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = {


    val expressionC = GaussianMoMoExpressionProposal(0.2f)
    val expressionF = GaussianMoMoExpressionProposal(0.1f)
    val expressionHF = GaussianMoMoExpressionProposal(0.025f)
    val expressionScaleProposal = GaussianMoMoExpressionCaricatureProposal(0.2f)
    val expressionProposal = MixtureProposal(0.1f *: expressionC + 0.5f *: expressionF + 0.2f *: expressionHF + 0.2f *: expressionScaleProposal).toParameterProposal


    MixtureProposal(neutralMorphableModelProposal + expressionProposal)
  }

  /* Collection of all color transform proposals */
  def defaultColorProposal(implicit rnd: Random):
  ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = {
    val colorC = GaussianColorProposal(RGB(0.01f, 0.01f, 0.01f), 0.01f, RGB(1e-4f, 1e-4f, 1e-4f))
    val colorF = GaussianColorProposal(RGB(0.001f, 0.001f, 0.001f), 0.01f, RGB(1e-4f, 1e-4f, 1e-4f))
    val colorHF = GaussianColorProposal(RGB(0.0005f, 0.0005f, 0.0005f), 0.01f, RGB(1e-4f, 1e-4f, 1e-4f))

    MixtureProposal(0.2f *: colorC + 0.6f *: colorF + 0.2f *: colorHF).toParameterProposal
  }


def fit(targetFn : String, lmFn: String, outputDir: String, modelRenderer: MoMoRenderer, expression: Boolean = true)(implicit rnd: Random):RenderParameter = {
  val target = PixelImageIO.read[RGBA](new File(targetFn)).get
  val targetLM = TLMSLandmarksIO.read2D(new File(lmFn)).get.filter(lm => lm.visible)

  PixelImageIO.write(target, new File(s"$outputDir/target.png")).get

  val init: RenderParameter = RenderParameter.defaultSquare.fitToImageSize(target.width, target.height)


  val sdev = 0.043f

  /* Foreground Evaluator */
  val pixEval = IsotropicGaussianPixelEvaluator(sdev)

  /* Background Evaluator */
  val histBGEval = HistogramRGB.fromImageRGBA(target, 25)

  /* Pixel Evaluator */
  val imgEval = IndependentPixelEvaluator(pixEval, histBGEval)

  /* Prior Evaluator */
  val priorEval = ProductEvaluator(GaussianShapePrior(0, 1), GaussianTexturePrior(0, 1))

  /* Image Evaluator */
  val allEval = ImageRendererEvaluator(modelRenderer, imgEval.toDistributionEvaluator(target))

  /* Landmarks Evaluator */
  val pointEval = IsotropicGaussianPointEvaluator[_2D](4.0) //lm click uncertainty in pixel! -> should be related to image/face size
  val landmarksEval = LandmarkPointEvaluator(targetLM, pointEval, modelRenderer)



  //logging
  val imageLogger = ImageRenderLogger(modelRenderer, new File(s"$outputDir/"), "mc-").withBackground(target)

  // Metropolis logger
  val printLogger = PrintLogger[RenderParameter](Console.out, "").verbose
  val mhLogger = printLogger


  // keep track of best sample
  val bestFileLogger = ParametersFileBestLogger(allEval, new File(s"$outputDir/fit-best.rps"))
  val bestSampleLogger = BestSampleLogger(allEval)
  val parametersLogger = ParametersFileLogger(new File(s"$outputDir/"), "mc-")

  val fitLogger = bestFileLogger :+ bestSampleLogger

  // pose proposal
  val totalPose = defaultPoseProposal(modelRenderer)

  //light proposals
  val lightProposal = defaultIlluminationProposal(modelRenderer, target)

  //color proposals
  val colorProposal = defaultColorProposal

  //Morphable Model  proposals
  val momoProposal = if(expression) defaultMorphableModelProposal else neutralMorphableModelProposal


  // full proposal filtered by the landmark and prior Evaluator
  val proposal = MetropolisFilterProposal(MetropolisFilterProposal(MixtureProposal(totalPose + colorProposal + 3f*:momoProposal + 2f *: lightProposal), landmarksEval), priorEval)

  //pose and image chains
  val imageFitter = MetropolisHastings(proposal, allEval)
  val poseFitter = MetropolisHastings(totalPose, landmarksEval)



  println("everyting setup. starting fitter ...")


  //landmark chain for initialisation
  val initDefault: RenderParameter = RenderParameter.defaultSquare.fitToImageSize(target.width, target.height)
  val init10 = initDefault.withMoMo(init.momo.withNumberOfCoefficients(50, 50, 5))
  val initLMSamples: IndexedSeq[RenderParameter] = poseFitter.iterator(init10, mhLogger).take(5000).toIndexedSeq

  val lmScores = initLMSamples.map(rps => (landmarksEval.logValue(rps), rps))

  val bestLM = lmScores.maxBy(_._1)._2
  RenderParameterIO.write(bestLM, new File(s"$outputDir/fitter-lminit.rps")).get

  val imgLM = modelRenderer.renderImage(bestLM)
  PixelImageIO.write(imgLM, new File(s"$outputDir/fitter-lminit.png")).get

  def printer(sample: RenderParameter): RenderParameter = {
  println(s"${sample.momo.shape} ${sample.momo.color} ${sample.momo.expression}")
  sample
}

  // image chain, fitting
  val fitsamples = imageFitter.iterator(bestLM, mhLogger).loggedWith(fitLogger).take(10000).toIndexedSeq
  val best = bestSampleLogger.currentBestSample().get

  val imgBest = modelRenderer.renderImage(best)
  PixelImageIO.write(imgBest, new File(s"$outputDir/fitter-best.png")).get
  best
}

}

================================================
FILE: src/main/scala/fitting/experiments/QualitativeLFW.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package fitting.experiments

import java.io.File

import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider
import fitting.StandardFitScript
import scalismo.faces.color.RGBA
import scalismo.faces.io.MoMoIO
import scalismo.faces.momo.MoMo
import scalismo.faces.sampling.face.MoMoRenderer
import scalismo.utils.Random

import scala.reflect.io.Path

object QualitativeLFW extends App{
  scalismo.initialize()
  val seed = 1986L
  implicit val rnd = new Random(seed)

  def fitModel(model:MoMo, modelName: String) = {
    val targetsPath =  BU3DDataProvider.repositoryRoot + "/recognition-experiment/fit-lfw-qualitative/lfwSelection/"
    val outPath =  BU3DDataProvider.repositoryRoot + "/recognition-experiment/fit-lfw-qualitative/lfwResults/" + modelName + "/"


    val files = new File(targetsPath).listFiles.filter(_.getName.endsWith(".png"))
    val listTarget = files.map(p => p.getName.substring(0, p.getName.length - 4)).toList



    listTarget.foreach{ targetName =>
      val outPathTarget = outPath + targetName + "/"

      if (!Path(outPathTarget).exists) {
        try {
          Path(outPathTarget).createDirectory(failIfExists = false)

          val renderer = MoMoRenderer(model, RGBA.BlackTransparent).cached(5)

          val targetFn = targetsPath + targetName + ".png"
          val targetLM = targetsPath + targetName + "_face0.tlms"

          StandardFitScript.fit(targetFn, targetLM, outPathTarget, renderer)
        }
      }
    }
  }

  val bfm = MoMoIO.read(new File( BU3DDataProvider.repositoryRoot + "/data/modelbuilding/model/model2017-1_face12_nomouth.h5")).get
  val bfmOld = MoMoIO.read(new File(BU3DDataProvider.repositoryRoot + "/model2009-face12.h5")).get
  val bu3d = MoMoIO.read(new File(BU3DDataProvider.repositoryRoot + "/data/modelbuilding/model/bu3d-face12_nomouth.h5")).get

  fitModel(bfm, "bfm")
  fitModel(bfmOld, "bfmOld")
  fitModel(bu3d, "bu3d")
}

================================================
FILE: src/main/scala/fitting/experiments/RecognitionMultiPIE.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package fitting.experiments

import java.io.File

import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider
import fitting.StandardFitScript
import scalismo.faces.color.RGBA
import scalismo.faces.io.{MoMoIO, RenderParameterIO}
import scalismo.faces.momo.MoMo
import scalismo.faces.sampling.face.MoMoRenderer
import scalismo.utils.Random

import scala.reflect.io.Path

// script to fit multiPie Neutral Samples with landmarks
object RecognitionMultiPiePose extends App{
  scalismo.initialize()
  val seed = 1986L
  implicit val rnd = new Random(seed)

  def fitModel(model:MoMo, modelName: String) = {
    val targetsPath = BU3DDataProvider.repositoryRoot + "/recognition-experiment"
    val outPath = targetsPath + "/results/" + modelName + "/"


    val files = new File(targetsPath + "/originals/").listFiles.filter(_.getName.endsWith(".png"))
    val listTarget = files.map(p => p.getName.substring(0, p.getName.length - 4)).toList



    listTarget.foreach(targetName => {
      val outPathTarget = outPath + targetName + "/"

      if (!Path(outPathTarget).exists) {
        try {
          Path(outPathTarget).createDirectory(failIfExists = false)

          val renderer = MoMoRenderer(model, RGBA.BlackTransparent).cached(5)

          val targetFn = targetsPath + "/originals/" +  targetName + ".png"
          val targetLM = targetsPath + "/landmarks/" + targetName + "_face0.tlms"

          StandardFitScript.fit(targetFn, targetLM, outPathTarget, renderer, false)
        }
      }
    })
  }
  val bfm = MoMoIO.read(new File(BU3DDataProvider.repositoryRoot + "data/modelbuilding/model/model2017-1_face12_nomouth.h5")).get.neutralModel
  val bfmOld = MoMoIO.read(new File(BU3DDataProvider.repositoryRoot + "/export/faces/model/bfm2009/model2009-face12.h5")).get.neutralModel
  val bu3d = MoMoIO.read(new File(BU3DDataProvider.repositoryRoot + "/export/faces/projects/pami-ppm2017/basel-face-pipeline/data/modelbuilding/model/bu3d-face12_nomouth.h5")).get.neutralModel

  fitModel(bfm, "bfm")
  fitModel(bfmOld, "bfmOld")
  fitModel(bu3d, "bu3d")


}

// script to fit multiPie Expression Samples with landmarks
object RecognitionMultiPieExpression extends App{
  scalismo.initialize()
  val seed = 1986L
  implicit val rnd = new Random(seed)

  def fitModel(model:MoMo, modelName: String) = {
    val targetsPath = "/export/faces/projects/pami-ppm2017/experiments/fit-multipie-recognition/multipie/"
    val outPath = targetsPath + "/results/" + modelName + "/"


    val files = new File(targetsPath + "/originalsExpressionsNotForPublishing").listFiles.filter(_.getName.endsWith(".png"))
    val listTarget = files.map(p => p.getName.substring(0, p.getName.length - 4)).toList



    listTarget.foreach(targetName => {
      val outPathTarget = outPath + targetName + "/"

      if (!Path(outPathTarget).exists) {
        try {
          Path(outPathTarget).createDirectory(failIfExists = false)

          val renderer = MoMoRenderer(model, RGBA.BlackTransparent).cached(5)

          val targetFn = targetsPath + "/originalsExpressionsNotForPublishing/" +  targetName + ".png"
          val targetLM = targetsPath + "landmarksExpressions/" + targetName + "_face0.tlms"

          StandardFitScript.fit(targetFn, targetLM, outPathTarget, renderer)
        }
      }
    })
  }
  val bu3dEx = MoMoIO.read(new File(BU3DDataProvider.repositoryRoot + "/data/modelbuilding/model/bu3d-face12_nomouth.h5")).get
  val bfmEx = MoMoIO.read(new File(BU3DDataProvider.repositoryRoot + "/data/modelbuilding/model/model2017-1_face12_nomouth.h5")).get

  fitModel(bfmEx, "bfmEx")
  fitModel(bu3dEx, "bu3dEx")

}

// Script to calculate recognition results over pose fitted multipie data
object RecognitionEvaluation extends App {

  case class Fit( id: String, pose: String, coeffs: IndexedSeq[Double])

  case class Match( id: String, similarity: Double)

  val models = IndexedSeq("bfmOld", "bu3d", "bfm")
  val resultsPath = BU3DDataProvider.repositoryRoot + "/recognition-experiment/results/"

  models.foreach { model =>
    val resultPath = resultsPath + model + "/"
    val files = new File(resultPath).listFiles.filter(_.isDirectory).toIndexedSeq.sortBy(_.getAbsoluteFile)
    val allFits = files.map { f =>
      val name = f.getName
      val id = name.substring(0, 3)
      val pose = name.substring(10, 13)
      val rps = RenderParameterIO.read(new File(resultPath + name + "/fit-best.rps")).get
      val coeffs = rps.momo.color ++ rps.momo.shape

      Fit(id, pose, coeffs)
    }

    val gallery = allFits.filter(fit => fit.pose == "051")

    val listOfPoses = IndexedSeq("051", "140", "130", "080")
    val listOfExperiments = listOfPoses.map { queryPose =>
      allFits.filter(fit => fit.pose == queryPose)
    }

    val queriesWithSimilarities = listOfExperiments.map { queriesInExperiment =>
      queriesInExperiment.map { query =>
        val similaritiesForQuery = gallery.map { subject =>
          Match(subject.id, cosineAngle(query.coeffs, subject.coeffs))
        }
        (query.id, similaritiesForQuery)
      }
    }

    val correctMatchesPerExperiment: IndexedSeq[Double] = queriesWithSimilarities.map { experiment =>
      val correctMatches = experiment.map { case (query_id, similarities) =>
        val bestMatch = similarities.maxBy(m => m.similarity)
        //println(query_id, bestMatch)
        if (bestMatch.id == query_id) 1.0 else 0.0
      }.sum
      correctMatches / experiment.length
    }

    println(model + correctMatchesPerExperiment)

  }


  def cosineAngle(aa: IndexedSeq[Double], bb: IndexedSeq[Double]): Double = {
    import breeze.linalg._

    val a = DenseVector(aa.toArray)
    val b = DenseVector(bb.toArray)

    (a dot b) / (norm(a) * norm(b))
  }

}

// Script to calculate recognition results over expression on fitted multipie data

object RecognitionEvaluationEx extends App {

  case class Fit( id: String, pose: String, coeffs: IndexedSeq[Double])

  case class Match( id: String, similarity: Double)

  val models = IndexedSeq("bfmEx", "bu3dEx")
  val resultsPath =  BU3DDataProvider.repositoryRoot + "/recognition-experiment/results/"

  models.foreach { model =>
    val resultPath = resultsPath + model + "/"
    val files = new File(resultPath).listFiles.filter(_.isDirectory).toIndexedSeq.sortBy(_.getAbsoluteFile)
    val allFits = files.map { f =>
      val name = f.getName
      val id = name.substring(0, 3)
      val expression = name.substring(7, 9)
      val rps = RenderParameterIO.read(new File(resultPath + name + "/fit-best.rps")).get
      val coeffs = rps.momo.color ++ rps.momo.shape

      Fit(id, expression, coeffs)
    }
    val gallery = allFits.filter(fit => fit.pose == "01")

    val listOfPoses = IndexedSeq("01","02")
    val listOfExperiments = listOfPoses.map { queryPose =>
      allFits.filter(fit => fit.pose == queryPose)
    }

    val queriesWithSimilarities = listOfExperiments.map { queriesInExperiment =>
      queriesInExperiment.map { query =>
        val similaritiesForQuery = gallery.map { subject =>
          Match(subject.id, cosineAngle(query.coeffs, subject.coeffs))
        }
        (query.id, similaritiesForQuery)
      }
    }

    val correctMatchesPerExperiment: IndexedSeq[Double] = queriesWithSimilarities.map { experiment =>
      val correctMatches = experiment.map { case (query_id, similarities) =>
        val bestMatch = similarities.maxBy(m => m.similarity)
       // println(query_id, bestMatch)
        if (bestMatch.id == query_id) 1.0 else 0.0
      }.sum
      correctMatches / experiment.length
    }

    println(model + correctMatchesPerExperiment)

  }


  def cosineAngle(aa: IndexedSeq[Double], bb: IndexedSeq[Double]): Double = {
    import breeze.linalg._

    val a = DenseVector(aa.toArray)
    val b = DenseVector(bb.toArray)

    (a dot b) / (norm(a) * norm(b))
  }

}

================================================
FILE: src/main/scala/modelbuilding/ModelBuilding.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package modelbuilding

import breeze.linalg.DenseMatrix
import ch.unibas.cs.gravis.facepipeline.{BU3DDataProvider, DataProvider, ExpressionType, PipelineStep}
import scalismo.common._
import scalismo.faces.color.{RGB, RGBA}
import scalismo.faces.io.MoMoIO
import scalismo.faces.mesh.{ColorNormalMesh3D, VertexColorMesh3D}
import scalismo.faces.momo.MoMo.NeutralWithExpression
import scalismo.faces.momo.{MoMo, PancakeDLRGP}
import scalismo.faces.render.Transform3D
import scalismo.geometry.{Point, Vector, _3D}
import scalismo.kernels.{DiagonalKernel, GaussianKernel, MatrixValuedPDKernel}
import scalismo.mesh.{SurfacePointProperty, TriangleMesh, TriangleMesh3D}
import scalismo.numerics.{UniformMeshSampler3D}
import scalismo.registration.{LandmarkRegistration, RigidTransformation}
import scalismo.statisticalmodel.{GaussianProcess, LowRankGaussianProcess}
import scala.util.{Success, Try}

object ModelBuilding {

  def main(args: Array[String]) {
    scalismo.initialize()
    ModelBuilding(BU3DDataProvider).run()
  }

}

case class ModelBuilding(dataProvider: DataProvider)  extends PipelineStep {

  override def run() {
    createMeshesWithVertexColor()
//    buildMoMoExpress()
    buildMoMoExpress("face12_nomouth")
  }

  /**
   * Extracts color for all registration results from the input meshes.
   */
  def createMeshesWithVertexColor(): Unit = {
    println("Extracting color from meshes ...")

    val expressions = dataProvider.expressions
    expressions.zipWithIndex.flatMap { case (exp,idx) =>
      println(s"... processing expression ${exp} (${idx+1}/${expressions.size})")

      val ids = dataProvider.registration.ids(exp)
      ids.zipWithIndex.map { case(id,idx) =>
        println(s"... ... processing id ${id} (${idx+1}/${ids.size})")

        val registeredShape: TriangleMesh[_3D] = dataProvider.registration.loadMesh(id, exp).get
        val coloredMesh: ColorNormalMesh3D = dataProvider.incoming.loadColoredMesh(id, exp).get
        val vertexColor = extractVertexColor(registeredShape, coloredMesh)

        val registeredShapeWithVertexColor = ColorNormalMesh3D(registeredShape, vertexColor, registeredShape.vertexNormals)
        dataProvider.model.saveColoredMesh(id, exp, registeredShapeWithVertexColor).get
      }

    }
  }

  /**
   * Extracts per vertex color for the registered mesh from the input mesh. The correspondence for each
   * vertex is sought after along the normal.
   *
   * @param registeredMesh Registration result without color.
   * @param colorMesh Input mesh with color.
   * @return Registeration result with color.
   */
  def extractVertexColor(registeredMesh: TriangleMesh3D, colorMesh: ColorNormalMesh3D): SurfacePointProperty[RGBA] = {

    val pointsReg = registeredMesh.pointSet.points.toIndexedSeq
    val normalsReg = registeredMesh.vertexNormals.pointData

    val shapeFromColorMesh = colorMesh.shape
    val meshOperations = shapeFromColorMesh.operations

    val colors = pointsReg.zip(normalsReg).map {
      case (point, normal) =>

        val intersections = meshOperations.getIntersectionPointsOnSurface(point, normal)

        val sortedIntersections = intersections.map { i =>
          val pointI = shapeFromColorMesh.position(i._1, i._2)
          val dist = (pointI - point).norm
          (i, dist)
        }.sortWith((a, b) => a._2 < b._2)

        if (sortedIntersections.nonEmpty && sortedIntersections.head._2 < 2.0) {
          val i = sortedIntersections.head._1
          colorMesh.color(i._1, i._2)
        } else {
          RGBA(0.9, 0.8, 0.1, 0.0)
        }
    }

    new SurfacePointProperty[RGBA](registeredMesh.triangulation, colors)
  }




  /**
   * Build a model from meshes with vertex color.
   * This step assumes that the registration was performed using the "bfm_nomouth" masked reference.
   */
  def buildMoMoExpress(maskType: String = "bfm_nomouth"): Unit = {
    println("Building model from meshes with vertex color...")

    val mask = {
      if ( maskType != "bfm_nomouth" ) Some(dataProvider.loadMeshMask("bfm_nomouth", "face12_nomouth").get)
      else None
    }

    val reference = {
      val originalRef = dataProvider.registration.loadPriorModel(dataProvider.Neutral).get.referenceMesh
      if ( mask.isDefined ) originalRef.operations.maskPoints(mask.get).transformedMesh
      else originalRef
    }
    val ids = dataProvider.registration.ids(dataProvider.Neutral)
    val otherExpressions = dataProvider.expressions.filter(exp => exp != dataProvider.Neutral)

    buildModel(ids, otherExpressions, reference)


    def buildModel(ids: Seq[dataProvider.Person], expressions: Seq[ExpressionType], reference: TriangleMesh3D) = {
      val data: Seq[Try[(DiscreteField[_3D, RGBA], VertexColorMesh3D, Seq[NeutralWithExpression])]] = ids.zipWithIndex.map {
        case (id, idx) =>
          println(s"... loading data for ${id.id} (${idx + 1}/${ids.size})")
          prepareData(reference, expressions, id)
      }

      val neutralMeshes: Seq[VertexColorMesh3D] = data.collect({ case Success(e) => e._2 })
      val neutralWithExpressions: Seq[NeutralWithExpression] = data.collect({ case Success(e) => e._3 }).flatten

      println(".. data loaded ...")

      val momo = MoMo.buildFromRegisteredSamples(
        reference = reference,
        samplesShape = neutralMeshes.toIndexedSeq,
        samplesColor = neutralMeshes.toIndexedSeq,
        samplesExpression = neutralWithExpressions.toIndexedSeq,
        shapeNoiseVariance = 0,
        colorNoiseVariance = 0,
        expressionNoiseVariance = 0)

      println("... initial model is built - (not handling missing color) ...")
      println(s"... ... shape rank: ${momo.shape.rank}")
      println(s"... ... color rank: ${momo.color.rank}")
      println(s"... ... exp rank: ${momo.expression.rank}")


      val colors: Seq[DiscreteField[_3D, RGBA]] = data.collect({ case Success(e) => e._1 })

      val colorModel = buildColorModel(reference, colors.toIndexedSeq, colors.size - 1)
      println("... color model is built ...")

      val bu3dModel = MoMo(momo.referenceMesh, momo.shape, colorModel, momo.expression, momo.landmarks)

      val newModelPath = dataProvider.repositoryRoot / "data" / "modelbuilding" / "model" / s"bu3d-${maskType}.h5"
      newModelPath.jfile.getParentFile.mkdirs()
      MoMoIO.write(bu3dModel, newModelPath.jfile)
      println("... model building finished!")
    }


    /** Align vcm to reference. */
    def align(reference: TriangleMesh3D, vcm: VertexColorMesh3D): VertexColorMesh3D = {
      val t: RigidTransformation[_3D] = LandmarkRegistration.rigid3DLandmarkRegistration(
        vcm.shape.pointSet.points.zip(reference.pointSet.points).toSeq,
        Point(0, 0, 0)
      )

      val transform = new Transform3D {
        override def apply(x: Point[_3D]): Point[_3D] = t(x)

        override def apply(v: Vector[_3D]): Vector[_3D] = t.rotation(v.toPoint).toVector
      }

      vcm.transform(transform)
    }

    /** Align mesh to reference. */
    def alignIt(vcm: VertexColorMesh3D): VertexColorMesh3D = align(reference, vcm)

    /** Execute function f for neutral and expression in ne. */
    def doForBothInNWE(ne: NeutralWithExpression, f: VertexColorMesh3D => VertexColorMesh3D): NeutralWithExpression = {
      ne match {
        case NeutralWithExpression(n, e) =>
          NeutralWithExpression(f(n), f(e))
      }
    }

    /** Mask mesh with precalculated mask. */
    def maskMesh(mesh: VertexColorMesh3D): VertexColorMesh3D = {
      if (mask.isDefined) {
        val reducer = mesh.shape.operations.maskPoints(mask.get)
        VertexColorMesh3D(
          reducer.transformedMesh,
          SurfacePointProperty.sampleSurfaceProperty(reducer.applyToSurfaceProperty(mesh.color), _.head)
        )
      } else {
        println("Warning: maskMesh is called but no mask is defined!")
        println("\t Hence the mesh is left unaltered.")
        mesh
      }
    }


    /**
      * Loads the neutral face and the expressions.
      *
      * @param reference
      * @param otherExpressions
      * @param id
      * @return
      */
    def prepareData(reference: TriangleMesh3D, otherExpressions: Seq[ExpressionType], id: dataProvider.Person) = {

      for {
        unalignedRegistrationWithColor <- dataProvider.model.loadColoredMesh(id, dataProvider.Neutral)
      } yield {

        val unaligned = VertexColorMesh3D(
          unalignedRegistrationWithColor.shape,
          unalignedRegistrationWithColor.color.asInstanceOf[SurfacePointProperty[RGBA]]
        )
        val masked = if ( mask.isDefined) maskMesh(unaligned) else unaligned
        val neutral = alignIt(masked)

        val neutralWithExpressionList = otherExpressions.par.map {
          exp =>
            println(s"... ... ${exp}")
            val unalignedExpression = dataProvider.model.loadColoredMesh(id, exp).get

            val unaligned = VertexColorMesh3D(
              unalignedExpression.shape,
              unalignedExpression.color.asInstanceOf[SurfacePointProperty[RGBA]]
            )
            val masked = if (mask.isDefined) maskMesh(unaligned) else unaligned

            val alignedExpression = alignIt(masked)
            NeutralWithExpression(neutral, alignedExpression)
        }.toIndexedSeq

        val neutralColor = DiscreteField[_3D, RGBA](neutral.shape.pointSet, neutral.color.pointData)
        (
          neutralColor,
          neutral,
          neutralWithExpressionList
        )
      }

    }
  }



  /**
    * Calculate the rigid 3d transform that aligns the mesh to the target.
    */
  def calculateShapeAligningTransform(mesh: TriangleMesh3D, target: TriangleMesh3D): Transform3D = {
    val t: RigidTransformation[_3D] = LandmarkRegistration.rigid3DLandmarkRegistration(
      mesh.pointSet.points.zip(target.pointSet.points).toSeq,
      Point(0,0,0)
    )

    val transform = new Transform3D{
      override def apply(x: Point[_3D]): Point[_3D] = t(x)
      override def apply(v: Vector[_3D]): Vector[_3D] = t.rotation(v.toPoint).toVector
    }

    transform
  }


  /**
   * Builds a color model. This model building accounts for missing color values.
   *
   * @param referenceMesh Reference mesh.
   * @param colorFields Colorfields to build the model from.
   * @param numberOfComponents Number of desired components.
   * @return Color model.
   */
  def buildColorModel(
    referenceMesh: TriangleMesh3D,
    colorFields: IndexedSeq[DiscreteField[_3D, RGBA]],
    numberOfComponents: Int
  ): PancakeDLRGP[_3D, RGB] = {

    val domain = referenceMesh.pointSet

    val meanRGBA = DiscreteField[_3D, RGBA](domain, saveMean(colorFields.map(_.data)))

    val meanFreeColors = saveMeanFreeColors(colorFields.map(_.data), meanRGBA.data)
    val meanFreeColorFields = meanFreeColors.map{ a => DiscreteField(domain, a) }

    val meanRGB = DiscreteField[_3D, RGB](domain, meanRGBA.data.map(_.toRGB))

    val kernel: MatrixValuedPDKernel[_3D] = ReducedEntryKernel(meanFreeColorFields)
    val gp: GaussianProcess[_3D, RGB] = GaussianProcess(meanRGB.interpolateNearestNeighbor(), kernel)
    val lrgp = LowRankGaussianProcess.approximateGP[_3D,RGB](gp,UniformMeshSampler3D(referenceMesh,500),numberOfComponents)
    val grf = lrgp.discretize(domain)

    PancakeDLRGP(grf)
  }

  /**
   * Calculates the mean based on available samples only.
   * If no value is available, i.e. the alpha channel is zero for all samples at a given vertex, BlackTransparent is set as mean color.
   */
  def saveMean(colorVectors: Seq[IndexedSeq[RGBA]]): IndexedSeq[RGBA] = {
    val numberOfColorValues = colorVectors.size
    val numberOfSamples = colorVectors.head.size

    val accumulatedColor = Array.fill(numberOfSamples)(RGBA.BlackTransparent)
    val numberOfUsedColors = Array.fill(numberOfSamples)(0)

    for (
      i <- 0 until numberOfColorValues;
      j <- 0 until numberOfSamples
    ) {
      val color = colorVectors(i)(j)
      if (color.a == 1.0) {
        accumulatedColor(j) = accumulatedColor(j) + color
        numberOfUsedColors(j) += 1
      }
    }

    val mean = accumulatedColor.zip(numberOfUsedColors).map {
      case (sumOfColors, counter) =>
        if (counter == 0) RGBA.BlackTransparent
        else sumOfColors / counter
    }

    mean
  }

  /**
   * Substracts the mean color vector from all color samples. Else BlackTransparent.
   */
  def saveMeanFreeColors(colorVectors: Seq[IndexedSeq[RGBA]], meanColorVector: IndexedSeq[RGBA]): Seq[IndexedSeq[RGBA]] = {
    colorVectors.map { colorVector =>
      colorVector.zip(meanColorVector).map {
        case (color, meanColor) =>
          if (color.a == 1.0) {
            val d = RGBA(color.r - meanColor.r, color.g - meanColor.g, color.b - meanColor.b, color.a)
            d
          } else {
            RGBA.BlackTransparent
          }
      }
    }
  }

  /**
   * Covariance kernel that has a backupkernel used when some of the data is missing.
   *
   * @param colorFields Input data with possibly some data missing.
   */
  case class MissingEntryKernel(
                                 colorFields: Seq[DiscreteField[_3D, RGBA]],
                                 backupKernel: MatrixValuedPDKernel[_3D] = DiagonalKernel(GaussianKernel[_3D](10), 3) * 0.0001
                               ) extends MatrixValuedPDKernel[_3D] {

    val fs = colorFields.map(f => f.interpolateNearestNeighbor())

    override protected def k(x: Point[_3D], y: Point[_3D]): DenseMatrix[Double] = {

      val correlation = fs.foldLeft(DenseMatrix.zeros[Double](outputDim, outputDim)) { (sum, field) =>
        val xc = field(x)
        val yc = field(y)
        val addend =
          if (xc.a > 0.5 && yc.a > 0.5) {
            xc.toRGB.toVector.outer(yc.toRGB.toVector).toBreezeMatrix
          } else {
            backupKernel(x, y)
          }
        sum + addend * (1.0 / fs.size)
      }

      correlation
    }

    override def domain: Domain[_3D] = RealSpace[_3D]

    override def outputDim: Int = 3
  }


  /**
    * Kernel estimating the covariance only on the available data.
    */
  case class ReducedEntryKernel(
                                 colorFields: Seq[DiscreteField[_3D, RGBA]],
                                 backupKernel: MatrixValuedPDKernel[_3D] = DiagonalKernel(GaussianKernel[_3D](5), 3) * 0.1
                               ) extends MatrixValuedPDKernel[_3D] {

    val fs = colorFields.map(f => f.interpolateNearestNeighbor())
    private val originalDomain = colorFields.head.domain
    val countEntries = DiscreteField(originalDomain,colorFields.foldLeft(
      IndexedSeq.fill[Int](originalDomain.numberOfPoints)(0)
    ) { case (sum, field) =>
        field.data.map(c => if(c.a==1.0) 1 else 0).zip(sum).map(p => p._1+p._2)
    }).interpolateNearestNeighbor()

    override protected def k(x: Point[_3D], y: Point[_3D]): DenseMatrix[Double] = {
      var count = 0

      val correlation = fs.foldLeft(DenseMatrix.zeros[Double](outputDim, outputDim)) { (sum, field) =>
        val xc = field(x)
        val yc = field(y)
        if (xc.a > 0.5 && yc.a > 0.5) {
          val addend = xc.toRGB.toVector.outer(yc.toRGB.toVector).toBreezeMatrix
          count += 1
          sum + addend
        } else {
          sum
        }
      }

      if (count>0) {
        correlation*(1.0/count)
      } else {
        backupKernel(x,y)
      }
    }

    override def domain: Domain[_3D] = RealSpace[_3D]

    override def outputDim: Int = 3
  }

}


================================================
FILE: src/main/scala/preprocessing/ConvertBu3DRawData.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package preprocessing

import java.awt.image.BufferedImage
import java.io.{File, FileInputStream, InputStream}
import javax.imageio.ImageIO

import ch.unibas.cs.gravis.facepipeline._
import scalismo.faces.landmarks.TLMSLandmark3D
import scalismo.faces.mesh.{ColorNormalMesh3D, TextureMappedProperty, VertexPropertyPerTriangle}
import scalismo.faces.render.Transform3D
import scalismo.faces.utils.ResourceManagement
import scalismo.faces.color.RGBA
import scalismo.faces.image.PixelImage
import scalismo.common.PointId
import scalismo.geometry._
import scalismo.mesh._
import scalismo.registration.{LandmarkRegistration, RigidTransformation, TranslationTransform}

import scala.io.Source
import scala.reflect.io._
import scala.collection.mutable.ListBuffer
import scala.util.Try



object ConvertBu3DRawData {

  def main(args: Array[String]) {
    import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider
    scalismo.initialize()
    ConvertBu3DRawData(BU3DDataProvider).run()
  }

}



case class ConvertBu3DRawData(dataProvider : DataProvider) extends PipelineStep {

  override def run() {
    val inputDirectory = dataProvider.repositoryRoot / "data" / "bu3dfe" / "original"
    val ids = getIds(inputDirectory)
    println(s"found ${ids.size} ids....")

    ids.take(4).foreach { id => // todo: remove take(4) to build on full database
      preprocessMaximalExpressions(id, inputDirectory)
    }
  }

  def getIds(directory: Path): Seq[dataProvider.Person] = {
    require(
      directory.isDirectory,
      "Expected path to the parent DIRECTORY containing the 100 folders of the BU3D, one for each person."
    )
    val subDirectories = directory.toDirectory.list
    val subDirectoryNames = subDirectories.map(_.name).toIndexedSeq
    val sortedList = subDirectoryNames.sortWith((l, r) => l.compareTo(r) < 0)
    sortedList.map { subdir =>
      val filenames = (directory / subdir).toDirectory.list.map(_.name).toIndexedSeq
      val filename = filenames.filter(file => file.contains("NE00") && file.contains("RAW") && file.contains(".wrl")).head
      dataProvider.personFromFilename(filename)
    }
  }

  def getFilestem(id: dataProvider.Person, expression: ExpressionType, mask: MaskType): String = {
    val base = id + expression.toString
    s"${id.id}${expression}${id.raceTag}${mask}"
  }

  def preprocessMaximalExpressions(id: dataProvider.Person, inputDirectory: Path): Unit = {
    import BU3DDataProvider.{RAW,F3D,Original,Aligned}

    val expressions = dataProvider.expressions
    val outputPath = dataProvider.repositoryRoot / "data" / "incoming"

    expressions.foreach { expression =>
      Seq(RAW,F3D).map { mask =>
        val meshStem = getFilestem(id, expression, mask)
        val textureStem = getFilestem(id, expression, F3D)

        val meshPath = outputPath / "mesh" / s"${id.id}${expression}${id.raceTag}${mask}.ply"
        if (! meshPath.jfile.exists() ) {
          println(meshStem + " converting mesh")
          val texture = loadTexture(id, textureStem, inputDirectory)
          val mesh = loadBu3DWRLHacky(id, meshStem, inputDirectory, texture)
          dataProvider.incoming.saveMesh(id, expression, mask, Original, mesh).get
        } else
          println(meshStem + " already converted mesh")

        // convert all available landmarks
        val landmarkPath = outputPath / "landmarks" / s"${id.id}${expression}${id.raceTag}${mask}.tlms"
        if (! landmarkPath.jfile.exists() ) {
          println(meshStem + " converting landmarks")
          if (meshStem.endsWith("F3D")) {
            val landmarks = readBU3DFEbnd(inputDirectory / id.id / s"${meshStem}.bnd").get
            dataProvider.incoming.saveLandmarks(id, expression, mask, Original, landmarks.map { lm => Landmark[_3D](id = lm.id, point = lm.point) }).get
          }
          if (meshStem.endsWith("RAW")) {
            val landmarks = readBU3DFEpse(inputDirectory / id.id / s"${meshStem}.pse").get
            dataProvider.incoming.saveLandmarks(id, expression, mask, Original, landmarks.map { lm => Landmark[_3D](id = lm.id, point = lm.point) }).get
          }
        } else {
          println(meshStem + " already converted landmarks")
        }
      }

    }

    // do alignment after converting all data as we need for each id raw+f3d available
    expressions.foreach { expression =>
      // align f3d data to original raw data
      val meshStem = getFilestem(id, expression, F3D)
      val landmarkPath = outputPath / "landmarks" / s"${id.id}${expression}${id.raceTag}${F3D}_aligned.tlms"
      val meshPath = outputPath / "mesh" / s"${id.id}${expression}${id.raceTag}${F3D}_aligned.ply"
      if ( (!landmarkPath.jfile.exists()) && (!meshPath.jfile.exists()) ) {
        println(meshStem + " calculating alignment")
        val rawMesh = dataProvider.incoming.loadColoredMesh(id, expression, RAW).get
        val rawLandmarks = dataProvider.incoming.loadLandmarks(id, expression, RAW, Original).get
        val f3dMesh = dataProvider.incoming.loadColoredMesh(id, expression, F3D).get
        val f3dLandmarks = dataProvider.incoming.loadLandmarks(id, expression, F3D, Original).get

        val aligned = align(f3dMesh, f3dLandmarks, rawMesh, rawLandmarks)

        aligned match {
          case Some((meshAligned, landmarksAligned)) =>
            dataProvider.incoming.saveLandmarks(id, expression, F3D, Aligned, landmarksAligned).get
            dataProvider.incoming.saveMesh(id, expression, F3D, Aligned, meshAligned).get
          case None =>
        }
      } else {
        println(meshStem + " already aligned")
      }
    }

  }

  def loadTexture(id: dataProvider.Person, stem: String, inputDirectory: Path): PixelImage[RGBA] = {
    import scalismo.faces.image.BufferedImageConverter
    val textureName = inputDirectory / id.id / s"${stem}.bmp"
    println(textureName.toString())
    val img: BufferedImage = ImageIO.read(new java.io.File(textureName.toString()))
    BufferedImageConverter.toPixelImage(img)
  }

  def loadBu3DWRLHacky(id: dataProvider.Person, stem: String, inputDirectory: Path, texture: PixelImage[RGBA]): ColorNormalMesh3D = {
    val meshName = inputDirectory / id.id / s"$stem.wrl"
    val lines = Source.fromFile(meshName.toString()).getLines()

    val coordinates = ListBuffer[Point[_3D]]()
    val textureCoordinates = ListBuffer[Point[_2D]]()
    val textureCoordinateIndex = ListBuffer[TriangleCell]()
    val triangleVertexIndex = ListBuffer[TriangleCell]()
    parseBu3DWRL(lines, coordinates, triangleVertexIndex, textureCoordinates, textureCoordinateIndex)

    val vc = coordinates.toIndexedSeq
    val tc = textureCoordinates.toIndexedSeq
    val tvi = TriangleList(triangleVertexIndex.toIndexedSeq)
    val tci = TriangleList(textureCoordinateIndex.toIndexedSeq)

    val mesh = TriangleMesh3D(vc, tvi)
    val texCoords = VertexPropertyPerTriangle(tvi, tci.triangles.map(_.toIntVector3D), tc)
    val tex = TextureMappedProperty(tvi, texCoords, texture)
    ColorNormalMesh3D(mesh, tex, mesh.cellNormals)

  }

  def readBU3DFEbnd(path: Path): Try[IndexedSeq[TLMSLandmark3D]] = {
    ResourceManagement.usingTry(Try(new FileInputStream(path.toString())))(readBU3DFEbnd)
  }

  def readBU3DFEbnd(stream: InputStream): Try[IndexedSeq[TLMSLandmark3D]] = Try {
    // little bit unsafe, we read each line and expect to have a landmark!

    var counter = 0
    val lines = Source.fromInputStream(stream).getLines()
    lines.map { line =>
      val fields = line.split("\\s+").map(_.trim)

      counter += 1
      val name = "bu3dfe-lm-" + counter
      val x = fields(1).toFloat
      val y = fields(2).toFloat
      val z = fields(3).toFloat
      TLMSLandmark3D(name, Point(x, y, z), true)
    }.toIndexedSeq
  }

  def readBU3DFEpse(path: Path): Try[IndexedSeq[TLMSLandmark3D]] = {
    ResourceManagement.usingTry(Try(new FileInputStream(path.toString())))(readBU3DFEpse)
  }

  def readBU3DFEpse(stream: InputStream): Try[IndexedSeq[TLMSLandmark3D]] = Try {
    // little bit unsafe, we read each line and expect to have an landmark

    var counter = 0
    val lines = Source.fromInputStream(stream).getLines().toIndexedSeq

    val idOrder = Array(1, 5, 9, 13, 40, 45, 84, 85).toIndexedSeq

    // filter out pose normal (entry starting with n)
    val linesFiltered = lines.filterNot(_.substring(0, 1) == "n")
    if (linesFiltered.length == 8) {
      for (line <- linesFiltered) yield {

        val fields = line.split("\\s+").map(_.trim)

        val name = "bu3dfe-lm-" + idOrder(counter)
        counter += 1
        val x = fields(1).toFloat
        val y = fields(2).toFloat
        val z = fields(3).toFloat
        TLMSLandmark3D(name, Point(x, y, z), true)
      }
    } else {
      // empty list
      IndexedSeq[TLMSLandmark3D]()
    }
  }

  def align(
    f3dMeshOrig: ColorNormalMesh3D,
    f3dLandmarks: Seq[Landmark[_3D]],
    rawMeshOrig: ColorNormalMesh3D,
    rawLandmarks: Seq[Landmark[_3D]]
  ): Option[(ColorNormalMesh3D, Seq[Landmark[_3D]])] = {

    if (f3dLandmarks.isEmpty || rawLandmarks.isEmpty) {
      println(s"Not enough landmarks: f3dLandmarks.size=${f3dLandmarks.size} / rawLandmarks.size=${rawLandmarks.size}")
      return None
    }

    // landmark transformation
    val transLM = LandmarkRegistration.rigid3DLandmarkRegistration(f3dLandmarks, rawLandmarks, Point(0,0,0))
    val transLMNormal: RigidTransformation[_3D] = RigidTransformation(transLM.rotation, TranslationTransform(Vector(0, 0, 0)))
    val transLMFaces: Transform3D = new Transform3D {
      override def apply(x: Point[_3D]): Point[_3D] = transLM.f(x)
      override def apply(v: Vector[_3D]): Vector[_3D] = transLMNormal.f(v.toPoint).toVector
    }

    var f3dLandmarksT1 = transformLandmarks(f3dLandmarks, transLM)

    var f3dPointsT1 = f3dMeshOrig.transform(transLMFaces)

    val n0 = for (p <- f3dPointsT1.shape.pointSet.points) yield {
      rawMeshOrig.shape.pointSet.findClosestPoint(p).point
    }
    var avgError = f3dPointsT1.shape.pointSet.points.zip(n0).map(p => (p._1 - p._2).norm2).sum / f3dMeshOrig.shape.pointSet.points.toIndexedSeq.length

    var iteration = 0
    var r = scala.util.Random
    while (avgError > 0.2 && iteration < 500) {
      val f3DPointsSampled = for (i <- 1 to 1000) yield {
        f3dPointsT1.shape.pointSet.pointsWithId.toIndexedSeq(r.nextInt(f3dPointsT1.shape.pointSet.points.length - 1))
      }

      val nn = for (f3dPoint <- f3DPointsSampled) yield {
        val rawNeighbor = rawMeshOrig.shape.pointSet.findClosestPoint(f3dPoint._1)
        if ((f3dPoint._1 - rawNeighbor.point).norm < 5) {
          (f3dPoint._1, rawNeighbor.point)
        } else {
          (None, None)
        }
      }
      val nnFiltered = nn.filterNot(p => p._1 == None || p._2 == None).map(p => (p._1.asInstanceOf[Point3D], p._2.asInstanceOf[Point3D]))
      if (nnFiltered.length > 0) {

        val transT2 = LandmarkRegistration.rigid3DLandmarkRegistration(nnFiltered, Point(0,0,0))
        val transT2Normal: RigidTransformation[_3D] = RigidTransformation(transT2.rotation, TranslationTransform(Vector(0, 0, 0)))
        val transT2Faces: Transform3D = new Transform3D {
          override def apply(x: Point[_3D]): Point[_3D] = transT2.f(x)

          override def apply(v: Vector[_3D]): Vector[_3D] = transT2Normal.f(v.toPoint).toVector
        }

        val f3dLandmarksT2 = transformLandmarks(f3dLandmarksT1, transT2)
        val f3dPointsT2 = f3dPointsT1.transform(transT2Faces)
        val sampledPointsTrans = transformPoints(nnFiltered.map(p => p._1), transT2)

        avgError = sampledPointsTrans.zip(nnFiltered).map(p => (p._1 - p._2._2).norm).sum / sampledPointsTrans.length

        f3dPointsT1 = f3dPointsT2
        f3dLandmarksT1 = f3dLandmarksT2
        iteration = iteration + 1
      } else {
        println("icp alignment did not find corresponding points!")
        avgError = -1
      }

    }
    println("... alignment error after iteration " + iteration + ": " + avgError)

    val outTLMSLandmarks = for (lm <- f3dLandmarksT1.zip(f3dLandmarks)) yield {
      TLMSLandmark3D(lm._2.id, lm._1.point, true)
    }

    Some((f3dPointsT1, f3dLandmarksT1))
  }

  def transformLandmarks(landmarks: Seq[Landmark[_3D]], trans: RigidTransformation[_3D]): Seq[Landmark[_3D]] = {
    for (lm <- landmarks) yield {
      Landmark[_3D](lm.id, trans(lm.point), lm.description, lm.uncertainty)
    }
  }

  def transformPoints(points: Seq[Point[_3D]], trans: RigidTransformation[_3D]): Seq[Point[_3D]] = {
    for (p <- points) yield {
      trans(p)
    }
  }

  def parseBu3DWRL(
    lines: Iterator[String],
    coordinates: ListBuffer[Point[_3D]],
    tvi: ListBuffer[TriangleCell],
    textureCoordinates: ListBuffer[Point[_2D]],
    tci: ListBuffer[TriangleCell]
  ): Unit = {

    val BOILERPLATE = "boilerplate"
    val COORDINATES = "coord Coordinate"
    val COORDINATES2 = "coord Coordinate {"
    val TEXTURECOORDS = "texCoord TextureCoordinate"
    val TEXTURECOORDS2 = "texCoord TextureCoordinate {"
    val TEXCOORDINDEX = "texCoordIndex ["
    val VERTEXINDEX = "coordIndex ["

    var mode = "boilerplate"

    for (line <- lines) {
      val trimmed = line.trim
      mode match {
        case BOILERPLATE => {
          trimmed match {
            case COORDINATES => mode = COORDINATES
            case COORDINATES2 => mode = COORDINATES
            case TEXTURECOORDS => mode = TEXTURECOORDS
            case TEXTURECOORDS2 => mode = TEXTURECOORDS
            case TEXCOORDINDEX => mode = TEXCOORDINDEX
            case VERTEXINDEX => mode = VERTEXINDEX
            case _ =>
          }
        }
        case COORDINATES => {
          trimmed match {
            case "{" =>
            case "point" =>
            case "point [" =>
            case "[" =>
            case "" =>
            case "]" => mode = BOILERPLATE
            case data => {
              data.split(',').map(_.trim).filter(s => s.nonEmpty).foreach { tripplet =>
                val point = Point[_3D](tripplet.split(" ").map { w =>
                  try {
                    w.toDouble
                  } catch {
                    case e: Throwable => {
                      println(w)
                      throw (e)
                    }
                  }
                })
                coordinates += point
              }
            }
          }
        }
        case TEXTURECOORDS => {
          trimmed match {
            case "{" =>
            case "point" =>
            case "point [" =>
            case "[" =>
            case "" =>
            case "]" => mode = BOILERPLATE
            case data => {
              data.split(",").map(_.trim).filter(_.nonEmpty).flatMap(_.split(" ")).map(_.trim).filter(_.nonEmpty).grouped(2).foreach { pair =>
                val point = Point[_2D](pair.map { w =>
                  try {
                    w.toDouble
                  } catch {
                    case e: Throwable => {
                      println(w)
                      throw (e)
                    }
                  }
                })
                textureCoordinates += point
              }
            }
          }
        }
        case VERTEXINDEX => {
          trimmed match {
            case "[" =>
            case "" =>
            case "]" => mode = BOILERPLATE
            case data => {
              val stringToParse = if (data.last == ',') data.init else data
              val indices = stringToParse.split(" ").map(_.trim).filter(_.nonEmpty).map(_.toInt).toIndexedSeq
              require(indices.last == -1)
              val alwaysFirst = indices.head
              val firstAndlastRemoved = indices.init.tail
              val seconds = firstAndlastRemoved.init
              val thirds = firstAndlastRemoved.tail
              val grouped = seconds.zip(thirds)
              grouped.foreach { pair =>
                tvi += TriangleCell(
                  PointId(alwaysFirst),
                  PointId(pair._1),
                  PointId(pair._2)
                )
              }
            }
          }
        }
        case TEXCOORDINDEX => {
          trimmed match {
            case "[" =>
            case "" =>
            case "]" => mode = BOILERPLATE
            case data => {
              val stringToParse = if (data.last == ',') data.init else data
              val indices = stringToParse.split(" ").map(_.trim).filter(_.nonEmpty).map(_.toInt).toIndexedSeq
              require(indices.last == -1)
              val alwaysFirst = indices.head
              val firstAndlastRemoved = indices.init.tail
              val seconds = firstAndlastRemoved.init
              val thirds = firstAndlastRemoved.tail
              val grouped = seconds.zip(thirds)
              grouped.foreach { pair =>
                tci += TriangleCell(
                  PointId(alwaysFirst),
                  PointId(pair._1),
                  PointId(pair._2)
                )
              }
            }
          }
        }
      }
    }
  }

}


================================================
FILE: src/main/scala/preprocessing/PrepareReferenceLandmarks.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
package preprocessing

import breeze.linalg.{DenseMatrix, DenseVector}
import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider
import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider.Expressions
import ch.unibas.cs.gravis.facepipeline.{DataProvider, PipelineStep}
import scalismo.faces.io.TLMSLandmarksIO
import scalismo.statisticalmodel.MultivariateNormalDistribution

object PrepareReferenceLandmarks {

  def main(args: Array[String]): Unit = {
    scalismo.initialize()

    PrepareReferenceLandmarks(BU3DDataProvider).run()
  }

}

case class PrepareReferenceLandmarks(dataProvider : DataProvider) extends PipelineStep {

  override def run(): Unit = {

    scalismo.initialize()

    val rawRefLmsFile = (dataProvider.repositoryRoot / "data" / "incoming" / "reference" / "landmarks" / "mean2012_l7_bfm_nomouth.tlms").jfile

    val referenceLandmarksTLMS = TLMSLandmarksIO.read3D(rawRefLmsFile).get
    val referenceLandmarks = for (lmTlms <- referenceLandmarksTLMS if lmTlms.visible) yield {
      val lm = lmTlms.toLandmark
      val noiseVariance = lm.id.trim match {
        case lmid if lmid.contains("eyebrow") => 3.0
        case lmid if lmid.contains("eye.bottom") => 3.0
        case lmid if lmid.contains("eye.top") => 3.0
        case _ => 1.0
      }
      lm.copy(uncertainty = Some(MultivariateNormalDistribution(DenseVector.zeros[Double](3), DenseMatrix.eye[Double](3) * noiseVariance)))
    }

    // Transfer the reference landmarks to all the expressions and save them.
    for (expression <- Expressions.expressionModelTypes()) {

      val neutralRef = dataProvider.incoming.reference.loadMesh(dataProvider.Neutral).get
      val expressionRef = dataProvider.registration.loadPriorModel(expression).get.referenceMesh
      val expressionLms = for (lm <- referenceLandmarks) yield {
        val id = neutralRef.pointSet.findClosestPoint(lm.point).id
        lm.copy(point = expressionRef.pointSet.point(id))
      }

      dataProvider.incoming.reference.saveLandmarks(expression, expressionLms)
    }
  }

}

================================================
FILE: src/main/scala/registration/Registration.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration

import breeze.linalg.DenseVector
import ch.unibas.cs.gravis.facepipeline._
import com.typesafe.scalalogging.LazyLogging
import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider._
import _root_.registration.utils.VisualLogger
import _root_.registration.modelbuilding.FaceMask
import _root_.registration.metrics.HuberDistanceMetric
import scalismo.common.{PointId, UnstructuredPointsDomain}
import scalismo.geometry.{Landmark, Point, _3D}
import scalismo.mesh.{MeshBoundaryPredicates, TriangleMesh, TriangleMesh3DOperations}
import scalismo.numerics.{LBFGSOptimizer, Sampler, UniformMeshSampler3D}
import scalismo.registration._
import scalismo.statisticalmodel.{DiscreteLowRankGaussianProcess, StatisticalMeshModel}
import scalismo.utils.Random

case class Registration(dataProvider: DataProvider) extends PipelineStep with LazyLogging {

  type CoefficientVector = DenseVector[Double]

  case class LandmarkPair(referenceLandmark: Landmark[_3D], targetLandmark: Landmark[_3D])

  case class LevelConfig(regularizationWeight : Double, outlierThreshold : Option[Double], numBasisFunctions : Int)

  case class OutlierAwarePointSampler(referenceMesh: TriangleMesh[_3D], sampledNumberOfPoints: Int, isValidTargetPoint: Point[_3D] => Boolean)(implicit rand: Random) extends Sampler[_3D] with LazyLogging {


    private val points = UniformMeshSampler3D(referenceMesh, sampledNumberOfPoints).sample()(rand).map(_._1)
    private val validPointsOnly = points.filter(isValidTargetPoint)
    override val numberOfPoints: Int = validPointsOnly.size
    logger.info(s"sampling $numberOfPoints points")
    override def volumeOfSampleRegion: Double = referenceMesh.area

    override def sample()(implicit rand: Random): IndexedSeq[(Point[_3D], Double)] = {
      validPointsOnly.map(p => (p, 1.0 / referenceMesh.area))
    }

  }

  def registration(gpModel: StatisticalMeshModel,
                   targetMesh: TriangleMesh[_3D],
                   faceMask : FaceMask,
                   landmarkPairs: Seq[LandmarkPair]): TriangleMesh[_3D] = {

    val referenceMesh = gpModel.referenceMesh

    VisualLogger.showTargetMesh(targetMesh)

    val landmarkConstraints = for (landmarkPair <- landmarkPairs.toIndexedSeq) yield {
      val referencePointId = referenceMesh.pointSet.findClosestPoint(landmarkPair.referenceLandmark.point).id
      val targetPoint = landmarkPair.targetLandmark.point
      (referencePointId, targetPoint, landmarkPair.referenceLandmark.uncertainty.get)
    }

    val posteriorModel = gpModel.posterior(landmarkConstraints)

    VisualLogger.ui.map(_.show(posteriorModel,"M"))

    var initialCoefficients = DenseVector.zeros[Double](posteriorModel.rank)

    val levelConfigs = Seq(LevelConfig(1.0, None, gpModel.rank),
                           LevelConfig(1E-1, None, gpModel.rank),
                           LevelConfig(1E-3, None, gpModel.rank),
                          LevelConfig(1E-4, Some(4.0), gpModel.rank),
                          LevelConfig(1E-5, Some(2.0), gpModel.rank),
                          LevelConfig(1E-6, Some(1.0), gpModel.rank)
    )
    val finalCoefficients = levelConfigs.foldLeft[DenseVector[Double]](initialCoefficients){
      case(currentCoefficients, levelConfig) => {
        registrationForLevel(posteriorModel, targetMesh, faceMask, levelConfig, numberOfIterations = 20, currentCoefficients)
      }
    }

    posteriorModel.instance(finalCoefficients)

  }

  def registrationForLevel(gpModel: StatisticalMeshModel,
                           targetMesh : TriangleMesh[_3D],
                           faceMask: FaceMask,
                           levelConfig : LevelConfig,
                           numberOfIterations: Int,
                           initialCoefficients: CoefficientVector): CoefficientVector = {

    val LevelConfig(regularizationWeight, outlierThreshold,  numBasisFunctions) = levelConfig


    val reducedGPModel = reduceModel(gpModel, numBasisFunctions)
    val reducedInitialCoefficients = initialCoefficients(0 until numBasisFunctions)

    val referenceMesh = reducedGPModel.referenceMesh
    val currentFit = reducedGPModel.instance(reducedInitialCoefficients)


    VisualLogger.showStatisticalShapeModel(reducedGPModel)
    VisualLogger.updateModelView(reducedInitialCoefficients)

    // here we need to compute a new posterior based on the line landmarks

    def isValidTargetPoint(currentFit: TriangleMesh[_3D],
                           targetMeshOps: TriangleMesh3DOperations,
                           targetMeshBoundary: UnstructuredPointsDomain[_3D])
                          (p: Point[_3D]): Boolean = {

      val ptId = referenceMesh.pointSet.findClosestPoint(p).id
      val closestPt = targetMeshOps.closestPointOnSurface(currentFit.pointSet.point(ptId))
      val closestPtId = targetMesh.pointSet.findClosestPoint(closestPt.point).id

      def isOnValidBoundary(ptId : PointId, closestPtId : PointId) : Boolean = {

        if(faceMask.isLipPoint(ptId)) {
          true
        } else {
          (closestPt.point - targetMeshBoundary.findClosestPoint(closestPt.point).point).norm > 8.0 // Points that are close to a border
        }

      }

      def getOutlierTreshold(ptId : PointId) : Double = {

        if(faceMask.isLipPoint(ptId)) {
          Double.MaxValue
        } else {
          outlierThreshold.getOrElse(Double.MaxValue)
        }
      }


      Math.sqrt(closestPt.distanceSquared) < getOutlierTreshold(ptId) &&
        isOnValidBoundary(ptId,closestPtId) && !faceMask.isEarRegion(ptId) && !faceMask.isNoseRegion(ptId)
    }

    val targetMeshBoundaryPred = MeshBoundaryPredicates(targetMesh)
    val targetMeshBoundary = UnstructuredPointsDomain(targetMesh.pointSet.pointIds
      .filter(targetMeshBoundaryPred.pointIsOnBoundary)
      .map(targetMesh.pointSet.point).toIndexedSeq
    )

    val optimizationPointSampler = OutlierAwarePointSampler(referenceMesh,
      sampledNumberOfPoints = referenceMesh.pointSet.numberOfPoints,
      isValidTargetPoint(currentFit, targetMesh.operations, targetMeshBoundary))


    val config = RegistrationConfiguration[_3D, GaussianProcessTransformationSpace[_3D]](
      optimizer = LBFGSOptimizer(numIterations = numberOfIterations),
      metric = HuberDistanceMetric[_3D](optimizationPointSampler),
      transformationSpace = GaussianProcessTransformationSpace(reducedGPModel.gp.interpolateNearestNeighbor),
      regularizer = L2Regularizer,
      regularizationWeight = regularizationWeight)

    // Scalismo implements registration always as image to image registration.
    // Therefore we compute distance images from the meshes
    val fixedImage = referenceMesh.operations.toDistanceImage
    val movingImage = targetMesh.operations.toDistanceImage

    val registrationIterator = scalismo.registration.Registration.iterations(config)(fixedImage, movingImage, reducedInitialCoefficients)
    val iteratorWithLogging = for ((regState, itNum) <- registrationIterator.zipWithIndex) yield {
      logger.debug(s"Iteration $itNum: value = ${regState.optimizerState.value}")
      VisualLogger.updateModelView(regState.optimizerState.parameters)
      regState
    }

    val lastRegistrationState = iteratorWithLogging.toSeq.last


    val fullFinalParameters = DenseVector.zeros[Double](initialCoefficients.length)
    fullFinalParameters(0 until numBasisFunctions) := lastRegistrationState.optimizerState.parameters
    fullFinalParameters
  }


  private def reduceModel(model : StatisticalMeshModel, numBasisFunctions : Int) : StatisticalMeshModel = {
    val reducedGp = DiscreteLowRankGaussianProcess(model.gp.mean, model.gp.klBasis.take(numBasisFunctions))
    model.copy(gp = reducedGp)
  }

  override def run(): Unit = {
    // transforms the mesh using the best similarity transform between the reference and target landmarks.

    for (expression <- Seq(Neutral,Sadness,Joy,Disgust,Anger,Fear,Surprise).reverse) {

      val referenceLandmarks = dataProvider.incoming.reference.loadLandmarks(expression = if(expression == Neutral) Neutral else CoreExpression).get
      val model = dataProvider.registration.loadPriorModel(expression = if(expression == Neutral) Neutral else CoreExpression).get

      val faceMask = dataProvider.incoming.reference.loadFaceMask().get

      logger.info("Successfully loaded reference and model")

      for (id <- scala.util.Random.shuffle(dataProvider.incoming.ids(expression)) if dataProvider.registration.loadMesh(id,expression).isFailure &&  dataProvider.incoming.loadLandmarks(id,expression).isSuccess) {

        logger.info("Performing registration for id " + id)
        val targetMesh = dataProvider.incoming.loadMesh(id,expression).get
        val targetLandmarks = dataProvider.incoming.loadLandmarks(id,expression).get

        val correspondingLandmarks = correspondingLandmarkPairs(referenceLandmarks, targetLandmarks)

        val correspondingLandmarkPoints = correspondingLandmarks.map(lmPair => (lmPair.targetLandmark.point, lmPair.referenceLandmark.point))
        val alignmentTransform = LandmarkRegistration.similarity3DLandmarkRegistration(correspondingLandmarkPoints, center = Point(0.0, 0.0, 0.0))
        val alignedTargetMesh = targetMesh.transform(alignmentTransform)
        val alignedLandmarkPairs = correspondingLandmarks.map(lmPair =>
          LandmarkPair(lmPair.referenceLandmark, lmPair.targetLandmark.transform(alignmentTransform))
        )

        VisualLogger.ui.map(_.show(alignedLandmarkPairs.map(_.targetLandmark),"Test"))
        VisualLogger.ui.map(_.show(alignedLandmarkPairs.map(_.referenceLandmark),"Test"))
        val registeredMesh = registration(model, alignedTargetMesh, faceMask, alignedLandmarkPairs)

        // we realign the registered mesh with the target.
        val registeredMeshOrigSpace = registeredMesh.transform(alignmentTransform.inverse)
        dataProvider.registration.saveMesh(id,expression, registeredMeshOrigSpace)
      }
    }
  }

  private def correspondingLandmarkPairs(referenceLandmarks: Seq[Landmark[_3D]], targetLandmarks: Seq[Landmark[_3D]]): Seq[LandmarkPair] = {

    referenceLandmarks
      .map(refLm => (refLm, targetLandmarks.find(targetLm => targetLm.id == refLm.id)))
      .filter(lmTuple => lmTuple._2.nonEmpty)
      .map(lmTuple => LandmarkPair(lmTuple._1, lmTuple._2.get))
  }


}

object Registration {

  def main(args: Array[String]): Unit = {

    scalismo.initialize()
    Registration(BU3DDataProvider).run()

  }
}


================================================
FILE: src/main/scala/registration/experiments/Bu3DFELandmarkEvaluation.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration.experiments

import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider.Neutral
import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider._
import scalismo.geometry.{Landmark,_3D}
import scalismo.io.LandmarkIO
import java.io.File
import breeze.linalg.DenseVector
import breeze.numerics.sqrt
import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider
import scalismo.common.PointId

case class FeaturePoints(f : Feature, points : Seq[PointId], lms: Seq[Landmark[_3D]])

case class Feature(lmids: Seq[Int],name: String) {
  override def toString: String = name
}

object Bu3DFELandmarkEvaluation {

  def main(args: Array[String]): Unit = {

    scalismo.initialize()

    val dataProvider = BU3DDataProvider

    val eyeLeft = Feature(Seq(2,3,4,6,7,8),"eye.left")
    val eyeRight = Feature(Seq(10,11,12,14,15,16),"eye.right")
    val eyebrowLeft = Feature((17 to 26),"eyebrow.left")
    val eyebrowRight = Feature((27 to 36),"eyebrow.right")
    val nose = Feature((37 to 47),"nose")
    val mouth = Feature((49 to 60),"mouth")
    val faceRight = Feature((69 to 74),"face.right")
    val faceLeft = Feature((78 to 83),"face.left")
    val chin = Feature((75 to 77),"chin")

    val features = Seq(eyeLeft,eyeRight,eyebrowLeft,eyebrowRight,nose,mouth,faceRight,faceLeft,chin)

    val referenceMesh = dataProvider.incoming.reference.loadMesh(Neutral).get

    val bu3DreferenceLandmarks = {
      LandmarkIO.readLandmarksCsv[_3D](new File(dataProvider.repositoryRoot.jfile,"/data/incoming/reference/landmarks/mean2012-bu3dfe-eval-landmarks.csv")).get
    }

    val featuresWithLandmarks = features.map{
      f =>
        val ids = f.lmids
        val lms = ids.map(id => bu3DreferenceLandmarks.find(lm => lm.id.replace("bu3dfe-lm-","") == s"$id").get)
        val points = lms.map(lm => referenceMesh.pointSet.findClosestPoint(lm.point).id)
        FeaturePoints(f,points,lms)
    }

    val finalRes = for (expression <- Seq(Neutral,Sadness,Joy,Disgust,Anger,Fear,Surprise).reverse) yield {

      val evalPerSample = for (id <- dataProvider.incoming.ids(expression) if dataProvider.incoming.loadLandmarks(id,expression, F3D, BU3DDataProvider.Aligned).isSuccess && dataProvider.incoming.loadLandmarks(id, expression).isSuccess) yield {

        val bu3Dlandmarks = dataProvider.incoming.loadLandmarks(id, expression, F3D, BU3DDataProvider.Aligned).get
        val registeredMesh = dataProvider.registration.loadMesh(id, expression).get

        val alldists = for (f <- featuresWithLandmarks) yield {

          val dists = for (id <- f.lms.zip(f.points)) yield {

            val registeredPoint = registeredMesh.pointSet.point(id._2)
            val gtPoint = bu3Dlandmarks.find(lm => lm.id == id._1.id).get.point

            val dist = sqrt((registeredPoint - gtPoint).norm2)
            dist

          }

          (dists.sum / dists.length)

        }

        alldists

      }

      evalPerSample

    }

    println("Landmark Evaluation Result:")

    val table = for((f,i) <- features.zipWithIndex) yield {
      val data = DenseVector(finalRes.map(d => d.map(_(i))).flatten.toArray)
      val error = breeze.stats.meanAndVariance(data)
      Seq(f.name,s"${error.mean}",s"${error.stdDev}")
    }

    println(Tabulator.format(Seq(Seq("Region","Mean","Std")) ++ table))

  }
}

object Tabulator {
  def format(table: Seq[Seq[Any]]) = table match {
    case Seq() => ""
    case _ =>
      val sizes = for (row <- table) yield (for (cell <- row) yield if (cell == null) 0 else cell.toString.length)
      val colSizes = for (col <- sizes.transpose) yield col.max
      val rows = for (row <- table) yield formatRow(row, colSizes)
      formatRows(rowSeparator(colSizes), rows)
  }

  def formatRows(rowSeparator: String, rows: Seq[String]): String = (
    rowSeparator ::
      rows.head ::
      rowSeparator ::
      rows.tail.toList :::
      rowSeparator ::
      List()).mkString("\n")

  def formatRow(row: Seq[Any], colSizes: Seq[Int]) = {
    val cells = (for ((item, size) <- row.zip(colSizes)) yield if (size == 0) "" else ("%" + size + "s").format(item))
    cells.mkString("|", "|", "|")
  }

  def rowSeparator(colSizes: Seq[Int]) = colSizes map { "-" * _ } mkString("+", "+", "+")
}



================================================
FILE: src/main/scala/registration/metrics/HuberDistanceMetric.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration.metrics

import scalismo.common.Domain
import scalismo.geometry.{Dim, NDSpace, Point}
import scalismo.image.{DifferentiableScalarImage, ScalarImage}
import scalismo.numerics.Sampler
import scalismo.registration.{ImageMetric, Transformation}

case class HuberDistanceMetric[D <: Dim: NDSpace](sampler: Sampler[D]) extends ImageMetric[D] {


  override val ndSpace = implicitly[NDSpace[D]]

  def value(fixedImage: ScalarImage[D], movingImage: ScalarImage[D], transform: Transformation[D]) = {
    val warpedImage = fixedImage.compose(transform)


    def rhoHuber(v : Float ) : Float = {
      val k = 1.345
      if (v < k)
        (v * v / 2f) / (1 + v * v)
      else
        (k * ( Math.abs(v) - k / 2 )).toFloat
    }
    integrator.integrateScalar((warpedImage - movingImage).andThen(rhoHuber _)) / integrator.sampler.volumeOfSampleRegion
  }


  def takeDerivativeWRTToTransform(fixedImage: DifferentiableScalarImage[D], movingImage: ScalarImage[D], transform: Transformation[D]) = {

    def psiHuber(v : Float) : Float = {
      val k = 1.345
      if (v < k) v else (k * Math.signum(v)).toFloat
    }

    val movingGradientImage = fixedImage.differentiate
    val warpedImage = fixedImage.compose(transform)
    val dDMovingImage = (warpedImage - movingImage).andThen(psiHuber _) * (1.0 / sampler.volumeOfSampleRegion)

    val fullMetricGradient = (x: Point[D]) => {
      val domain = Domain.intersection(warpedImage.domain, dDMovingImage.domain)
      if (domain.isDefinedAt(x))
        Some(movingGradientImage(transform(x)).toFloatBreezeVector * dDMovingImage(x))
      else None
    }

    fullMetricGradient
  }
}

================================================
FILE: src/main/scala/registration/modelbuilding/BuildCoreExpressionPrior.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration.modelbuilding

import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider._
import ch.unibas.cs.gravis.facepipeline.{BU3DDataProvider, DataProvider, PipelineStep}
import com.typesafe.scalalogging.StrictLogging
import scalismo.common._
import scalismo.geometry.{Point, Vector, _3D}
import scalismo.statisticalmodel._

case class BuildCoreExpressionPrior(dataProvider: DataProvider) extends PipelineStep with StrictLogging {

  def run(): Unit = {

    scalismo.initialize()

    val dataProvider = BU3DDataProvider

    logger.info("load reference mesh ...")

    val referenceMesh = dataProvider.incoming.reference.loadMesh(Neutral).get

    logger.info("make core model from expressions ...")

    val references = for(exp <- Seq(Neutral, Sadness, Joy, Disgust, Anger, Fear, Surprise) ) yield {

      val expression = dataProvider.incoming.reference.loadMesh(exp).get

      def t(p : Point[_3D]) : Vector[_3D] = {
        val id = referenceMesh.pointSet.findClosestPoint(p).id
        val cp = expression.pointSet.point(id)
        cp - p
      }

      Field[_3D,Vector[_3D]](RealSpace[_3D],t)

    }

    logger.info("augment neutral with core model ...")

    val ssm = StatisticalMeshModel.createUsingPCA(referenceMesh,references)
    val neutralModel = dataProvider.registration.loadPriorModel(Neutral).get
    val augmentedExpressionModel = StatisticalMeshModel.augmentModel(ssm,neutralModel.gp.interpolateNearestNeighbor)

    logger.info("save core model ...")

    dataProvider.registration.savePriorModel(augmentedExpressionModel, dataProvider.CoreExpression)

  }

}

object BuildCoreExpressionPrior {

  def main(args: Array[String]): Unit = {

    BuildCoreExpressionPrior(BU3DDataProvider).run()

  }

}


================================================
FILE: src/main/scala/registration/modelbuilding/BuildNeutralPrior.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration.modelbuilding

import breeze.linalg.DenseMatrix
import ch.unibas.cs.gravis.facepipeline.BU3DDataProvider._
import ch.unibas.cs.gravis.facepipeline._
import com.typesafe.scalalogging.StrictLogging
import scalismo.common._
import scalismo.geometry.{Vector, _3D}
import scalismo.numerics.PivotedCholesky
import scalismo.numerics.PivotedCholesky.StoppingCriterion
import scalismo.statisticalmodel._

case class BuildNeutralPrior(dataProvider: DataProvider) extends PipelineStep with StrictLogging {


  def approximatePointSet(points: UnstructuredPointsDomain[_3D], D: Double, gp : GaussianProcess[_3D,Vector[_3D]], sc: StoppingCriterion) : (DiscreteLowRankGaussianProcess[_3D,Vector[_3D]],UnstructuredPointsDomain[_3D]) = {

    def phiWithDim(i: Int, dim : Int, ptId : Int, phi: DenseMatrix[Double]) = {
      phi(ptId*3 + dim,i)
    }

    def phiVec(i : Int, ptID : PointId,phi : DenseMatrix[Double]) = {
      Vector(phiWithDim(i,0,ptID.id,phi),phiWithDim(i,1,ptID.id,phi),phiWithDim(i,2,ptID.id,phi))
    }
    val (phi,lambda) = PivotedCholesky.computeApproximateEig(gp.cov,points.points.toIndexedSeq,D,sc)

    val nPhi = phi.cols

    val klBasis: DiscreteLowRankGaussianProcess.KLBasis[_3D, Vector[_3D]] = for(i <- 0 until nPhi) yield {
      val v = DiscreteField[_3D,Vector[_3D]](points,points.pointsWithId.toIndexedSeq.map(f => phiVec(i,f._2,phi)))
      DiscreteLowRankGaussianProcess.Eigenpair(lambda(i),v)
    }
    val mean = DiscreteField[_3D,Vector[_3D]](points,points.points.toIndexedSeq.map(p => gp.mean(p)))

    val r = DiscreteLowRankGaussianProcess[_3D,Vector[_3D]](mean, klBasis)
    (r,points)

  }


  override def run(): Unit = {

    scalismo.initialize()

    logger.info(s"building model for neutral expression")
    val referenceMesh = dataProvider.incoming.reference.loadMesh(Neutral).get
    val mask = dataProvider.incoming.reference.loadFaceMask().get
    val faceKernel = FaceKernel(mask,referenceMesh)
    val gp = GaussianProcess[_3D, Vector[_3D]](faceKernel)
    val ldg = approximatePointSet(referenceMesh.pointSet, 1.0, gp, PivotedCholesky.NumberOfEigenfunctions(1000))
    val lowRankGaussianProcess = ldg._1.interpolateNearestNeighbor
    logger.info("computed nystrom approximation")

    val model = StatisticalMeshModel(referenceMesh, lowRankGaussianProcess)

    dataProvider.registration.savePriorModel(model, Neutral)

    logger.info("model building done")

  }

}

object BuildNeutralPrior {

  def main(args: Array[String]): Unit = {
    BuildNeutralPrior(BU3DDataProvider).run()
  }

}



================================================
FILE: src/main/scala/registration/modelbuilding/FaceKernel.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration.modelbuilding

import breeze.linalg.DenseMatrix
import registration.modelbuilding.FaceKernel.LevelWithScale
import scalismo.common._
import scalismo.geometry.{Point, SquareMatrix, _3D}
import scalismo.kernels.{BSplineKernel, DiagonalKernel, MatrixValuedPDKernel}
import scalismo.mesh.TriangleMesh

case class SpatiallyVaryingMultiscaleKernel(levelsWithScale : Seq[LevelWithScale],
                                            mask: FaceMask, referenceMesh: TriangleMesh[_3D]) extends MatrixValuedPDKernel[_3D] {


  val bSplineKernel = DiagonalKernel(BSplineKernel[_3D](order = 3, scale = 0), 3)

  val smoothedRegionWeights = levelsWithScale.map(levelWithScale =>
    (levelWithScale.level, mask.computeSmoothedRegions(referenceMesh,levelWithScale.level, 40))
  ).toMap

  def k(x: Point[_3D], y: Point[_3D]): DenseMatrix[Double] = {

    var sum = SquareMatrix.zeros[_3D].toBreezeMatrix

    for (LevelWithScale(level, scale) <- levelsWithScale) {

      val weightX = smoothedRegionWeights(level)(x)
      val weightY = smoothedRegionWeights(level)(y)

      sum += bSplineKernel((x.toVector * Math.pow(2, level)).toPoint, (y.toVector * Math.pow(2, level)).toPoint) * scale * weightX * weightY

    }

    sum
  }

  override def outputDim = 3

  override def domain = RealSpace[_3D]

}

case class FaceKernel(faceMask : FaceMask, referenceMesh: TriangleMesh[_3D]) extends MatrixValuedPDKernel[_3D] {


  private val faceKernel = {

    val levelsAndScales =   Seq(
      LevelWithScale(-6,128.0),
      LevelWithScale(-5, 64.0),
      LevelWithScale(-4, 32.0),
      LevelWithScale(-3, 10.0),
      LevelWithScale(-2, 4.0))

    val spatiallyVaryingKernel = SpatiallyVaryingMultiscaleKernel(levelsAndScales, faceMask, referenceMesh)

    val symmetricKernel = symmetrize(spatiallyVaryingKernel)

    symmetricKernel * 0.7 + spatiallyVaryingKernel * 0.3
  }

  override protected def k(x: Point[_3D], y: Point[_3D]): DenseMatrix[Double] = faceKernel(x,y)

  override def domain: Domain[_3D] = RealSpace[_3D]

  override def outputDim: Int = 3

  private def symmetrize(kernel: MatrixValuedPDKernel[_3D]) :  MatrixValuedPDKernel[_3D] = {

    new MatrixValuedPDKernel[_3D] {
      override def outputDim = 3

      override def k(x: Point[_3D], y: Point[_3D]): DenseMatrix[Double] = {

        val ybar = Point(-y.x, y.y, y.z)
        val xbar = Point(-x.x, x.y, x.z)

        val I = DenseMatrix.eye[Double](3)
        I(0, 0) = 1

        val IBar = DenseMatrix.eye[Double](3)
        IBar(0, 0) = -1

        I * kernel(x, y) + IBar * (kernel(x, ybar))
      }

      override def domain: Domain[_3D] = kernel.domain
    }

  }


}

object FaceKernel {
  case class LevelWithScale(level : Int, scale : Double)
}


================================================
FILE: src/main/scala/registration/modelbuilding/FaceMask.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration.modelbuilding

import scalismo.common.{PointId, UnstructuredPointsDomain}
import scalismo.geometry.{Point, _3D}
import scalismo.kernels.GaussianKernel
import scalismo.mesh.{ScalarMeshField, TriangleMesh}
import scalismo.utils.Memoize

case class FaceMask(levelMask: ScalarMeshField[Int], semanticMask: ScalarMeshField[Int]) {

  def isEarRegion(id : PointId) : Boolean = {
    semanticMask(id) == 1
  }

  def isLipPoint(id : PointId) : Boolean = {
    semanticMask(id) == 2
  }

  def isNoseRegion(id : PointId) : Boolean = {
    semanticMask(id) == 3
  }

  // Returns a value in the interval [0,1] indicating whether a point belongs to the region
  def computeSmoothedRegions(referenceMesh: TriangleMesh[_3D], level : Int, stddev : Double) : Point[_3D] => Double = {

    val transformedMask = levelMask.copy(mesh = referenceMesh)
    val pointsWithRegions = transformedMask.pointsWithValues.toIndexedSeq

    val regionSmoother = GaussianKernel[_3D](stddev)
    val regionPts = UnstructuredPointsDomain(pointsWithRegions.filter(_._2 >= level).map(_._1))

    def regionWeight(p : Point[_3D]) : Double = {
      regionSmoother(regionPts.findClosestPoint(p).point,p)
    }

    Memoize(regionWeight,referenceMesh.pointSet.numberOfPoints)
  }

}



================================================
FILE: src/main/scala/registration/utils/VisualLogger.scala
================================================
/*
 * Copyright University of Basel, Graphics and Vision Research Group
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package registration.utils

import java.awt.Color

import breeze.linalg.DenseVector
import scalismo.geometry._3D
import scalismo.mesh.TriangleMesh
import scalismo.statisticalmodel.StatisticalMeshModel
import scalismo.ui.api._

object VisualLogger {
  var ui : Option[ScalismoUI] = None//Some(ScalismoUI("Visual Logger"))

  val modelGroup = ui.map(_.createGroup("Model"))
  var modelView : Option[StatisticalMeshModelViewControls] = None

  val targetGroup = ui.map(_.createGroup("Target"))
  var targetMeshView : Option[TriangleMeshView] = None



  def showTargetMesh(targetMesh : TriangleMesh[_3D]) : Unit = {
    remove(targetMeshView)
    targetMeshView = show(VisualLogger.targetGroup, targetMesh, "target")
    targetMeshView.map(_.color = Color.RED)
  }

  def showStatisticalShapeModel(ssm : StatisticalMeshModel) : Unit = {
    removeModel(modelView)
    modelView = show(modelGroup, ssm, "gpmodel")
    modelView.map(_.meshView.opacity = 0.7)
  }

  def updateModelView(coeffs : DenseVector[Double]) : Unit = {
    if (modelView.isDefined) {
      modelView.get.shapeModelTransformationView.shapeTransformationView.coefficients = coeffs
    }
  }


  private def show[A](group : Option[Group], t : A, name : String)(implicit sic : ShowInScene[A]): Option[sic.View] = {
    for {
      ui <- ui
      g <- group
    } yield {
      ui.show(g, t, name)
    }
  }

  def remove[V <: ObjectView](view : Option[V]): Unit = {
    view.foreach(_.remove())
  }

  def removeModel(view : Option[StatisticalMeshModelViewControls]): Unit = {
    for {v <- view} {
      v.meshView.remove()
      v.shapeModelTransformationView.remove()
    }
  }

}
Download .txt
gitextract_eb7rw3gs/

├── LICENSE.txt
├── README.md
├── build.sbt
├── pipeline-data/
│   └── .gitignore
└── src/
    └── main/
        ├── resources/
        │   └── logback.xml
        └── scala/
            ├── ch/
            │   └── unibas/
            │       └── cs/
            │           └── gravis/
            │               └── facepipeline/
            │                   ├── BU3DDataProvider.scala
            │                   ├── DataProvider.scala
            │                   └── PipelineStep.scala
            ├── fitting/
            │   ├── StandardFitScript.scala
            │   └── experiments/
            │       ├── QualitativeLFW.scala
            │       └── RecognitionMultiPIE.scala
            ├── modelbuilding/
            │   └── ModelBuilding.scala
            ├── preprocessing/
            │   ├── ConvertBu3DRawData.scala
            │   └── PrepareReferenceLandmarks.scala
            └── registration/
                ├── Registration.scala
                ├── experiments/
                │   └── Bu3DFELandmarkEvaluation.scala
                ├── metrics/
                │   └── HuberDistanceMetric.scala
                ├── modelbuilding/
                │   ├── BuildCoreExpressionPrior.scala
                │   ├── BuildNeutralPrior.scala
                │   ├── FaceKernel.scala
                │   └── FaceMask.scala
                └── utils/
                    └── VisualLogger.scala
Condensed preview — 22 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (142K chars).
[
  {
    "path": "LICENSE.txt",
    "chars": 11357,
    "preview": "                                 Apache License\n                           Version 2.0, January 2004\n                   "
  },
  {
    "path": "README.md",
    "chars": 5803,
    "preview": "# Basel Face Registration Pipeline\n\nThis repository contains all the code to reproduce our results from our recent publi"
  },
  {
    "path": "build.sbt",
    "chars": 690,
    "preview": "name := \"basel-face-pipeline\"\n\nversion := \"0.1\"\n\nscalaVersion := \"2.11.8\"\n\nscalacOptions := Seq(\"-unchecked\", \"-deprecat"
  },
  {
    "path": "pipeline-data/.gitignore",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "src/main/resources/logback.xml",
    "chars": 1482,
    "preview": "<!--\n  ~ Copyright University of Basel, Graphics and Vision Research Group\n  ~\n  ~ Licensed under the Apache License, Ve"
  },
  {
    "path": "src/main/scala/ch/unibas/cs/gravis/facepipeline/BU3DDataProvider.scala",
    "chars": 18963,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/ch/unibas/cs/gravis/facepipeline/DataProvider.scala",
    "chars": 4941,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/ch/unibas/cs/gravis/facepipeline/PipelineStep.scala",
    "chars": 720,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/fitting/StandardFitScript.scala",
    "chars": 13085,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/fitting/experiments/QualitativeLFW.scala",
    "chars": 2533,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/fitting/experiments/RecognitionMultiPIE.scala",
    "chars": 8465,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/modelbuilding/ModelBuilding.scala",
    "chars": 16082,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/preprocessing/ConvertBu3DRawData.scala",
    "chars": 17628,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/preprocessing/PrepareReferenceLandmarks.scala",
    "chars": 2669,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/Registration.scala",
    "chars": 11114,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/experiments/Bu3DFELandmarkEvaluation.scala",
    "chars": 4863,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/metrics/HuberDistanceMetric.scala",
    "chars": 2286,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/modelbuilding/BuildCoreExpressionPrior.scala",
    "chars": 2375,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/modelbuilding/BuildNeutralPrior.scala",
    "chars": 3194,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/modelbuilding/FaceKernel.scala",
    "chars": 3372,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/modelbuilding/FaceMask.scala",
    "chars": 1904,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  },
  {
    "path": "src/main/scala/registration/utils/VisualLogger.scala",
    "chars": 2283,
    "preview": "/*\n * Copyright University of Basel, Graphics and Vision Research Group\n *\n * Licensed under the Apache License, Version"
  }
]

About this extraction

This page contains the full source code of the unibas-gravis/basel-face-pipeline GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 22 files (132.6 KB), approximately 34.1k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!