Full Code of hypelib/Hype for AI

master e467e926da4c cached
42 files
398.3 KB
145.5k tokens
4 symbols
1 requests
Download .txt
Showing preview only (424K chars total). Download the full file or copy to clipboard to get everything.
Repository: hypelib/Hype
Branch: master
Commit: e467e926da4c
Files: 42
Total size: 398.3 KB

Directory structure:
gitextract__bdmbe28/

├── .gitattributes
├── .gitignore
├── .paket/
│   ├── Paket.Restore.targets
│   └── paket.targets
├── Hype.sln
├── LICENSE.txt
├── README.md
├── Roadmap.txt
├── docs/
│   ├── .gitignore
│   ├── BuildDocs.fsx
│   └── input/
│       ├── FeedforwardNets.fsx
│       ├── HMC.fsx
│       ├── Optimization.fsx
│       ├── RecurrentNets.fsx
│       ├── Regression.fsx
│       ├── Training.fsx
│       ├── download.fsx
│       ├── files/
│       │   └── misc/
│       │       ├── style.css
│       │       ├── style_light.css
│       │       └── tips.js
│       ├── housing.data
│       ├── index.fsx
│       ├── resources/
│       │   └── Hype.pspimage
│       └── templates/
│           ├── docpage.cshtml
│           ├── reference/
│           │   ├── module.cshtml
│           │   ├── namespaces.cshtml
│           │   ├── part-members.cshtml
│           │   ├── part-nested.cshtml
│           │   └── type.cshtml
│           ├── template.cshtml
│           └── template.html
├── paket.dependencies
└── src/
    └── Hype/
        ├── AssemblyInfo.fs
        ├── Classifier.fs
        ├── Hype.fs
        ├── Hype.fsproj
        ├── Inference.fs
        ├── NLP.fs
        ├── Neural.fs
        ├── Optimize.fs
        ├── app.config
        └── paket.references

================================================
FILE CONTENTS
================================================

================================================
FILE: .gitattributes
================================================
###############################################################################
# Set default behavior to automatically normalize line endings.
###############################################################################
* text=auto

###############################################################################
# Set default behavior for command prompt diff.
#
# This is need for earlier builds of msysgit that does not have it on by
# default for csharp files.
# Note: This is only used by command line
###############################################################################
#*.cs     diff=csharp

###############################################################################
# Set the merge driver for project and solution files
#
# Merging from the command prompt will add diff markers to the files if there
# are conflicts (Merging from VS is not affected by the settings below, in VS
# the diff markers are never inserted). Diff markers may cause the following 
# file extensions to fail to load in VS. An alternative would be to treat
# these files as binary and thus will always conflict and require user
# intervention with every merge. To do so, just uncomment the entries below
###############################################################################
#*.sln       merge=binary
#*.csproj    merge=binary
#*.vbproj    merge=binary
#*.vcxproj   merge=binary
#*.vcproj    merge=binary
#*.dbproj    merge=binary
#*.fsproj    merge=binary
#*.lsproj    merge=binary
#*.wixproj   merge=binary
#*.modelproj merge=binary
#*.sqlproj   merge=binary
#*.wwaproj   merge=binary

###############################################################################
# behavior for image files
#
# image files are treated as binary by default.
###############################################################################
#*.jpg   binary
#*.png   binary
#*.gif   binary

###############################################################################
# diff behavior for common document formats
# 
# Convert binary document formats to text before diffing them. This feature
# is only available from the command line. Turn it on by uncommenting the 
# entries below.
###############################################################################
#*.doc   diff=astextplain
#*.DOC   diff=astextplain
#*.docx  diff=astextplain
#*.DOCX  diff=astextplain
#*.dot   diff=astextplain
#*.DOT   diff=astextplain
#*.pdf   diff=astextplain
#*.PDF   diff=astextplain
#*.rtf   diff=astextplain
#*.RTF   diff=astextplain


================================================
FILE: .gitignore
================================================
## Ignore Visual Studio temporary files, build results, and
## files generated by popular Visual Studio add-ons.

# User-specific files
*.suo
*.user
*.sln.docstates

# Build results

[Dd]ebug/
[Rr]elease/
x64/
build/
[Bb]in/
[Oo]bj/

# Enable "build/" folder in the NuGet Packages folder since NuGet packages use it for MSBuild targets
!packages/*/build/

# MSTest test Results
[Tt]est[Rr]esult*/
[Bb]uild[Ll]og.*

*_i.c
*_p.c
*.ilk
*.meta
*.obj
*.pch
*.pdb
*.pgc
*.pgd
*.rsp
*.sbr
*.tlb
*.tli
*.tlh
*.tmp
*.tmp_proj
*.log
*.vspscc
*.vssscc
.builds
*.pidb
*.log
*.scc

# Visual C++ cache files
ipch/
*.aps
*.ncb
*.opensdf
*.sdf
*.cachefile

# Visual Studio profiler
*.psess
*.vsp
*.vspx

# Guidance Automation Toolkit
*.gpState

# ReSharper is a .NET coding add-in
_ReSharper*/
*.[Rr]e[Ss]harper

# TeamCity is a build add-in
_TeamCity*

# DotCover is a Code Coverage Tool
*.dotCover

# NCrunch
*.ncrunch*
.*crunch*.local.xml

# Installshield output folder
[Ee]xpress/

# DocProject is a documentation generator add-in
DocProject/buildhelp/
DocProject/Help/*.HxT
DocProject/Help/*.HxC
DocProject/Help/*.hhc
DocProject/Help/*.hhk
DocProject/Help/*.hhp
DocProject/Help/Html2
DocProject/Help/html

# Click-Once directory
publish/

# Publish Web Output
*.Publish.xml

# NuGet Packages Directory
## TODO: If you have NuGet Package Restore enabled, uncomment the next line
packages/

# Windows Azure Build Output
csx
*.build.csdef

# Windows Store app package directory
AppPackages/

# Others
sql/
*.Cache
ClientBin/
[Ss]tyle[Cc]op.*
~$*
*~
*.dbmdl
*.[Pp]ublish.xml
*.pfx
*.publishsettings

# RIA/Silverlight projects
Generated_Code/

# Backup & report files from converting an old project file to a newer
# Visual Studio version. Backup files are not needed, because we have git ;-)
_UpgradeReport_Files/
Backup*/
UpgradeLog*.XML
UpgradeLog*.htm

# SQL Server files
App_Data/*.mdf
App_Data/*.ldf


#LightSwitch generated files
GeneratedArtifacts/
_Pvt_Extensions/
ModelManifest.xml

# =========================
# Windows detritus
# =========================

# Windows image file caches
Thumbs.db
ehthumbs.db

# Folder config file
Desktop.ini

# Recycle Bin used on file shares
$RECYCLE.BIN/

# Mac desktop service store files
.DS_Store


================================================
FILE: .paket/Paket.Restore.targets
================================================
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <!-- Prevent dotnet template engine to parse this file -->
  <!--/-:cnd:noEmit-->
  <PropertyGroup>
    <!-- make MSBuild track this file for incremental builds. -->
    <!-- ref https://blogs.msdn.microsoft.com/msbuild/2005/09/26/how-to-ensure-changes-to-a-custom-target-file-prompt-a-rebuild/ -->
    <MSBuildAllProjects>$(MSBuildAllProjects);$(MSBuildThisFileFullPath)</MSBuildAllProjects>

    <DetectedMSBuildVersion>$(MSBuildVersion)</DetectedMSBuildVersion>
    <DetectedMSBuildVersion Condition="'$(MSBuildVersion)' == ''">15.0.0</DetectedMSBuildVersion>
    <MSBuildSupportsHashing>false</MSBuildSupportsHashing>
    <MSBuildSupportsHashing Condition=" '$(DetectedMSBuildVersion)' &gt; '15.8.0' ">true</MSBuildSupportsHashing>
    <!-- Mark that this target file has been loaded.  -->
    <IsPaketRestoreTargetsFileLoaded>true</IsPaketRestoreTargetsFileLoaded>
    <PaketToolsPath>$(MSBuildThisFileDirectory)</PaketToolsPath>
    <PaketRootPath>$(MSBuildThisFileDirectory)..\</PaketRootPath>
    <PaketRestoreCacheFile>$(PaketRootPath)paket-files\paket.restore.cached</PaketRestoreCacheFile>
    <PaketLockFilePath>$(PaketRootPath)paket.lock</PaketLockFilePath>
    <PaketBootstrapperStyle>classic</PaketBootstrapperStyle>
    <PaketBootstrapperStyle Condition="Exists('$(PaketToolsPath)paket.bootstrapper.proj')">proj</PaketBootstrapperStyle>
    <PaketExeImage>assembly</PaketExeImage>
    <PaketExeImage Condition=" '$(PaketBootstrapperStyle)' == 'proj' ">native</PaketExeImage>
    <MonoPath Condition="'$(MonoPath)' == '' AND Exists('/Library/Frameworks/Mono.framework/Commands/mono')">/Library/Frameworks/Mono.framework/Commands/mono</MonoPath>
    <MonoPath Condition="'$(MonoPath)' == ''">mono</MonoPath>

    <!-- PaketBootStrapper  -->
    <PaketBootStrapperExePath Condition=" '$(PaketBootStrapperExePath)' == '' AND Exists('$(PaketRootPath)paket.bootstrapper.exe')">$(PaketRootPath)paket.bootstrapper.exe</PaketBootStrapperExePath>
    <PaketBootStrapperExePath Condition=" '$(PaketBootStrapperExePath)' == '' ">$(PaketToolsPath)paket.bootstrapper.exe</PaketBootStrapperExePath>
    <PaketBootStrapperExeDir Condition=" Exists('$(PaketBootStrapperExePath)') " >$([System.IO.Path]::GetDirectoryName("$(PaketBootStrapperExePath)"))\</PaketBootStrapperExeDir>

    <PaketBootStrapperCommand Condition=" '$(OS)' == 'Windows_NT' ">"$(PaketBootStrapperExePath)"</PaketBootStrapperCommand>
    <PaketBootStrapperCommand Condition=" '$(OS)' != 'Windows_NT' ">$(MonoPath) --runtime=v4.0.30319 "$(PaketBootStrapperExePath)"</PaketBootStrapperCommand>

    <!-- Disable automagic references for F# DotNet SDK -->
    <!-- This will not do anything for other project types -->
    <!-- see https://github.com/fsharp/fslang-design/blob/master/tooling/FST-1002-fsharp-in-dotnet-sdk.md -->
    <DisableImplicitFSharpCoreReference>true</DisableImplicitFSharpCoreReference>
    <DisableImplicitSystemValueTupleReference>true</DisableImplicitSystemValueTupleReference>

    <!-- Disable Paket restore under NCrunch build -->
    <PaketRestoreDisabled Condition="'$(NCrunch)' == '1'">True</PaketRestoreDisabled>

    <!-- Disable test for CLI tool completely - overrideable via properties in projects or via environment variables -->
    <PaketDisableCliTest Condition=" '$(PaketDisableCliTest)' == '' ">False</PaketDisableCliTest>

    <PaketIntermediateOutputPath Condition=" '$(PaketIntermediateOutputPath)' == '' ">$(BaseIntermediateOutputPath.TrimEnd('\').TrimEnd('\/'))</PaketIntermediateOutputPath>
  </PropertyGroup>

  <!-- Resolve how paket should be called -->
  <!-- Current priority is: local (1: repo root, 2: .paket folder) => 3: as CLI tool => as bootstrapper (4: proj Bootstrapper style, 5: BootstrapperExeDir) => 6: global path variable -->
  <Target Name="SetPaketCommand" >
    <!-- Test if paket is available in the standard locations. If so, that takes priority. Case 1/2 - non-windows specific -->
    <PropertyGroup Condition=" '$(OS)' != 'Windows_NT' ">
      <!-- no windows, try native paket as default, root => tool -->
      <PaketExePath Condition=" '$(PaketExePath)' == '' AND Exists('$(PaketRootPath)paket') ">$(PaketRootPath)paket</PaketExePath>
      <PaketExePath Condition=" '$(PaketExePath)' == '' AND Exists('$(PaketToolsPath)paket') ">$(PaketToolsPath)paket</PaketExePath>
    </PropertyGroup>

    <!-- Test if paket is available in the standard locations. If so, that takes priority. Case 2/2 - same across platforms -->
    <PropertyGroup>
      <!-- root => tool -->
      <PaketExePath Condition=" '$(PaketExePath)' == '' AND Exists('$(PaketRootPath)paket.exe') ">$(PaketRootPath)paket.exe</PaketExePath>
      <PaketExePath Condition=" '$(PaketExePath)' == '' AND Exists('$(PaketToolsPath)paket.exe') ">$(PaketToolsPath)paket.exe</PaketExePath>
    </PropertyGroup>

    <!-- If paket hasn't be found in standard locations, test for CLI tool usage. -->
    <!-- First test: Is CLI configured to be used in "dotnet-tools.json"? - can result in a false negative; only a positive outcome is reliable. -->
    <PropertyGroup Condition=" '$(PaketExePath)' == '' ">
      <_DotnetToolsJson Condition="Exists('$(PaketRootPath)/.config/dotnet-tools.json')">$([System.IO.File]::ReadAllText("$(PaketRootPath)/.config/dotnet-tools.json"))</_DotnetToolsJson>
      <_ConfigContainsPaket Condition=" '$(_DotnetToolsJson)' != ''">$(_DotnetToolsJson.Contains('"paket"'))</_ConfigContainsPaket>
      <_ConfigContainsPaket Condition=" '$(_ConfigContainsPaket)' == ''">false</_ConfigContainsPaket>
    </PropertyGroup>

    <!-- Second test: Call 'dotnet paket' and see if it returns without an error. Mute all the output. Only run if previous test failed and the test has not been disabled. -->
    <!-- WARNING: This method can lead to processes hanging forever, and should be used as little as possible. See https://github.com/fsprojects/Paket/issues/3705 for details. -->
    <Exec Condition=" '$(PaketExePath)' == '' AND !$(PaketDisableCliTest) AND !$(_ConfigContainsPaket)" Command="dotnet paket --version" IgnoreExitCode="true" StandardOutputImportance="low" StandardErrorImportance="low" >
      <Output TaskParameter="ExitCode" PropertyName="LocalPaketToolExitCode" />
    </Exec>

    <!-- If paket is installed as CLI use that. Again, only if paket haven't already been found in standard locations. -->
    <PropertyGroup Condition=" '$(PaketExePath)' == '' AND ($(_ConfigContainsPaket) OR '$(LocalPaketToolExitCode)' == '0') ">
      <_PaketCommand>dotnet paket</_PaketCommand>
    </PropertyGroup>

    <!-- If neither local files nor CLI tool can be found, final attempt is searching for boostrapper config before falling back to global path variable. -->
    <PropertyGroup Condition=" '$(PaketExePath)' == '' AND '$(_PaketCommand)' == '' ">
      <!-- Test for bootstrapper setup -->
      <PaketExePath Condition=" '$(PaketExePath)' == '' AND '$(PaketBootstrapperStyle)' == 'proj' ">$(PaketToolsPath)paket</PaketExePath>
      <PaketExePath Condition=" '$(PaketExePath)' == '' AND Exists('$(PaketBootStrapperExeDir)') ">$(PaketBootStrapperExeDir)paket</PaketExePath>

      <!-- If all else fails, use global path approach. -->
      <PaketExePath Condition=" '$(PaketExePath)' == ''">paket</PaketExePath>
    </PropertyGroup>

    <!-- If not using CLI, setup correct execution command. -->
    <PropertyGroup Condition=" '$(_PaketCommand)' == '' ">
      <_PaketExeExtension>$([System.IO.Path]::GetExtension("$(PaketExePath)"))</_PaketExeExtension>
      <_PaketCommand Condition=" '$(_PaketCommand)' == '' AND '$(_PaketExeExtension)' == '.dll' ">dotnet "$(PaketExePath)"</_PaketCommand>
      <_PaketCommand Condition=" '$(_PaketCommand)' == '' AND '$(OS)' != 'Windows_NT' AND '$(_PaketExeExtension)' == '.exe' ">$(MonoPath) --runtime=v4.0.30319 "$(PaketExePath)"</_PaketCommand>
      <_PaketCommand Condition=" '$(_PaketCommand)' == '' ">"$(PaketExePath)"</_PaketCommand>
    </PropertyGroup>

    <!-- The way to get a property to be available outside the target is to use this task. -->
    <CreateProperty Value="$(_PaketCommand)">
      <Output TaskParameter="Value" PropertyName="PaketCommand"/>
    </CreateProperty>

  </Target>

  <Target Name="PaketBootstrapping" Condition="Exists('$(PaketToolsPath)paket.bootstrapper.proj')">
    <MSBuild Projects="$(PaketToolsPath)paket.bootstrapper.proj" Targets="Restore" />
  </Target>

  <!-- Official workaround for https://docs.microsoft.com/en-us/visualstudio/msbuild/getfilehash-task?view=vs-2019 -->
  <UsingTask TaskName="Microsoft.Build.Tasks.GetFileHash" AssemblyName="Microsoft.Build.Tasks.Core, Version=15.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" Condition=" '$(MSBuildSupportsHashing)' == 'true' And '$(DetectedMSBuildVersion)' &lt; '16.0.360' " />
  <UsingTask TaskName="Microsoft.Build.Tasks.VerifyFileHash" AssemblyName="Microsoft.Build.Tasks.Core, Version=15.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" Condition=" '$(MSBuildSupportsHashing)' == 'true' And '$(DetectedMSBuildVersion)' &lt; '16.0.360' " />
  <Target Name="PaketRestore" Condition="'$(PaketRestoreDisabled)' != 'True'" BeforeTargets="_GenerateDotnetCliToolReferenceSpecs;_GenerateProjectRestoreGraphPerFramework;_GenerateRestoreGraphWalkPerFramework;CollectPackageReferences" DependsOnTargets="SetPaketCommand;PaketBootstrapping">

    <!-- Step 1 Check if lockfile is properly restored (if the hash of the lockfile and the cache-file match) -->
    <PropertyGroup>
      <PaketRestoreRequired>true</PaketRestoreRequired>
      <NoWarn>$(NoWarn);NU1603;NU1604;NU1605;NU1608</NoWarn>
      <CacheFilesExist>false</CacheFilesExist>
      <CacheFilesExist Condition=" Exists('$(PaketRestoreCacheFile)') And Exists('$(PaketLockFilePath)') ">true</CacheFilesExist>
    </PropertyGroup>

    <!-- Read the hash of the lockfile -->
    <GetFileHash Condition=" '$(MSBuildSupportsHashing)' == 'true' And '$(CacheFilesExist)' == 'true' " Files="$(PaketLockFilePath)" Algorithm="SHA256" HashEncoding="hex" >
      <Output TaskParameter="Hash" PropertyName="PaketRestoreLockFileHash" />
    </GetFileHash>
    <!-- Read the hash of the cache, which is json, but a very simple key value object -->
    <PropertyGroup Condition=" '$(MSBuildSupportsHashing)' == 'true' And '$(CacheFilesExist)' == 'true' ">
        <PaketRestoreCachedContents>$([System.IO.File]::ReadAllText('$(PaketRestoreCacheFile)'))</PaketRestoreCachedContents>
    </PropertyGroup>
    <ItemGroup Condition=" '$(MSBuildSupportsHashing)' == 'true' And '$(CacheFilesExist)' == 'true' ">
        <!-- Parse our simple 'paket.restore.cached' json ...-->
        <PaketRestoreCachedSplitObject Include="$([System.Text.RegularExpressions.Regex]::Split(`$(PaketRestoreCachedContents)`, `{|}|,`))"></PaketRestoreCachedSplitObject>
        <!-- Keep Key, Value ItemGroup-->
        <PaketRestoreCachedKeyValue Include="@(PaketRestoreCachedSplitObject)"
            Condition=" $([System.Text.RegularExpressions.Regex]::Split(`%(Identity)`, `&quot;: &quot;`).Length) &gt; 1 ">
          <Key>$([System.Text.RegularExpressions.Regex]::Split(`%(Identity)`, `": "`)[0].Replace(`"`, ``).Replace(` `, ``))</Key>
          <Value>$([System.Text.RegularExpressions.Regex]::Split(`%(Identity)`, `": "`)[1].Replace(`"`, ``).Replace(` `, ``))</Value>
        </PaketRestoreCachedKeyValue>
    </ItemGroup>
    <PropertyGroup Condition=" '$(MSBuildSupportsHashing)' == 'true' And '$(CacheFilesExist)' == 'true' ">
        <!-- Retrieve the hashes we are interested in -->
        <PackagesDownloadedHash Condition=" '%(PaketRestoreCachedKeyValue.Key)' == 'packagesDownloadedHash' ">%(PaketRestoreCachedKeyValue.Value)</PackagesDownloadedHash>
        <ProjectsRestoredHash Condition=" '%(PaketRestoreCachedKeyValue.Key)' == 'projectsRestoredHash' ">%(PaketRestoreCachedKeyValue.Value)</ProjectsRestoredHash>
    </PropertyGroup>

    <PropertyGroup Condition=" '$(MSBuildSupportsHashing)' == 'true' And '$(CacheFilesExist)' == 'true' ">
      <!-- If the restore file doesn't exist we need to restore, otherwise only if hashes don't match -->
      <PaketRestoreRequired>true</PaketRestoreRequired>
      <PaketRestoreRequired Condition=" '$(PaketRestoreLockFileHash)' == '$(ProjectsRestoredHash)' ">false</PaketRestoreRequired>
      <PaketRestoreRequired Condition=" '$(PaketRestoreLockFileHash)' == '' ">true</PaketRestoreRequired>
    </PropertyGroup>

	<!--
		This value should match the version in the props generated by paket
		If they differ, this means we need to do a restore in order to ensure correct dependencies
	-->
    <PropertyGroup Condition="'$(PaketPropsVersion)' != '5.185.3' ">
      <PaketRestoreRequired>true</PaketRestoreRequired>
    </PropertyGroup>

    <!-- Do a global restore if required -->
    <Warning Text="This version of MSBuild (we assume '$(DetectedMSBuildVersion)' or older) doesn't support GetFileHash, so paket fast restore is disabled." Condition=" '$(MSBuildSupportsHashing)' != 'true' " />
    <Error Text="Stop build because of PAKET_ERROR_ON_MSBUILD_EXEC and we always call the bootstrapper" Condition=" '$(PAKET_ERROR_ON_MSBUILD_EXEC)' == 'true' AND '$(PaketBootstrapperStyle)' == 'classic' AND Exists('$(PaketBootStrapperExePath)') AND !(Exists('$(PaketExePath)'))" />
    <Exec Command='$(PaketBootStrapperCommand)' Condition=" '$(PaketBootstrapperStyle)' == 'classic' AND Exists('$(PaketBootStrapperExePath)') AND !(Exists('$(PaketExePath)'))" ContinueOnError="false" />
    <Error Text="Stop build because of PAKET_ERROR_ON_MSBUILD_EXEC and we need a full restore (hashes don't match)" Condition=" '$(PAKET_ERROR_ON_MSBUILD_EXEC)' == 'true' AND '$(PaketRestoreRequired)' == 'true' AND '$(PaketDisableGlobalRestore)' != 'true'" />
    <Exec Command='$(PaketCommand) restore' Condition=" '$(PaketRestoreRequired)' == 'true' AND '$(PaketDisableGlobalRestore)' != 'true' " ContinueOnError="false" />

    <!-- Step 2 Detect project specific changes -->
    <ItemGroup>
      <MyTargetFrameworks Condition="'$(TargetFramework)' != '' " Include="$(TargetFramework)"></MyTargetFrameworks>
      <!-- Don't include all frameworks when msbuild explicitly asks for a single one -->
      <MyTargetFrameworks Condition="'$(TargetFrameworks)' != '' AND '$(TargetFramework)' == '' " Include="$(TargetFrameworks)"></MyTargetFrameworks>
      <PaketResolvedFilePaths Include="@(MyTargetFrameworks -> '$(PaketIntermediateOutputPath)\$(MSBuildProjectFile).%(Identity).paket.resolved')"></PaketResolvedFilePaths>
    </ItemGroup>

    <PropertyGroup>
      <PaketReferencesCachedFilePath>$(PaketIntermediateOutputPath)\$(MSBuildProjectFile).paket.references.cached</PaketReferencesCachedFilePath>
      <!-- MyProject.fsproj.paket.references has the highest precedence -->
      <PaketOriginalReferencesFilePath>$(MSBuildProjectFullPath).paket.references</PaketOriginalReferencesFilePath>
      <!-- MyProject.paket.references -->
      <PaketOriginalReferencesFilePath Condition=" !Exists('$(PaketOriginalReferencesFilePath)')">$(MSBuildProjectDirectory)\$(MSBuildProjectName).paket.references</PaketOriginalReferencesFilePath>
      <!-- paket.references -->
      <PaketOriginalReferencesFilePath Condition=" !Exists('$(PaketOriginalReferencesFilePath)')">$(MSBuildProjectDirectory)\paket.references</PaketOriginalReferencesFilePath>

      <DoAllResolvedFilesExist>false</DoAllResolvedFilesExist>
      <DoAllResolvedFilesExist Condition="Exists(%(PaketResolvedFilePaths.Identity))">true</DoAllResolvedFilesExist>
      <PaketRestoreRequired>true</PaketRestoreRequired>
      <PaketRestoreRequiredReason>references-file-or-cache-not-found</PaketRestoreRequiredReason>
    </PropertyGroup>

    <!-- Step 2 a Detect changes in references file -->
    <PropertyGroup Condition="Exists('$(PaketOriginalReferencesFilePath)') AND Exists('$(PaketReferencesCachedFilePath)') ">
      <PaketRestoreCachedHash>$([System.IO.File]::ReadAllText('$(PaketReferencesCachedFilePath)'))</PaketRestoreCachedHash>
      <PaketRestoreReferencesFileHash>$([System.IO.File]::ReadAllText('$(PaketOriginalReferencesFilePath)'))</PaketRestoreReferencesFileHash>
      <PaketRestoreRequiredReason>references-file</PaketRestoreRequiredReason>
      <PaketRestoreRequired Condition=" '$(PaketRestoreReferencesFileHash)' == '$(PaketRestoreCachedHash)' ">false</PaketRestoreRequired>
    </PropertyGroup>

    <PropertyGroup Condition="!Exists('$(PaketOriginalReferencesFilePath)') AND !Exists('$(PaketReferencesCachedFilePath)') ">
      <!-- If both don't exist there is nothing to do. -->
      <PaketRestoreRequired>false</PaketRestoreRequired>
    </PropertyGroup>

    <!-- Step 2 b detect relevant changes in project file (new targetframework) -->
    <PropertyGroup Condition=" '$(DoAllResolvedFilesExist)' != 'true' ">
      <PaketRestoreRequired>true</PaketRestoreRequired>
      <PaketRestoreRequiredReason>target-framework '$(TargetFramework)' or '$(TargetFrameworks)' files @(PaketResolvedFilePaths)</PaketRestoreRequiredReason>
    </PropertyGroup>

    <!-- Step 3 Restore project specific stuff if required -->
    <Message Condition=" '$(PaketRestoreRequired)' == 'true' " Importance="low" Text="Detected a change ('$(PaketRestoreRequiredReason)') in the project file '$(MSBuildProjectFullPath)', calling paket restore" />
    <Error Text="Stop build because of PAKET_ERROR_ON_MSBUILD_EXEC and we detected a change ('$(PaketRestoreRequiredReason)') in the project file '$(MSBuildProjectFullPath)'" Condition=" '$(PAKET_ERROR_ON_MSBUILD_EXEC)' == 'true' AND '$(PaketRestoreRequired)' == 'true' " />
    <Exec Command='$(PaketCommand) restore --project "$(MSBuildProjectFullPath)" --output-path "$(PaketIntermediateOutputPath)" --target-framework "$(TargetFrameworks)"' Condition=" '$(PaketRestoreRequired)' == 'true' AND '$(TargetFramework)' == '' " ContinueOnError="false" />
    <Exec Command='$(PaketCommand) restore --project "$(MSBuildProjectFullPath)" --output-path "$(PaketIntermediateOutputPath)" --target-framework "$(TargetFramework)"' Condition=" '$(PaketRestoreRequired)' == 'true' AND '$(TargetFramework)' != '' " ContinueOnError="false" />

    <!-- This shouldn't actually happen, but just to be sure. -->
    <PropertyGroup>
      <DoAllResolvedFilesExist>false</DoAllResolvedFilesExist>
      <DoAllResolvedFilesExist Condition="Exists(%(PaketResolvedFilePaths.Identity))">true</DoAllResolvedFilesExist>
    </PropertyGroup>
    <Error Condition=" '$(DoAllResolvedFilesExist)' != 'true' AND '$(ResolveNuGetPackages)' != 'False' " Text="One Paket file '@(PaketResolvedFilePaths)' is missing while restoring $(MSBuildProjectFile). Please delete 'paket-files/paket.restore.cached' and call 'paket restore'." />

    <!-- Step 4 forward all msbuild properties (PackageReference, DotNetCliToolReference) to msbuild -->
    <ReadLinesFromFile Condition="($(DesignTimeBuild) != true OR '$(PaketPropsLoaded)' != 'true') AND '@(PaketResolvedFilePaths)' != ''" File="%(PaketResolvedFilePaths.Identity)" >
      <Output TaskParameter="Lines" ItemName="PaketReferencesFileLines"/>
    </ReadLinesFromFile>

    <ItemGroup Condition="($(DesignTimeBuild) != true OR '$(PaketPropsLoaded)' != 'true') AND '@(PaketReferencesFileLines)' != '' " >
      <PaketReferencesFileLinesInfo Include="@(PaketReferencesFileLines)" >
        <Splits>$([System.String]::Copy('%(PaketReferencesFileLines.Identity)').Split(',').Length)</Splits>
        <PackageName>$([System.String]::Copy('%(PaketReferencesFileLines.Identity)').Split(',')[0])</PackageName>
        <PackageVersion>$([System.String]::Copy('%(PaketReferencesFileLines.Identity)').Split(',')[1])</PackageVersion>
        <AllPrivateAssets>$([System.String]::Copy('%(PaketReferencesFileLines.Identity)').Split(',')[4])</AllPrivateAssets>
        <CopyLocal Condition="'%(PaketReferencesFileLinesInfo.Splits)' == '6'">$([System.String]::Copy('%(PaketReferencesFileLines.Identity)').Split(',')[5])</CopyLocal>
      </PaketReferencesFileLinesInfo>
      <PackageReference Include="%(PaketReferencesFileLinesInfo.PackageName)">
        <Version>%(PaketReferencesFileLinesInfo.PackageVersion)</Version>
        <PrivateAssets Condition=" ('%(PaketReferencesFileLinesInfo.AllPrivateAssets)' == 'true') Or ('$(PackAsTool)' == 'true') ">All</PrivateAssets>
        <ExcludeAssets Condition=" '%(PaketReferencesFileLinesInfo.Splits)' == '6' And %(PaketReferencesFileLinesInfo.CopyLocal) == 'false'">runtime</ExcludeAssets>
        <ExcludeAssets Condition=" '%(PaketReferencesFileLinesInfo.Splits)' != '6' And %(PaketReferencesFileLinesInfo.AllPrivateAssets) == 'exclude'">runtime</ExcludeAssets>
        <Publish Condition=" '$(PackAsTool)' == 'true' ">true</Publish>
        <AllowExplicitVersion>true</AllowExplicitVersion>
      </PackageReference>
    </ItemGroup>

    <PropertyGroup>
      <PaketCliToolFilePath>$(PaketIntermediateOutputPath)/$(MSBuildProjectFile).paket.clitools</PaketCliToolFilePath>
    </PropertyGroup>

    <ReadLinesFromFile File="$(PaketCliToolFilePath)" >
      <Output TaskParameter="Lines" ItemName="PaketCliToolFileLines"/>
    </ReadLinesFromFile>

    <ItemGroup Condition=" '@(PaketCliToolFileLines)' != '' " >
      <PaketCliToolFileLinesInfo Include="@(PaketCliToolFileLines)" >
        <PackageName>$([System.String]::Copy('%(PaketCliToolFileLines.Identity)').Split(',')[0])</PackageName>
        <PackageVersion>$([System.String]::Copy('%(PaketCliToolFileLines.Identity)').Split(',')[1])</PackageVersion>
      </PaketCliToolFileLinesInfo>
      <DotNetCliToolReference Include="%(PaketCliToolFileLinesInfo.PackageName)">
        <Version>%(PaketCliToolFileLinesInfo.PackageVersion)</Version>
      </DotNetCliToolReference>
    </ItemGroup>

    <!-- Disabled for now until we know what to do with runtime deps - https://github.com/fsprojects/Paket/issues/2964
    <PropertyGroup>
      <RestoreConfigFile>$(PaketIntermediateOutputPath)/$(MSBuildProjectFile).NuGet.Config</RestoreConfigFile>
    </PropertyGroup> -->

  </Target>

  <Target Name="PaketDisableDirectPack" AfterTargets="_IntermediatePack" BeforeTargets="GenerateNuspec" Condition="('$(IsPackable)' == '' Or '$(IsPackable)' == 'true') And Exists('$(PaketIntermediateOutputPath)/$(MSBuildProjectFile).references')" >
    <PropertyGroup>
      <ContinuePackingAfterGeneratingNuspec>false</ContinuePackingAfterGeneratingNuspec>
    </PropertyGroup>
  </Target>

  <Target Name="PaketOverrideNuspec" DependsOnTargets="SetPaketCommand" AfterTargets="GenerateNuspec" Condition="('$(IsPackable)' == '' Or '$(IsPackable)' == 'true') And Exists('$(PaketIntermediateOutputPath)/$(MSBuildProjectFile).references')" >
    <ItemGroup>
      <_NuspecFilesNewLocation Include="$(PaketIntermediateOutputPath)\$(Configuration)\*.nuspec"/>
      <MSBuildMajorVersion Include="$(DetectedMSBuildVersion.Replace(`-`, `.`).Split(`.`)[0])" />
      <MSBuildMinorVersion Include="$(DetectedMSBuildVersion.Replace(`-`, `.`).Split(`.`)[1])" />
    </ItemGroup>

    <PropertyGroup>
      <PaketProjectFile>$(MSBuildProjectDirectory)/$(MSBuildProjectFile)</PaketProjectFile>
      <ContinuePackingAfterGeneratingNuspec>true</ContinuePackingAfterGeneratingNuspec>
      <UseMSBuild16_0_Pack>false</UseMSBuild16_0_Pack>
      <UseMSBuild16_0_Pack Condition=" '@(MSBuildMajorVersion)' >= '16' ">true</UseMSBuild16_0_Pack>
      <UseMSBuild15_9_Pack>false</UseMSBuild15_9_Pack>
      <UseMSBuild15_9_Pack Condition=" '@(MSBuildMajorVersion)' == '15' AND '@(MSBuildMinorVersion)' > '8' ">true</UseMSBuild15_9_Pack>
      <UseMSBuild15_8_Pack>false</UseMSBuild15_8_Pack>
      <UseMSBuild15_8_Pack Condition=" '$(NuGetToolVersion)' != '4.0.0' AND (! $(UseMSBuild15_9_Pack)) AND (! $(UseMSBuild16_0_Pack)) ">true</UseMSBuild15_8_Pack>
      <UseNuGet4_Pack>false</UseNuGet4_Pack>
      <UseNuGet4_Pack Condition=" (! $(UseMSBuild15_8_Pack)) AND (! $(UseMSBuild15_9_Pack)) AND (! $(UseMSBuild16_0_Pack)) ">true</UseNuGet4_Pack>
      <AdjustedNuspecOutputPath>$(PaketIntermediateOutputPath)\$(Configuration)</AdjustedNuspecOutputPath>
      <AdjustedNuspecOutputPath Condition="@(_NuspecFilesNewLocation) == ''">$(PaketIntermediateOutputPath)</AdjustedNuspecOutputPath>
    </PropertyGroup>

    <ItemGroup>
      <_NuspecFiles Include="$(AdjustedNuspecOutputPath)\*.$(PackageVersion.Split(`+`)[0]).nuspec"/>
    </ItemGroup>

    <Error Text="Error Because of PAKET_ERROR_ON_MSBUILD_EXEC (not calling fix-nuspecs)" Condition=" '$(PAKET_ERROR_ON_MSBUILD_EXEC)' == 'true' " />
    <Exec Condition="@(_NuspecFiles) != ''" Command='$(PaketCommand) fix-nuspecs files "@(_NuspecFiles)" project-file "$(PaketProjectFile)" ' />
    <Error Condition="@(_NuspecFiles) == ''" Text='Could not find nuspec files in "$(AdjustedNuspecOutputPath)" (Version: "$(PackageVersion)"), therefore we cannot call "paket fix-nuspecs" and have to error out!' />

    <ConvertToAbsolutePath Condition="@(_NuspecFiles) != ''" Paths="@(_NuspecFiles)">
      <Output TaskParameter="AbsolutePaths" PropertyName="NuspecFileAbsolutePath" />
    </ConvertToAbsolutePath>

    <!-- Call Pack -->
    <PackTask Condition="$(UseMSBuild16_0_Pack)"
              PackItem="$(PackProjectInputFile)"
              PackageFiles="@(_PackageFiles)"
              PackageFilesToExclude="@(_PackageFilesToExclude)"
              PackageVersion="$(PackageVersion)"
              PackageId="$(PackageId)"
              Title="$(Title)"
              Authors="$(Authors)"
              Description="$(Description)"
              Copyright="$(Copyright)"
              RequireLicenseAcceptance="$(PackageRequireLicenseAcceptance)"
              LicenseUrl="$(PackageLicenseUrl)"
              ProjectUrl="$(PackageProjectUrl)"
              IconUrl="$(PackageIconUrl)"
              ReleaseNotes="$(PackageReleaseNotes)"
              Tags="$(PackageTags)"
              DevelopmentDependency="$(DevelopmentDependency)"
              BuildOutputInPackage="@(_BuildOutputInPackage)"
              TargetPathsToSymbols="@(_TargetPathsToSymbols)"
              SymbolPackageFormat="$(SymbolPackageFormat)"
              TargetFrameworks="@(_TargetFrameworks)"
              AssemblyName="$(AssemblyName)"
              PackageOutputPath="$(PackageOutputAbsolutePath)"
              IncludeSymbols="$(IncludeSymbols)"
              IncludeSource="$(IncludeSource)"
              PackageTypes="$(PackageType)"
              IsTool="$(IsTool)"
              RepositoryUrl="$(RepositoryUrl)"
              RepositoryType="$(RepositoryType)"
              SourceFiles="@(_SourceFiles->Distinct())"
              NoPackageAnalysis="$(NoPackageAnalysis)"
              MinClientVersion="$(MinClientVersion)"
              Serviceable="$(Serviceable)"
              FrameworkAssemblyReferences="@(_FrameworkAssemblyReferences)"
              ContinuePackingAfterGeneratingNuspec="$(ContinuePackingAfterGeneratingNuspec)"
              NuspecOutputPath="$(AdjustedNuspecOutputPath)"
              IncludeBuildOutput="$(IncludeBuildOutput)"
              BuildOutputFolders="$(BuildOutputTargetFolder)"
              ContentTargetFolders="$(ContentTargetFolders)"
              RestoreOutputPath="$(RestoreOutputAbsolutePath)"
              NuspecFile="$(NuspecFileAbsolutePath)"
              NuspecBasePath="$(NuspecBasePath)"
              NuspecProperties="$(NuspecProperties)"
              PackageLicenseFile="$(PackageLicenseFile)"
              PackageLicenseExpression="$(PackageLicenseExpression)"
              PackageLicenseExpressionVersion="$(PackageLicenseExpressionVersion)" />

    <PackTask Condition="$(UseMSBuild15_9_Pack)"
              PackItem="$(PackProjectInputFile)"
              PackageFiles="@(_PackageFiles)"
              PackageFilesToExclude="@(_PackageFilesToExclude)"
              PackageVersion="$(PackageVersion)"
              PackageId="$(PackageId)"
              Title="$(Title)"
              Authors="$(Authors)"
              Description="$(Description)"
              Copyright="$(Copyright)"
              RequireLicenseAcceptance="$(PackageRequireLicenseAcceptance)"
              LicenseUrl="$(PackageLicenseUrl)"
              ProjectUrl="$(PackageProjectUrl)"
              IconUrl="$(PackageIconUrl)"
              ReleaseNotes="$(PackageReleaseNotes)"
              Tags="$(PackageTags)"
              DevelopmentDependency="$(DevelopmentDependency)"
              BuildOutputInPackage="@(_BuildOutputInPackage)"
              TargetPathsToSymbols="@(_TargetPathsToSymbols)"
              SymbolPackageFormat="$(SymbolPackageFormat)"
              TargetFrameworks="@(_TargetFrameworks)"
              AssemblyName="$(AssemblyName)"
              PackageOutputPath="$(PackageOutputAbsolutePath)"
              IncludeSymbols="$(IncludeSymbols)"
              IncludeSource="$(IncludeSource)"
              PackageTypes="$(PackageType)"
              IsTool="$(IsTool)"
              RepositoryUrl="$(RepositoryUrl)"
              RepositoryType="$(RepositoryType)"
              SourceFiles="@(_SourceFiles->Distinct())"
              NoPackageAnalysis="$(NoPackageAnalysis)"
              MinClientVersion="$(MinClientVersion)"
              Serviceable="$(Serviceable)"
              FrameworkAssemblyReferences="@(_FrameworkAssemblyReferences)"
              ContinuePackingAfterGeneratingNuspec="$(ContinuePackingAfterGeneratingNuspec)"
              NuspecOutputPath="$(AdjustedNuspecOutputPath)"
              IncludeBuildOutput="$(IncludeBuildOutput)"
              BuildOutputFolder="$(BuildOutputTargetFolder)"
              ContentTargetFolders="$(ContentTargetFolders)"
              RestoreOutputPath="$(RestoreOutputAbsolutePath)"
              NuspecFile="$(NuspecFileAbsolutePath)"
              NuspecBasePath="$(NuspecBasePath)"
              NuspecProperties="$(NuspecProperties)"/>

    <PackTask Condition="$(UseMSBuild15_8_Pack)"
              PackItem="$(PackProjectInputFile)"
              PackageFiles="@(_PackageFiles)"
              PackageFilesToExclude="@(_PackageFilesToExclude)"
              PackageVersion="$(PackageVersion)"
              PackageId="$(PackageId)"
              Title="$(Title)"
              Authors="$(Authors)"
              Description="$(Description)"
              Copyright="$(Copyright)"
              RequireLicenseAcceptance="$(PackageRequireLicenseAcceptance)"
              LicenseUrl="$(PackageLicenseUrl)"
              ProjectUrl="$(PackageProjectUrl)"
              IconUrl="$(PackageIconUrl)"
              ReleaseNotes="$(PackageReleaseNotes)"
              Tags="$(PackageTags)"
              DevelopmentDependency="$(DevelopmentDependency)"
              BuildOutputInPackage="@(_BuildOutputInPackage)"
              TargetPathsToSymbols="@(_TargetPathsToSymbols)"
              TargetFrameworks="@(_TargetFrameworks)"
              AssemblyName="$(AssemblyName)"
              PackageOutputPath="$(PackageOutputAbsolutePath)"
              IncludeSymbols="$(IncludeSymbols)"
              IncludeSource="$(IncludeSource)"
              PackageTypes="$(PackageType)"
              IsTool="$(IsTool)"
              RepositoryUrl="$(RepositoryUrl)"
              RepositoryType="$(RepositoryType)"
              SourceFiles="@(_SourceFiles->Distinct())"
              NoPackageAnalysis="$(NoPackageAnalysis)"
              MinClientVersion="$(MinClientVersion)"
              Serviceable="$(Serviceable)"
              FrameworkAssemblyReferences="@(_FrameworkAssemblyReferences)"
              ContinuePackingAfterGeneratingNuspec="$(ContinuePackingAfterGeneratingNuspec)"
              NuspecOutputPath="$(AdjustedNuspecOutputPath)"
              IncludeBuildOutput="$(IncludeBuildOutput)"
              BuildOutputFolder="$(BuildOutputTargetFolder)"
              ContentTargetFolders="$(ContentTargetFolders)"
              RestoreOutputPath="$(RestoreOutputAbsolutePath)"
              NuspecFile="$(NuspecFileAbsolutePath)"
              NuspecBasePath="$(NuspecBasePath)"
              NuspecProperties="$(NuspecProperties)"/>

    <PackTask Condition="$(UseNuGet4_Pack)"
              PackItem="$(PackProjectInputFile)"
              PackageFiles="@(_PackageFiles)"
              PackageFilesToExclude="@(_PackageFilesToExclude)"
              PackageVersion="$(PackageVersion)"
              PackageId="$(PackageId)"
              Title="$(Title)"
              Authors="$(Authors)"
              Description="$(Description)"
              Copyright="$(Copyright)"
              RequireLicenseAcceptance="$(PackageRequireLicenseAcceptance)"
              LicenseUrl="$(PackageLicenseUrl)"
              ProjectUrl="$(PackageProjectUrl)"
              IconUrl="$(PackageIconUrl)"
              ReleaseNotes="$(PackageReleaseNotes)"
              Tags="$(PackageTags)"
              TargetPathsToAssemblies="@(_TargetPathsToAssemblies->'%(FinalOutputPath)')"
              TargetPathsToSymbols="@(_TargetPathsToSymbols)"
              TargetFrameworks="@(_TargetFrameworks)"
              AssemblyName="$(AssemblyName)"
              PackageOutputPath="$(PackageOutputAbsolutePath)"
              IncludeSymbols="$(IncludeSymbols)"
              IncludeSource="$(IncludeSource)"
              PackageTypes="$(PackageType)"
              IsTool="$(IsTool)"
              RepositoryUrl="$(RepositoryUrl)"
              RepositoryType="$(RepositoryType)"
              SourceFiles="@(_SourceFiles->Distinct())"
              NoPackageAnalysis="$(NoPackageAnalysis)"
              MinClientVersion="$(MinClientVersion)"
              Serviceable="$(Serviceable)"
              AssemblyReferences="@(_References)"
              ContinuePackingAfterGeneratingNuspec="$(ContinuePackingAfterGeneratingNuspec)"
              NuspecOutputPath="$(AdjustedNuspecOutputPath)"
              IncludeBuildOutput="$(IncludeBuildOutput)"
              BuildOutputFolder="$(BuildOutputTargetFolder)"
              ContentTargetFolders="$(ContentTargetFolders)"
              RestoreOutputPath="$(RestoreOutputAbsolutePath)"
              NuspecFile="$(NuspecFileAbsolutePath)"
              NuspecBasePath="$(NuspecBasePath)"
              NuspecProperties="$(NuspecProperties)"/>
  </Target>
  <!--/+:cnd:noEmit-->
</Project>


================================================
FILE: .paket/paket.targets
================================================
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <!-- Enable the restore command to run before builds -->
    <RestorePackages Condition=" '$(RestorePackages)' == '' ">true</RestorePackages>
    <!-- Download Paket.exe if it does not already exist -->
    <DownloadPaket Condition=" '$(DownloadPaket)' == '' ">true</DownloadPaket>
    <PaketToolsPath>$(MSBuildThisFileDirectory)</PaketToolsPath>
    <PaketRootPath>$(MSBuildThisFileDirectory)..\</PaketRootPath>
  </PropertyGroup>
  <PropertyGroup>
    <!-- Paket command -->
    <PaketExePath Condition=" '$(PaketExePath)' == '' ">$(PaketToolsPath)paket.exe</PaketExePath>
    <PaketBootStrapperExePath Condition=" '$(PaketBootStrapperExePath)' == '' ">$(PaketToolsPath)paket.bootstrapper.exe</PaketBootStrapperExePath>
    <PaketCommand Condition=" '$(OS)' == 'Windows_NT'">"$(PaketExePath)"</PaketCommand>
    <PaketCommand Condition=" '$(OS)' != 'Windows_NT' ">mono --runtime=v4.0.30319 "$(PaketExePath)"</PaketCommand>
    <PaketBootStrapperCommand Condition=" '$(OS)' == 'Windows_NT'">"$(PaketBootStrapperExePath)"</PaketBootStrapperCommand>
    <PaketBootStrapperCommand Condition=" '$(OS)' != 'Windows_NT' ">mono --runtime=v4.0.30319 $(PaketBootStrapperExePath)</PaketBootStrapperCommand>
    <!-- Commands -->
    <RestoreCommand>$(PaketCommand) restore</RestoreCommand>
    <DownloadPaketCommand>$(PaketBootStrapperCommand)</DownloadPaketCommand>
    <!-- We need to ensure packages are restored prior to assembly resolve -->
    <BuildDependsOn Condition="$(RestorePackages) == 'true'">RestorePackages; $(BuildDependsOn);</BuildDependsOn>
  </PropertyGroup>
  <Target Name="CheckPrerequisites">
    <!-- Raise an error if we're unable to locate paket.exe -->
    <Error Condition="'$(DownloadPaket)' != 'true' AND !Exists('$(PaketExePath)')" Text="Unable to locate '$(PaketExePath)'" />
    <MsBuild Targets="DownloadPaket" Projects="$(MSBuildThisFileFullPath)" Properties="Configuration=NOT_IMPORTANT;DownloadPaket=$(DownloadPaket)" />
  </Target>
  <Target Name="DownloadPaket">
    <Exec Command="$(DownloadPaketCommand)" Condition=" '$(DownloadPaket)' == 'true' AND !Exists('$(PaketExePath)')" />
  </Target>
  <Target Name="RestorePackages" DependsOnTargets="CheckPrerequisites">
    <Exec Command="$(RestoreCommand)" WorkingDirectory="$(PaketRootPath)" />
  </Target>
</Project>


================================================
FILE: Hype.sln
================================================

Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 16
VisualStudioVersion = 16.0.29009.5
MinimumVisualStudioVersion = 10.0.40219.1
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = ".paket", ".paket", "{B7FB3383-EF19-4645-986C-72D50C08F292}"
	ProjectSection(SolutionItems) = preProject
		paket.dependencies = paket.dependencies
	EndProjectSection
EndProject
Project("{F2A71F9B-5D33-465A-A702-920D77279786}") = "Hype", "src\Hype\Hype.fsproj", "{C923664D-182E-48D5-BB30-F1505D7D28DF}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "docs", "docs", "{56DA870A-0ED4-47A2-B78B-34A8D4D6AD28}"
	ProjectSection(SolutionItems) = preProject
		docs\BuildDocs.fsx = docs\BuildDocs.fsx
		docs\input\download.fsx = docs\input\download.fsx
		docs\input\FeedforwardNets.fsx = docs\input\FeedforwardNets.fsx
		docs\input\HMC.fsx = docs\input\HMC.fsx
		docs\input\index.fsx = docs\input\index.fsx
		docs\input\Optimization.fsx = docs\input\Optimization.fsx
		docs\input\RecurrentNets.fsx = docs\input\RecurrentNets.fsx
		docs\input\Regression.fsx = docs\input\Regression.fsx
		docs\input\Training.fsx = docs\input\Training.fsx
	EndProjectSection
EndProject
Global
	GlobalSection(SolutionConfigurationPlatforms) = preSolution
		Debug|Any CPU = Debug|Any CPU
		Debug|x64 = Debug|x64
		Release|Any CPU = Release|Any CPU
		Release|x64 = Release|x64
	EndGlobalSection
	GlobalSection(ProjectConfigurationPlatforms) = postSolution
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Debug|Any CPU.Build.0 = Debug|Any CPU
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Debug|x64.ActiveCfg = Debug|Any CPU
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Debug|x64.Build.0 = Debug|Any CPU
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Release|Any CPU.ActiveCfg = Release|Any CPU
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Release|Any CPU.Build.0 = Release|Any CPU
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Release|x64.ActiveCfg = Release|Any CPU
		{C923664D-182E-48D5-BB30-F1505D7D28DF}.Release|x64.Build.0 = Release|Any CPU
	EndGlobalSection
	GlobalSection(SolutionProperties) = preSolution
		HideSolutionNode = FALSE
	EndGlobalSection
	GlobalSection(ExtensibilityGlobals) = postSolution
		SolutionGuid = {028AF435-B43C-4E8E-8A82-4A65AF666086}
	EndGlobalSection
EndGlobal


================================================
FILE: LICENSE.txt
================================================
The MIT License (MIT)

Copyright (c) 2015, National University of Ireland Maynooth (Atilim Gunes Baydin, Barak A. Pearlmutter)

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

================================================
FILE: README.md
================================================
Hype: Compositional Machine Learning and Hyperparameter Optimization
--------------------------------------------------------------------

Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization.

It is developed by [Atılım Güneş Baydin](http://www.cs.nuim.ie/~gunes/) and [Barak A. Pearlmutter](http://bcl.hamilton.ie/~barak/), at the [Brain and Computation Lab](http://www.bcl.hamilton.ie/), National University of Ireland Maynooth.

This work is supported by Science Foundation Ireland grant 09/IN.1/I2637.

Please visit the [project website](http://hypelib.github.io/Hype/) for documentation and tutorials.

You can come and join the Gitter chat room, if you want to chat with us:

[![Join the chat at https://gitter.im/hypelib/Hype](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/hypelib/Hype?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)

### Project statistics

[![Issue Stats](http://issuestats.com/github/hypelib/Hype/badge/pr?style=flat-square)](http://issuestats.com/github/hypelib/Hype)
[![Issue Stats](http://issuestats.com/github/hypelib/Hype/badge/issue?style=flat-square)](http://issuestats.com/github/hypelib/Hype)

### Current build status

[![Build status](https://ci.appveyor.com/api/projects/status/w1xgcleb1x4f30c0?svg=true)](https://ci.appveyor.com/project/gbaydin/hype)

### License

Hype is released under the MIT license.


================================================
FILE: Roadmap.txt
================================================
- CUDA backend (DiffSharp)
- Example for Hamiltonian MCMC
- Probabilistic inference
- Convolutional neural networks (ideally with DiffSharp tensor)
- Saving and loading models using a standard format

- Improve code comments
- Add references to research papers where relevant

- Add ability to read and write MATLAB files (scipy.io loadmat, savemat)
- Add ability to read and write FSL nifti files for fMRI (PyMVPA2, SampleAttributes, fmri_dataset, poly_detrend, zscore)
- Add ability to read and write standard image/video formats (OpenCV, MATLAB)
- Better integration with graph libraries (box plots, bar graphs, confusion matrix plots, write to .png support)


================================================
FILE: docs/.gitignore
================================================
output/

================================================
FILE: docs/BuildDocs.fsx
================================================
//
// This file is part of
// Hype: Compositional Machine Learning and Hyperparameter Optimization
//
// Copyright (c) 2015, National University of Ireland Maynooth (Atilim Gunes Baydin, Barak A. Pearlmutter)
//
// Hype is released under the MIT license.
// (See accompanying LICENSE file.)
//
// Written by:
//
//   Atilim Gunes Baydin
//   atilimgunes.baydin@nuim.ie
//
//   Barak A. Pearlmutter
//   barak@cs.nuim.ie
//
//   Brain and Computation Lab
//   Hamilton Institute & Department of Computer Science
//   National University of Ireland Maynooth
//   Maynooth, Co. Kildare
//   Ireland
//
//   www.bcl.hamilton.ie
//

#r "../packages/FSharp.Compiler.Service/lib/net40/FSharp.Compiler.Service.dll"
#r "../packages/FSharpVSPowerTools.Core/lib/net45/FSharpVSPowerTools.Core.dll"
#r "../packages/FSharp.Formatting/lib/net40/CSharpFormat.dll"
#r "../packages/FSharp.Formatting/lib/net40/FSharp.CodeFormat.dll"
#r "../packages/FSharp.Formatting/lib/net40/FSharp.Literate.dll"
#r "../packages/FSharp.Formatting/lib/net40/FSharp.MetadataFormat.dll"
#r "../packages/FSharp.Formatting/lib/net40/FSharp.Markdown.dll"

open System.IO
open FSharp.Literate
open FSharp.MetadataFormat

//
// Setup output directory structure and copy static files
//

let source = __SOURCE_DIRECTORY__ 
let docs = Path.Combine(source, "")
let relative subdir = Path.Combine(docs, subdir)

if not (Directory.Exists(relative "output")) then
    Directory.CreateDirectory(relative "output") |> ignore
if not (Directory.Exists(relative "output/img")) then
    Directory.CreateDirectory (relative "output/img") |> ignore
if not (Directory.Exists(relative "output/misc")) then
    Directory.CreateDirectory (relative "output/misc") |> ignore
if not (Directory.Exists(relative "output/reference")) then
    Directory.CreateDirectory (relative "output/reference") |> ignore

for fileInfo in DirectoryInfo(relative "input/files/misc").EnumerateFiles() do
    fileInfo.CopyTo(Path.Combine(relative "output/misc", fileInfo.Name), true) |> ignore

for fileInfo in DirectoryInfo(relative "input/files/img").EnumerateFiles() do
    fileInfo.CopyTo(Path.Combine(relative "output/img", fileInfo.Name), true) |> ignore

//
// Generate documentation
//

let tags = ["project-name", "Hype"; "project-author", "Atılım Güneş Baydin"; "project-github", "http://github.com/hypelib/Hype"; "project-nuget", "https://www.nuget.org/packages/hype"; "root", ""]

Literate.ProcessScriptFile(relative "input/index.fsx", relative "input/templates/template.html", relative "output/index.html", replacements = tags)
Literate.ProcessScriptFile(relative "input/download.fsx", relative "input/templates/template.html", relative "output/download.html", replacements = tags)
Literate.ProcessScriptFile(relative "input/Optimization.fsx", relative "input/templates/template.html", relative "output/optimization.html", replacements = tags)
Literate.ProcessScriptFile(relative "input/Training.fsx", relative "input/templates/template.html", relative "output/training.html", replacements = tags)
Literate.ProcessScriptFile(relative "input/Regression.fsx", relative "input/templates/template.html", relative "output/regression.html", replacements = tags)
Literate.ProcessScriptFile(relative "input/FeedforwardNets.fsx", relative "input/templates/template.html", relative "output/feedforwardnets.html", replacements = tags)
Literate.ProcessScriptFile(relative "input/RecurrentNets.fsx", relative "input/templates/template.html", relative "output/recurrentnets.html", replacements = tags)
Literate.ProcessScriptFile(relative "input/HMC.fsx", relative "input/templates/template.html", relative "output/hmc.html", replacements = tags)

//
// Generate API reference
//

let library = relative "../src/Hype/bin/Debug/Hype.dll"
let layoutRoots = [relative "input/templates"; relative "input/templates/reference" ]

MetadataFormat.Generate(library, relative "output/reference", layoutRoots, tags, markDownComments = true, libDirs = [relative "../src/Hype/bin/Debug/"])


================================================
FILE: docs/input/FeedforwardNets.fsx
================================================
(*** hide ***)
#r "../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll"
#r "../../src/Hype/bin/Release/netstandard2.0/Hype.dll"
#I "../../packages/R.NET.Community/lib/net40/"
#I "../../packages/R.NET.Community.FSharp/lib/net40/"
#I "../../packages/RProvider"
#load "RProvider.fsx"
fsi.ShowDeclarationValues <- true

(**
Feedforward neural networks
===========================

In this example, we implement a softmax classifier network with several hidden layers. Also see the [regression example](regression.html) for some relevant basics.

We again demonstrate the library with the [MNIST](http://yann.lecun.com/exdb/mnist/) database, this time using the full training set of 60,000 examples for building a classifier with 10 outputs representing the class probabilities of an input image belonging to one of the ten categories.

### Loading the data

We load the data and form the training, validation, and test datasets. The datasets are shuffled and the input data are normalized.
*)

open Hype
open Hype.Neural
open DiffSharp.AD.Float32
open DiffSharp.Util

let MNIST = Dataset(Util.LoadMNISTPixels("C:/datasets/MNIST/train-images.idx3-ubyte", 60000),
                    Util.LoadMNISTLabels("C:/datasets/MNIST/train-labels.idx1-ubyte", 60000)).NormalizeX()

let MNISTtrain = MNIST.[..58999].Shuffle()
let MNISTvalid = MNIST.[59000..].Shuffle()

let MNISTtest = 
    Dataset(Util.LoadMNISTPixels("C:/datasets/MNIST/t10k-images.idx3-ubyte", 10000),
            Util.LoadMNISTLabels("C:/datasets/MNIST/t10k-labels.idx1-ubyte", 10000)).NormalizeX().Shuffle()

(**
<pre>
val MNISTtrain : Dataset = Hype.Dataset
   X: 784 x 59000
   Y: 1 x 59000
val MNISTvalid : Dataset = Hype.Dataset
   X: 784 x 1000
   Y: 1 x 1000
val MNISTtest : Dataset = Hype.Dataset
   X: 784 x 10000
   Y: 1 x 10000
</pre>

*)

MNISTtrain.[..5].VisualizeXColsAsImageGrid(28) |> printfn "%s"

(**

    [lang=cs]
    Hype.Dataset
       X: 784 x 6
       Y: 1 x 6
    X's columns reshaped to (28 x 28), presented in a (2 x 3) grid:
    DM : 56 x 84
                                                                                    
                                                                                    
                                                                                    
                                                  ·▴█                                   
                                                 ■■♦█·                  █■              
                                                ▪███■▪                 -██■-            
                                             ·■███♦●                    ·●██■▪          
                            -♦▪             ·████♦                         -♦█♦         
                         -♦■▪·              █■█●                             ■█·        
                      ·●██♦                 ██●·                             ██·        
             ·▴     ·●██▪                -■██▪                              ■█▪         
            ·■▪   ·▪■■▪                 ·███▪  ·▴·                         ♦█▪          
           ♦■▪  ·■■▴                    ♦███●▴▴♦██●                       ██▴           
           █   ■█·                      ■█■█■██████■                     ■█▪            
           ■█· -                        ██▴█████████▴                  ·██·             
            ●█▪                         █▪●■████■███▴                 ·███♦·            
             ·■■                        █▴●- ·    ●█·                  ●♦♦♦█▪           
               ■█·                      █●■       ▪█                       ▴█-          
               ·█●                      ███·  ·●■■██                        ■█          
               ·█●                      █●■▴▴██████●                        ██          
               ●█▴                      ♦█████████▴                        ▴█▪          
            -♦██♦                       ▪█████♦▪·                    ●     █■           
            ▴■■▴                          ▪▪▪                        █   ·■♦            
                                                                     ■♦▪♦█▪             
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
           ·■       ·♦                           ▪■♦-                  ·●●█■·           
           ██·      ♦█●                         ██■■█▪-●·              ▪■·▴●            
           ██▴      ♦██▪                       ■█-  ▴███■              █♦               
          -██       ♦██●                      ▪█·   ▪██▪              ▪█♦  ■▴           
          -██       ♦██■                      █●   ♦██▴               ♦█- ██■           
          ▴█♦       ▴██■                     ▴█  ·██■·                ♦█- ███           
          ♦█♦       -███                     ▴█·▴██♦                  ▪██■███·          
          ♦█·       ·██●                      ████▴                    ■██●■█·          
          ♦█▴    ··-███●                     ▪██♦                       ·  ♦█·          
          ♦██■■■■██████●                   ·■███                           ██           
          ▪█████████■♦█●                  ♦██■●█                           █■           
           ●██████▴-  █■                ▴██■- ●█                          ▴█▪           
           ·███♦·     ██               ♦█■▴   █●                          ██▴           
            ▪▪        ██▪             ███    ■█·                          ██▴           
                      ●██            ●████■■██-                           ██            
                      ▴██            -▪▪▪▪▪▪▪-                            ██            
                       ██                                                -██            
                       ■█                                                -█●            
                       ●█·                                               -█▪            
                       ·█                                                 █             
                                                                                    
                                                                                    

### Defining the model

We define a neural network with 3 layers: (1) a hidden layer with 300 units, followed by ReLU activation, (2) a hidden layer with 100 units, followed by ReLU activation, (3) a final layer with 10 units, followed by softmax transformation.
*)

let n = FeedForward()
n.Add(Linear(28 * 28, 300, Initializer.InitReLU))
n.Add(reLU)
n.Add(Linear(300, 100, Initializer.InitReLU))
n.Add(reLU)
n.Add(Linear(100, 10))
n.Add(fun m -> m |> DM.mapCols softmax) // Note the free inline implementation of the layer

n.ToString() |> printfn "%s"

(**
    [lang=cs]
    Hype.Neural.FeedForward
       Learnable parameters: 266610
       (0) -> (1) -> (2) -> (3) -> (4) -> (5)

       (0): Hype.Neural.Linear
       784 -> 300
       Learnable parameters: 235500
       Init: ReLU
       W   : 300 x 784
       b   : 300

       (1): Hype.Neural.Activation

       (2): Hype.Neural.Linear
       300 -> 100
       Learnable parameters: 30100
       Init: ReLU
       W   : 100 x 300
       b   : 100

       (3): Hype.Neural.Activation

       (4): Hype.Neural.Linear
       100 -> 10
       Learnable parameters: 1010
       Init: Standard
       W   : 10 x 100
       b   : 10

       (5): Hype.Neural.Activation
*)


(**

### Freely implementing transformation layers

Now let's have a closer look at how we implemented the nonlinear transformations between the linear layers. 

You might think that the instances of **reLU** in **n.Add(reLU)** above refer to a particular layer structure previously implemented as a layer module within the library. They don't. **reLU** is just a matrix-to-matrix elementwise function.

**An important thing to note** here is that the activation/transformation layers added with, for example, **n.Add(reLU)**, can be **any matrix-to-matrix function that you can express in the language,** unlike commonly seen in many machine learning frameworks where you are asked to select a particular layer type that has been implemented beforehand with it's (1) forward evaluation code and (2) reverse gradient code w.r.t. layer inputs, and (3) reverse gradient code w.r.t. any layer parameters. In such a setting, a new layer design would require you to add a new layer type to the system and carefully implement these components.

Here, because the system is based on nested AD, you can freely use any matrix-to-matrix transformation as a layer, and the forward and/or reverse AD operations of your code will be handled automatically by the underlying system. For example, you can write a layer like this: 
*)

n.Add(fun w ->
        let min = DM.Min(w)
        let range = DM.Max(w) - min
        (w - min) / range)

(** 
which will be a normalization layer, scaling the values to be between 0 and 1.

In the above model, this is how the softmax layer is implemented as a mapping of the vector-to-vector **softmax** function to the columns of a matrix. 

*)

n.Add(fun m -> m |> DM.mapCols softmax) 

(**
In this particular example, the output matrix has 10 rows (for the 10 target classes) and each column (a vector of size 10) is individually passed through the **softmax** function. The output matrix would have as many columns as the input matrix, representing the class probabilities of each input.
*)


(**
### Weight initialization schemes

When layers with learnable weights are created, the weights are initialized using one of the following schemes. The correct initialization would depend on the activation function immediately following the layer and would take the fan-in/fan-out of the layer into account. If a specific scheme is not specified, the **InitStandard** scheme is used by default. These implementations are based on existing machine learning literature, such as _"Glorot, Xavier, and Yoshua Bengio. "Understanding the difficulty of training deep feedforward neural networks." International conference on artificial intelligence and statistics. 2010"_.

*)

type Initializer =
    | InitUniform of D * D
    | InitNormal of D * D
    | InitRBM of D
    | InitReLU
    | InitSigmoid
    | InitTanh
    | InitStandard
    | InitCustom of (int->int->D)
    override i.ToString() =
        match i with
        | InitUniform(min, max) -> sprintf "Uniform min=%A max=%A" min max
        | InitNormal(mu, sigma) -> sprintf "Normal mu=%A sigma=%A" mu sigma
        | InitRBM sigma -> sprintf "RBM sigma=%A" sigma
        | InitReLU -> "ReLU"
        | InitSigmoid -> "Sigmoid"
        | InitTanh -> "Tanh"
        | InitStandard -> "Standard"
        | InitCustom f -> "Custom"
    member i.InitDM(m, n) =
        let fanOut, fanIn = m, n
        match i with
        | InitUniform(min, max) -> Rnd.UniformDM(m, n, min, max)
        | InitNormal(mu, sigma) -> Rnd.NormalDM(m, n, mu, sigma)
        | InitRBM sigma -> Rnd.NormalDM(m, n, D 0.f, sigma)
        | InitReLU -> Rnd.NormalDM(m, n, D 0.f, sqrt (D 2.f / (float32 fanIn)))
        | InitSigmoid -> let r = D 4.f * sqrt (D 6.f / (fanIn + fanOut)) in Rnd.UniformDM(m, n, -r, r)
        | InitTanh -> let r = sqrt (D 6.f / (fanIn + fanOut)) in Rnd.UniformDM(m, n, -r, r)
        | InitStandard -> let r = (D 1.f) / sqrt (float32 fanIn) in Rnd.UniformDM(m, n, -r, r)
        | InitCustom f -> DM.init m n (fun _ _ -> f fanIn fanOut)
    member i.InitDM(m:DM) = i.InitDM(m.Rows, m.Cols)

(**
### Training

Before training, let's visualize the weights of the first layer in a grid where each row of the weight matrix of the first layer is shown as a 28-by-28 image. It is an image of random weights, as expected.
*)

let l = (n.[0] :?> Linear)
l.VisualizeWRowsAsImageGrid(28) |> printfn "%s"

(**
<pre>
Hype.Neural.Linear
    784 -> 300
    Learnable parameters: 235500
    Init: ReLU
    W's rows reshaped to (28 x 28), presented in a (17 x 18) grid:
</pre>

<div class="row">
    <div class="span6 text-center">
        <img src="img/Feedforwardnets-1.png" alt="Chart" style="width:500px;"/>
    </div>
</div><br/>

Now let's train the network with the training and validation datasets we've prepared, using RMSProp, Nesterov momentum, and cross-entropy loss.
*)

let p = {Params.Default with 
            Epochs = 2
            EarlyStopping = Early (400, 100)
            ValidationInterval = 10
            Batch = Minibatch 100
            Loss = CrossEntropyOnSoftmax
            Momentum = Nesterov (D 0.9f)
            LearningRate = RMSProp (D 0.001f, D 0.9f)}

let _, lhist = n.Train(MNISTtrain, MNISTvalid, p)


(**
<pre>
[12/11/2015 22:42:07] --- Training started
[12/11/2015 22:42:07] Parameters     : 266610
[12/11/2015 22:42:07] Iterations     : 1180
[12/11/2015 22:42:07] Epochs         : 2
[12/11/2015 22:42:07] Batches        : Minibatches of 100 (590 per epoch)
[12/11/2015 22:42:07] Training data  : 59000
[12/11/2015 22:42:07] Validation data: 1000
[12/11/2015 22:42:07] Valid. interval: 10
[12/11/2015 22:42:07] Method         : Gradient descent
[12/11/2015 22:42:07] Learning rate  : RMSProp a0 = D 0.00100000005f, k = D 0.899999976f
[12/11/2015 22:42:07] Momentum       : Nesterov D 0.899999976f
[12/11/2015 22:42:07] Loss           : Cross entropy after softmax layer
[12/11/2015 22:42:07] Regularizer    : L2 lambda = D 9.99999975e-05f
[12/11/2015 22:42:07] Gradient clip. : None
[12/11/2015 22:42:07] Early stopping : Stagnation thresh. = 400, overfit. thresh. = 100
[12/11/2015 22:42:07] Improv. thresh.: D 0.995000005f
[12/11/2015 22:42:07] Return best    : true
[12/11/2015 22:42:07] 1/2 | Batch   1/590 | D  2.383214e+000 [- ] | Valid D  2.411374e+000 [- ] | Stag:  0 Ovfit:  0
[12/11/2015 22:42:08] 1/2 | Batch  11/590 | D  6.371681e-001 [↓▼] | Valid D  6.128169e-001 [↓▼] | Stag:  0 Ovfit:  0
[12/11/2015 22:42:08] 1/2 | Batch  21/590 | D  4.729548e-001 [↓▼] | Valid D  4.779414e-001 [↓▼] | Stag:  0 Ovfit:  0
[12/11/2015 22:42:09] 1/2 | Batch  31/590 | D  4.792733e-001 [↑ ] | Valid D  3.651254e-001 [↓▼] | Stag:  0 Ovfit:  0
[12/11/2015 22:42:10] 1/2 | Batch  41/590 | D  2.977416e-001 [↓▼] | Valid D  3.680202e-001 [↑ ] | Stag: 10 Ovfit:  0
[12/11/2015 22:42:10] 1/2 | Batch  51/590 | D  4.242567e-001 [↑ ] | Valid D  3.525212e-001 [↓▼] | Stag:  0 Ovfit:  0
[12/11/2015 22:42:11] 1/2 | Batch  61/590 | D  2.464822e-001 [↓▼] | Valid D  3.365663e-001 [↓▼] | Stag:  0 Ovfit:  0
[12/11/2015 22:42:11] 1/2 | Batch  71/590 | D  6.299557e-001 [↑ ] | Valid D  3.981607e-001 [↑ ] | Stag: 10 Ovfit:  0
...
[12/11/2015 22:43:21] 2/2 | Batch 521/590 | D  1.163270e-001 [↓ ] | Valid D  2.264248e-001 [↓ ] | Stag: 50 Ovfit:  0
[12/11/2015 22:43:21] 2/2 | Batch 531/590 | D  2.169427e-001 [↑ ] | Valid D  2.203927e-001 [↓ ] | Stag: 60 Ovfit:  0
[12/11/2015 22:43:22] 2/2 | Batch 541/590 | D  2.233351e-001 [↑ ] | Valid D  2.353653e-001 [↑ ] | Stag: 70 Ovfit:  0
[12/11/2015 22:43:22] 2/2 | Batch 551/590 | D  3.425132e-001 [↑ ] | Valid D  2.559682e-001 [↑ ] | Stag: 80 Ovfit:  0
[12/11/2015 22:43:23] 2/2 | Batch 561/590 | D  2.768238e-001 [↓ ] | Valid D  2.412431e-001 [↓ ] | Stag: 90 Ovfit:  0
[12/11/2015 22:43:24] 2/2 | Batch 571/590 | D  2.550858e-001 [↓ ] | Valid D  2.726600e-001 [↑ ] | Stag:100 Ovfit:  0
[12/11/2015 22:43:24] 2/2 | Batch 581/590 | D  2.308137e-001 [↓ ] | Valid D  2.466903e-001 [↓ ] | Stag:110 Ovfit:  0
[12/11/2015 22:43:25] Duration       : 00:01:17.5011734
[12/11/2015 22:43:25] Loss initial   : D  2.383214e+000
[12/11/2015 22:43:25] Loss final     : D  1.087980e-001 (Best)
[12/11/2015 22:43:25] Loss change    : D -2.274415e+000 (-95.43 %)
[12/11/2015 22:43:25] Loss chg. / s  : D -2.934685e-002
[12/11/2015 22:43:25] Epochs / s     : 0.02580606089
[12/11/2015 22:43:25] Epochs / min   : 1.548363654
[12/11/2015 22:43:25] --- Training finished
</pre>

<div class="row">
    <div class="span6 text-center">
        <img src="img/Feedforwardnets-3.png" alt="Chart" style="width:500px;"/>
    </div>
</div><br/>
*)

(*** hide ***)
open RProvider
open RProvider.graphics
open RProvider.grDevices

let ll = lhist |> Array.map (float32>>float)

namedParams[
    "x", box ll
    "pch", box 19
    "col", box "darkblue"
    "type", box "l"
    "xlab", box "Iteration"
    "ylab", box "Loss"
    "width", box 700
    "height", box 500
    ]
|> R.plot|> ignore


(**
Now let's visualize the weights of the first layer in the grid. We see that the network has learned the problem domain.
*)

let l = (n.[0] :?> Linear)
l.VisualizeWRowsAsImageGrid(28) |> printfn "%s"

(**

<div class="row">
    <div class="span6 text-center">
        <img src="img/Feedforwardnets-2.png" alt="Chart" style="width:500px;"/>
    </div>
</div><br/>
*)

(**

### Building the softmax classifier

As explained in [regression](regression.html), we just construct an instance of **SoftmaxClassifier** with the trained neural network as its parameter. Please see the [API reference](reference/index.html) and the [source code](https://github.com/hypelib/Hype/blob/master/src/Hype/Classifier.fs) for a better understanding of how classifiers are implemented.
*)

let cc = SoftmaxClassifier(n)

(**

Testing class predictions for 10 random elements from the MNIST test set.

*)

let pred = cc.Classify(MNISTtest.X.[*,0..9]);;
let real = MNISTtest.Yi.[0..9]

(**
<pre>
val pred : int [] = [|5; 1; 9; 2; 6; 0; 0; 5; 7; 6|]
val real : int [] = [|5; 1; 9; 2; 6; 0; 0; 5; 7; 6|]
</pre>

Let's compute the classification error for the whole MNIST test set of 10,000 examples.
*)

cc.ClassificationError(MNISTtest)

(**
<pre>
val it : float32 = 0.0502999984f
</pre>

The classification error is around 5%. This can be lowered some more by training the model for more than 2 epochs as we did.

Classifying a single digit:
*)

let cls = cc.Classify(MNISTtest.X.[*,0]);;
MNISTtest.X.[*,0] |> DV.visualizeAsDM 28 |> printfn "%s"

(**
    [lang=cs]
    val cls : int = 5

    DM : 28 x 28
                            
                            
                            
                            
                            
                            ·   
                        ▴●██♦-  
                     ▴♦██■▴-    
                ♦█■■███▪·       
               ■████■-          
              ♦███▪             
             ♦██♦               
             ██●                
            ■█▪                 
            ██· -▴■●-           
           ▴██████■███-         
           ♦██♦▪    ▪█■-        
            ▪·       ▴█●        
                     -██        
                     ♦█●        
                    ■█■         
                 -●██■·         
             -▴▪■███▪           
          ███████●-             
                            
                            
                            

Classifying many digits at the same time:
*)

let clss = cc.Classify(MNISTtest.X.[*,0..4]);;
MNISTtest.[0..4].VisualizeXColsAsImageGrid(28) |> printfn "%s"

(**

    [lang=cs]
    val clss : int [] = [|5; 1; 9; 2; 6|]

    Hype.Dataset
       X: 784 x 5
       Y: 1 x 5
    X's columns reshaped to (28 x 28), presented in a (2 x 3) grid:
    DM : 56 x 84
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                  ██♦                                   
                            ·                     ██                                    
                        ▴●██♦-                   ██▴                    -♦█▪            
                     ▴♦██■▴-                    ♦██                    ●█████●          
                ♦█■■███▪·                       ██♦                   ■███♦♦██          
               ■████■-                         ███                   ■██♦   ■█▴         
              ♦███▪                           ▴███                  ·██♦    ●██         
             ♦██♦                             ███                   ▪██     ■█■         
             ██●                             ▴██▴                   ·██·  ·♦██▴         
            ■█▪                              ███                     ███♦♦████▴         
            ██· -▴■●-                       ███♦                     ▴████████·         
           ▴██████■███-                     ███      ▴                ·-●- ■██          
           ♦██♦▪    ▪█■-                   ♦██▴                            ██■          
            ▪·       ▴█●                  ▴██♦                            -██▴          
                     -██                  ███▴                            -██·          
                     ♦█●                 ♦██▴                             ■██·          
                    ■█■                  ███                              ███           
                 -●██■·                 ♦██▴                             ▴██●           
             -▴▪■███▪                   ██♦                              ███            
          ███████●-                     ♦█                              -██■            
                                                                        -██♦            
                                                                        -██·            
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                ▴●█♦                                    
                ●██                           -████▴                                    
              ▪████●                         ▴████                                      
             ▴██████▴                       ▴███■                                       
             ■██▪▴██▴                       ███▪                                        
            ▴██●  ▴█■                      ■██▴                                         
           ·███    ██-                   ·♦██▴                                          
           ♦██●    ▪█▪                  -███▴                                           
           ███      ██                  ███♦                                            
           ███      █♦                 ███▪                                             
           █♦·      █♦                ●██■        ▴▴▴                                   
            ·       ██                ███    -██-■█████▪                                
              -     ██                ██■   ●███████████-                               
            ·██■♦-  ██               ▴██▴  ███●-     ▪██▴                               
            ♦█████■███               ▪██  ·██-       ·██▪                               
            ■█████████               ███▪·██▴        ♦██                                
            ♦█████████♦▪             ▪██████▴      ·♦██·                                
            -███████████■●●●·         ▪███████████████▴                                 
             ■██████■■■█████▴          -▪██████████♦-                                   
             ·████■    ▴████■              ·▴▴▴▴▴▴·                                     
              -■█-       ■■▴                                                            
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    




Nested optimization of training hyperparameters
-----------------------------------------------

As we've seen in [optimization](optimization.html), nested AD allows us to apply gradient-based optimization to functions that also internally perform optimization.

This gives us the possibility of optimizing the hyperparameters of training. We can, for example, compute the gradient of the final loss of a training procedure with respect to the continuous hyperparameters of the training such as learning rates, momentum parameters, regularization coefficients, or initialization conditions. 

As an example, let's train a neural network with a learning rate schedule of 50 elements, and optimize this schedule vector with another level of optimization on top of the training.
*)

let train lrschedule =
    Rnd.Seed(123)
    n.Init()

    let p = {Params.Default with
                LearningRate = Schedule lrschedule
                Loss = CrossEntropyOnSoftmax
                ValidationInterval = 1
                Silent = true
                ReturnBest = false
                Batch = Full}
    let loss, _ = n.Train(MNISTvalid.[..20], p)
    loss

let hypertrain epochs =
    let p = {Params.Default with 
                Epochs = epochs
                LearningRate = RMSProp(D 0.01f, D 0.9f)
                ValidationInterval = 1}
    let lr, _, _, _ = Optimize.Minimize(train, DV.create 50 (D 0.1f), p)
    lr

let lr = hypertrain 50

(*** hide ***)
open RProvider
open RProvider.graphics
open RProvider.grDevices

let lrlr = lr |> DV.toArray |> Array.map (float32>>float)

namedParams[
    "x", box lrlr
    "pch", box 19
    "col", box "darkblue"
    "type", box "o"
    "xlab", box "Iteration"
    "ylab", box "Learning rate"
    "width", box 700
    "height", box 500
    ]
|> R.plot|> ignore

(**
<div class="row">
    <div class="span6 text-center">
        <img src="img/Feedforwardnets-4.png" alt="Chart" style="width:500px;"/>
    </div>
</div><br/>
*)

================================================
FILE: docs/input/HMC.fsx
================================================
(*** hide ***)
#r "../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll"
#r "../../src/Hype/bin/Release/netstandard2.0/Hype.dll"
#I "../../packages/R.NET.Community/lib/net40/"
#I "../../packages/R.NET.Community.FSharp/lib/net40/"
#I "../../packages/RProvider"
#load "RProvider.fsx"
fsi.ShowDeclarationValues <- false

(**
Markov Chain Monte Carlo
========================

Documentation coming soon.
*)

================================================
FILE: docs/input/Optimization.fsx
================================================
(*** hide ***)

#r "../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll"
#r "../../src/Hype/bin/Release/netstandard2.0/Hype.dll"
#I "../../packages/R.NET.Community/lib/net40/"
#I "../../packages/R.NET.Community.FSharp/lib/net40/"
#I "../../packages/RProvider"
#load "RProvider.fsx"

(**
Optimization
============

Hype provides a highly configurable and modular gradient-based optimization functionality. This works similar to many other machine learning libraries.

**Here's the novelty:** 

Thanks to nested AD, gradient-based optimization can be combined with any code, including code which internally takes derivatives of a function to produce its output. In other words, you can optimize the value of a function that is internally optimizing another function, or using derivatives for any other purpose (e.g. running particle simulations, adaptive control), up to any level. 

In such a compositional optimization setting, all arising higher-order derivatives are handled for you through **nested instantiations of forward and/or reverse AD**. In any case, you only need to write your algorithms as usual, **only implementing a regular forward algorithm**.

Let's explain this through a basic example from the article _"Jeffrey Mark Siskind and Barak A. Pearlmutter. Nesting forward-mode AD in a functional framework. Higher Order and Symbolic Computation 21(4):361-76, 2008. doi:10.1007/s10990-008-9037-1"_, where a parameter of a physics simulation using the gradient of an electric potential is optimized with Newton's method using the Hessian of an error, requiring third-order nesting of derivatives.

Optimizing a physics simulation
-------------------------------

Consider a charged particle traveling in a plane with position $\mathbf{x}(t)$, velocity $\dot{\mathbf{x}}(t)$, initial position $\mathbf{x}(0)=(0, 8)$, and initial velocity $\dot{\mathbf{x}}(0)=(0.75, 0)$. The particle is accelerated by an electric field formed by a pair of repulsive bodies,

$$$
   p(\mathbf{x}; w) = \| \mathbf{x} - (10, 10 - w)\|^{-1} + \| \mathbf{x} - (10, 0)\|^{-1}

where $w$ is a parameter of this simple particle simulation, adjusting the location of one of the repulsive bodies.

We can simulate the time evolution of this system by using a naive Euler ODE integration

$$$
   \begin{eqnarray*}
   \ddot{\mathbf{x}}(t) &=& \left. -\nabla_{\mathbf{x}} p(\mathbf{x}) \right|_{\mathbf{x}=\mathbf{x}(t)}\\
   \dot{\mathbf{x}}(t + \Delta t) &=& \dot{\mathbf{x}}(t) + \Delta t \ddot{\mathbf{x}}(t)\\
   \mathbf{x}(t + \Delta t) &=& \mathbf{x}(t) + \Delta t \dot{\mathbf{x}}(t)
   \end{eqnarray*}

where $\Delta t$ is an integration time step.

For a given parameter $w$, the simulation starts with $t=0$ and finishes when the particle hits the $x$-axis, at position $\mathbf{x}(t_f)$ at time $t_f$. When the particle hits the $x$-axis, we calculate an error $E(w) = x_0 (t_f)^2$, the squared horizontal distance of the particle from the origin. We then minimize this error using Newton's method, which finds the optimal value of $w$ so that the particle eventually hits the $x$-axis at the origin.

$$$
   w^{(i+1)} = w^{(i)} - \frac{E'(w^{(i)})}{E''(w^{(i)})}

In other words, the code calculating the trajectory of the particle internally computes the gradient of the electric potential $p(\mathbf{x}; w)$, and, at the same time, the final position of the trajectory $\mathbf{x}(t_f)$ is used to compute an error, and the gradient and Hessian of this error are computed during the optimization procedure.

Here's how it goes.
*)

open Hype
open DiffSharp.AD.Float32

let dt = D 0.1f
let x0 = toDV [0.; 8.]
let v0 = toDV [0.75; 0.]

let p w (x:DV) = (1.f / DV.norm (x - toDV [D 10.f + w * D 0.f; D 10.f - w])) 
               + (1.f / DV.norm (x - toDV [10.; 0.]))

let trajectory (w:D) = 
    (x0, v0) 
    |> Seq.unfold (fun (x, v) ->
                    let a = -grad (p w)  x
                    let v = v + dt * a
                    let x = x + dt * v
                    Some(x, (x, v)))
    |> Seq.takeWhile (fun x -> x.[1] > D 0.f)

let error (w:DV) =
    let xf = trajectory w.[0] |> Seq.last
    xf.[0] * xf.[0]

let w, l, whist, lhist = Optimize.Minimize(error, toDV [0.], 
                                            {Params.Default with 
                                                Method = Newton; 
                                                LearningRate = Constant (D 1.f)
                                                ValidationInterval = 1;
                                                Epochs = 10})

(**
<pre>
[25/12/2015 23:53:10] --- Minimization started
[25/12/2015 23:53:10] Parameters     : 1
[25/12/2015 23:53:10] Iterations     : 10
[25/12/2015 23:53:10] Valid. interval: 1
[25/12/2015 23:53:10] Method         : Exact Newton
[25/12/2015 23:53:10] Learning rate  : Constant a = D 1.0f
[25/12/2015 23:53:10] Momentum       : None
[25/12/2015 23:53:10] Gradient clip. : None
[25/12/2015 23:53:10] Early stopping : None
[25/12/2015 23:53:10] Improv. thresh.: D 0.995000005f
[25/12/2015 23:53:10] Return best    : true
[25/12/2015 23:53:10]  1/10 | D  2.535113e+000 [- ]
[25/12/2015 23:53:10]  2/10 | D  7.528733e-002 [↓▼]
[25/12/2015 23:53:10]  3/10 | D  1.592970e-002 [↓▼]
[25/12/2015 23:53:10]  4/10 | D  4.178338e-003 [↓▼]
[25/12/2015 23:53:10]  5/10 | D  1.382800e-008 [↓▼]
[25/12/2015 23:53:11]  6/10 | D  3.274181e-011 [↓▼]
[25/12/2015 23:53:11]  7/10 | D  1.151079e-012 [↓▼]
[25/12/2015 23:53:11]  8/10 | D  1.151079e-012 [- ]
[25/12/2015 23:53:11]  9/10 | D  1.151079e-012 [- ]
[25/12/2015 23:53:11] 10/10 | D  3.274181e-011 [↑ ]
[25/12/2015 23:53:11] Duration       : 00:00:00.9201285
[25/12/2015 23:53:11] Value initial  : D  2.535113e+000
[25/12/2015 23:53:11] Value final    : D  1.151079e-012 (Best)
[25/12/2015 23:53:11] Value change   : D -2.535113e+000 (-100.00 %)
[25/12/2015 23:53:11] Value chg. / s : D -2.755173e+000
[25/12/2015 23:53:11] Iter. / s      : 10.86804723
[25/12/2015 23:53:11] Iter. / min    : 652.0828341
[25/12/2015 23:53:11] --- Minimization finished

val whist : DV [] =
  [|DV [|0.0f|]; DV [|0.20767726f|]; DV [|0.17457059f|]; DV [|0.190040559f|];
    DV [|0.182180524f|]; DV [|0.182166189f|]; DV [|0.182166889f|];
    DV [|0.182166755f|]; DV [|0.182166621f|]; DV [|0.182166487f|]|]
val w : DV = DV [|0.182166889f|]
val lhist : D [] =
  [|D 2.5351131f; D 2.5351131f; D 0.0752873272f; D 0.0159297027f;
    D 0.00417833822f; D 1.38279992e-08f; D 3.27418093e-11f; D 1.15107923e-12f;
    D 1.15107923e-12f; D 1.15107923e-12f|]
val l : D = D 1.15107923e-12f
</pre>
*)

(*** hide ***)
open RProvider
open RProvider.graphics
open RProvider.grDevices

R.plot_new (namedParams [ ])

let t = trajectory (whist.[1].[0])
let tx, ty = t |> Seq.toArray |> Array.map (fun v -> v.[0] |> float32 |> float, v.[1] |> float32 |> float) |> Array.unzip

namedParams[
    "x", box tx
    "y", box ty
    "pch", box 1
    "xlab", box ""
    "ylab", box ""
    "col", box "darkblue"
    "type", box "l"
    "lty", box 4
    "width", box 700
    "height", box 500
    ]
|> R.lines |> ignore


(**
<div class="row">
    <div class="span6 text-center">
        <img src="img/Optimization-3.png" alt="Chart" style="width:500px;"/>
    </div>
</div><br/>

Optimization parameters
-----------------------
As another example, let's optimize the Beale function

$$$
   f(\mathbf{x}) = (1.5 - x_1 + x_1 x_2)^2 + (2.25 - x_1 + x_1 x_2^2)^2 + (2.625 - x_1 + x_1 x_2^3)^2

starting from $\mathbf{x} = (1, 1.5)$, using RMSProp. The optimum is at $(3, 0.5)$
*)

let beale (x:DV) = (1.5f - x.[0] + (x.[0] * x.[1])) ** 2.f
                    + (2.25f - x.[0] + x.[0] * x.[1] ** 2.f) ** 2.f
                    + (2.625f - x.[0] + x.[0] * x.[1] ** 3.f) ** 2.f

let wopt, lopt, whist, lhist = Optimize.Minimize(beale, toDV [1.; 1.5], 
                                                    {Params.Default with 
                                                        Epochs = 3000; 
                                                        LearningRate = RMSProp (D 0.01f, D 0.9f)})

(**
<pre>
[12/11/2015 01:22:59] --- Minimization started
[12/11/2015 01:22:59] Parameters     : 2
[12/11/2015 01:22:59] Iterations     : 3000
[12/11/2015 01:22:59] Valid. interval: 10
[12/11/2015 01:22:59] Method         : Gradient descent
[12/11/2015 01:22:59] Learning rate  : RMSProp a0 = D 0.00999999978f, k = D 0.899999976f
[12/11/2015 01:22:59] Momentum       : None
[12/11/2015 01:22:59] Gradient clip. : None
[12/11/2015 01:22:59] Early stopping : None
[12/11/2015 01:22:59] Improv. thresh.: D 0.995000005f
[12/11/2015 01:22:59] Return best    : true
[12/11/2015 01:22:59]    1/3000 | D  4.125000e+001 [- ]
[12/11/2015 01:22:59]   11/3000 | D  2.655878e+001 [↓▼]
[12/11/2015 01:22:59]   21/3000 | D  2.154373e+001 [↓▼]
[12/11/2015 01:22:59]   31/3000 | D  1.841705e+001 [↓▼]
[12/11/2015 01:22:59]   41/3000 | D  1.624916e+001 [↓▼]
[12/11/2015 01:22:59]   51/3000 | D  1.465973e+001 [↓▼]
[12/11/2015 01:22:59]   61/3000 | D  1.334291e+001 [↓▼]
...
[12/11/2015 01:22:59] 2921/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] 2931/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] 2941/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] 2951/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] 2961/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] 2971/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] 2981/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] 2991/3000 | D  9.084024e-004 [- ]
[12/11/2015 01:22:59] Duration       : 00:00:00.3142646
[12/11/2015 01:22:59] Value initial  : D  4.125000e+001
[12/11/2015 01:22:59] Value final    : D  8.948371e-004 (Best)
[12/11/2015 01:22:59] Value change   : D -4.124910e+001 (-100.00 %)
[12/11/2015 01:22:59] Value chg. / s : D -1.312560e+002
[12/11/2015 01:22:59] Iter. / s      : 9546.09587
[12/11/2015 01:22:59] Iter. / min    : 572765.7522
[12/11/2015 01:22:59] --- Minimization finished

val wopt : DV = DV [|2.99909306f; 0.50039643f|]
</pre>
*)

(*** hide ***)
open RProvider
open RProvider.graphics
open RProvider.grDevices

R.plot_new (namedParams [ ])

let ll = lhist |> Array.map (float32>>float)

namedParams[
    "x", box ll
    "pch", box 19
    "col", box "darkblue"
    "type", box "o"
    "xlab", box "Iteration"
    "ylab", box "Function value"
    "width", box 700
    "height", box 500
    ]
|> R.plot|> ignore

(**
<div class="row">
    <div class="span6 text-center">
        <img src="img/Optimization-1.png" alt="Chart" style="width:500px;"/>
    </div>
</div><br/>

*)

(*** hide ***)

let contourplot3d (f:DV->D) (xmin, xmax) (ymin, ymax) =
    let res = 100
    let xstep = ((xmax - xmin) / float res)
    let ystep = ((ymax - ymin) / float res)
    let x = [|xmin .. xstep .. xmax|]
    let y = [|ymin .. ystep .. ymax|]
    let z = Array2D.init x.Length y.Length (fun i j -> f (toDV [x.[i]; y.[j]])) |> Array2D.map (float32>>float)
    namedParams [
        "x", box x
        "y", box y
        "z", box z
        "labels", box ""
        "levels", box [|0..5..200|]]
    |> R.contour

contourplot3d beale (-4.5,4.5) (-4.5,4.5) 

let xx, yy = whist |> Array.map (fun v -> v.[0] |> float32 |> float, v.[1] |> float32 |> float) |> Array.unzip
namedParams[
    "x", box xx
    "y", box yy
    "col", box "blue"]
|> R.lines

namedParams[
    "x", box (xx |>Array.last)
    "y", box (yy |> Array.last)
    "pch", box 16
    "col", box "blue"]
|> R.points

(**
<div class="row">
    <div class="span6 text-center">
        <img src="img/Optimization-2.png" alt="Chart" style="width:500px;"/>
    </div>
</div><br/>

Each instantiation of gradient-based optimization is controlled through a collection of parameters, using the **Hype.Params** type.

If you do not supply any parameters to optimization, the default parameter set **Params.Default** is used. The default parameters look like this:

*)
module Params =
     let Default = {Epochs = 100
                    LearningRate = LearningRate.DefaultRMSProp
                    Momentum = NoMomentum
                    Loss = L2Loss
                    Regularization = Regularization.DefaultL2Reg
                    GradientClipping = NoClip
                    Method = GD
                    Batch = Full
                    EarlyStopping = NoEarly
                    ImprovementThreshold = D 0.995f
                    Silent = false
                    ReturnBest = true
                    ValidationInterval = 10
                    LoggingFunction = fun _ _ _ -> ()}

(**
If you want to change only a specific element of the parameter type, you can do so by extending the **Params.Default** value and overwriting only the parts you need to change, such as this:
*)

let p = {Params.Default with
            Epochs = 5000
            LearningRate = LearningRate.AdaGrad (D 0.001f)
            Momentum = Nesterov (D 0.9f)}

(**
### Optimization method
*)

type Method =
    | GD          // Gradient descent
    | CG          // Conjugate gradient
    | CD          // Conjugate descent
    | NonlinearCG // Nonlinear conjugate gradient
    | DaiYuanCG   // Dai & Yuan conjugate gradient
    | NewtonCG    // Newton conjugate gradient
    | Newton      // Exact Newton

(**
### Learning rate
*)

type LearningRate =
    | Constant    of D         // Constant
    | Decay       of D * D     // 1 / t decay, a = a0 / (1 + kt). Initial value, decay rate
    | ExpDecay    of D * D     // Exponential decay, a = a0 * Exp(-kt). Initial value, decay rate
    | Schedule    of DV        // Scheduled learning rate vector, its length overrides Params.Epochs
    | Backtrack   of D * D * D // Backtracking line search. Initial value, c, rho
    | StrongWolfe of D * D * D // Strong Wolfe line search. lmax, c1, c2
    | AdaGrad     of D         // Adagrad. Initial value
    | RMSProp     of D * D     // RMSProp. Initial value, decay rate
    static member DefaultConstant    = Constant (D 0.001f)
    static member DefaultDecay       = Decay (D 1.f, D 0.1f)
    static member DefaultExpDecay    = ExpDecay (D 1.f, D 0.1f)
    static member DefaultBacktrack   = Backtrack (D 1.f, D 0.0001f, D 0.5f)
    static member DefaultStrongWolfe = StrongWolfe (D 1.f, D 0.0001f, D 0.5f)
    static member DefaultAdaGrad     = AdaGrad (D 0.001f)
    static member DefaultRMSProp     = RMSProp (D 0.001f, D 0.9f)

(**
### Momentum
*)

type Momentum =
    | Momentum of D // Default momentum
    | Nesterov of D // Nesterov momentum
    | NoMomentum
    static member DefaultMomentum = Momentum (D 0.9f)
    static member DefaultNesterov = Nesterov (D 0.9f)

(**
### Gradient clipping
*)

type GradientClipping =
    | NormClip of D // Norm clipping
    | NoClip
    static member DefaultNormClip = NormClip (D 1.f)

(**

Finally, looking at the [API reference](reference/index.html) and the [source code](https://github.com/hypelib/Hype/blob/master/src/Hype/Optimize.fs) of the optimization module can give you a better idea of the optimization algorithms currently implemented.
*)

================================================
FILE: docs/input/RecurrentNets.fsx
================================================
(*** hide ***)
#r "../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll"
#r "../../src/Hype/bin/Release/netstandard2.0/Hype.dll"
#I "../../packages/R.NET.Community/lib/net40/"
#I "../../packages/R.NET.Community.FSharp/lib/net40/"
#I "../../packages/RProvider"
#load "RProvider.fsx"
fsi.ShowDeclarationValues <- false

(**
Recurrent neural networks
=========================

In this example we build a recurrent neural network (RNN) for a language modeling task and train it with a short passage of text for a quick demonstration. Hype currently has three RNN models implemented as **Hype.Neural** layers, which can be combined freely with other layer types, explained, for example, in the [neural networks](feedforwardnets.html) page. **Hype.Neural.Recurrent** implements the "vanilla" RNN layer, **Hype.Neural.LSTM** implements the LSTM layer, and **Hype.Neural.GRU** implements the gated recurrent unit (GRU) layer.

### Language modeling

RNNs are well suited for constructing [language models,](https://en.wikipedia.org/wiki/Language_model) where we need to predict the probability of a word (or token) given the history of the tokens that came before it. Here, we will use an LSTM-based RNN to construct a word-level language model from a short passage of text, for a basic demonstration of usage. This model can be scaled to larger problems. State-of-the-art models of this type can require considerable computing resources and training time.

The text is from the beginning of Virgil's Aeneid, Book I.
*)

let text = "I sing of arms and the man, he who, exiled by fate, first came from the coast of Troy to Italy, and to Lavinian shores – hurled about endlessly by land and sea, by the will of the gods, by cruel Juno’s remorseless anger, long suffering also in war, until he founded a city and brought his gods to Latium: from that the Latin people came, the lords of Alba Longa, the walls of noble Rome. Muse, tell me the cause: how was she offended in her divinity, how was she grieved, the Queen of Heaven, to drive a man, noted for virtue, to endure such dangers, to face so many trials? Can there be such anger in the minds of the gods?"

(**
Hype provides a simple **Hype.NLP.Language** type for tokenizing text. You can look at the [API reference](reference/index.html) and the [source code](https://github.com/hypelib/Hype/blob/master/src/Hype/NLP.fs) for a better understanding of its usage.
*)

open Hype
open Hype.Neural
open Hype.NLP
open DiffSharp.AD.Float32
open DiffSharp.Util

let lang = Language(text)

lang.Tokens |> printfn "%A"
lang.Length |> printfn "%A"

(**
These are the tokens extracted from the text, including some of the punctuation marks. When we are sampling from the RNN language model, we will make use of the "." token for signaling the end of a sentence. The puncutation marks are configurable when you are constructing the **Language** instance. If they are not provided, a default set is used.

<pre>
[|","; "."; ":"; "?"; "Alba"; "Can"; "Heaven"; "I"; "Italy"; "Juno’s"; "Latin";
  "Latium"; "Lavinian"; "Longa"; "Muse"; "Queen"; "Rome"; "Troy"; "a"; "about";
  "also"; "and"; "anger"; "arms"; "be"; "brought"; "by"; "came"; "cause"; "city";
  "coast"; "cruel"; "dangers"; "divinity"; "drive"; "endlessly"; "endure";
  "exiled"; "face"; "fate"; "first"; "for"; "founded"; "from"; "gods"; "grieved";
  "he"; "her"; "his"; "how"; "hurled"; "in"; "land"; "long"; "lords"; "man";
  "many"; "me"; "minds"; "noble"; "noted"; "of"; "offended"; "people";
  "remorseless"; "sea"; "she"; "shores"; "sing"; "so"; "such"; "suffering";
  "tell"; "that"; "the"; "there"; "to"; "trials"; "until"; "virtue"; "walls";
  "war"; "was"; "who"; "will"; "–"|]
  
  86
</pre>
There are 86 tokens in this language instance.

Now let's transform the full text to a dataset, using the **Language** instance holding these tokens. The text will be encoded in a matrix where each column is a representation of each word as a _one-hot_ vector.
*)

let text' = lang.EncodeOneHot(text)
text'.Visualize() |> printfn "%s"

(**
<pre>
DM : 86 x 145
</pre>

Out of these 145 words, we will construct a dataset where the inputs are the first 144 words and the target outputs are the 144 words starting with a one word shift. This means that, for each word, we want the output (the prediction) to be the following word in our text passage.
*)

let data = Dataset(text'.[*, 0..(text'.Cols - 2)],
                   text'.[*, 1..(text'.Cols - 1)])

(**
<pre>
val data : Dataset = Hype.Dataset
   X: 86 x 144
   Y: 86 x 144
</pre>

RNNs, and especially the LSTM variety that we will use, can make predictions that take long-term dependencies and contextual information into account. When the language model is trained with a large enough text corpus and the network has enough capacity, state-of-the-art RNN language models are able to learn complex grammatical relations.

For our quick demonstration, we use a linear word embedding layer of 20 units, an LSTM of 100 units and a final linear layer of 86 units (the size of our vocabulary) followed by **softmax** activation.
*)

let dim = lang.Length // Vocabulary size, here 86

let n = FeedForward()
n.Add(Linear(dim, 20))
n.Add(LSTM(20, 100))
n.Add(Linear(100, dim))
n.Add(DM.mapCols softmax)

(**
You can also easily stack multiple RNNs on top of each other.
*)

let n = FeedForward()
n.Add(Linear(dim, 20))
n.Add(LSTM(20, 100))
n.Add(LSTM(100, 100))
n.Add(Linear(100, dim))
n.Add(DM.mapCols softmax)

(**
We will observe the the performance of our RNN during training by sampling random sentences from the language model. 

Remember that the final output of the network, through the softmax activation, is a vector of word probabilities. When we are sampling, we start with a word, supply this to the network, and use the resulting probabilities at the output to sample from the vocabulary where words with higher probability are more likely to be selected. We then continue by giving the network the last sampled word and repeating this until we hit an "end of sentence" token (we use "." here) or reach a limit of maximum sentence length.

This is how we would sample a sentence starting with a specific word.
*)

n.Reset()
for i = 0 to 5 do
    lang.Sample(n.Run, "I", [|"."|], 30) // Use "." as the stop token, limit maximum sentence length to 30.
    |> printfn "%s"

(**

Because the model is not trained, we get sequences of random words from the vocabulary.

<pre>
I be: she dangers Latium endlessly gods remorseless divinity tell and his offended lords trials? about war trials and anger shores so anger Alba a Alba sing her
I? came exiled – suffering shores anger came Latium people sing sing remorseless who brought war walls endlessly anger me founded his.
I – will long of in offended cruel until Queen Italy who anger lords Queen in Longa Muse who people about suffering Italy also grieved cruel hurled who me about
I endlessly city first by face, a Heaven me hurled sea such long noted she noted many sea city anger I noted remorseless cause Queen to remorseless Italy coast
I sea noted noble me minds long sing cause people in walls Italy by Longa first, for grieved sea many walls Troy came was endlessly of in Latium Latium
I and Latin of many suffering Alba Latium war.
</pre>

We set a training cycle where we run one epoch of training followed by sampling one sentence starting with the word "I". In each epoch, we run through the whole training dataset. With a larger training corpus, we could also run the training with minibatches by stating this in the parameter set (commented out below).

Like the sample sentences above, at the beginning of training, we see mostly random orderings of words. As the training progresses, the cross-entropy loss for our dataset is decreasing and the sentences start exhibiting meaningful word patterns.
*)

for i = 0 to 1000 do
    let par = {Params.Default with
                //Batch = Minibatch 10
                LearningRate = LearningRate.RMSProp(D 0.01f, D 0.9f)
                Loss = CrossEntropyOnSoftmax
                Epochs = 1
                Silent = true       // Suppress the regular printing of training progress
                ReturnBest = false} 
    let loss, _ = Layer.Train(n, data, par)
    printfn "Epoch: %*i | Loss: %O | Sample: %s" 3 i loss (lang.Sample(n.Run, "I", [|"."|], 30))

(**

Here is a selection of sentences demonstrating the progress of training.

<pre>
Epoch:   0 | Loss: D  4.478101e+000 | Sample: I Queen drive she Alba endlessly Queen the by how tell his from grieved war her there drive people – lords coast he.
Epoch:  10 | Loss: D  4.102071e+000 | Sample: I people to,, Rome how the he of – sing fate, Muse, by,, Muse the of man Queen Latin and in her cause:
Epoch:  30 | Loss: D  3.438288e+000 | Sample: I walls long to first dangers she her, to founded to virtue sea first Can dangers a founded about Can Queen lords from sea by remorseless founded endlessly Latium
Epoch:  40 | Loss: D  2.007577e+000 | Sample: I Alba gods Alba Rome, the walls Alba Muse Rome anger me the the of the gods to who man me first founded offended endlessly until also grieved long
Epoch:  50 | Loss: D  9.753818e-001 | Sample: I sing people cruel: me the of Rome.
Epoch:  60 | Loss: D  3.944587e-001 | Sample: I sing sing Troy to so hurled endlessly by land sea, by to – hurled about by the of arms, by Juno’s such anger long also in her
Epoch:  70 | Loss: D  2.131431e-001 | Sample: I sing of and the of Longa, by Juno’s anger was in her of Heaven, to a city brought his gods to a gods to Lavinian hurled to
Epoch:  80 | Loss: D  1.895453e-001 | Sample: I sing, by will the of Rome.
Epoch:  90 | Loss: D  1.799535e-001 | Sample: I sing? there Muse the of the of the of arms by the: how she offended in the of? a, he shores hurled by land to
Epoch: 100 | Loss: D  1.733837e-001 | Sample: I sing arms the of Alba gods who, by Juno’s Rome such anger the of the of arms and, by, by from the coast Rome.
Epoch: 110 | Loss: D  1.682917e-001 | Sample: I sing Troy by, by from the of arms and, by, by from came, by Juno’s anger long in the of the of arms cruel Muse
Epoch: 120 | Loss: D  1.639529e-001 | Sample: I sing arms the of Rome.
Epoch: 130 | Loss: D  1.600647e-001 | Sample: I sing arms and, by Juno’s remorseless there and the of the of arms and, by Alba coast Troy to a – his gods by of the of
Epoch: 140 | Loss: D  1.564835e-001 | Sample: I sing arms by the of Rome.
Epoch: 150 | Loss: D  1.531392e-001 | Sample: I sing arms cruel, exiled by coast, he a city in the of the of arms.
Epoch: 160 | Loss: D  1.499920e-001 | Sample: I sing arms cruel man, by the trials arms to shores hurled endlessly by the of gods Italy, me the of Rome.
Epoch: 200 | Loss: D  1.390327e-001 | Sample: I sing arms and, by Juno’s such of the of the of arms Italy, by from the sing arms walls of the of Rome.
Epoch: 230 | Loss: D  1.322940e-001 | Sample: I sing arms the man he, tell from the of arms Italy, by fate, by the of Troy Italy, by fate first from the of the
Epoch: 260 | Loss: D  1.264137e-001 | Sample: I sing brought Muse Muse the of Heaven, by shores remorseless there he in the of arms cruel, by fate, he from the gods to Italy,
Epoch: 420 | Loss: D  1.131158e-001 | Sample: I sing of arms the of Heaven, by Juno’s remorseless hurled such in the of arms.
Epoch: 680 | Loss: D  9.938217e-002 | Sample: I of arms the man he, exiled fate, he virtue, to a? Can be such in the of the of of the of arms.
Epoch: 923 | Loss: D  9.283429e-002 | Sample: I sing of arms and the man he, by fate came from the of to Italy, by the, by Juno’s anger of Rome.
</pre>
*)


================================================
FILE: docs/input/Regression.fsx
================================================
(*** hide ***)
#r "../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll"
#r "../../src/Hype/bin/Release/netstandard2.0/Hype.dll"
#I "../../packages/R.NET.Community/lib/net40/"
#I "../../packages/R.NET.Community.FSharp/lib/net40/"
#I "../../packages/RProvider"
#load "RProvider.fsx"
fsi.ShowDeclarationValues <- true

(**
Regression
==========

In this example we implement a logistic regression based binary classifier and train it to distinguish between the [MNIST](http://yann.lecun.com/exdb/mnist/) digits of 0 and 1.

### Loading the data

First, let's start by loading the MNIST training and testing data and arranging these into training, validation, and testing sets.
*)

open Hype
open Hype.Neural
open DiffSharp.AD.Float32
open DiffSharp.Util

let MNIST = Dataset(Util.LoadMNISTPixels("C:/datasets/MNIST/train-images.idx3-ubyte", 60000),
                    Util.LoadMNISTLabels("C:/datasets/MNIST/train-labels.idx1-ubyte", 60000) |> toDV |> DM.ofDV 1).NormalizeX()



let MNISTtrain = MNIST.[..58999]
let MNISTvalid = MNIST.[59000..]

let MNISTtest = Dataset(Util.LoadMNISTPixels("C:/datasets/MNIST/t10k-images.idx3-ubyte", 10000),
                        Util.LoadMNISTLabels("C:/datasets/MNIST/t10k-labels.idx1-ubyte", 10000) |> toDV |> DM.ofDV 1).NormalizeX()

(**
We shuffle the columns of the datasets and filter them to only keep the digits of 0 and 1.
*)

let MNISTtrain01 = MNISTtrain.Shuffle().Filter(fun (x, y) -> y.[0] <= D 1.f)
let MNISTvalid01 = MNISTvalid.Shuffle().Filter(fun (x, y) -> y.[0] <= D 1.f)
let MNISTtest01 = MNISTtest.Shuffle().Filter(fun (x, y) -> y.[0] <= D 1.f)

(**
<pre>
val MNISTtrain01 : Dataset = Hype.Dataset
   X: 784 x 12465
   Y: 1 x 12465
val MNISTvalid01 : Dataset = Hype.Dataset
   X: 784 x 200
   Y: 1 x 200
val MNISTtest01 : Dataset = Hype.Dataset
   X: 784 x 2115
   Y: 1 x 2115
</pre>

We can visualize individual digits from the dataset.
*)

MNISTtrain.X.[*,9] |> DV.visualizeAsDM 28 |> printfn "%s"
MNISTtrain.Y.[*,9]

(**
    [lang=cs]
    DM : 28 x 28
                            
                            
                            
                            
                          ♦♦    
                         ▪█▪    
                        ▴██·    
                        ♦█♦     
                 ●     ·█■      
                ■█     ■█·      
                ♦█     ██·      
               ▴█■    ●█♦       
               ■█    ▪█■        
              ■█▪   ▴██-        
            -███♦▴  ♦█▪         
           ·███■██♦■█■          
          ·██■  ♦█████■         
          ♦■-    ♦████▪         
          -      ██·            
                ▪█■             
               ▴█■              
               ■█▴              
              ■█▪               
              ▴█·               
                            
                            
                            
                            

    val it : DV = DV [|4.0f|]

We can also visualize a series of digits in grid layout.
*)

MNISTtrain.[..5].VisualizeXColsAsImageGrid(28) |> printfn "%s"

(**
    [lang=cs]
    Hype.Dataset
       X: 784 x 6
       Y: 1 x 6
    X's columns reshaped to (28 x 28), presented in a (2 x 3) grid:
    DM : 56 x 84
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                ▪█▪                                     
                    ▴▴● ●██▴                   █████                             ■      
              -▪●█████■●██♦                   ■███■█             ·              ▴●      
            ██████████-··                    ■███♦·██▴          ▴●              ▪♦      
            ■█████♦●██                     ●██████-♦█●          ■●              █▪      
            ·▪-██♦   ▪                     ███♦-█■ ·█●          ■●             ●█▴      
               ▪█·                        ███● ·▴   ██          █●             ♦█       
               ▴█♦                       ●█■♦·      ██●        ▴█●             ■█       
                ♦█·                     ●██·        ██♦        ▪█▴            ●█■       
                 █■▪-                   ██          ██♦        ▪█           ·●██·       
                 ·███▴                 ♦█♦          ██♦        ▪█·     ▴▪▪███●██        
                   ●██▪               ·██-          ██▪        ▪██♦♦♦████♦▪·  ■█        
                    -██♦              ·█■          ▴█●          ▴●●●●●-      -█■        
                     ███              ·█■         ▴█■·                       ●█▴        
                   ▴●██♦              ·█▪        ●█●                         ●█         
                 ▪■████●              ·█■      -██▪                          ●█         
               -■████♦·               ·██▪  ·●■█■●                           ●█-        
              ■████♦·                 ·███■■███♦▴                            ●█-        
           ●■████♦·                    ♦█████■▪                              ●█▪        
         ●■█████▴                       ▴███▪                                ●█▪        
        ▴███■▴▴                                                              -█▪        
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                      ▴██                                                    -▴         
                     -███                                                 ▪♦███▪        
                     ▴███                    ▪♦██-·▪                    ▪███■■█■        
                     ██■                   ·■██♦♦███●                 ▪████■  ██        
                    ■██-                   ██♦   ●██▴                -████■   ██        
                   ▪██♦                  -██●   ·██■                 ●██■●   ·██        
                   ███                  ▴██▪    ■██·                  ▴      -██        
                  ♦██▴                 ▴██●    ·██▴                          ▪██        
                 -██●                  ■█●    ♦██●                       -▴▴▴♦█♦        
                ·██♦                   ██  ▴♦████·                      ●█████■         
                ███▪                   ■█████■■█■                     ■██■●███■▴        
               ▪███                     ██■▴  ♦█▪                   ·██▴   ♦████·       
               ■██●                           ██-                  ▴██●   ♦██▴●██●      
              ███♦                           ·██                  ▴██-   ♦█■   ·●███■   
              ███·                            ██                 -██· ·●██▴      ·●●    
             ▪███                            ·██                 ■██♦■███▴              
             ■██▪                            -██                 ♦████●▴                
             ██■                              ██                  -▪▴                   
             ██■                              ■█                                        
             ♦█■                              -█♦                                       
                                               ●█●                                      
                                                ▪█                                      
                                                                                    
*)

MNISTtrain01.[..5].VisualizeXColsAsImageGrid(28) |> printfn "%s"

(**
    [lang=cs]
    Hype.Dataset
       X: 784 x 6
       Y: 1 x 6
    X's columns reshaped to (28 x 28), presented in a (2 x 3) grid:
    DM : 56 x 84
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                     ▴●███-                      ·♦██                                   
                  ·▪■█████■                      ████●                   -▪██████       
                 ▪████████-                     ●████■                 ▪██████████      
               ●████▪ ●███▴                     ■████●                ■███■▪▪▴-███●     
              ████●   ▴███·                    ██████               -███■     ██████    
             ███▪      ██▴                    -██████               ■██▴      ████▴♦-   
            ■██·       ●█●                   ▪█████■               ■██        ■█▴█■     
           ●██·        ▴█●                   ██████               ▪██●         - ■█-    
          -██▪         ▴█●                   ██████               ██■            ▪█▴    
          ██■          ▴█●                  ■█████▴              ▴██-            ■█▴    
         ·██           ▴█●                 ██████-               ▴██             ██▴    
         ■█■           ▴█●                 █████▴                ▴█■             ██     
         ■█▪           ▴█●                ♦████-                 ▴█▪            ▴█▪     
         ██-           ●█●               ♦█████                  ▴█▪            ●█·     
         ██▴           ██·               █████♦                  -██           -█▪      
         ███          ●██·              ●█████·                   ██■           █-      
         -███●       ▴██●              ·█████                     ███          ■■       
           ■█████♦●●■███■              ♦█████                      ██-       -■█·       
            ▴■█████████♦               █████♦                      ●██●    -■██▪        
               ·▪▪●█♦▪▴                ▴████·                       ●████████■-         
                                                                     ▪█████▪-           
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                ■▴                         ·····        
                                                ▴■                       ▪■█████·       
               ♦♦♦♦♦-·                          ■█                     ▴█████████-      
             ▪████████■▪                        ██                     ███████████·     
             ■███████████■                     ▪█■                   ▴■████♦♦ ▴████     
            █████████  ♦███                    ██-                  ▴████♦▴    ■███·    
          ▪███●     ●   ·██■                   █●                  ▴████■      ♦███-    
          ██■            -██·                 ▪█▴                  ████▪        ████    
         ▪██              ██♦                 ██                   ███-         ████    
         ●█♦              ▪██                ▪██                 ·███·          ████    
         ██▴              ▴██                ●██                ·███♦          ▪████    
         ██               ♦█▪                ██●                ·███          ▪████·    
         ██-              ██                 ██·                ●███        ·●█████·    
         ♦█●             ■██                ·██-                ████·      ■██████▴     
         ▪█■            ■██-                 █♦                 ████■▴▴▴■████████▪      
         -██●         ▴███♦                 -█♦                 ■██████████████♦        
          ♦██♦-▪    ▪♦██■-                  ■█♦                 ·████████████▪▴         
           ♦███████████●                    ■█-                   -█████████▴           
            ▴■█■■■■■■·                      █■                       ▪ ▪▪               
                                            ♦▴                                          
                                                                                    
                                                                                    
                                                                                    
                                                                                    
### Defining the model

Let's now create our linear regression model. We implement this using the **Hype.Neural** module, as a linear layer with $28 \times 28 = 784$ inputs and one output. The output of the layer is passed through the sigmoid function.

*)

let n = Neural.FeedForward()
n.Add(Linear(28 * 28, 1))
n.Add(sigmoid)

(**

We can visualize the initial state of the linear model weights before the training. For information of about weight initialization parameters, please see the [neural networks example](feedforwardnets.html).

*)

let l = (n.[0] :?> Linear)
l.VisualizeWRowsAsImageGrid(28) |> printfn "%s"

(** 

    [lang=cs]
    Hype.Neural.Linear
       784 -> 1
       Learnable parameters: 785
       Init: Standard
       W's rows reshaped to (28 x 28), presented in a (1 x 1) grid:
    DM : 28 x 28
     ▴▪●●-█▴♦♦● ·▴█● ● ▴· ●●●●▪·
    ■ █- ▴●●▪ ■♦· ■▪■▪   █  ♦■●■
    ♦■ █♦●▪●♦  ♦■   ♦     ■ ▪- ■
     ■▪ ■♦■♦ █ ▪● ♦▪▴··■█ -▴●▪▪●
    ██··▴●●█▪♦■ -·█■ ▪- ··▪·  ██
    - ▪   ♦ ▪●  ▪■█♦- ▴▪ ▴·  ▪·●
    -   ●●▴▴ ▪■ ▴█ ▪▴·▴▴·♦■■♦·■■
    ♦▴ ▪■ ▪▪▴■·■--▪♦-   ·♦▪■ ♦·●
     ·▴·♦▪♦●▪··▴·▪ ● ▪ █  ▴▪·♦▪ 
    ■ ▴ ♦█▴ -  ♦●■  █▪■●▪█■▴●--█
    ♦■   ●■▴♦ ●· █· ▴· -█-▪●■■-■
     █-·▪▴-▴█ ♦ █●·♦█▪▪●●■ -   ·
     -   █ ■♦·●▪▴♦ -▴ -  ■♦· ♦ -
    ■█ ▪-  ▪■●♦█▴-█▪■  ■♦▪█■▪■ -
    ●♦█▴♦♦ ♦   ▴▪▴▴♦-▴♦♦█ ▴ ▪·● 
     ·█▪■■█ ●· ●· -●■●··  ▴  --▴
    ·♦█▴ ♦♦■ ▴▪●▪-  · -♦●♦ ■ · ■
    ■■▪---♦■·●▴▪-▪▴· ▪●● ·♦■ ▪♦▴
    ▴ -♦●■█·█   ● ♦▪●■- ·■♦-▪▴■▴
     ●-■● ···●█▴▪ -█·▪ ♦▴    ● ●
    ·█  █▴ ·♦---■▴·█■■▴ ▴■  -  █
    - ▪  ●█·▴♦▪    ■ ▪■ ■···   ▴
    ■ ♦♦- █▪♦-- ▴ ▴ ··█▴● ■♦    
    ■·■■▪▴-·█♦●■ ▴ ♦ ♦▴■♦  ■ ●♦▪
    ·█▪- ■●▴▪▴▪ ▪  ▴▪ ·   ▪▴▴··♦
      ▪█♦■   ·♦ ■▪ ♦ ▴·●█▪· ·▪▴ 
    · ■♦▪■ ▪■● ♦  ··· ·▪█■·  ▪■●
    ●▴▪ ·■● -█●█·▪■▴ ▴▴♦  ■  ■ ▴

       b:
    DV : 1
     

### Training

Let's train the model for 10 epochs (full passes through the training data), with a minibatch size of 100, using the training and validation sets we've defined. The validation set will make sure that we're not overfitting the model.
*)

let p = {Params.Default with 
            Epochs = 10; 
            Batch = Minibatch 100; 
            EarlyStopping = EarlyStopping.DefaultEarly}

n.Train(MNISTtrain01, MNISTvalid01, p)

(**
<pre>
[12/11/2015 20:21:12] --- Training started
[12/11/2015 20:21:12] Parameters     : 785
[12/11/2015 20:21:12] Iterations     : 1240
[12/11/2015 20:21:12] Epochs         : 10
[12/11/2015 20:21:12] Batches        : Minibatches of 100 (124 per epoch)
[12/11/2015 20:21:12] Training data  : 12465
[12/11/2015 20:21:12] Validation data: 200
[12/11/2015 20:21:12] Valid. interval: 10
[12/11/2015 20:21:12] Method         : Gradient descent
[12/11/2015 20:21:12] Learning rate  : RMSProp a0 = D 0.00100000005f, k = D 0.899999976f
[12/11/2015 20:21:12] Momentum       : None
[12/11/2015 20:21:12] Loss           : L2 norm
[12/11/2015 20:21:12] Regularizer    : L2 lambda = D 9.99999975e-05f
[12/11/2015 20:21:12] Gradient clip. : None
[12/11/2015 20:21:12] Early stopping : Stagnation thresh. = 750, overfit. thresh. = 10
[12/11/2015 20:21:12] Improv. thresh.: D 0.995000005f
[12/11/2015 20:21:12] Return best    : true
[12/11/2015 20:21:12]  1/10 | Batch   1/124 | D  4.748471e-001 [- ] | Valid D  4.866381e-001 [- ] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  11/124 | D  2.772053e-001 [↓▼] | Valid D  3.013612e-001 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  21/124 | D  2.178165e-001 [↓▼] | Valid D  2.304372e-001 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  31/124 | D  2.009703e-001 [↓▼] | Valid D  1.799015e-001 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  41/124 | D  1.352896e-001 [↓▼] | Valid D  1.405802e-001 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  51/124 | D  1.182899e-001 [↓▼] | Valid D  1.108390e-001 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  61/124 | D  1.124191e-001 [↓▼] | Valid D  8.995526e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  71/124 | D  8.975799e-002 [↓▼] | Valid D  7.361954e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  81/124 | D  5.031444e-002 [↓▼] | Valid D  5.941865e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch  91/124 | D  5.063754e-002 [↑ ] | Valid D  4.927430e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch 101/124 | D  3.842642e-002 [↓▼] | Valid D  4.095582e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch 111/124 | D  4.326219e-002 [↑ ] | Valid D  3.452797e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  1/10 | Batch 121/124 | D  2.585407e-002 [↓▼] | Valid D  2.788338e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch   1/124 | D  3.069563e-002 [↑ ] | Valid D  2.663207e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  11/124 | D  1.765305e-002 [↓▼] | Valid D  2.332163e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  21/124 | D  2.314118e-002 [↑ ] | Valid D  1.902804e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  31/124 | D  3.177435e-002 [↑ ] | Valid D  1.691620e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  41/124 | D  2.219648e-002 [↓ ] | Valid D  1.455527e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  51/124 | D  1.205402e-002 [↓▼] | Valid D  1.240637e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  61/124 | D  3.891717e-002 [↑ ] | Valid D  1.189688e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  71/124 | D  2.114762e-002 [↓ ] | Valid D  1.083007e-002 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  81/124 | D  5.075417e-003 [↓▼] | Valid D  9.630994e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:12]  2/10 | Batch  91/124 | D  1.343214e-002 [↑ ] | Valid D  8.666289e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  2/10 | Batch 101/124 | D  6.054885e-003 [↓ ] | Valid D  8.039203e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  2/10 | Batch 111/124 | D  1.964125e-002 [↑ ] | Valid D  7.339509e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  2/10 | Batch 121/124 | D  4.401092e-003 [↓▼] | Valid D  6.376633e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch   1/124 | D  7.068173e-003 [↑ ] | Valid D  6.426438e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  11/124 | D  3.763680e-003 [↓▼] | Valid D  6.076077e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  21/124 | D  9.855231e-003 [↑ ] | Valid D  5.091224e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  31/124 | D  1.263964e-002 [↑ ] | Valid D  4.641499e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  41/124 | D  1.205439e-002 [↓ ] | Valid D  4.599225e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  51/124 | D  2.941387e-003 [↓▼] | Valid D  4.381890e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  61/124 | D  2.546543e-002 [↑ ] | Valid D  4.439059e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  71/124 | D  9.878366e-003 [↓ ] | Valid D  4.358966e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  81/124 | D  1.868963e-003 [↓▼] | Valid D  3.960044e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch  91/124 | D  7.171181e-003 [↑ ] | Valid D  3.634899e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch 101/124 | D  2.681098e-003 [↓ ] | Valid D  3.636524e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch 111/124 | D  1.502046e-002 [↑ ] | Valid D  3.393996e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  3/10 | Batch 121/124 | D  2.381395e-003 [↓ ] | Valid D  3.178693e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  4/10 | Batch   1/124 | D  3.185510e-003 [↑ ] | Valid D  3.240891e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:13]  4/10 | Batch  11/124 | D  2.029225e-003 [↓ ] | Valid D  3.163968e-003 [↓ ] | Stag: 20 Ovfit: 0
[12/11/2015 20:21:13]  4/10 | Batch  21/124 | D  6.450378e-003 [↑ ] | Valid D  2.772849e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  4/10 | Batch  31/124 | D  7.448227e-003 [↑ ] | Valid D  2.572560e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:13]  4/10 | Batch  41/124 | D  9.700718e-003 [↑ ] | Valid D  2.693694e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:13]  4/10 | Batch  51/124 | D  1.799919e-003 [↓▼] | Valid D  2.737873e-003 [↑ ] | Stag: 20 Ovfit: 1
[12/11/2015 20:21:13]  4/10 | Batch  61/124 | D  1.919956e-002 [↑ ] | Valid D  2.778393e-003 [↑ ] | Stag: 30 Ovfit: 3
[12/11/2015 20:21:13]  4/10 | Batch  71/124 | D  5.462923e-003 [↓ ] | Valid D  2.870561e-003 [↑ ] | Stag: 40 Ovfit: 3
[12/11/2015 20:21:13]  4/10 | Batch  81/124 | D  1.455469e-003 [↓▼] | Valid D  2.632472e-003 [↓ ] | Stag: 50 Ovfit: 4
[12/11/2015 20:21:14]  4/10 | Batch  91/124 | D  5.270801e-003 [↑ ] | Valid D  2.455564e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:14]  4/10 | Batch 101/124 | D  2.057914e-003 [↓ ] | Valid D  2.511977e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:14]  4/10 | Batch 111/124 | D  1.314815e-002 [↑ ] | Valid D  2.393763e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:14]  4/10 | Batch 121/124 | D  2.033168e-003 [↓ ] | Valid D  2.358985e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch   1/124 | D  2.199435e-003 [↑ ] | Valid D  2.389120e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  11/124 | D  1.668178e-003 [↓ ] | Valid D  2.356529e-003 [↓ ] | Stag: 20 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  21/124 | D  5.649061e-003 [↑ ] | Valid D  2.151499e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  31/124 | D  5.264180e-003 [↓ ] | Valid D  2.038927e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  41/124 | D  8.416546e-003 [↑ ] | Valid D  2.145057e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  51/124 | D  1.564733e-003 [↓ ] | Valid D  2.208556e-003 [↑ ] | Stag: 20 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  61/124 | D  1.581773e-002 [↑ ] | Valid D  2.233998e-003 [↑ ] | Stag: 30 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  71/124 | D  3.898179e-003 [↓ ] | Valid D  2.347554e-003 [↑ ] | Stag: 40 Ovfit: 0
[12/11/2015 20:21:14]  5/10 | Batch  81/124 | D  1.395002e-003 [↓▼] | Valid D  2.182974e-003 [↓ ] | Stag: 50 Ovfit: 1
[12/11/2015 20:21:14]  5/10 | Batch  91/124 | D  4.450763e-003 [↑ ] | Valid D  2.069927e-003 [↓ ] | Stag: 60 Ovfit: 1
[12/11/2015 20:21:14]  5/10 | Batch 101/124 | D  1.927794e-003 [↓ ] | Valid D  2.129479e-003 [↑ ] | Stag: 70 Ovfit: 1
[12/11/2015 20:21:14]  5/10 | Batch 111/124 | D  1.238949e-002 [↑ ] | Valid D  2.059099e-003 [↓ ] | Stag: 80 Ovfit: 1
[12/11/2015 20:21:14]  5/10 | Batch 121/124 | D  1.969593e-003 [↓ ] | Valid D  2.072177e-003 [↑ ] | Stag: 90 Ovfit: 1
[12/11/2015 20:21:14]  6/10 | Batch   1/124 | D  1.885590e-003 [↓ ] | Valid D  2.087292e-003 [↑ ] | Stag:100 Ovfit: 1
[12/11/2015 20:21:14]  6/10 | Batch  11/124 | D  1.577425e-003 [↓ ] | Valid D  2.074389e-003 [↓ ] | Stag:110 Ovfit: 1
[12/11/2015 20:21:14]  6/10 | Batch  21/124 | D  5.410788e-003 [↑ ] | Valid D  1.943973e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:14]  6/10 | Batch  31/124 | D  4.188792e-003 [↓ ] | Valid D  1.863442e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:14]  6/10 | Batch  41/124 | D  7.516511e-003 [↑ ] | Valid D  1.951990e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:14]  6/10 | Batch  51/124 | D  1.510475e-003 [↓ ] | Valid D  2.003860e-003 [↑ ] | Stag: 20 Ovfit: 0
[12/11/2015 20:21:14]  6/10 | Batch  61/124 | D  1.375423e-002 [↑ ] | Valid D  2.020531e-003 [↑ ] | Stag: 30 Ovfit: 0
[12/11/2015 20:21:14]  6/10 | Batch  71/124 | D  3.260145e-003 [↓ ] | Valid D  2.129138e-003 [↑ ] | Stag: 40 Ovfit: 0
[12/11/2015 20:21:15]  6/10 | Batch  81/124 | D  1.402565e-003 [↓ ] | Valid D  2.002138e-003 [↓ ] | Stag: 50 Ovfit: 0
[12/11/2015 20:21:15]  6/10 | Batch  91/124 | D  3.999386e-003 [↑ ] | Valid D  1.920336e-003 [↓ ] | Stag: 60 Ovfit: 0
[12/11/2015 20:21:15]  6/10 | Batch 101/124 | D  1.929424e-003 [↓ ] | Valid D  1.976652e-003 [↑ ] | Stag: 70 Ovfit: 0
[12/11/2015 20:21:15]  6/10 | Batch 111/124 | D  1.205915e-002 [↑ ] | Valid D  1.926643e-003 [↓ ] | Stag: 80 Ovfit: 0
[12/11/2015 20:21:15]  6/10 | Batch 121/124 | D  1.978536e-003 [↓ ] | Valid D  1.951888e-003 [↑ ] | Stag: 90 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch   1/124 | D  1.769614e-003 [↓ ] | Valid D  1.959661e-003 [↑ ] | Stag:100 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  11/124 | D  1.555518e-003 [↓ ] | Valid D  1.955613e-003 [↓ ] | Stag:110 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  21/124 | D  5.217655e-003 [↑ ] | Valid D  1.861573e-003 [↓ ] | Stag:120 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  31/124 | D  3.625835e-003 [↓ ] | Valid D  1.796666e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  41/124 | D  6.929778e-003 [↑ ] | Valid D  1.872346e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  51/124 | D  1.502809e-003 [↓ ] | Valid D  1.913079e-003 [↑ ] | Stag: 20 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  61/124 | D  1.241405e-002 [↑ ] | Valid D  1.924762e-003 [↑ ] | Stag: 30 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  71/124 | D  2.962820e-003 [↓ ] | Valid D  2.024504e-003 [↑ ] | Stag: 40 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  81/124 | D  1.421725e-003 [↓ ] | Valid D  1.919308e-003 [↓ ] | Stag: 50 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch  91/124 | D  3.717377e-003 [↑ ] | Valid D  1.854433e-003 [↓ ] | Stag: 60 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch 101/124 | D  1.973184e-003 [↓ ] | Valid D  1.907719e-003 [↑ ] | Stag: 70 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch 111/124 | D  1.190252e-002 [↑ ] | Valid D  1.867085e-003 [↓ ] | Stag: 80 Ovfit: 0
[12/11/2015 20:21:15]  7/10 | Batch 121/124 | D  2.006255e-003 [↓ ] | Valid D  1.894716e-003 [↑ ] | Stag: 90 Ovfit: 0
[12/11/2015 20:21:15]  8/10 | Batch   1/124 | D  1.721533e-003 [↓ ] | Valid D  1.898627e-003 [↑ ] | Stag:100 Ovfit: 0
[12/11/2015 20:21:15]  8/10 | Batch  11/124 | D  1.553262e-003 [↓ ] | Valid D  1.897926e-003 [↓ ] | Stag:110 Ovfit: 0
[12/11/2015 20:21:15]  8/10 | Batch  21/124 | D  5.004487e-003 [↑ ] | Valid D  1.823838e-003 [↓ ] | Stag:120 Ovfit: 0
[12/11/2015 20:21:15]  8/10 | Batch  31/124 | D  3.308986e-003 [↓ ] | Valid D  1.768821e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:15]  8/10 | Batch  41/124 | D  6.563510e-003 [↑ ] | Valid D  1.835302e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:15]  8/10 | Batch  51/124 | D  1.507999e-003 [↓ ] | Valid D  1.868091e-003 [↑ ] | Stag: 20 Ovfit: 0
[12/11/2015 20:21:15]  8/10 | Batch  61/124 | D  1.148601e-002 [↑ ] | Valid D  1.876653e-003 [↑ ] | Stag: 30 Ovfit: 0
[12/11/2015 20:21:16]  8/10 | Batch  71/124 | D  2.807777e-003 [↓ ] | Valid D  1.968064e-003 [↑ ] | Stag: 40 Ovfit: 0
[12/11/2015 20:21:16]  8/10 | Batch  81/124 | D  1.440011e-003 [↓ ] | Valid D  1.876611e-003 [↓ ] | Stag: 50 Ovfit: 0
[12/11/2015 20:21:16]  8/10 | Batch  91/124 | D  3.522004e-003 [↑ ] | Valid D  1.821817e-003 [↓ ] | Stag: 60 Ovfit: 0
[12/11/2015 20:21:16]  8/10 | Batch 101/124 | D  2.031282e-003 [↓ ] | Valid D  1.872902e-003 [↑ ] | Stag: 70 Ovfit: 0
[12/11/2015 20:21:16]  8/10 | Batch 111/124 | D  1.182362e-002 [↑ ] | Valid D  1.836957e-003 [↓ ] | Stag: 80 Ovfit: 0
[12/11/2015 20:21:16]  8/10 | Batch 121/124 | D  2.035742e-003 [↓ ] | Valid D  1.864137e-003 [↑ ] | Stag: 90 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch   1/124 | D  1.699795e-003 [↓ ] | Valid D  1.865989e-003 [↑ ] | Stag:100 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  11/124 | D  1.556397e-003 [↓ ] | Valid D  1.866347e-003 [↑ ] | Stag:110 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  21/124 | D  4.788828e-003 [↑ ] | Valid D  1.804229e-003 [↓ ] | Stag:120 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  31/124 | D  3.119682e-003 [↓ ] | Valid D  1.756223e-003 [↓▼] | Stag:  0 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  41/124 | D  6.336636e-003 [↑ ] | Valid D  1.816257e-003 [↑ ] | Stag: 10 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  51/124 | D  1.516153e-003 [↓ ] | Valid D  1.843593e-003 [↑ ] | Stag: 20 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  61/124 | D  1.080968e-002 [↑ ] | Valid D  1.850113e-003 [↑ ] | Stag: 30 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  71/124 | D  2.720124e-003 [↓ ] | Valid D  1.934669e-003 [↑ ] | Stag: 40 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  81/124 | D  1.455176e-003 [↓ ] | Valid D  1.852409e-003 [↓ ] | Stag: 50 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch  91/124 | D  3.375944e-003 [↑ ] | Valid D  1.804057e-003 [↓ ] | Stag: 60 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch 101/124 | D  2.093168e-003 [↓ ] | Valid D  1.853583e-003 [↑ ] | Stag: 70 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch 111/124 | D  1.178356e-002 [↑ ] | Valid D  1.820183e-003 [↓ ] | Stag: 80 Ovfit: 0
[12/11/2015 20:21:16]  9/10 | Batch 121/124 | D  2.061530e-003 [↓ ] | Valid D  1.846045e-003 [↑ ] | Stag: 90 Ovfit: 0
[12/11/2015 20:21:16] 10/10 | Batch   1/124 | D  1.689459e-003 [↓ ] | Valid D  1.846794e-003 [↑ ] | Stag:100 Ovfit: 0
[12/11/2015 20:21:16] 10/10 | Batch  11/124 | D  1.560583e-003 [↓ ] | Valid D  1.847311e-003 [↑ ] | Stag:110 Ovfit: 0
[12/11/2015 20:21:16] 10/10 | Batch  21/124 | D  4.588457e-003 [↑ ] | Valid D  1.792883e-003 [↓ ] | Stag:120 Ovfit: 0
[12/11/2015 20:21:16] 10/10 | Batch  31/124 | D  3.001853e-003 [↓ ] | Valid D  1.750141e-003 [↓ ] | Stag:130 Ovfit: 0
[12/11/2015 20:21:16] 10/10 | Batch  41/124 | D  6.195725e-003 [↑ ] | Valid D  1.805622e-003 [↑ ] | Stag:140 Ovfit: 0
[12/11/2015 20:21:16] 10/10 | Batch  51/124 | D  1.524289e-003 [↓ ] | Valid D  1.829196e-003 [↑ ] | Stag:150 Ovfit: 0
[12/11/2015 20:21:17] 10/10 | Batch  61/124 | D  1.029841e-002 [↑ ] | Valid D  1.834366e-003 [↑ ] | Stag:160 Ovfit: 0
[12/11/2015 20:21:17] 10/10 | Batch  71/124 | D  2.667856e-003 [↓ ] | Valid D  1.913492e-003 [↑ ] | Stag:170 Ovfit: 0
[12/11/2015 20:21:17] 10/10 | Batch  81/124 | D  1.467351e-003 [↓ ] | Valid D  1.837669e-003 [↓ ] | Stag:180 Ovfit: 0
[12/11/2015 20:21:17] 10/10 | Batch  91/124 | D  3.261143e-003 [↑ ] | Valid D  1.793646e-003 [↓ ] | Stag:190 Ovfit: 0
[12/11/2015 20:21:17] 10/10 | Batch 101/124 | D  2.153974e-003 [↓ ] | Valid D  1.842048e-003 [↑ ] | Stag:200 Ovfit: 0
[12/11/2015 20:21:17] 10/10 | Batch 111/124 | D  1.176465e-002 [↑ ] | Valid D  1.810117e-003 [↓ ] | Stag:210 Ovfit: 0
[12/11/2015 20:21:17] 10/10 | Batch 121/124 | D  2.082179e-003 [↓ ] | Valid D  1.834467e-003 [↑ ] | Stag:220 Ovfit: 0
[12/11/2015 20:21:17] Duration       : 00:00:05.2093910
[12/11/2015 20:21:17] Loss initial   : D  4.748471e-001
[12/11/2015 20:21:17] Loss final     : D  1.395002e-003 (Best)
[12/11/2015 20:21:17] Loss change    : D -4.734521e-001 (-99.71 %)
[12/11/2015 20:21:17] Loss chg. / s  : D -9.088434e-002
[12/11/2015 20:21:17] Epochs / s     : 1.919610181
[12/11/2015 20:21:17] Epochs / min   : 115.1766109
[12/11/2015 20:21:17] --- Training finished

</pre>

After a 5-second training, we can see that the characteristics of the problem domain (distinguishing between the digits of 0 and 1) is captured in the model weights.

*)

let l = (n.[0] :?> Linear)
l.VisualizeWRowsAsImageGrid(28) |> printfn "%s"

(**
    [lang=cs]
    Hype.Neural.Linear
       784 -> 1
       Learnable parameters: 785
       Init: Standard
       W's rows reshaped to (28 x 28), presented in a (1 x 1) grid:
    DM : 28 x 28
    ----------------------------
    ----------------------------
    ------------▴▴▴▴▴-----------
    ---------▴--▴▴▴▴▴▴-▴--------
    --------▴▴▪▴▪▪▪▴-▴▴▴▪▪▪▴----
    ------▴-▴▴▴▴▪▪▴▴-·-▴▪▪▪▪▴---
    ------▴--▴▴-▴▴--···▴▪▴▴▴▴---
    ----▴---------▴▴-· ---------
    ---------··---▴▪--·-·····---
    -------······▴▪▪▪▴-······---
    ------·····  ▴●●●▴·     ·---
    -----·· ·    ▪♦■♦▪      ·---
    -----· ·     ●■■♦▴      ·---
    -----·      ·♦██♦·      ·---
    -----·      ▴■██●       ·---
    ----·       ▪██■▪       ·---
    ----·      -●█■♦-       ·---
    ----·      ▴♦█■●·     ···---
    ----·     ·▴♦♦♦●·   ····----
    ----·    ·▴▪●●●▪·· ····--▴--
    ----······-▴▪▪▪▴--------▴---
    -----▴▴----·--▴▴-▴▴-▴▴------
    -----▴▪▪▴-· ··--▴▴▪▴▴▴▴-----
    ----▴▪▪▪▪▴-· ·-▴▴▪▴▴▴▴------
    -----▴▪▴▴▴▴·---▴▴▴▴▴--------
    ------------▴▴▴▴------------
    ----------------------------
    ----------------------------

       b:
    DV : 1
     
### Classifier

You can create classifiers by instantiating types such as **LogisticClassifier** or **SoftmaxClassifier**, and passing a classification function of the form **DM->DM**in the constructor. Alternatively, you can directly pass the model we have just trained. 

Please see the [API reference](reference/index.html) and the [source code](https://github.com/hypelib/Hype/blob/master/src/Hype/Classifier.fs) for a better understanding of how classifiers are implemented.

*)

let cc = LogisticClassifier(n)

(**
Let's test the class predictions for 10 random elements from the MNIST test set, which, if you remember, we've filtered to have only 0s and 1s.
*)

let pred = cc.Classify(MNISTtest01.X.[*,0..9]);;
let real = MNISTtest01.Y.[*, 0..9] |> DM.toDV |> DV.toArray |> Array.map (float32>>int)

(**
<pre>
val pred : int [] = [|1; 0; 1; 0; 1; 0; 0; 1; 1; 1|]
val real : int [] = [|1; 0; 1; 0; 1; 0; 0; 1; 1; 1|]
</pre>

The classifier seems to be working well. We can compute the classification error for a given dataset.
*)

let error = cc.ClassificationError(MNISTtest01);;

(**
<pre>
val error : float32 = 0.000472813234f
</pre>

The classification error is 0.047%.

Finally, this is how you would classify single digits.
*)

let cls = cc.Classify(MNISTtest01.X.[*,0]);;
MNISTtest01.X.[*,0] |> DV.visualizeAsDM 28 |> printfn "%s"

(**
    [lang=cs]
    val cls : int = 1

    DM : 28 x 28
                            
                            
                            
                            
                ♦               
                ●♦              
                 █              
                 ■·             
                ▪█-             
                ▴█-             
                 ■♦             
                 ♦█·            
                 -█▪            
                  █▪            
                  ●▪            
                  ▪█            
                  ▪█-           
                  ▪█▴           
                  ▪█■           
                   █■           
                   ██           
                   ▪█           
                    █▴          
                    █●          
                            
                            
And this is how you would classify many digits efficiently at the same time, by running them through the model together as the columns of an input matrix.
*)

let clss = cc.Classify(MNISTtest01.X.[*,5..9]);;
MNISTtest01.[5..9].VisualizeXColsAsImageGrid(28) |> printfn "%s"

(**
    [lang=cs]
    val clss : int [] = [|0; 0; 1; 1; 1|]

    Hype.Dataset
       X: 784 x 5
       Y: 1 x 5
    X's columns reshaped to (28 x 28), presented in a (2 x 3) grid:
    DM : 56 x 84
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                 ██·                                                                    
                ●███♦-                        ·████♦·                   -█▴             
                ██████■-                     ♦███████-                  ▪██·            
               ●███████■                  ♦███████████▴                 ●██·            
              ▪███● -███-                ♦█████████████■                ▴♦█·            
              ♦██▪   -██■                ████♦ ●●████████                ▪█·            
             ▪███    ·██■               ●█████·  ··██████                ▪█·            
             ■██▪    ·██■              ▪██████·    ██████                ▪█·            
            ·██■     ▪██■             -██████♦     ██████                ▪█·            
            ♦██▪     ■██■            ▪███████     ·█████♦                ●█·            
           ·███-     ■██■            ██████♦·    ♦██████·                ██·            
           ♦██■     ·███■            ██████-   ▪███████-                 ██·            
           ■██-     -███-            ██████   ●███████▴                  ██·            
           ■██·     ■██♦             ██████·♦████████■                   ██·            
           ■██·    ■███●             ████████████████                    ██             
           ■██●    ████              ██████████████-                     ██             
           ▴███· -■███-              ▴████████████▴                     ·██             
           ·█████████♦                ■████████■●                        ██             
            ▴███████♦                  ████████                         ·█♦·            
              ▪███♦·                    ●●■●●-                          ·██▴            
                                                                         █■             
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                                                                                    
                      ·██                                                               
                      ███                       -██■                                    
                     ●██■                      ▪████-                                   
                     ♦██                       ♦███■                                    
                    ·███                      ·████■                                    
                    ■██♦                      ▴███■                                     
                   ▪██■                      -████●                                     
                   ███-                      ▴████▪                                     
                  ▪██■                      ·████♦                                      
                 -███▴                      █████·                                      
                 ♦██●                       ████♦                                       
                -███                       ▪████·                                       
                ███▴                       ■███■                                        
               -███                       ▴████·                                        
               ███●                       ■███■                                         
              ████                        ■███▪                                         
             ●███-                       -███♦                                          
            ▴███●                        ●███▪                                          
            ●██♦                         ████▪                                          
            ●██·                         ■███▴                                          
                                          ■■-                                           
                                                                                    
                                                                                    
                                                                                    
                                                                                                                
*)


================================================
FILE: docs/input/Training.fsx
================================================
(*** hide ***)
#r "../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll"
#r "../../src/Hype/bin/Release/netstandard2.0/Hype.dll"
#I "../../packages/R.NET.Community/lib/net40/"
#I "../../packages/R.NET.Community.FSharp/lib/net40/"
#I "../../packages/RProvider"
#load "RProvider.fsx"
//fsi.ShowDeclarationValues <- false
System.Environment.CurrentDirectory <- __SOURCE_DIRECTORY__

(**
Training
========

In [optimization,](optimization.html) we've seen how nested AD and gradient-based optimization work together.

Training a model is the optimization of model parameters to minimize a loss function, or equivalently, to maximize the likelihood of a given set of data under the model parameters. In addition to the _optimization method_, _learning rate_, _momentum_, and _gradient clipping_ parameters we've seen, this introduces parameters for the _loss function_, _regularization_, _training batches_, and _validation and early stopping_. 

But let's start with the **Dataset** type, which we will use for keeping the training, validation, and test data for the training procedure.

Dataset
-------

For supervised training, data consists of pairs of input vectors $\mathbf{x}_i \in \mathbb{R}^{d_x}$ and output vectors $\mathbf{y}_i \in \mathbb{R}^{d_y}$. We represent data using the **Dataset** type, which is basically a pair of matrices 

$$$
   \begin{eqnarray*}
   \mathbf{X} &\in& \mathbb{R}^{d_x \times n}\\
   \mathbf{Y} &\in& \mathbb{R}^{d_y \times n}\\
   \end{eqnarray*}
   
holding these vectors, where $n$ is the number of input–output pairs, $d_x$ is the number of input features and $d_y$ is the number of output features. In other words, each of the $n$ columns of the matrix $\mathbf{X}$ is an input vector of length $d_x$ and each of the $n$ columns of matrix $\mathbf{Y}$ is the corresponding output vector of length $d_y$.

Keeping data in matrix form is essential for harnessing high-performance linear algebra engines tailored for your CPU or GPU. Hype, by default, uses a high-performance CPU backend using OpenBLAS for BLAS/LAPACK operations, and parallel implementations of non-BLAS operations such as elementwise functions.
*)

open Hype
open DiffSharp.AD.Float32

let x = toDM [[0; 0; 1; 1]
              [0; 1; 0; 1]]
let y = toDM [[0; 1; 1; 0]]

let XORdata = Dataset(x, y)

(**

Hype provides several utility functions for loading data into matrices from images, delimited text files (e.g., CSV), or commonly used dataset files such as the MNIST.

*)

let MNIST = Dataset(Util.LoadMNISTPixels("train-images.idx3-ubyte", 60000),
                    Util.LoadMNISTLabels("train-labels.idx1-ubyte", 60000) |> toDV |> DM.ofDV 1).NormalizeX()

let MNISTtest = Dataset(Util.LoadMNISTPixels("t10k-images.idx3-ubyte", 10000),
                        Util.LoadMNISTLabels("t10k-labels.idx1-ubyte", 10000) |> toDV |> DM.ofDV 1).NormalizeX()

(**

You can see the [API reference](reference/index.html) and the [source code](https://github.com/hypelib/Hype/blob/master/src/Hype/Hype.fs) for various ways of constructing Datasets.

Training parameters
-------------------

Let's load the housing prices dataset from the [Stanford UFLDL Tutorial](http://ufldl.stanford.edu/tutorial/supervised/LinearRegression/) and divide it into input and output pairs. We will later train a simple linear regression model, to demonstrate the use of training parameters.

*)

let h = Util.LoadDelimited("housing.data") |> DM.Transpose
h.ToString() |> printfn "%s"

(**
<pre>
DM : 14 x 506
  0.00632    0.0273    0.0273    0.0324    0.0691    0.0299    0.0883     0.145     0.211      0.17     0.225     0.117    0.0938      0.63     0.638     0.627      1.05     0.784     0.803     0.726      1.25     0.852      1.23     0.988      0.75     0.841     0.672     0.956     0.773         1      1.13      1.35      1.39      1.15      1.61    0.0642    0.0974    0.0801     0.175    0.0276    0.0336     0.127     0.142     0.159     0.123     0.171     0.188     0.229     0.254      0.22    0.0887    0.0434    0.0536    0.0498    0.0136    0.0131    0.0206    0.0143     0.154     0.103     0.149     0.172      0.11     0.127    0.0195    0.0358    0.0438    0.0579     0.136     0.128    0.0883     0.159    0.0916     0.195     0.079    0.0951     0.102    0.0871    0.0565    0.0839    0.0411    0.0446    0.0366    0.0355    0.0506    0.0574    0.0519    0.0715    0.0566     0.053    0.0468    0.0393     0.042    0.0288    0.0429     0.122     0.115     0.121    0.0819    0.0686     0.149     0.114     0.229     0.212      0.14     0.133     0.171     0.131     0.128     0.264     0.108     0.101     0.123     0.222     0.142     0.171     0.132     0.151     0.131     0.145     0.069    0.0717     0.093      0.15    0.0985     0.169     0.387     0.259     0.325     0.881      0.34      1.19      0.59      0.33     0.976     0.558     0.323     0.352      0.25     0.545     0.291      1.63      3.32       4.1      2.78      2.38      2.16      2.37      2.33      2.73      1.66       1.5      1.13      2.15      1.41      3.54      2.45      1.22      1.34      1.43      1.27      1.46      1.83      1.52      2.24      2.92      2.01       1.8       2.3      2.45      1.21      2.31     0.139    0.0918    0.0845    0.0666    0.0702    0.0543    0.0664    0.0578    0.0659    0.0689     0.091       0.1    0.0831    0.0605     0.056    0.0788     0.126    0.0837    0.0907    0.0691    0.0866    0.0219    0.0144    0.0138    0.0401    0.0467    0.0377    0.0315    0.0178    0.0345    0.0218    0.0351    0.0201     0.136      0.23     0.252     0.136     0.436     0.174     0.376     0.217     0.141      0.29     0.198    0.0456    0.0701     0.111     0.114     0.358     0.408     0.624     0.615     0.315     0.527     0.382     0.412     0.298     0.442     0.537     0.463     0.575     0.331     0.448      0.33     0.521     0.512    0.0824    0.0925     0.113     0.106     0.103     0.128     0.206     0.191      0.34     0.197     0.164     0.191      0.14     0.214    0.0822     0.369    0.0482    0.0355    0.0154     0.612     0.664     0.657      0.54     0.534      0.52     0.825      0.55     0.762     0.786     0.578     0.541    0.0907     0.299     0.162     0.115     0.222    0.0564     0.096     0.105    0.0613    0.0798      0.21    0.0358    0.0371    0.0613     0.015   0.00906     0.011    0.0197    0.0387    0.0459     0.043     0.035    0.0789    0.0362    0.0827     0.082     0.129    0.0537     0.141    0.0647    0.0556    0.0442    0.0354    0.0927       0.1    0.0552    0.0548     0.075    0.0493     0.493     0.349      2.64      0.79     0.262     0.269     0.369     0.254     0.318     0.245     0.402     0.475     0.168     0.182     0.351     0.284     0.341     0.192     0.303     0.241    0.0662    0.0672    0.0454    0.0502    0.0347    0.0508    0.0374    0.0396    0.0343    0.0304    0.0331     0.055    0.0615     0.013     0.025    0.0254    0.0305    0.0311    0.0616    0.0187     0.015     0.029    0.0621    0.0795    0.0724    0.0171     0.043     0.107      8.98      3.85       5.2      4.26      4.54      3.84      3.68      4.22      3.47      4.56       3.7      3.52       4.9      5.67      6.54      9.23      8.27      1.11       8.5      9.61      5.29      9.82      3.65      7.87      8.98      5.87      9.19      7.99    0.0849      6.81      4.39       2.6      4.33      8.15      6.96      5.29      1.58      8.64      3.36      8.72      5.87      7.67      8.35      9.92      5.05      4.24       9.6       4.8      1.53      7.92     0.716      1.95       7.4      4.44      1.14      4.05      8.81      8.66      5.75      8.08     0.834      5.94      3.53      1.81      1.09      7.02      2.05      7.05      8.79      5.86      2.25      7.66      7.37      9.34      8.49    0.0623      6.44      5.58      3.91      1.16      4.42      5.18      3.68      9.39      2.05      9.72      5.67      9.97       2.8     0.672      6.29      9.92      9.33      7.53      6.72      5.44      5.09      8.25      9.51      4.75      4.67       8.2      7.75       6.8      4.81      3.69      6.65      5.82      7.84      3.16      3.77      4.42      5.58      3.08      4.35      4.04      3.57      4.65      8.06      6.39      4.87      5.02     0.233      4.33      5.82      5.71      5.73      2.82      2.38      3.67      5.69      4.84     0.151     0.183     0.207     0.106     0.111     0.173      0.28     0.179      0.29     0.268     0.239     0.178     0.224    0.0626    0.0453    0.0608      0.11    0.0474 
       18         0         0         0         0         0      12.5      12.5      12.5      12.5      12.5      12.5      12.5         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0        75        75         0         0         0         0         0         0         0         0         0        21        21        21        21        75        90        85       100        25        25        25        25        25        25      17.5        80        80      12.5      12.5      12.5         0         0         0         0         0         0         0         0         0         0        25        25        25        25         0         0         0         0         0         0         0         0        28        28        28         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0        45        45        45        45        45        45        60        60        80        80        80        80        95        95      82.5      82.5        95        95         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0        30        30        30        30        30        30        22        22        22        22        22        22        22        22        22        22        80        80        90        20        20        20        20        20        20        20        20        20        20        20        20        20        20        20        20        20        40        40        40        40        40        20        20        20        20        90        90        55        80      52.5      52.5      52.5        80        80        80         0         0         0         0         0        70        70        70        34        34        34        33        33        33        33         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0        35        35         0         0         0         0         0         0         0         0        35         0        55        55         0         0        85        80        40        40        60        60        90        80        80         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0 
     2.31      7.07      7.07      2.18      2.18      2.18      7.87      7.87      7.87      7.87      7.87      7.87      7.87      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      8.14      5.96      5.96      5.96      5.96      2.95      2.95      6.91      6.91      6.91      6.91      6.91      6.91      6.91      6.91      6.91      5.64      5.64      5.64      5.64         4      1.22      0.74      1.32      5.13      5.13      5.13      5.13      5.13      5.13      1.38      3.37      3.37      6.07      6.07      6.07      10.8      10.8      10.8      10.8      12.8      12.8      12.8      12.8      12.8      12.8      4.86      4.86      4.86      4.86      4.49      4.49      4.49      4.49      3.41      3.41      3.41      3.41        15        15        15      2.89      2.89      2.89      2.89      2.89      8.56      8.56      8.56      8.56      8.56      8.56      8.56      8.56      8.56      8.56      8.56        10        10        10        10        10        10        10        10        10      25.7      25.7      25.7      25.7      25.7      25.7      25.7      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      21.9      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      19.6      4.05      4.05      4.05      4.05      4.05      4.05      4.05      2.46      2.46      2.46      2.46      2.46      2.46      2.46      2.46      3.44      3.44      3.44      3.44      3.44      3.44      2.93      2.93      0.46      1.52      1.52      1.52      1.47      1.47      2.03      2.03      2.68      2.68      10.6      10.6      10.6      10.6      10.6      10.6      10.6      10.6      10.6      10.6      10.6      13.9      13.9      13.9      13.9       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2       6.2      4.93      4.93      4.93      4.93      4.93      4.93      5.86      5.86      5.86      5.86      5.86      5.86      5.86      5.86      5.86      5.86      3.64      3.64      3.75      3.97      3.97      3.97      3.97      3.97      3.97      3.97      3.97      3.97      3.97      3.97      3.97      6.96      6.96      6.96      6.96      6.96      6.41      6.41      6.41      6.41      6.41      3.33      3.33      3.33      3.33      1.21      2.97      2.25      1.76      5.32      5.32      5.32      4.95      4.95      4.95      13.9      13.9      13.9      13.9      13.9      2.24      2.24      2.24      6.09      6.09      6.09      2.18      2.18      2.18      2.18       9.9       9.9       9.9       9.9       9.9       9.9       9.9       9.9       9.9       9.9       9.9       9.9      7.38      7.38      7.38      7.38      7.38      7.38      7.38      7.38      3.24      3.24      3.24      6.06      6.06      5.19      5.19      5.19      5.19      5.19      5.19      5.19      5.19      1.52      1.89      3.78      3.78      4.39      4.39      4.15      2.01      1.25      1.25      1.69      1.69      2.02      1.91      1.91      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      18.1      27.7      27.7      27.7      27.7      27.7      9.69      9.69      9.69      9.69      9.69      9.69      9.69      9.69      11.9      11.9      11.9      11.9      11.9 
        0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         1         0         0         0         0         0         0         0         0         0         1         0         1         1         0         0         0         0         1         0         1         1         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         1         1         1         1         1         0         0         0         1         0         1         1         1         1         1         0         0         0         0         0         0         0         0         0         0         0         1         0         1         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         1         0         0         0         1         1         0         1         1         0         0         0         0         1         1         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         1         1         1         0         0         0         0         1         1         0         0         0         0         1         1         0         1         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0         0 
    0.538     0.469     0.469     0.458     0.458     0.458     0.524     0.524     0.524     0.524     0.524     0.524     0.524     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.538     0.499     0.499     0.499     0.499     0.428     0.428     0.448     0.448     0.448     0.448     0.448     0.448     0.448     0.448     0.448     0.439     0.439     0.439     0.439      0.41     0.403      0.41     0.411     0.453     0.453     0.453     0.453     0.453     0.453     0.416     0.398     0.398     0.409     0.409     0.409     0.413     0.413     0.413     0.413     0.437     0.437     0.437     0.437     0.437     0.437     0.426     0.426     0.426     0.426     0.449     0.449     0.449     0.449     0.489     0.489     0.489     0.489     0.464     0.464     0.464     0.445     0.445     0.445     0.445     0.445      0.52      0.52      0.52      0.52      0.52      0.52      0.52      0.52      0.52      0.52      0.52     0.547     0.547     0.547     0.547     0.547     0.547     0.547     0.547     0.547     0.581     0.581     0.581     0.581     0.581     0.581     0.581     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.624     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.871     0.605     0.605     0.871     0.605     0.605     0.605     0.605     0.605     0.605     0.605     0.605     0.605     0.605     0.605     0.605      0.51      0.51      0.51      0.51      0.51      0.51      0.51     0.488     0.488     0.488     0.488     0.488     0.488     0.488     0.488     0.437     0.437     0.437     0.437     0.437     0.437     0.401     0.401     0.422     0.404     0.404     0.404     0.403     0.403     0.415     0.415     0.416     0.416     0.489     0.489     0.489     0.489     0.489     0.489     0.489     0.489     0.489     0.489     0.489      0.55      0.55      0.55      0.55     0.507     0.507     0.507     0.507     0.504     0.504     0.504     0.504     0.504     0.504     0.504     0.504     0.507     0.507     0.507     0.507     0.507     0.507     0.428     0.428     0.428     0.428     0.428     0.428     0.431     0.431     0.431     0.431     0.431     0.431     0.431     0.431     0.431     0.431     0.392     0.392     0.394     0.647     0.647     0.647     0.647     0.647     0.647     0.647     0.647     0.647     0.647     0.575     0.575     0.464     0.464     0.464     0.464     0.464     0.447     0.447     0.447     0.447     0.447     0.443     0.443     0.443     0.443     0.401       0.4     0.389     0.385     0.405     0.405     0.405     0.411     0.411     0.411     0.437     0.437     0.437     0.437     0.437       0.4       0.4       0.4     0.433     0.433     0.433     0.472     0.472     0.472     0.472     0.544     0.544     0.544     0.544     0.544     0.544     0.544     0.544     0.544     0.544     0.544     0.544     0.493     0.493     0.493     0.493     0.493     0.493     0.493     0.493      0.46      0.46      0.46     0.438     0.438     0.515     0.515     0.515     0.515     0.515     0.515     0.515     0.515     0.442     0.518     0.484     0.484     0.442     0.442     0.429     0.435     0.429     0.429     0.411     0.411      0.41     0.413     0.413      0.77      0.77      0.77      0.77      0.77      0.77      0.77      0.77     0.718     0.718     0.718     0.631     0.631     0.631     0.631     0.631     0.668     0.668     0.668     0.671     0.671     0.671     0.671     0.671     0.671     0.671       0.7       0.7       0.7       0.7       0.7       0.7       0.7       0.7       0.7       0.7       0.7     0.693     0.693     0.693     0.693     0.693     0.693     0.693     0.693     0.693     0.693     0.693     0.693     0.693     0.659     0.659     0.597     0.597     0.597     0.597     0.597     0.597     0.693     0.679     0.679     0.679     0.679     0.718     0.718     0.718     0.614     0.614     0.584     0.679     0.584     0.679     0.679     0.679     0.584     0.584     0.584     0.713     0.713      0.74      0.74      0.74      0.74      0.74      0.74      0.74      0.74      0.74      0.74      0.74      0.74      0.74     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.713     0.655     0.655     0.655     0.584      0.58      0.58      0.58     0.532      0.58     0.614     0.584     0.584     0.614     0.614     0.614     0.614     0.532     0.532     0.532     0.532     0.583     0.583     0.583     0.583     0.609     0.609     0.609     0.609     0.609     0.585     0.585     0.585     0.585     0.585     0.585     0.585     0.585     0.573     0.573     0.573     0.573     0.573 
     6.58      6.42      7.19         7      7.15      6.43      6.01      6.17      5.63         6      6.38      6.01      5.89      5.95       6.1      5.83      5.94      5.99      5.46      5.73      5.57      5.97      6.14      5.81      5.92       5.6      5.81      6.05       6.5      6.67      5.71      6.07      5.95       5.7       6.1      5.93      5.84      5.85      5.97       6.6      7.02      6.77      6.17      6.21      6.07      5.68      5.79      6.03       5.4       5.6      5.96      6.12      6.51         6      5.89      7.25      6.38      6.82      6.15      5.93      5.74      5.97      6.46      6.76       7.1      6.29      5.79      5.88      5.59      5.89      6.42      5.96      6.07      6.25      6.27      6.29      6.28      6.14      6.23      5.87      6.73      6.62       6.3      6.17      6.39      6.63      6.02      6.12      7.01      7.08      6.42      6.41      6.44      6.21      6.25      6.63      6.16      8.07      7.82      7.42      6.73      6.78      6.41      6.14      6.17      5.85      5.84      6.13      6.47      6.23       6.2      6.72      5.91      6.09      6.25      5.93      6.18      6.02      5.87      5.73      5.87         6      5.96      5.86      5.88      5.99      5.61      5.69      6.43      5.64      6.46      6.33      6.37      5.82      5.76      6.34      5.94      6.45      5.86      6.15      6.17      5.02       5.4      5.47       4.9      6.13      5.63      4.93      5.19       5.6      6.12       5.4      5.01      5.71      6.13      6.15      5.27      6.94      6.07      6.51      6.25      7.49       7.8      8.38      5.85       6.1      7.93      5.88      6.32       6.4      5.88      5.88      5.57      6.42      5.86      6.55      6.02      6.32      6.86      6.98      7.77      6.14      7.16      6.56       5.6      6.15      7.83      6.78      6.56      7.19      6.95      6.74      7.18       6.8       6.6      7.88      7.29      7.11      7.27      6.98      7.14      6.16      7.61      7.85      8.03      5.89      6.33      5.78      6.06      5.34      5.96       5.4      5.81      6.38      5.41      6.18      5.89      6.64      5.95      6.37      6.95      6.16      6.88      6.62      8.27      8.73      8.04      7.16      7.69      6.55      5.98      7.41      8.34      8.25      6.73      6.09      6.63      7.36      6.48      6.61       6.9       6.1      6.36      6.39      5.59      5.61      6.11      6.23      6.43      6.72      6.49      6.44      6.96      8.26      6.11      5.88      7.45       8.7      7.33      6.84       7.2      7.52       8.4      7.33      7.21      5.56      7.01       8.3      7.47      5.92      5.86      6.24      6.54      7.69      6.76      6.85      7.27      6.83      6.48      6.81      7.82      6.97      7.65      7.92      7.09      6.45      6.23      6.21      6.32      6.57      6.86      7.15      6.63      6.13      6.01      6.68      6.55      5.79      6.35      7.04      6.87      6.59       6.5      6.98      7.24      6.62      7.42      6.85      6.64      5.97      4.97      6.12      6.02      6.27      6.57      5.71      5.91      5.78      6.38      6.11      6.43      6.38      6.04      5.71      6.42      6.43      6.31      6.08      5.87      6.33      6.14      5.71      6.03      6.32      6.31      6.04      5.87       5.9      6.06      5.99      5.97      7.24      6.54       6.7      6.87      6.01       5.9      6.52      6.64      6.94      6.49      6.58      5.88      6.73      5.66      5.94      6.21       6.4      6.13      6.11       6.4      6.25      5.36       5.8      8.78      3.56      4.96      3.86      4.97      6.68      7.02      6.22      5.88      4.91      4.14      7.31      6.65      6.79      6.38      6.22      6.97      6.55      5.54      5.52      4.37      5.28      4.65         5      4.88      5.39      5.71      6.05      5.04      6.19      5.89      6.47      6.41      5.75      5.45      5.85      5.99      6.34       6.4      5.35      5.53      5.68      4.14      5.61      5.62      6.85      5.76      6.66      4.63      5.16      4.52      6.43      6.78       5.3      5.96      6.82      6.41      6.01      5.65       6.1      5.57       5.9      5.84       6.2      6.19      6.38      6.35      6.83      6.43      6.44      6.21      6.63      6.46      6.15      5.94      5.63      5.82      6.41      6.22      6.49      5.85      6.46      6.34      6.25      6.19      6.42      6.75      6.66       6.3      7.39      6.73      6.53      5.98      5.94       6.3      6.08       6.7      6.38      6.32      6.51      6.21      5.76      5.95         6      5.93      5.71      6.17      6.23      6.44      6.98      5.43      6.16      6.48       5.3      6.19      6.23      6.24      6.75      7.06      5.76      5.87      6.31      6.11      5.91      5.45      5.41      5.09      5.98      5.98      5.71      5.93      5.67      5.39      5.79      6.02      5.57      6.03      6.59      6.12      6.98      6.79      6.03 
     65.2      78.9      61.1      45.8      54.2      58.7      66.6      96.1       100      85.9      94.3      82.9        39      61.8      84.5      56.5      29.3      81.7      36.6      69.5      98.1      89.2      91.7       100      94.1      85.7      90.3      88.8      94.4      87.3      94.1       100        82        95      96.9      68.2      61.4      41.5      30.2      21.8      15.8       2.9       6.6       6.5        40      33.8      33.3      85.5      95.3        62      45.7        63      21.1      21.4      47.6      21.9      35.7      40.5      29.2      47.2      66.2      93.4      67.8      43.4      59.5      17.8      31.1      21.4      36.8        33       6.6      17.5       7.8       6.2         6        45      74.5      45.8      53.7      36.6      33.5      70.4      32.2      46.7        48      56.1      45.1      56.8      86.3      63.1      66.1      73.9      53.6      28.9      77.3      57.8      69.6        76      36.9      62.5      79.9      71.3      85.4      87.4        90      96.7      91.9      85.2      97.1      91.2      54.4      81.6      92.9      95.4      84.2      88.2      72.5      82.6      73.1      65.2      69.7      84.1      92.9        97      95.8      88.4      95.6        96      98.8      94.7      98.9      97.7      97.9      95.4      98.4      98.2      93.5      98.4      98.2      97.9      93.6       100       100       100      97.8       100       100      95.7      93.8      94.9      97.3       100        88      98.5        96      82.6        94      97.4       100       100      92.6      90.8      98.2      93.9      91.8        93      96.2      79.2      96.1      95.2      94.6      97.3      88.5      84.1      68.7      33.1      47.2      73.4      74.4      58.4      83.3      62.2      92.2      95.6      89.8      68.8      53.6      41.1      29.1      38.9      21.5      30.8      26.3       9.9      18.8        32      34.1      36.6      38.3      15.3      13.9      38.4      15.7      33.2      31.9      22.3      52.5      72.7      59.1       100      92.1      88.6      53.8      32.3       9.8      42.4        56      85.1      93.8      92.4      88.5      91.3      77.7      80.8      78.3        83      86.5      79.9        17      21.4      68.1      76.9      73.3      70.4      66.5      61.5      76.5      71.6      18.5      42.2      54.3      65.1      52.9       7.8      76.5      70.2      34.9      79.2      49.1      17.5        13       8.9       6.8       8.4        32      19.1      34.2      86.9       100       100      81.8      89.4      91.5      94.5      91.6      62.8      84.6        67      52.6      61.5      42.1      16.3      58.7      51.8      32.9      42.8        49      27.6      32.1      32.2      64.5      37.2      49.7      24.8      20.8      31.9      31.5      31.3      45.6      22.9      27.9      27.7      23.4      18.4      42.3      31.1        51        58      20.1        10      47.4      40.4      18.4      17.7      41.1      58.1      71.9      70.3      82.5      76.7      37.8      52.8      90.4      82.8      87.3      77.7      83.2      71.7      67.2      58.8      52.3      54.3      49.9      74.3      40.1      14.7      28.9      43.7      25.8      17.2      32.2      28.4      23.3      38.1      38.5      34.5      46.3      59.6      37.3      45.4      58.5      49.3      59.7      56.4      28.1      48.5      52.3      27.7      29.7      34.5      44.4      35.9      18.5      36.1      21.9      19.5      97.4        91      83.4      81.3        88      91.1      96.2        89      82.9      87.9      91.4       100       100      96.8      97.5       100      89.6       100       100      97.9      93.3      98.8      96.2       100      91.9      99.1       100       100      91.2      98.1       100      89.5       100      98.9        97      82.5        97      92.6      94.7      98.8        96      98.9       100      77.8       100       100       100        96      85.4       100       100       100      97.9       100       100       100       100       100       100       100      90.8      89.1       100      76.5       100      95.3      87.6      85.1      70.6      95.4      59.7      78.7      78.1      95.6      86.1      94.3      74.8      87.9        95      94.6      93.3       100      87.9      93.9      92.4      97.2       100       100      96.6      94.8      96.4      96.6      98.7      98.3      92.6      98.2      91.8      99.3      94.1      86.5      87.9      80.3      83.7      84.4        90      88.4        83      89.9      65.4      48.2      84.7      94.5        71      56.7        84      90.7        75      67.6      95.4      97.4      93.6      97.3      96.7        88      64.7      74.9        77      40.3      41.9      51.9      79.8      53.2      92.7      98.3        98      98.8      83.5        54      42.6      28.8      72.9      70.6      65.3      73.5      79.7      69.1      76.7        91      89.3      80.8 
     4.09      4.97      4.97      6.06      6.06      6.06      5.56      5.95      6.08      6.59      6.35      6.23      5.45      4.71      4.46       4.5       4.5      4.26       3.8       3.8       3.8      4.01      3.98       4.1       4.4      4.45      4.68      4.45      4.45      4.24      4.23      4.18      3.99      3.79      3.76      3.36      3.38      3.93      3.85       5.4       5.4      5.72      5.72      5.72      5.72       5.1       5.1      5.69      5.87      6.09      6.81      6.81      6.81      6.81      7.32       8.7      9.19      8.32      7.81      6.93      7.23      6.82      7.23      7.98      9.22      6.61      6.61       6.5       6.5       6.5      5.29      5.29      5.29      5.29      4.25       4.5      4.05      4.09      5.01       4.5       5.4       5.4       5.4       5.4      4.78      4.44      4.43      3.75      3.42      3.41      3.09      3.09      3.67      3.67      3.62       3.5       3.5       3.5       3.5       3.5      2.78      2.86      2.71      2.71      2.42      2.11      2.21      2.12      2.43      2.55      2.78      2.68      2.35      2.55      2.26      2.46      2.73      2.75      2.48      2.76      2.26       2.2      2.09      1.94      2.01      1.99      1.76      1.79      1.81      1.98      2.12      2.27      2.33      2.47      2.35      2.11      1.97      1.85      1.67      1.67      1.61      1.44      1.32      1.41      1.35      1.42      1.52      1.46      1.53      1.53      1.62      1.59      1.61      1.62      1.75      1.75      1.74      1.88      1.76      1.77       1.8      1.97      2.04      2.16      2.42      2.28      2.05      2.43       2.1      2.26      2.43      2.39       2.6      2.65       2.7      3.13      3.55      3.32      2.92      2.83      2.74       2.6       2.7      2.85      2.99      3.28       3.2      3.79      4.57      4.57      6.48      6.48      6.48      6.22      6.22      5.65      7.31      7.31      7.31      7.65      7.65      6.27      6.27      5.12      5.12      3.95      4.35      4.35      4.24      3.88      3.88      3.67      3.65      3.95      3.59      3.95      3.11      3.42      2.89      3.36      2.86      3.05      3.27      3.27      2.89      2.89      3.22      3.22      3.38      3.38      3.67      3.67      3.84      3.65      3.65      3.65      4.15      4.15      6.19      6.19      6.34      6.34      7.04      7.04      7.95      7.95      8.06      8.06      7.83      7.83       7.4       7.4      8.91      8.91      9.22      9.22      6.34       1.8      1.89      2.01      2.11      2.14      2.29      2.08      1.93      1.99      2.13      2.42      2.87      3.92      4.43      4.43      3.92      4.37      4.08      4.27      4.79      4.86      4.14       4.1      4.69      5.24      5.21      5.89      7.31      7.31      9.09      7.32      7.32      7.32      5.12      5.12      5.12       5.5       5.5      5.96      5.96      6.32      7.83      7.83      7.83      5.49      5.49      5.49      4.02      3.37       3.1      3.18      3.32       3.1      2.52      2.64      2.83      3.26       3.6      3.95         4      4.03      3.53         4      4.54      4.54      4.72      4.72      4.72      5.42      5.42      5.42      5.21      5.21      5.87      6.64      6.64      6.46      6.46      5.99      5.23      5.62      4.81      4.81      4.81      7.04      6.27      5.73      6.47      8.01      8.01      8.54      8.34      8.79      8.79      10.7      10.7      12.1      10.6      10.6      2.12      2.51      2.72      2.51      2.52       2.3       2.1       1.9       1.9      1.61      1.75      1.51      1.33      1.36       1.2      1.17      1.13      1.17      1.14      1.32      1.34      1.36      1.39      1.39      1.42      1.52      1.58      1.53      1.44      1.43      1.47      1.52      1.59      1.73      1.93      2.17      1.77      1.79      1.78      1.73      1.68      1.63      1.49       1.5      1.59      1.57      1.64       1.7      1.61      1.43      1.18      1.29      1.45      1.47      1.41      1.53      1.55      1.59      1.66      1.83      1.82      1.65       1.8      1.79      1.86      1.87      1.95      2.02      2.06      1.91         2      1.86      1.94      1.97      2.05      2.09       2.2      2.32      2.22      2.12         2      1.91      1.82      1.82      1.87      2.07         2      1.98       1.9      1.99      2.07       2.2      2.26      2.19      2.32      2.36      2.37      2.45       2.5      2.44      2.58      2.78      2.78      2.72       2.6      2.57      2.73       2.8      2.96      3.07      2.87      2.54      2.91      2.82      3.03       3.1       2.9      2.53      2.43      2.21      2.31       2.1      2.17      1.95      3.42      3.33      3.41       4.1      3.72      3.99      3.55      3.15      1.82      1.76      1.82      1.87      2.11      2.38      2.38       2.8       2.8      2.89      2.41       2.4       2.5      2.48      2.29      2.17      2.39      2.51 
        1         2         2         3         3         3         5         5         5         5         5         5         5         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         5         5         5         5         3         3         3         3         3         3         3         3         3         3         3         4         4         4         4         3         5         2         5         8         8         8         8         8         8         3         4         4         4         4         4         4         4         4         4         5         5         5         5         5         5         4         4         4         4         3         3         3         3         2         2         2         2         4         4         4         2         2         2         2         2         5         5         5         5         5         5         5         5         5         5         5         6         6         6         6         6         6         6         6         6         2         2         2         2         2         2         2         4         4         4         4         4         4         4         4         4         4         4         4         4         4         4         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         5         3         3         3         3         3         3         3         3         5         5         5         5         5         5         1         1         4         2         2         2         3         3         2         2         4         4         4         4         4         4         4         4         4         4         4         4         4         5         5         5         5         8         8         8         8         8         8         8         8         8         8         8         8         8         8         8         8         8         8         6         6         6         6         6         6         7         7         7         7         7         7         7         7         7         7         1         1         3         5         5         5         5         5         5         5         5         5         5         5         5         3         3         3         3         3         4         4         4         4         4         5         5         5         5         1         1         1         1         6         6         6         4         4         4         4         4         4         4         4         5         5         5         7         7         7         7         7         7         7         4         4         4         4         4         4         4         4         4         4         4         4         5         5         5         5         5         5         5         5         4         4         4         1         1         5         5         5         5         5         5         5         5         1         1         5         5         3         3         4         4         1         1         4         4         5         4         4        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24        24         4         4         4         4         4         6         6         6         6         6         6         6         6         1         1         1         1         1 
      296       242       242       222       222       222       311       311       311       311       311       311       311       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       279       279       279       279       252       252       233       233       233       233       233       233       233       233       233       243       243       243       243       469       226       313       256       284       284       284       284       284       284       216       337       337       345       345       345       305       305       305       305       398       398       398       398       398       398       281       281       281       281       247       247       247       247       270       270       270       270       270       270       270       276       276       276       276       276       384       384       384       384       384       384       384       384       384       384       384       432       432       432       432       432       432       432       432       432       188       188       188       188       188       188       188       437       437       437       437       437       437       437       437       437       437       437       437       437       437       437       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       403       296       296       296       296       296       296       296       193       193       193       193       193       193       193       193       398       398       398       398       398       398       265       265       255       329       329       329       402       402       348       348       224       224       277       277       277       277       277       277       277       277       277       277       277       276       276       276       276       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       307       300       300       300       300       300       300       330       330       330       330       330       330       330       330       330       330       315       315       244       264       264       264       264       264       264       264       264       264       264       264       264       223       223       223      
Download .txt
gitextract__bdmbe28/

├── .gitattributes
├── .gitignore
├── .paket/
│   ├── Paket.Restore.targets
│   └── paket.targets
├── Hype.sln
├── LICENSE.txt
├── README.md
├── Roadmap.txt
├── docs/
│   ├── .gitignore
│   ├── BuildDocs.fsx
│   └── input/
│       ├── FeedforwardNets.fsx
│       ├── HMC.fsx
│       ├── Optimization.fsx
│       ├── RecurrentNets.fsx
│       ├── Regression.fsx
│       ├── Training.fsx
│       ├── download.fsx
│       ├── files/
│       │   └── misc/
│       │       ├── style.css
│       │       ├── style_light.css
│       │       └── tips.js
│       ├── housing.data
│       ├── index.fsx
│       ├── resources/
│       │   └── Hype.pspimage
│       └── templates/
│           ├── docpage.cshtml
│           ├── reference/
│           │   ├── module.cshtml
│           │   ├── namespaces.cshtml
│           │   ├── part-members.cshtml
│           │   ├── part-nested.cshtml
│           │   └── type.cshtml
│           ├── template.cshtml
│           └── template.html
├── paket.dependencies
└── src/
    └── Hype/
        ├── AssemblyInfo.fs
        ├── Classifier.fs
        ├── Hype.fs
        ├── Hype.fsproj
        ├── Inference.fs
        ├── NLP.fs
        ├── Neural.fs
        ├── Optimize.fs
        ├── app.config
        └── paket.references
Download .txt
SYMBOL INDEX (4 symbols across 1 files)

FILE: docs/input/files/misc/tips.js
  function hideTip (line 4) | function hideTip(evt, name, unique) {
  function findPos (line 10) | function findPos(obj) {
  function hideUsingEsc (line 25) | function hideUsingEsc(e) {
  function showTip (line 30) | function showTip(evt, name, unique, owner) {
Condensed preview — 42 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (436K chars).
[
  {
    "path": ".gitattributes",
    "chars": 2518,
    "preview": "###############################################################################\n# Set default behavior to automatically "
  },
  {
    "path": ".gitignore",
    "chars": 2232,
    "preview": "## Ignore Visual Studio temporary files, build results, and\n## files generated by popular Visual Studio add-ons.\n\n# User"
  },
  {
    "path": ".paket/Paket.Restore.targets",
    "chars": 34232,
    "preview": "<Project xmlns=\"http://schemas.microsoft.com/developer/msbuild/2003\">\n  <!-- Prevent dotnet template engine to parse thi"
  },
  {
    "path": ".paket/paket.targets",
    "chars": 2447,
    "preview": "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n<Project ToolsVersion=\"4.0\" xmlns=\"http://schemas.microsoft.com/developer/msbuild"
  },
  {
    "path": "Hype.sln",
    "chars": 2347,
    "preview": "\nMicrosoft Visual Studio Solution File, Format Version 12.00\n# Visual Studio Version 16\nVisualStudioVersion = 16.0.2900"
  },
  {
    "path": "LICENSE.txt",
    "chars": 1150,
    "preview": "The MIT License (MIT)\n\nCopyright (c) 2015, National University of Ireland Maynooth (Atilim Gunes Baydin, Barak A. Pearlm"
  },
  {
    "path": "README.md",
    "chars": 1551,
    "preview": "Hype: Compositional Machine Learning and Hyperparameter Optimization\n---------------------------------------------------"
  },
  {
    "path": "Roadmap.txt",
    "chars": 662,
    "preview": "- CUDA backend (DiffSharp)\n- Example for Hamiltonian MCMC\n- Probabilistic inference\n- Convolutional neural networks (ide"
  },
  {
    "path": "docs/.gitignore",
    "chars": 7,
    "preview": "output/"
  },
  {
    "path": "docs/BuildDocs.fsx",
    "chars": 3995,
    "preview": "//\n// This file is part of\n// Hype: Compositional Machine Learning and Hyperparameter Optimization\n//\n// Copyright (c) "
  },
  {
    "path": "docs/input/FeedforwardNets.fsx",
    "chars": 26776,
    "preview": "(*** hide ***)\n#r \"../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll\"\n#r \"../../src/Hype/bin/Release/netstandard2"
  },
  {
    "path": "docs/input/HMC.fsx",
    "chars": 406,
    "preview": "(*** hide ***)\n#r \"../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll\"\n#r \"../../src/Hype/bin/Release/netstandard2"
  },
  {
    "path": "docs/input/Optimization.fsx",
    "chars": 14925,
    "preview": "(*** hide ***)\n\n#r \"../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll\"\n#r \"../../src/Hype/bin/Release/netstandard"
  },
  {
    "path": "docs/input/RecurrentNets.fsx",
    "chars": 11603,
    "preview": "(*** hide ***)\n#r \"../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll\"\n#r \"../../src/Hype/bin/Release/netstandard2"
  },
  {
    "path": "docs/input/Regression.fsx",
    "chars": 41374,
    "preview": "(*** hide ***)\n#r \"../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll\"\n#r \"../../src/Hype/bin/Release/netstandard2"
  },
  {
    "path": "docs/input/Training.fsx",
    "chars": 86733,
    "preview": "(*** hide ***)\n#r \"../../src/Hype/bin/Release/netstandard2.0/DiffSharp.dll\"\n#r \"../../src/Hype/bin/Release/netstandard2"
  },
  {
    "path": "docs/input/download.fsx",
    "chars": 616,
    "preview": "(**\nDownload\n========\n\nHype is tested on Linux and Windows.\n\nYou can download the source code or the binaries of the [l"
  },
  {
    "path": "docs/input/files/misc/style.css",
    "chars": 4797,
    "preview": "@import url(https://fonts.googleapis.com/css?family=Droid+Sans|Droid+Sans+Mono|Open+Sans:400,600,700);\n\n/*--------------"
  },
  {
    "path": "docs/input/files/misc/style_light.css",
    "chars": 4942,
    "preview": "@import url(https://fonts.googleapis.com/css?family=Droid+Sans|Droid+Sans+Mono|Open+Sans:400,600,700);\n\n/*--------------"
  },
  {
    "path": "docs/input/files/misc/tips.js",
    "chars": 1295,
    "preview": "var currentTip = null;\nvar currentTipElement = null;\n\nfunction hideTip(evt, name, unique) {\n    var el = document.getEle"
  },
  {
    "path": "docs/input/housing.data",
    "chars": 41241,
    "preview": "0.00632\t18.00\t2.310\t0\t0.5380\t6.5750\t65.20\t4.0900\t1\t296.0\t15.30\t396.90\t4.98\t24.00\n0.02731\t0.00\t7.070\t0\t0.4690\t6.4210\t78.9"
  },
  {
    "path": "docs/input/index.fsx",
    "chars": 5014,
    "preview": "(*** hide ***)\n#r \"../../src/Hype/bin/Debug/DiffSharp.dll\"\n#r \"../../src/Hype/bin/Debug/Hype.dll\"\nopen DiffSharp.AD.Flo"
  },
  {
    "path": "docs/input/templates/docpage.cshtml",
    "chars": 159,
    "preview": "@{\n  Layout = \"template\";\n  Title = Properties[\"page-title\"];\n  Description = Properties[\"project-summary\"];\n}\n@Propert"
  },
  {
    "path": "docs/input/templates/reference/module.cshtml",
    "chars": 3299,
    "preview": "@using FSharp.MetadataFormat\n@{\n  Layout = \"template\";\n  Title = Model.Module.Name + \" - \" + Properties[\"project-name\"];"
  },
  {
    "path": "docs/input/templates/reference/namespaces.cshtml",
    "chars": 1469,
    "preview": "@using FSharp.MetadataFormat\n@{\n  Layout = \"template\";\n  Title = \"Namespaces - \" + Properties[\"project-name\"];\n}\n\n<h1>@M"
  },
  {
    "path": "docs/input/templates/reference/part-members.cshtml",
    "chars": 1558,
    "preview": "@if (Enumerable.Count(Model.Members) > 0) {\n  <h3>@Model.Header</h3>\n  <table class=\"table table-bordered member-list\" s"
  },
  {
    "path": "docs/input/templates/reference/part-nested.cshtml",
    "chars": 1179,
    "preview": "@if (Enumerable.Count(Model.Types) > 0) {\n  <table class=\"table table-bordered type-list\" style=\"border-color:#2f2f2f\">\n"
  },
  {
    "path": "docs/input/templates/reference/type.cshtml",
    "chars": 3310,
    "preview": "@using FSharp.MetadataFormat\n@{\n  Layout = \"template\";\n  Title = Model.Type.Name + \" - \" + Properties[\"project-name\"];\n}"
  },
  {
    "path": "docs/input/templates/template.cshtml",
    "chars": 4217,
    "preview": "<!DOCTYPE html>\n<html lang=\"en\">\n  <head>\n    <meta charset=\"utf-8\">\n    <title>@Title</title>\n    <meta name=\"viewport"
  },
  {
    "path": "docs/input/templates/template.html",
    "chars": 4199,
    "preview": "<!DOCTYPE html>\n<html lang=\"en\">\n  <head>\n    <meta charset=\"utf-8\">\n    <title>{page-title}</title>\n    <meta name=\"vie"
  },
  {
    "path": "paket.dependencies",
    "chars": 454,
    "preview": "source https://api.nuget.org/v3/index.json\nframework: netstandard2.0\nredirects: on\nstorage: none\n\nnuget System.Drawing.C"
  },
  {
    "path": "src/Hype/AssemblyInfo.fs",
    "chars": 1733,
    "preview": "namespace Hype.AssemblyInfo\n\nopen System.Reflection\nopen System.Runtime.CompilerServices\nopen System.Runtime.InteropSer"
  },
  {
    "path": "src/Hype/Classifier.fs",
    "chars": 2201,
    "preview": "//\n// This file is part of\n// Hype: Compositional Machine Learning and Hyperparameter Optimization\n//\n// Copyright (c) "
  },
  {
    "path": "src/Hype/Hype.fs",
    "chars": 19588,
    "preview": "//\n// This file is part of\n// Hype: Compositional Machine Learning and Hyperparameter Optimization\n//\n// Copyright (c) "
  },
  {
    "path": "src/Hype/Hype.fsproj",
    "chars": 879,
    "preview": "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n<Project Sdk=\"Microsoft.NET.Sdk\">\n  <PropertyGroup>\n    <TargetFrameworks>netsta"
  },
  {
    "path": "src/Hype/Inference.fs",
    "chars": 1570,
    "preview": "//\n// This file is part of\n// Hype: Compositional Machine Learning and Hyperparameter Optimization\n//\n// Copyright (c) "
  },
  {
    "path": "src/Hype/NLP.fs",
    "chars": 3093,
    "preview": "//\n// This file is part of\n// Hype: Compositional Machine Learning and Hyperparameter Optimization\n//\n// Copyright (c) "
  },
  {
    "path": "src/Hype/Neural.fs",
    "chars": 30169,
    "preview": "//\n// This file is part of\n// Hype: Compositional Machine Learning and Hyperparameter Optimization\n//\n// Copyright (c) "
  },
  {
    "path": "src/Hype/Optimize.fs",
    "chars": 35497,
    "preview": "//\n// This file is part of\n// Hype: Compositional Machine Learning and Hyperparameter Optimization\n//\n// Copyright (c) "
  },
  {
    "path": "src/Hype/app.config",
    "chars": 2340,
    "preview": "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n<configuration>\n  <runtime>\n    \n  <assemblyBinding xmlns=\"urn:schemas-microsoft"
  },
  {
    "path": "src/Hype/paket.references",
    "chars": 43,
    "preview": "DiffSharp\nFSharp.Core\nSystem.Drawing.Common"
  }
]

// ... and 1 more files (download for full content)

About this extraction

This page contains the full source code of the hypelib/Hype GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 42 files (398.3 KB), approximately 145.5k tokens, and a symbol index with 4 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!