Repository: HarvestProfit/react-native-rectangle-scanner
Branch: master
Commit: 22fe3cfccf08
Files: 54
Total size: 176.7 KB
Directory structure:
gitextract_pw_p8jxp/
├── .eslintrc.json
├── .gitignore
├── CONTRIBUTING.md
├── LICENSE.md
├── README.md
├── RNRectangleScanner.podspec
├── android/
│ ├── .settings/
│ │ └── org.eclipse.buildship.core.prefs
│ ├── build.gradle
│ ├── gradle/
│ │ └── wrapper/
│ │ ├── gradle-wrapper.jar
│ │ └── gradle-wrapper.properties
│ ├── gradle.properties
│ ├── gradlew
│ ├── gradlew.bat
│ └── src/
│ └── main/
│ ├── AndroidManifest.xml
│ ├── java/
│ │ └── com/
│ │ └── rectanglescanner/
│ │ ├── RNRectangleScannerManager.java
│ │ ├── RNRectangleScannerModule.java
│ │ ├── RectangleScannerPackage.java
│ │ ├── helpers/
│ │ │ ├── CapturedImage.java
│ │ │ ├── ImageProcessor.java
│ │ │ ├── ImageProcessorMessage.java
│ │ │ └── Quadrilateral.java
│ │ └── views/
│ │ ├── CameraDeviceController.java
│ │ ├── MainView.java
│ │ ├── RNRectangleScannerView.java
│ │ └── RectangleDetectionController.java
│ └── res/
│ └── layout/
│ └── activity_rectangle_scanner.xml
├── example/
│ ├── .gitignore
│ ├── App.js
│ ├── app.json
│ ├── babel.config.js
│ ├── package.json
│ └── src/
│ ├── ScanDocument/
│ │ ├── CameraControls.js
│ │ ├── DocumentScanner.js
│ │ ├── index.js
│ │ └── styles.js
│ └── useIsMultiTasking.js
├── index.js
├── ios/
│ ├── CameraDeviceController.h
│ ├── CameraDeviceController.m
│ ├── RNRectangleScanner.xcodeproj/
│ │ ├── project.pbxproj
│ │ └── xcshareddata/
│ │ └── xcschemes/
│ │ └── RNRectangleScanner.xcscheme
│ ├── RNRectangleScannerManager.h
│ ├── RNRectangleScannerManager.m
│ ├── RNRectangleScannerView.h
│ ├── RNRectangleScannerView.m
│ ├── RectangleDetectionController.h
│ └── RectangleDetectionController.m
├── package.json
├── react-native.config.js
└── src/
├── Filters.js
├── FlashAnimation.js
├── RectangleOverlay.js
├── Scanner.js
└── index.d.ts
================================================
FILE CONTENTS
================================================
================================================
FILE: .eslintrc.json
================================================
{
"env": {
"jest": true
},
"extends": "airbnb",
"parser": "babel-eslint",
"rules": {
"react/no-unescaped-entities": 0,
"react/jsx-filename-extension": [1, { "extensions": [".js", ".jsx"] }],
"function-paren-newline": ["error", "consistent"],
"object-curly-newline": ["error", { "consistent": true }],
"react/destructuring-assignment": 0,
"jsx-a11y/accessible-emoji": 0
}
}
================================================
FILE: .gitignore
================================================
# OSX
#
.DS_Store
# XDE
.expo/
# VSCode
.vscode/
jsconfig.json
# Xcode
#
build/
*.pbxuser
!default.pbxuser
*.mode1v3
!default.mode1v3
*.mode2v3
!default.mode2v3
*.perspectivev3
!default.perspectivev3
xcuserdata
*.xccheckout
*.moved-aside
DerivedData
*.hmap
*.ipa
*.xcuserstate
project.xcworkspace
# Android/IntelliJ
#
build/
.idea
.gradle
local.properties
*.iml
# node.js
#
node_modules/
npm-debug.log
yarn-debug.log
yarn-error.log
# BUCK
buck-out/
\.buckd/
android/app/libs
android/keystores/debug.keystore
# generated by bob
lib/
# example expo app (ignore ios and android folders)
example/ios/
example/android/
================================================
FILE: CONTRIBUTING.md
================================================
# Contributing
### Issues
When opening an issue, try to be specific. For example, if you are opening an issue relating to the build process in android, it is helpful to include a stack trace and the gradle version you are using.
I usually will reply to an issue within the first 24hrs or so asking for more information or providing help. If the issue requires a code fix, this will take longer.
### Pull Requests
I'm always looking for additional help and am welcome to PRs! One thing to note, I am a big fan of understanding why code is being added or removed. So if you open a PR, please reference a link to why that change is being done (ex: Apple's docs say to do this... + link). This helps get the code merged in faster (otherwise, I will search the web and docs for the reason you are providing the PR.) and I think it helps other open programmers too.
### Design of Code
This package is built for react developers. This means that the native code should not restrict the javascript functionality and instead supply a robust API. For example, instead of implementing a "Focus on Point" feature in iOS and Android, we instead supply the javascript with an api to focus the camera. The javascript developer can then implement their own algorithm for camera focusing if they wish. **When requesting a feature or creating a PR, you should take this into account**
================================================
FILE: LICENSE.md
================================================
Copyright (c) 2020 GitHub Inc.
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
================================================
FILE: README.md
================================================
# `react-native-rectangle-scanner`
### ⚠️ Deprecation Notice ⚠️
iOS and Android have come a long way since this package was first released. Both iOS Vision and Android Google Play Services allows you to use their built in document scanner, both are much much better than the capabilities of this package (including editing the detected boundaries).
https://github.com/WebsiteBeaver/react-native-document-scanner-plugin is an NPM package that we switched to which supports those above mentioned native APIs and is also working well in our Expo app.
I did release one last version for this package which corrected a few minor things allowing this to work with Expo 50+ in dev client mode.

[](https://www.npmjs.com/package/react-native-rectangle-scanner)  
Live photo rectangle detection library useful for scanning documents. On capture, it returns the URIs for the original and a cropped version of the image allowing you to use the images as you want. You can additionally apply filters to adjust the visibility of text on the image (similar to the iOS document scanner filters).
- Live detection
- Perspective correction and crop of the image
- Filters
- Flash
- Orientation Changes
- Camera permission and capabilities detection
- Fully customizable UI
## Getting started
Install the library using either yarn:
```sh
yarn add react-native-rectangle-scanner
```
or npm:
```sh
npm install react-native-rectangle-scanner --save
```
you will also need to install `react-native-svg`, which is used for drawing the detected rectangle over the camera view.
### iOS Only
CocoaPods on iOS needs this extra step:
```sh
cd ios && pod install && cd ..
```
**NOTE**: you need to be targeting iOS 10 or greater. Your pod file may need `platform :ios, '10.0'` at the top
#### Info.plist
Add Camera permissions request:
Add the `NSCameraUsageDescription` tag, otherwise you will only see a black screen and no camera. iOS needs to know why you want to use the camera.
### Android Only
If you do not have it already in your project, you must link openCV in your `settings.gradle` file
```java
include ':openCVLibrary310'
project(':openCVLibrary310').projectDir = new File(rootProject.projectDir,'../node_modules/react-native-rectangle-scanner/android/openCVLibrary310')
```
#### In android/app/src/main/AndroidManifest.xml
Add Camera permissions request:
```
<uses-permission android:name="android.permission.CAMERA" />
```
## Usage
This is the most barebones usage of this. It will show a fullscreen camera preview with no controls on it. Calling `this.camera.current.capture()` will trigger a capture and after the image has been captured and processed (cropped, filtered, stored/cached), it will trigger the `onPictureProcessed` callback.
```javascript
import React, { Component, useRef } from "react"
import { View, Image } from "react-native"
import Scanner from "react-native-rectangle-scanner"
class DocumentScanner extends Component {
handleOnPictureProcessed = ({croppedImage, initialImage}) => {
this.props.doSomethingWithCroppedImagePath(croppedImage);
this.props.doSomethingWithOriginalImagePath(initialImage);
}
onCapture = () => {
this.camera.current.capture();
}
render() {
return (
<Scanner
onPictureProcessed={this.handleOnPictureProcessed}
ref={this.camera}
style={{flex: 1}}
/>
);
}
}
```
Above is a very barebones version of the scanner. Check out a full example in [example folder](example/CompleteExample.js). That will handle device specific things, rendering error states, camera controls for different device sizes, mult tasking mode, etc. This is what I would consider the production ready version of using this package (it's actually very similar to the component(s) that we use in production.
## Simulators
This package works on a simulator. Android has a pretty cool VR world that emulates a camera. On iOS the preview will just be a black screen, and the `onDeviceSetup` property will return false for the `hasCamera` attribute so you can show a custom message like "This device doesnt have a camera".
## Properties
| Prop | Default | Type | Description |
| :-------------------------- | :-----: | :-------: | :--------------------------------------------------------- |
| filterId | `none` | `integer` | The id of the filter to use. [See More](#filters) |
| enableTorch | `false` | `bool` | If the flashlight should be turned on |
| capturedQuality | `0.5` | `float` | The jpeg quality of the output images |
| onTorchChanged | `null` | `func` | Called when the system changes the flash state |
| onRectangleDetected | `null` | `func` | Called when the system detects a rectangle on the image, sends the coordinates |
| onPictureTaken | `null` | `func` | Called after an image is captured. It hasn't been cached yet but it will send you the URIs of where it will store it |
| onPictureProcessed | `null` | `func` | Called after an image was captured and cached. It sends the URIs of where it stored the images. |
| styles | `null` | `object` | Styles the camera view (works best on fullscreen/flex: 1). |
| onErrorProcessingImage | `null` | `func` | Called if there was an error capturing the image. Includes a `message` and the paths it was trying to save if the error was failing to save the image. |
| onDeviceSetup | `null` | `func` | Called after the system sets up the camera allowing you to configure the view for different device setups. |
| androidPermission | `null` | `object or false` | ANDROID ONLY: Allows specifying the permission object on android or disabling entirely (pass `false`). |
### onDeviceSetup
This callback is really important. When you show the Scanner component, it will start setting up the camera. The `onDeviceSetup({hasCamera, permissionToUseCamera, flashIsAvailable, previewHeightPercent, previewWidthPercent})` contains all the details you need to preset the camera view.
`hasCamera` will notify you if the device even has a camera. iOS simulators do not have a camera for example. This gives you the chance to hide the camera preview and show an error or something.
`permissionToUseCamera` will tell you if the user has granted permission to use the camera.
`flashIsAvailable` tells you if the device has a flashlight that you can use.
`previewHeightPercent` and `previewWidthPercent` contain percentages of the portrait view that the preview takes up. This is important because on android devices, there are preset preview sizes that may or may not match the screen size. So you can't just show the preview at full screen or the preview will be stretched. See the example on how I handle this.
### Torch
When changing the `enableTorch` property, the system will call the `onTorchChanged({enabled})` callback as well with the new state. This allows you to keep your component state in sync. Natively the torch will get turned off when the component cleans up or after an image is captured. This allows you to update the state.
### Rectangle Detection
Rectangle detection does NOT show up on the UI automatically. You must take the coordinates from the `onRectangleDetected({detectedRectangle})` callback and render a view that displays a rectangle over the camera view. This can be done easily with a simple SVG by importing `RectangleOverlay` from this package and feeding it the detected rectangle object.
Why not just handle in natively? Because it allows much more customization of the rectangle overlay. For example, you could black out the entire image, except where the detected rectangle is. This also lets you control auto capture and UI changes on detection in javascript.
#### Auto Capture
Auto capturing is handled entirely in the `RectangleOverlay` component by simply setting its `allowDetection={true}` and `onDetectedCapture={this.captureImage}` props. See that component for documentation.
#### Focusing
iOS and some android devices support `continuous focus` mode on their cameras. This means we don't need to worry about focusing the camera ever. There is a function you can call on the ref `focus()` which will trigger a refocus on android devices. *This will likely get expanded in the future to support points so you can focus on a specific location.*
### Capturing An Image
To capture an image, you must create a ref to the component. This ref will allow you to call `capture()` which will trigger the capture asynchronously.
Once triggered, it will take the current detected rectangle and crop, apply filters, and transform the image to correct the perspective. It will call `onPictureTaken({croppedImage, initialImage})` containing the URIs of the cropped image and the original image. NOTE: The image still needs to be cached which can take a few ms, so loading the image will not work yet.
The picture will then start to be processed and cached. Once done, it will call `onPictureProcessed({croppedImage, initialImage})` containing the URIs of the images. This is called after the image is cached which means you can load the images into the UI.
NOTE: There is no UI changes when you capture an image. No screen flash, only a camera sound. This is meant so you can design how you want. *The easiest way is to just use an animated view to flash a white screen.* You can import the `FlashAnimation` component to do this if you want.
**NOTE**: captured images are stored in the app's cache directory under the `CACHE_FOLDER_NAME`. This allows you to clear the cached images when you are done. (This is advised although these may get deleted by the system.)
**NOTE**: on iOS, it will try to correct the rotation of the image. If you are in portrait mode, but the phone is rotated to landscape, it will rotate the captured image automatically.
### Filters
Instead of allowing you to customize the contrast, saturation, etc of the image, I prebuilt the filters. This is because the filter controls are massively different between platforms and changing those values results in much different image outputs. Below are the avilable filters. Honestly, the color controls where pretty bad on android, so the best ones for android are probably just using the Color and Black & White instead of showing all 4 (they are only slightly better than Greyscale and the original photo).
| ID | Name | Default | Description | Preview |
| -- | ------------- | ------- | -------------------------------------- | -------------------------------------------|
| 1 | Color | | Optimzed for legibility with color. |  |
| 2 | Black & White | | Optimized for legibility without color |  |
| 3 | Greyscale | | A black & white version of the image |  |
| 4 | Photo | YES | Just the photo |  |
================================================
FILE: RNRectangleScanner.podspec
================================================
require 'json'
package = JSON.parse(File.read(File.join(__dir__, 'package.json')))
Pod::Spec.new do |s|
s.name = 'RNRectangleScanner'
s.version = package['version']
s.summary = package['description']
s.description = package['description']
s.license = package['license']
s.author = package['author']
s.homepage = 'https://github.com/HarvestProfit/react-native-rectangle-scanner'
s.source = { git: 'https://github.com/HarvestProfit/react-native-rectangle-scanner.git', tag: s.version }
s.requires_arc = true
s.platform = :ios, '10.0'
s.preserve_paths = 'README.md', 'package.json', 'index.js'
s.source_files = 'ios/**/*.{h,m}'
s.dependency 'React'
end
================================================
FILE: android/.settings/org.eclipse.buildship.core.prefs
================================================
connection.project.dir=
eclipse.preferences.version=1
================================================
FILE: android/build.gradle
================================================
buildscript {
repositories {
mavenCentral()
google()
maven {
// All of React Native (JS, Obj-C sources, Android binaries) is installed from npm
url "$rootDir/../node_modules/react-native/android"
}
}
dependencies {
classpath("com.android.tools.build:gradle:7.3.1")
}
}
apply plugin: 'com.android.library'
android {
compileSdkVersion 33
defaultConfig {
minSdkVersion 16
targetSdkVersion 33
versionCode 1
versionName "1.0"
ndk {
abiFilters "armeabi-v7a", "x86"
}
}
}
repositories {
mavenCentral()
}
dependencies {
implementation 'org.opencv:opencv:4.9.0'
implementation 'com.facebook.react:react-native:+'
}
================================================
FILE: android/gradle/wrapper/gradle-wrapper.properties
================================================
#Thu Aug 01 13:05:36 CDT 2024
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-7.4-bin.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
================================================
FILE: android/gradle.properties
================================================
# Project-wide Gradle settings.
# IDE (e.g. Android Studio) users:
# Gradle settings configured through the IDE *will override*
# any settings specified in this file.
# For more details on how to configure your build environment visit
# http://www.gradle.org/docs/current/userguide/build_environment.html
# Specifies the JVM arguments used for the daemon process.
# The setting is particularly useful for tweaking memory settings.
# Default value: -Xmx10248m -XX:MaxPermSize=256m
# org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
# When configured, Gradle will run in incubating parallel mode.
# This option should only be used with decoupled projects. More details, visit
# http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects
# org.gradle.parallel=true
# android.useDeprecatedNdk=true
================================================
FILE: android/gradlew
================================================
#!/usr/bin/env bash
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn ( ) {
echo "$*"
}
die ( ) {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
esac
# For Cygwin, ensure paths are in UNIX format before anything is touched.
if $cygwin ; then
[ -n "$JAVA_HOME" ] && JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
fi
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >&-
APP_HOME="`pwd -P`"
cd "$SAVED" >&-
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Split up the JVM_OPTS And GRADLE_OPTS values into an array, following the shell quoting and substitution rules
function splitJvmOpts() {
JVM_OPTS=("$@")
}
eval splitJvmOpts $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS
JVM_OPTS[${#JVM_OPTS[*]}]="-Dorg.gradle.appname=$APP_BASE_NAME"
exec "$JAVACMD" "${JVM_OPTS[@]}" -classpath "$CLASSPATH" org.gradle.wrapper.GradleWrapperMain "$@"
================================================
FILE: android/gradlew.bat
================================================
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windowz variants
if not "%OS%" == "Windows_NT" goto win9xME_args
if "%@eval[2+2]" == "4" goto 4NT_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
goto execute
:4NT_args
@rem Get arguments from the 4NT Shell from JP Software
set CMD_LINE_ARGS=%$
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega
================================================
FILE: android/src/main/AndroidManifest.xml
================================================
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.rectanglescanner">
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW"/>
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.FLASHLIGHT" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
</manifest>
================================================
FILE: android/src/main/java/com/rectanglescanner/RNRectangleScannerManager.java
================================================
package com.rectanglescanner;
import android.app.Activity;
import com.rectanglescanner.views.MainView;
import com.facebook.react.bridge.WritableMap;
import com.facebook.react.common.MapBuilder;
import com.facebook.react.uimanager.ThemedReactContext;
import com.facebook.react.uimanager.ViewGroupManager;
import com.facebook.react.uimanager.annotations.ReactProp;
import javax.annotation.Nullable;
import java.util.Map;
/**
* Created by Jake on Jan 6, 2020.
*/
public class RNRectangleScannerManager extends ViewGroupManager<MainView> {
private static final String REACT_CLASS = "RNRectangleScanner";
private MainView view = null;
@Override
public String getName() {
return REACT_CLASS;
}
@Override
protected MainView createViewInstance(final ThemedReactContext reactContext) {
MainView.createInstance(reactContext, (Activity) reactContext.getBaseContext());
view = MainView.getInstance();
return view;
}
// MARK: Props
@ReactProp(name = "enableTorch", defaultBoolean = false)
public void setEnableTorch(MainView view, Boolean enable) {
view.setEnableTorch(enable);
}
@ReactProp(name = "capturedQuality", defaultDouble = 0.5)
public void setCapturedQuality(MainView view, double quality) {
view.setCapturedQuality(quality);
}
@ReactProp(name = "filterId", defaultInt = 1)
public void setFilterId(MainView view, int filterId) {
view.setFilterId(filterId);
}
// Life cycle Events
@Override
public @Nullable Map getExportedCustomDirectEventTypeConstants() {
return MapBuilder.of(
"onDeviceSetup", MapBuilder.of("registrationName", "onDeviceSetup"),
"onPictureTaken", MapBuilder.of("registrationName", "onPictureTaken"),
"onPictureProcessed", MapBuilder.of("registrationName", "onPictureProcessed"),
"onErrorProcessingImage", MapBuilder.of("registrationName", "onErrorProcessingImage"),
"onRectangleDetected", MapBuilder.of("registrationName", "onRectangleDetected"),
"onTorchChanged", MapBuilder.of("registrationName", "onTorchChanged")
);
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/RNRectangleScannerModule.java
================================================
package com.rectanglescanner;
import com.rectanglescanner.views.MainView;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.bridge.ReactContextBaseJavaModule;
import com.facebook.react.bridge.ReactMethod;
/**
* Created by Jake on Jan 6, 2020.
*/
public class RNRectangleScannerModule extends ReactContextBaseJavaModule{
public RNRectangleScannerModule(ReactApplicationContext reactContext){
super(reactContext);
}
@Override
public String getName() {
return "RNRectangleScannerManager";
}
@ReactMethod
public void start(){
MainView view = MainView.getInstance();
view.startCamera();
}
@ReactMethod
public void stop(){
MainView view = MainView.getInstance();
view.stopCamera();
}
@ReactMethod
public void cleanup(){
MainView view = MainView.getInstance();
view.cleanupCamera();
}
@ReactMethod
public void refresh(){
MainView view = MainView.getInstance();
view.stopCamera();
view.startCamera();
}
@ReactMethod
public void capture(){
MainView view = MainView.getInstance();
view.capture();
}
@ReactMethod
public void focus() {
MainView view = MainView.getInstance();
view.focusCamera();
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/RectangleScannerPackage.java
================================================
package com.rectanglescanner;
import com.facebook.react.ReactPackage;
import com.facebook.react.bridge.JavaScriptModule;
import com.facebook.react.bridge.NativeModule;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.uimanager.ViewManager;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
/**
* Created by Jake on Jan 6, 2020.
*/
public class RectangleScannerPackage implements ReactPackage {
@Override
public List<NativeModule> createNativeModules(ReactApplicationContext reactContext) {
return Arrays.<NativeModule>asList(
new RNRectangleScannerModule(reactContext)
);
}
@Override
public List<ViewManager> createViewManagers(ReactApplicationContext reactContext) {
return Arrays.<ViewManager>asList(
new RNRectangleScannerManager()
);
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/helpers/CapturedImage.java
================================================
package com.rectanglescanner.helpers;
import org.opencv.core.Mat;
import org.opencv.core.Point;
import org.opencv.core.Size;
/**
* Created by Jake on Jan 6, 2020.
*/
public class CapturedImage {
public Mat original;
public Mat processed;
public Point[] previewPoints;
public Size previewSize;
public Size originalSize;
public Point[] originalPoints;
public int heightWithRatio;
public int widthWithRatio;
public CapturedImage(Mat original) {
this.original = original;
}
public Mat getProcessed() {
return processed;
}
public CapturedImage setProcessed(Mat processed) {
this.processed = processed;
return this;
}
public void release() {
if (processed != null) {
processed.release();
}
if (original != null) {
original.release();
}
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/helpers/ImageProcessor.java
================================================
package com.rectanglescanner.helpers;
import android.content.Context;
import android.content.SharedPreferences;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.preference.PreferenceManager;
import android.util.Log;
import com.rectanglescanner.views.RectangleDetectionController;
import com.rectanglescanner.helpers.ImageProcessorMessage;
import com.rectanglescanner.helpers.Quadrilateral;
import com.rectanglescanner.helpers.CapturedImage;
import android.view.Surface;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfPoint;
import org.opencv.core.MatOfPoint2f;
import org.opencv.core.Point;
import org.opencv.core.Size;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;
import android.os.Bundle;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import com.facebook.react.bridge.Arguments;
/**
Created by Jake on Jan 6, 2020.
Async processes either the image preview frame to detect rectangles, or
the captured image to crop and apply filters.
*/
public class ImageProcessor extends Handler {
private static final String TAG = "ImageProcessor";
private final RectangleDetectionController mMainActivity;
private Quadrilateral lastDetectedRectangle = null;
public ImageProcessor(Looper looper, RectangleDetectionController mainActivity, Context context) {
super(looper);
this.mMainActivity = mainActivity;
SharedPreferences sharedPref = PreferenceManager.getDefaultSharedPreferences(context);
}
/**
Receives an event message to handle async
*/
public void handleMessage(Message msg) {
if (msg.obj.getClass() == ImageProcessorMessage.class) {
ImageProcessorMessage obj = (ImageProcessorMessage) msg.obj;
String command = obj.getCommand();
Log.d(TAG, "Message Received: " + command + " - " + obj.getObj().toString());
if (command.equals("previewFrame")) {
processPreviewFrame((Mat) obj.getObj());
} else if (command.equals("pictureTaken")) {
processCapturedImage((Mat) obj.getObj());
}
}
}
/**
Detect a rectangle in the current frame from the camera video
*/
private void processPreviewFrame(Mat frame) {
rotateImageForScreen(frame);
detectRectangleInFrame(frame);
frame.release();
mMainActivity.setImageProcessorBusy(false);
}
/**
Process a single frame from the camera video
*/
private void processCapturedImage(Mat capturedImage) {
// Mat capturedImage = Imgcodecs.imdecode(picture, Imgcodecs.IMREAD_UNCHANGED);
// picture.release();
Log.d(TAG, "processCapturedImage - imported image " + capturedImage.size().width + "x" + capturedImage.size().height);
rotateImageForScreen(capturedImage);
CapturedImage doc = cropImageToLatestQuadrilateral(capturedImage);
mMainActivity.onProcessedCapturedImage(doc);
doc.release();
capturedImage.release();
mMainActivity.setImageProcessorBusy(false);
}
/**
Detects a rectangle from the image and sets the last detected rectangle
*/
private void detectRectangleInFrame(Mat inputRgba) {
ArrayList<MatOfPoint> contours = findContours(inputRgba);
Size srcSize = inputRgba.size();
this.lastDetectedRectangle = getQuadrilateral(contours, srcSize);
Bundle data = new Bundle();
if (this.lastDetectedRectangle != null) {
Bundle quadMap = this.lastDetectedRectangle.toBundle();
data.putBundle("detectedRectangle", quadMap);
} else {
data.putBoolean("detectedRectangle", false);
}
mMainActivity.rectangleWasDetected(Arguments.fromBundle(data));
}
/**
Crops the image to the latest detected rectangle and fixes perspective
*/
private CapturedImage cropImageToLatestQuadrilateral(Mat capturedImage) {
applyFilters(capturedImage);
Mat doc;
if (this.lastDetectedRectangle != null) {
Mat croppedCapturedImage = this.lastDetectedRectangle.cropImageToRectangleSize(capturedImage);
doc = fourPointTransform(croppedCapturedImage, this.lastDetectedRectangle.getPointsForSize(croppedCapturedImage.size()));
croppedCapturedImage.release();
} else {
doc = new Mat(capturedImage.size(), CvType.CV_8UC4);
capturedImage.copyTo(doc);
}
Core.flip(doc.t(), doc, 0);
Core.flip(capturedImage.t(), capturedImage, 0);
CapturedImage sd = new CapturedImage(capturedImage);
sd.originalSize = capturedImage.size();
sd.heightWithRatio = Double.valueOf(sd.originalSize.width).intValue();
sd.widthWithRatio = Double.valueOf(sd.originalSize.height).intValue();
return sd.setProcessed(doc);
}
private Quadrilateral getQuadrilateral(ArrayList<MatOfPoint> contours, Size srcSize) {
int height = Double.valueOf(srcSize.height).intValue();
int width = Double.valueOf(srcSize.width).intValue();
Size size = new Size(width, height);
Log.i(TAG, "Size----->" + size);
for (MatOfPoint c : contours) {
MatOfPoint2f c2f = new MatOfPoint2f(c.toArray());
double peri = Imgproc.arcLength(c2f, true);
MatOfPoint2f approx = new MatOfPoint2f();
Imgproc.approxPolyDP(c2f, approx, 0.02 * peri, true);
Point[] points = approx.toArray();
// select biggest 4 angles polygon
// if (points.length == 4) {
Point[] foundPoints = sortPoints(points);
if (insideArea(foundPoints, size)) {
return new Quadrilateral(c, foundPoints, new Size(srcSize.width, srcSize.height));
}
// }
}
return null;
}
private Point[] sortPoints(Point[] src) {
ArrayList<Point> srcPoints = new ArrayList<>(Arrays.asList(src));
Point[] result = { null, null, null, null };
Comparator<Point> sumComparator = new Comparator<Point>() {
@Override
public int compare(Point lhs, Point rhs) {
return Double.compare(lhs.y + lhs.x, rhs.y + rhs.x);
}
};
Comparator<Point> diffComparator = new Comparator<Point>() {
@Override
public int compare(Point lhs, Point rhs) {
return Double.compare(lhs.y - lhs.x, rhs.y - rhs.x);
}
};
// top-left corner = minimal sum
result[0] = Collections.min(srcPoints, sumComparator);
// bottom-right corner = maximal sum
result[2] = Collections.max(srcPoints, sumComparator);
// top-right corner = minimal difference
result[1] = Collections.min(srcPoints, diffComparator);
// bottom-left corner = maximal difference
result[3] = Collections.max(srcPoints, diffComparator);
return result;
}
private boolean insideArea(Point[] rp, Size size) {
int width = Double.valueOf(size.width).intValue();
int height = Double.valueOf(size.height).intValue();
int minimumSize = width / 10;
boolean isANormalShape = rp[0].x != rp[1].x && rp[1].y != rp[0].y && rp[2].y != rp[3].y && rp[3].x != rp[2].x;
boolean isBigEnough = ((rp[1].x - rp[0].x >= minimumSize) && (rp[2].x - rp[3].x >= minimumSize)
&& (rp[3].y - rp[0].y >= minimumSize) && (rp[2].y - rp[1].y >= minimumSize));
double leftOffset = rp[0].x - rp[3].x;
double rightOffset = rp[1].x - rp[2].x;
double bottomOffset = rp[0].y - rp[1].y;
double topOffset = rp[2].y - rp[3].y;
boolean isAnActualRectangle = ((leftOffset <= minimumSize && leftOffset >= -minimumSize)
&& (rightOffset <= minimumSize && rightOffset >= -minimumSize)
&& (bottomOffset <= minimumSize && bottomOffset >= -minimumSize)
&& (topOffset <= minimumSize && topOffset >= -minimumSize));
return isANormalShape && isAnActualRectangle && isBigEnough;
}
private Mat fourPointTransform(Mat src, Point[] pts) {
Point tl = pts[0];
Point tr = pts[1];
Point br = pts[2];
Point bl = pts[3];
double widthA = Math.sqrt(Math.pow(br.x - bl.x, 2) + Math.pow(br.y - bl.y, 2));
double widthB = Math.sqrt(Math.pow(tr.x - tl.x, 2) + Math.pow(tr.y - tl.y, 2));
double dw = Math.max(widthA, widthB);
int maxWidth = Double.valueOf(dw).intValue();
double heightA = Math.sqrt(Math.pow(tr.x - br.x, 2) + Math.pow(tr.y - br.y, 2));
double heightB = Math.sqrt(Math.pow(tl.x - bl.x, 2) + Math.pow(tl.y - bl.y, 2));
double dh = Math.max(heightA, heightB);
int maxHeight = Double.valueOf(dh).intValue();
Mat doc = new Mat(maxHeight, maxWidth, CvType.CV_8UC4);
Mat src_mat = new Mat(4, 1, CvType.CV_32FC2);
Mat dst_mat = new Mat(4, 1, CvType.CV_32FC2);
src_mat.put(0, 0, tl.x, tl.y, tr.x, tr.y, br.x, br.y,
bl.x, bl.y);
dst_mat.put(0, 0, 0.0, 0.0, dw, 0.0, dw, dh, 0.0, dh);
Mat m = Imgproc.getPerspectiveTransform(src_mat, dst_mat);
Imgproc.warpPerspective(src, doc, m, doc.size());
return doc;
}
private ArrayList<MatOfPoint> findContours(Mat src) {
Mat grayImage;
Mat cannedImage;
Mat resizedImage;
int height = Double.valueOf(src.size().height).intValue();
int width = Double.valueOf(src.size().width).intValue();
Size size = new Size(width, height);
resizedImage = new Mat(size, CvType.CV_8UC4);
grayImage = new Mat(size, CvType.CV_8UC4);
cannedImage = new Mat(size, CvType.CV_8UC1);
Imgproc.resize(src, resizedImage, size);
Imgproc.cvtColor(resizedImage, grayImage, Imgproc.COLOR_RGBA2GRAY, 4);
Imgproc.GaussianBlur(grayImage, grayImage, new Size(5, 5), 0);
Imgproc.Canny(grayImage, cannedImage, 80, 100, 3, false);
ArrayList<MatOfPoint> contours = new ArrayList<>();
Mat hierarchy = new Mat();
Imgproc.findContours(cannedImage, contours, hierarchy, Imgproc.RETR_TREE, Imgproc.CHAIN_APPROX_SIMPLE);
hierarchy.release();
Collections.sort(contours, new Comparator<MatOfPoint>() {
@Override
public int compare(MatOfPoint lhs, MatOfPoint rhs) {
return Double.compare(Imgproc.contourArea(rhs), Imgproc.contourArea(lhs));
}
});
resizedImage.release();
grayImage.release();
cannedImage.release();
return contours;
}
/*!
Applies filters to the image based on the set filter
*/
public void applyFilters(Mat image) {
int filterId = this.mMainActivity.getFilterId();
switch (filterId) {
case 1: {
// original image
break;
}
case 2: {
applyGreyscaleFilterToImage(image);
break;
}
case 3: {
applyColorFilterToImage(image);
break;
}
case 4: {
applyBlackAndWhiteFilterToImage(image);
break;
}
default:
// original image
}
}
/*!
Slightly enhances the black and white image
*/
public Mat applyGreyscaleFilterToImage(Mat image)
{
Imgproc.cvtColor(image, image, Imgproc.COLOR_RGBA2GRAY);
return image;
}
/*!
Slightly enhances the black and white image
*/
public Mat applyBlackAndWhiteFilterToImage(Mat image)
{
Imgproc.cvtColor(image, image, Imgproc.COLOR_RGBA2GRAY);
image.convertTo(image, -1, 1, 10);
return image;
}
/*!
Slightly enhances the color on the image
*/
public Mat applyColorFilterToImage(Mat image)
{
image.convertTo(image, -1, 1.2, 0);
return image;
}
public void rotateImageForScreen(Mat image) {
switch (this.mMainActivity.lastDetectedRotation) {
case Surface.ROTATION_90: {
// Do nothing
break;
}
case Surface.ROTATION_180: {
Core.flip(image.t(), image, 0);
break;
}
case Surface.ROTATION_270: {
Core.flip(image, image, 0);
Core.flip(image, image, 1);
break;
}
case Surface.ROTATION_0:
default: {
Core.flip(image.t(), image, 1);
break;
}
}
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/helpers/ImageProcessorMessage.java
================================================
package com.rectanglescanner.helpers;
/**
* Created by Jake on Jan 6, 2020.
*/
public class ImageProcessorMessage {
private String command;
private Object obj;
public ImageProcessorMessage(String command , Object obj ) {
setObj(obj);
setCommand(command);
}
public String getCommand() {
return command;
}
public void setCommand(String command) {
this.command = command;
}
public Object getObj() {
return obj;
}
public void setObj(Object obj) {
this.obj = obj;
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/helpers/Quadrilateral.java
================================================
package com.rectanglescanner.helpers;
import org.opencv.core.MatOfPoint;
import org.opencv.core.Rect;
import org.opencv.core.Mat;
import org.opencv.core.Point;
import org.opencv.core.Size;
import android.os.Bundle;
/**
* Created by Jake on Jan 6, 2020.
* Represents the detected rectangle from an image
*/
public class Quadrilateral {
public MatOfPoint contour;
public Point[] points;
public Size sourceSize;
public Quadrilateral(MatOfPoint contour, Point[] points, Size sourceSize) {
this.contour = contour;
this.points = points;
this.sourceSize = sourceSize;
}
/**
Crops the edges of the image to the aspect ratio of the detected rectangle.
*/
public Mat cropImageToRectangleSize(Mat image) {
Size imageSize = image.size();
double rectangleRatio = this.sourceSize.height / this.sourceSize.width;
double imageRatio = imageSize.height / imageSize.width;
double cropHeight = imageSize.height;
double cropWidth = imageSize.width;
// Used to center the crop in the middle
int rectangleXCoord = 0;
int rectangleYCoord = 0;
if (imageRatio > rectangleRatio) {
// Height should be cropped
cropHeight = cropWidth * rectangleRatio;
rectangleYCoord = (int)((imageSize.height - cropHeight) / 2);
} else {
// Width should be cropped
cropWidth = cropHeight / rectangleRatio;
rectangleXCoord = (int)((imageSize.width - cropWidth) / 2);
}
Rect rectCrop = new Rect(rectangleXCoord, rectangleYCoord, (int)cropWidth, (int)cropHeight);
return new Mat(image, rectCrop);
}
/**
Returns the points of the rectangle scaled to the given size
*/
public Point[] getPointsForSize(Size outputSize) {
double scale = outputSize.height / this.sourceSize.height;
if (scale == 1) {
return this.points;
}
Point[] scaledPoints = new Point[4];
for (int i = 0;i < this.points.length;i++ ) {
scaledPoints[i] = this.points[i].clone();
scaledPoints[i].x *= scale;
scaledPoints[i].y *= scale;
}
return scaledPoints;
}
/**
Returns the rectangle as a bundle object
*/
public Bundle toBundle() {
Bundle quadMap = new Bundle();
Bundle bottomLeft = new Bundle();
bottomLeft.putDouble("x", this.points[2].x);
bottomLeft.putDouble("y", this.points[2].y);
quadMap.putBundle("bottomLeft", bottomLeft);
Bundle bottomRight = new Bundle();
bottomRight.putDouble("x", this.points[1].x);
bottomRight.putDouble("y", this.points[1].y);
quadMap.putBundle("bottomRight", bottomRight);
Bundle topLeft = new Bundle();
topLeft.putDouble("x", this.points[3].x);
topLeft.putDouble("y", this.points[3].y);
quadMap.putBundle("topLeft", topLeft);
Bundle topRight = new Bundle();
topRight.putDouble("x", this.points[0].x);
topRight.putDouble("y", this.points[0].y);
quadMap.putBundle("topRight", topRight);
Bundle dimensions = new Bundle();
dimensions.putDouble("height", this.sourceSize.height);
dimensions.putDouble("width", this.sourceSize.width);
quadMap.putBundle("dimensions", dimensions);
return quadMap;
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/views/CameraDeviceController.java
================================================
package com.rectanglescanner.views;
import android.app.Activity;
import android.content.Context;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.ImageFormat;
import android.hardware.Camera;
import android.hardware.Camera.PictureCallback;
import android.media.AudioManager;
import android.media.MediaActionSound;
import android.os.Build;
import android.util.AttributeSet;
import android.util.Log;
import android.view.Display;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.WindowManager;
import android.content.res.Configuration;
import android.widget.FrameLayout;
import com.rectanglescanner.R;
import com.facebook.react.bridge.WritableMap;
import com.facebook.react.bridge.WritableNativeMap;
import org.opencv.android.JavaCameraView;
import org.opencv.android.Utils;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;
import java.util.List;
/**
Created by Jake on Jan 6, 2020.
Handles Generic camera device setup and capture
*/
public class CameraDeviceController extends JavaCameraView implements PictureCallback {
public static final String TAG = "CameraDeviceController";
protected Context mContext;
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
protected final boolean mBugRotate = false;
protected boolean safeToTakePicture;
protected Activity mActivity;
private PictureCallback pCallback;
protected Boolean enableTorch = false;
public int lastDetectedRotation = Surface.ROTATION_0;
protected View mView = null;
protected boolean cameraIsSetup = false;
protected boolean isStopped = true;
private WritableMap deviceConfiguration = new WritableNativeMap();
private int captureDevice = -1;
private boolean imageProcessorBusy = true;
private boolean cameraRequiresManualAutoFocus = false;
private static CameraDeviceController mThis;
public CameraDeviceController(Context context, AttributeSet attrs) {
super(context, attrs);
}
public CameraDeviceController(Context context, Integer numCam, Activity activity, FrameLayout frameLayout) {
super(context, numCam);
this.mContext = context;
this.mActivity = activity;
pCallback = this;
mView = frameLayout;
context.getSystemService(Context.LAYOUT_INFLATER_SERVICE);
}
//================================================================================
// Setters
//================================================================================
/**
Toggles the flash on the camera device
*/
public void setEnableTorch(boolean enableTorch) {
this.enableTorch = enableTorch;
if (mCamera != null) {
Camera.Parameters p = mCamera.getParameters();
p.setFlashMode(enableTorch ? Camera.Parameters.FLASH_MODE_TORCH : Camera.Parameters.FLASH_MODE_OFF);
mCamera.setParameters(p);
}
torchWasChanged(enableTorch);
}
protected void torchWasChanged(boolean torchEnabled) {}
/**
Cleans up the camera view
*/
public void cleanupCamera() {
if (mCamera != null) {
mCamera.stopPreview();
mCamera.setPreviewCallback(null);
mCamera.release();
mCamera = null;
this.cameraIsSetup = false;
}
}
/**
Stops and restarts the camera
*/
private void refreshCamera() {
stopCamera();
startCamera();
}
/**
Starts the capture session
*/
public void startCamera() {
Log.d(TAG, "Starting preview");
if (this.isStopped) {
try {
if (!this.cameraIsSetup) {
setupCameraView();
}
mCamera.setPreviewDisplay(mSurfaceHolder);
mCamera.startPreview();
mCamera.setPreviewCallback(this);
this.isStopped = false;
} catch (Exception e) {
Log.d(TAG, "Error starting preview: " + e);
}
}
}
/**
Stops the capture session
*/
public void stopCamera() {
Log.d(TAG, "Stopping preview");
if (!this.isStopped) {
try {
if (mCamera != null) {
mCamera.stopPreview();
}
this.isStopped = true;
}
catch (Exception e) {
Log.d(TAG, "Error stopping preview: " + e);
}
}
}
/**
Tell the camera to focus
*/
public void focusCamera() {
Log.d(TAG, "Autofocusing");
mCamera.autoFocus(null);
}
/**
Sets the device configuration flash setting
*/
public void setDeviceConfigurationFlashAvailable(boolean isAvailable) {
this.deviceConfiguration.putBoolean("flashIsAvailable", isAvailable);
}
/**
Sets the device configuration permission setting
*/
public void setDeviceConfigurationPermissionToUseCamera(boolean granted){
this.deviceConfiguration.putBoolean("permissionToUseCamera", granted);
}
/**
Sets the device configuration camera availablility
*/
public void setDeviceConfigurationHasCamera(boolean isAvailable){
this.deviceConfiguration.putBoolean("hasCamera", isAvailable);
}
/**
Sets the percent size of the camera preview
*/
public void setDeviceConfigurationPreviewPercentSize(double heightPercent, double widthPercent) {
this.deviceConfiguration.putDouble("previewHeightPercent", heightPercent);
this.deviceConfiguration.putDouble("previewWidthPercent", widthPercent);
}
/**
Sets the inital device configuration
*/
public void resetDeviceConfiguration()
{
this.deviceConfiguration = new WritableNativeMap();
setDeviceConfigurationFlashAvailable(false);
setDeviceConfigurationPermissionToUseCamera(false);
setDeviceConfigurationHasCamera(false);
setDeviceConfigurationPreviewPercentSize(1.0, 1.0);
}
/**
Called after the camera and session are set up. This lets you check if a
camera is found and permission is granted to use it.
*/
public void commitDeviceConfiguration() {
deviceWasSetup(this.deviceConfiguration);
}
protected void deviceWasSetup(WritableMap config) {}
//================================================================================
// Getters
//================================================================================
private int getCameraDevice() {
int cameraId = -1;
// Search for the back facing camera
// get the number of cameras
int numberOfCameras = Camera.getNumberOfCameras();
// for every camera check
for (int i = 0; i < numberOfCameras; i++) {
Camera.CameraInfo info = new Camera.CameraInfo();
Camera.getCameraInfo(i, info);
if (info.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
cameraId = i;
break;
}
cameraId = i;
}
return cameraId;
}
/**
Given a list of resolution sizes and a ratio to fit to, it will find the highest resolution
that fits the ratio the best.
*/
private Camera.Size getOptimalResolution(float ratioToFitTo, List<Camera.Size> resolutionList) {
int maxPixels = 0;
int ratioMaxPixels = 0;
double bestRatioDifference = 5;
Camera.Size currentMaxRes = null;
Camera.Size ratioCurrentMaxRes = null;
for (Camera.Size r : resolutionList) {
float pictureRatio = (float) r.width / r.height;
Log.d(TAG, "supported resolution: " + r.width + "x" + r.height + " ratio: " + pictureRatio + " ratioToFitTo: " + ratioToFitTo);
int resolutionPixels = r.width * r.height;
double ratioDifference = Math.abs(ratioToFitTo - pictureRatio);
if (resolutionPixels > ratioMaxPixels && ratioDifference < bestRatioDifference) {
ratioMaxPixels = resolutionPixels;
ratioCurrentMaxRes = r;
bestRatioDifference = ratioDifference;
}
if (resolutionPixels > maxPixels) {
maxPixels = resolutionPixels;
currentMaxRes = r;
}
}
if (ratioCurrentMaxRes != null) {
Log.d(TAG, "Max supported resolution with aspect ratio: " + ratioCurrentMaxRes.width + "x"
+ ratioCurrentMaxRes.height);
return ratioCurrentMaxRes;
}
return currentMaxRes;
}
//================================================================================
// Setup
//================================================================================
/**
Creates a session for the camera device and outputs it to a preview view.
@note Called on view did load
*/
public void setupCameraView()
{
resetDeviceConfiguration();
if (mSurfaceView == null) {
mSurfaceView = mView.findViewById(R.id.surfaceView);
mSurfaceHolder = this.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
setupCamera();
commitDeviceConfiguration();
// [self listenForOrientationChanges];
this.cameraIsSetup = true;
}
/**
Sets up the hardware and capture session asking for permission to use the camera if needed.
*/
public void setupCamera() {
if (!setupCaptureDevice()) {
return;
}
Camera.Parameters param;
param = mCamera.getParameters();
PackageManager pm = mActivity.getPackageManager();
if (pm.hasSystemFeature(PackageManager.FEATURE_CAMERA_FLASH)) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
if (param.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
param.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
} else if (param.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
param.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
cameraRequiresManualAutoFocus = true;
}
param.setPictureFormat(ImageFormat.JPEG);
mCamera.setDisplayOrientation(getScreenRotationOnPhone());
Display display = mActivity.getWindowManager().getDefaultDisplay();
android.graphics.Point size = new android.graphics.Point();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1) {
display.getRealSize(size);
}
int displayWidth = Math.min(size.y, size.x);
int displayHeight = Math.max(size.y, size.x);
float displayRatio = (float) displayHeight / displayWidth;
Camera.Size pSize = getOptimalResolution(displayRatio, getResolutionList());
param.setPreviewSize(pSize.width, pSize.height);
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_AUTO);
float previewRatio = (float) pSize.width / pSize.height;
setDevicePreviewSize(previewRatio);
Camera.Size maxRes = getOptimalResolution(previewRatio, getPictureResolutionList());
if (maxRes != null) {
param.setPictureSize(maxRes.width, maxRes.height);
Log.d(TAG, "max supported picture resolution: " + maxRes.width + "x" + maxRes.height);
}
try {
mCamera.setParameters(param);
setDeviceConfigurationPermissionToUseCamera(true);
safeToTakePicture = true;
} catch (Exception e) {
Log.d(TAG, "failed to initialize the camera settings");
}
}
/**
Sets the surface preview ratio size. Some android devices will have a different
sized preview than their full screen size so this allows for some size adjusting
so the preview's aspect ratio is intact
*/
public void setDevicePreviewSize(float previewRatio) {
Display display = mActivity.getWindowManager().getDefaultDisplay();
android.graphics.Point size = new android.graphics.Point();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1) {
display.getRealSize(size);
}
int displayWidth = Math.min(size.y, size.x);
int displayHeight = Math.max(size.y, size.x);
float displayRatio = (float) displayHeight / displayWidth;
int previewHeight = displayHeight;
int previewWidth = displayWidth;
int sizeY = size.y;
int sizeX = size.x;
if (this.lastDetectedRotation == Surface.ROTATION_90 || this.lastDetectedRotation == Surface.ROTATION_270) {
sizeY = size.x;
sizeX = size.y;
}
if (displayRatio > previewRatio) {
// Adjust height
previewHeight = (int) ((float) sizeY / displayRatio * previewRatio);
} else if (displayRatio < previewRatio) {
// Adjust Width
previewWidth = (int) ((float) sizeX * displayRatio / previewRatio);
}
double percentOfScreenSizeHeight = (double) previewHeight / displayHeight;
double percentOfScreenSizeWidth = (double) previewWidth / displayWidth;
setDeviceConfigurationPreviewPercentSize(percentOfScreenSizeHeight, percentOfScreenSizeWidth);
}
/**
Finds a physical camera, configures it, and sets the captureDevice property to it
@return boolean if the camera was found and opened correctly
*/
public boolean setupCaptureDevice() {
this.captureDevice = getCameraDevice();
try {
int cameraId = getCameraDevice();
mCamera = Camera.open(cameraId);
} catch (RuntimeException e) {
System.err.println(e);
return false;
}
setDeviceConfigurationHasCamera(true);
PackageManager pm = mActivity.getPackageManager();
if (pm.hasSystemFeature(PackageManager.FEATURE_CAMERA_FLASH)) {
setDeviceConfigurationFlashAvailable(true);
}
return true;
}
//================================================================================
// Capture Image
//================================================================================
public void captureImageLater() {
PackageManager pm = mActivity.getPackageManager();
if (this.safeToTakePicture) {
this.safeToTakePicture = false;
try {
if (cameraRequiresManualAutoFocus) {
mCamera.autoFocus(new Camera.AutoFocusCallback() {
@Override
public void onAutoFocus(boolean success, Camera camera) {
if (success) {
takePicture();
} else {
onPictureFailed();
}
}
});
} else {
takePicture();
}
} catch (Exception e) {
onPictureFailed();
}
}
}
private void takePicture() {
mCamera.takePicture(null, null, pCallback);
makeShutterSound();
}
private void onPictureFailed() {
Log.d(TAG, "failed to capture image");
mCamera.cancelAutoFocus();
this.safeToTakePicture = true;
}
/**
Responds to the capture image call. It will apply a few filters and call handleCapturedImage which can be overrided for more processing
*/
@Override
public void onPictureTaken(byte[] data, Camera camera) {
setEnableTorch(false);
this.safeToTakePicture = true;
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
Mat picture = new Mat();
Bitmap bmp32 = bitmap.copy(Bitmap.Config.ARGB_8888, true);
Utils.bitmapToMat(bmp32, picture);
Mat mat = new Mat();
Imgproc.cvtColor(picture, mat, Imgproc.COLOR_BGR2RGB, 4);
handleCapturedImage(mat);
}
public void handleCapturedImage(Mat capturedImage) {}
public int getScreenRotationOnPhone() {
final Display display = ((WindowManager) mContext
.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
this.lastDetectedRotation = display.getRotation();
switch (this.lastDetectedRotation) {
case Surface.ROTATION_0:
return 90;
case Surface.ROTATION_90:
return 0;
case Surface.ROTATION_180:
return 270;
case Surface.ROTATION_270:
return 180;
}
return 90;
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
mCamera.setDisplayOrientation(getScreenRotationOnPhone());
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
cleanupCamera();
}
/**
Processes the image output from the capture session.
*/
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
mSurfaceView.setVisibility(SurfaceView.VISIBLE);
Camera.Size pictureSize = camera.getParameters().getPreviewSize();
Mat yuv = new Mat(new Size(pictureSize.width, pictureSize.height * 1.5), CvType.CV_8UC1);
yuv.put(0, 0, data);
Mat mat = new Mat(new Size(pictureSize.width, pictureSize.height), CvType.CV_8UC4);
Imgproc.cvtColor(yuv, mat, Imgproc.COLOR_YUV2RGBA_NV21, 4);
yuv.release();
processOutput(mat);
} catch(Exception e) {
Log.d(TAG, "Error processing preview frame: " + e);
}
}
public void processOutput(Mat image) {}
private void makeShutterSound() {
AudioManager audio = (AudioManager) mActivity.getSystemService(Context.AUDIO_SERVICE);
if (audio.getRingerMode() == AudioManager.RINGER_MODE_NORMAL) {
MediaActionSound sound = new MediaActionSound();
sound.play(MediaActionSound.SHUTTER_CLICK);
}
}
private List<Camera.Size> getResolutionList() {
return mCamera.getParameters().getSupportedPreviewSizes();
}
private List<Camera.Size> getPictureResolutionList() {
return mCamera.getParameters().getSupportedPictureSizes();
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/views/MainView.java
================================================
package com.rectanglescanner.views;
import android.app.Activity;
import android.content.Context;
import android.view.LayoutInflater;
import android.widget.FrameLayout;
import com.facebook.react.bridge.WritableMap;
import com.facebook.react.bridge.WritableNativeMap;
import com.facebook.react.bridge.ReactContext;
import com.facebook.react.uimanager.events.RCTEventEmitter;
import com.rectanglescanner.R;
public class MainView extends FrameLayout {
private RNRectangleScannerView view;
public static MainView instance = null;
public static MainView getInstance() {
return instance;
}
public static void createInstance(Context context, Activity activity) {
instance = new MainView(context, activity);
}
private MainView(Context context, Activity activity) {
super(context);
LayoutInflater lf = (LayoutInflater) context.getSystemService(Context.LAYOUT_INFLATER_SERVICE);
FrameLayout frameLayout = (FrameLayout) lf.inflate(R.layout.activity_rectangle_scanner, null);
view = new RNRectangleScannerView(context, -1, activity, frameLayout);
view.setParent(this);
addViewInLayout(view, 0, new FrameLayout.LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT));
addViewInLayout(frameLayout, 1, view.getLayoutParams());
}
@Override
protected void onLayout(boolean changed, int l, int t, int r, int b) {
for (int i = 0; i < getChildCount(); i++) {
getChildAt(i).layout(l, t, r, b);
}
}
public void setEnableTorch(boolean enable) {
view.setEnableTorch(enable);
}
public void setCapturedQuality(double quality) {
view.setCapturedQuality(quality);
}
public void setFilterId(int filterId) {
view.setFilterId(filterId);
}
public void startCamera() {
view.startCamera();
}
public void stopCamera() {
view.stopCamera();
}
public void cleanupCamera() {
view.cleanupCamera();
}
public void capture() {
view.capture();
}
public void focusCamera() {
view.focusCamera();
}
public void deviceWasSetup(WritableMap config) {
final ReactContext context = (ReactContext) getContext();
context.getJSModule(RCTEventEmitter.class).receiveEvent(getId(), "onDeviceSetup", config);
}
public void torchWasChanged(boolean torchEnabled) {
WritableMap map = new WritableNativeMap();
map.putBoolean("enabled", torchEnabled);
final ReactContext context = (ReactContext) getContext();
context.getJSModule(RCTEventEmitter.class).receiveEvent(getId(), "onTorchChanged", map);
}
public void rectangleWasDetected(WritableMap detection) {
final ReactContext context = (ReactContext) getContext();
context.getJSModule(RCTEventEmitter.class).receiveEvent(getId(), "onRectangleDetected", detection);
}
public void pictureWasTaken(WritableMap pictureDetails) {
final ReactContext context = (ReactContext) getContext();
context.getJSModule(RCTEventEmitter.class).receiveEvent(getId(), "onPictureTaken", pictureDetails);
}
public void pictureWasProcessed(WritableMap pictureDetails) {
final ReactContext context = (ReactContext) getContext();
context.getJSModule(RCTEventEmitter.class).receiveEvent(getId(), "onPictureProcessed", pictureDetails);
}
public void pictureDidFailToProcess(WritableMap errorDetails) {
final ReactContext context = (ReactContext) getContext();
context.getJSModule(RCTEventEmitter.class).receiveEvent(getId(), "onErrorProcessingImage", errorDetails);
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/views/RNRectangleScannerView.java
================================================
package com.rectanglescanner.views;
import android.app.Activity;
import android.content.Context;
import android.util.Log;
import android.widget.FrameLayout;
import com.rectanglescanner.R;
import com.rectanglescanner.helpers.CapturedImage;
import com.facebook.react.bridge.WritableMap;
import com.facebook.react.bridge.WritableNativeMap;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfInt;
import org.opencv.imgcodecs.Imgcodecs;
import java.util.UUID;
import java.io.File;
import java.util.ArrayList;
/**
Created by Jake on Jan 6, 2020.
Wraps up the camera and rectangle detection code into a simple interface.
Allows you to call start, stop, cleanup, and capture. Also is responsible
for deterining how to cache the output images.
*/
public class RNRectangleScannerView extends RectangleDetectionController {
private String cacheFolderName = "RNRectangleScanner";
private double capturedQuality = 0.5;
//================================================================================
// Setup
//================================================================================
public RNRectangleScannerView(Context context, Integer numCam, Activity activity, FrameLayout frameLayout) {
super(context, numCam, activity, frameLayout);
}
private MainView parentView = null;
public void setParent(MainView view) {
this.parentView = view;
}
/**
Sets the jpeg quality of the output image
*/
public void setCapturedQuality(double quality) {
this.capturedQuality = quality;
}
/**
Call to capture an image
*/
public void capture() {
captureImageLater();
}
/**
Called after a picture was captured
*/
private void pictureWasTaken(WritableMap pictureDetails) {
Log.d(TAG, "picture taken");
this.parentView.pictureWasTaken(pictureDetails);
}
/**
Called after a picture was captured and finished processing
*/
private void pictureWasProcessed(WritableMap pictureDetails) {
Log.d(TAG, "picture processed");
this.parentView.pictureWasProcessed(pictureDetails);
}
/**
Called if the picture faiiled to be captured
*/
private void pictureDidFailToProcess(WritableMap errorDetails) {
Log.d(TAG, "picture failed to process");
this.parentView.pictureDidFailToProcess(errorDetails);
}
/**
Called after the torch/flash state was changed
*/
@Override
protected void torchWasChanged(boolean torchEnabled) {
Log.d(TAG, "torch changed");
this.parentView.torchWasChanged(torchEnabled);
}
/**
Called after the camera and session are set up. This lets you check if a
camera is found and permission is granted to use it.
*/
@Override
protected void deviceWasSetup(WritableMap config) {
Log.d(TAG, "device setup");
this.parentView.deviceWasSetup(config);
}
/**
Called after a frame is processed and a rectangle was found
*/
@Override
public void rectangleWasDetected(WritableMap detection) {
this.parentView.rectangleWasDetected(detection);
}
/**
After an image is captured and cropped, this method is called
*/
@Override
public void onProcessedCapturedImage(CapturedImage capturedImage) {
WritableMap pictureWasTakenConfig = new WritableNativeMap();
WritableMap pictureWasProcessedConfig = new WritableNativeMap();
String croppedImageFileName = null;
String originalImageFileName = null;
boolean hasCroppedImage = (capturedImage.processed != null);
try {
originalImageFileName = generateStoredFileName("O");
if (hasCroppedImage) {
croppedImageFileName = generateStoredFileName("C");
} else {
croppedImageFileName = originalImageFileName;
}
} catch(Exception e) {
WritableMap folderError = new WritableNativeMap();
folderError.putString("message", "Failed to create the cache directory");
pictureDidFailToProcess(folderError);
return;
}
pictureWasTakenConfig.putString("croppedImage", "file://" + croppedImageFileName);
pictureWasTakenConfig.putString("initialImage", "file://" + originalImageFileName);
pictureWasProcessedConfig.putString("croppedImage", "file://" + croppedImageFileName);
pictureWasProcessedConfig.putString("initialImage", "file://" + originalImageFileName);
pictureWasTaken(pictureWasTakenConfig);
if (hasCroppedImage && !this.saveToDirectory(capturedImage.processed, croppedImageFileName)) {
WritableMap fileError = new WritableNativeMap();
fileError.putString("message", "Failed to write cropped image to cache");
fileError.putString("filePath", croppedImageFileName);
pictureDidFailToProcess(fileError);
return;
}
if (!this.saveToDirectory(capturedImage.original, originalImageFileName)) {
WritableMap fileError = new WritableNativeMap();
fileError.putString("message", "Failed to write original image to cache");
fileError.putString("filePath", originalImageFileName);
pictureDidFailToProcess(fileError);
return;
}
pictureWasProcessed(pictureWasProcessedConfig);
capturedImage.release();
Log.d(TAG, "Captured Images");
}
private String generateStoredFileName(String name) throws Exception {
String folderDir = this.mContext.getCacheDir().toString();
File folder = new File( folderDir + "/" + this.cacheFolderName);
if (!folder.exists()) {
boolean result = folder.mkdirs();
if (result) {
Log.d(TAG, "wrote: created folder " + folder.getPath());
} else {
Log.d(TAG, "Not possible to create folder");
throw new Exception("Failed to create the cache directory");
}
}
return folderDir + "/" + this.cacheFolderName + "/" + name + UUID.randomUUID() + ".png";
}
/**
Saves a file to a folder
*/
private boolean saveToDirectory(Mat doc, String fileName) {
Mat endDoc = new Mat(doc.size(), CvType.CV_8UC4);
doc.copyTo(endDoc);
Core.flip(doc.t(), endDoc, 1);
ArrayList<Integer> parameters = new ArrayList();
parameters.add(Imgcodecs.IMWRITE_JPEG_QUALITY);
parameters.add((int)(this.capturedQuality * 100));
MatOfInt par = new MatOfInt();
par.fromList(parameters);
boolean success = Imgcodecs.imwrite(fileName, endDoc, par);
endDoc.release();
return success;
}
}
================================================
FILE: android/src/main/java/com/rectanglescanner/views/RectangleDetectionController.java
================================================
package com.rectanglescanner.views;
import android.app.Activity;
import android.content.Context;
import android.os.Build;
import android.os.HandlerThread;
import android.os.Message;
import android.util.Log;
import android.view.Display;
import android.view.WindowManager;
import android.widget.FrameLayout;
import com.rectanglescanner.R;
import com.rectanglescanner.helpers.ImageProcessor;
import com.rectanglescanner.helpers.ImageProcessorMessage;
import com.rectanglescanner.helpers.CapturedImage;
import com.facebook.react.bridge.WritableMap;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
/**
Created by Jake on Jan 6, 2020.
Takes the output from the camera device controller and attempts to detect
rectangles from the output. On capture, it will also crop the image.
*/
public class RectangleDetectionController extends CameraDeviceController {
private HandlerThread mImageThread;
private ImageProcessor mImageProcessor;
private int numberOfRectangles = 15;
private boolean imageProcessorBusy = true;
private int filterId = 1;
public void setImageProcessorBusy(boolean isBusy) {
this.imageProcessorBusy = isBusy;
}
public int getFilterId() {
return this.filterId;
}
/**
Sets the currently active filter
*/
public void setFilterId(int filterId) {
this.filterId = filterId;
}
//================================================================================
// Setup
//================================================================================
public RectangleDetectionController(Context context, Integer numCam, Activity activity, FrameLayout frameLayout) {
super(context, numCam, activity, frameLayout);
initializeImageProcessor(context);
}
/**
Sets up the image processor. It uses OpenCV so it needs to load that first
*/
private void initializeImageProcessor(Context context) {
mActivity.getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Display display = mActivity.getWindowManager().getDefaultDisplay();
android.graphics.Point size = new android.graphics.Point();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1) {
display.getRealSize(size);
}
if (OpenCVLoader.initLocal()) {
Log.i(TAG, "OpenCV loaded successfully");
} else {
Log.e(TAG, "OpenCV initialization failed!");
return;
}
if (mImageThread == null) {
mImageThread = new HandlerThread("Worker Thread");
mImageThread.start();
}
if (mImageProcessor == null) {
mImageProcessor = new ImageProcessor(mImageThread.getLooper(), this, mContext);
}
this.setImageProcessorBusy(false);
}
//================================================================================
// Image Detection
//================================================================================
/**
Runs each frame the image is being pushed to the preview layer
*/
@Override
public void processOutput(Mat image) {
detectRectangleFromImageLater(image);
}
/**
Looks for a rectangle in the given image async
*/
private void detectRectangleFromImageLater(Mat image) {
if (!imageProcessorBusy) {
setImageProcessorBusy(true);
Message msg = mImageProcessor.obtainMessage();
msg.obj = new ImageProcessorMessage("previewFrame", image);
mImageProcessor.sendMessageDelayed(msg, 100);
}
}
/**
Called after a frame is processed and a rectangle was found
*/
public void rectangleWasDetected(WritableMap detection) {}
//================================================================================
// Capture Image
//================================================================================
/**
After an image is captured, this fuction is called and handles cropping the image
*/
@Override
public void handleCapturedImage(Mat capturedImage) {
setImageProcessorBusy(true);
Message msg = mImageProcessor.obtainMessage();
msg.obj = new ImageProcessorMessage("pictureTaken", capturedImage);
mImageProcessor.sendMessageAtFrontOfQueue(msg);
}
/**
After an image is captured and cropped, this method is called
*/
public void onProcessedCapturedImage(CapturedImage scannedDocument) {
}
}
================================================
FILE: android/src/main/res/layout/activity_rectangle_scanner.xml
================================================
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
xmlns:opencv="http://schemas.android.com/apk/res-auto"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:configChanges="orientation|keyboardHidden"
android:background="#000000FF"
tools:context=".CameraDeviceControllerActivity">
<SurfaceView
android:id="@+id/surfaceView"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:layout_alignParentStart="false"
android:layout_alignParentTop="true"
android:layout_alignParentLeft="true"
android:layout_alignParentBottom="true"
android:layout_alignParentRight="true"
android:background="#000000FF"
android:layout_alignParentEnd="true" />
</FrameLayout>
================================================
FILE: example/.gitignore
================================================
# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files
# dependencies
node_modules/
# Expo
.expo/
dist/
web-build/
# Native
*.orig.*
*.jks
*.p8
*.p12
*.key
*.mobileprovision
# Metro
.metro-health-check*
# debug
npm-debug.*
yarn-debug.*
yarn-error.*
# macOS
.DS_Store
*.pem
# local env files
.env*.local
# typescript
*.tsbuildinfo
================================================
FILE: example/App.js
================================================
import { StatusBar } from 'expo-status-bar';
import { StyleSheet, Text, View } from 'react-native';
import ScanDocument from './src/ScanDocument';
export default function App() {
return (
<View style={styles.container}>
<Text>Open up App.js to start working on your app!</Text>
<StatusBar style="auto" />
<ScanDocument />
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#fff',
alignItems: 'center',
justifyContent: 'center',
},
});
================================================
FILE: example/app.json
================================================
{
"expo": {
"name": "example",
"slug": "example",
"version": "1.0.0",
"orientation": "portrait",
"icon": "./assets/icon.png",
"userInterfaceStyle": "light",
"plugins": [
[
"expo-dev-launcher",
{
"launchMode": "most-recent"
}
]
],
"splash": {
"image": "./assets/splash.png",
"resizeMode": "contain",
"backgroundColor": "#ffffff"
},
"ios": {
"supportsTablet": true,
"bundleIdentifier": "com.example.example",
"infoPlist": {
"NSCameraUsageDescription": "Example App requires to access camera for taking pictures of documents."
}
},
"android": {
"package": "com.example.example",
"adaptiveIcon": {
"foregroundImage": "./assets/adaptive-icon.png",
"backgroundColor": "#ffffff"
}
},
"web": {
"favicon": "./assets/favicon.png"
}
}
}
================================================
FILE: example/babel.config.js
================================================
module.exports = function(api) {
api.cache(true);
return {
presets: ['babel-preset-expo'],
};
};
================================================
FILE: example/package.json
================================================
{
"name": "example",
"version": "1.0.0",
"main": "expo/AppEntry.js",
"scripts": {
"start": "expo start --dev-client --clear",
"android": "expo run:android",
"ios": "rm -f ios/.xcode.env.local && expo run:ios"
},
"dependencies": {
"expo": "^50.0.17",
"expo-dev-client": "~3.3.12",
"expo-status-bar": "~1.12.1",
"react": "18.2.0",
"react-native": "0.73.6",
"react-native-rectangle-scanner": "file:../"
},
"devDependencies": {
"@babel/core": "^7.20.0"
},
"private": true
}
================================================
FILE: example/src/ScanDocument/CameraControls.js
================================================
import React from 'react';
import { SafeAreaView, Text, TouchableOpacity, View } from 'react-native';
import { Filters } from 'react-native-rectangle-scanner';
import { styles } from './styles';
const CameraControls = ({ closeScanner, capture, isCapturing, flashIsAvailable, flashOn, setFlashOn, filterId, setFilterId }) => (
<SafeAreaView style={[styles.overlay]}>
<View style={{ flexDirection: 'row', justifyContent: 'space-around' }}>
{Filters.RECOMMENDED_PLATFORM_FILTERS.map((f) => (
<TouchableOpacity key={f.id} onPress={() => setFilterId(f.id)}>
<Text style={{ color: 'white', fontSize: 13, fontWeight: filterId === f.id ? 'bold' : 'normal' }}>{f.name}</Text>
</TouchableOpacity>
))}
</View>
<View style={styles.buttonBottomContainer}>
<View style={styles.buttonGroup}>
<TouchableOpacity
style={styles.button}
onPress={closeScanner}
activeOpacity={0.8}
>
<Text style={styles.buttonText}>Cancel</Text>
</TouchableOpacity>
</View>
<View style={[styles.cameraOutline, { opacity: isCapturing ? 0.8 : 1 }]}>
<TouchableOpacity
activeOpacity={0.8}
style={styles.cameraButton}
onPress={isCapturing ? () => null : () => capture}
/>
</View>
<View>
<View style={[styles.buttonActionGroup, { justifyContent: 'flex-end', marginBottom: 16 }]}>
{flashIsAvailable && (
<TouchableOpacity
style={{
borderRadius: 30,
margin: 8,
backgroundColor: flashOn ? '#FFFFFF80' : '#00000080',
alignItems: 'center',
justifyContent: 'center',
paddingTop: 7,
height: 50,
width: 50
}}
activeOpacity={0.8}
onPress={() => setFlashOn(!flashOn)}
>
<Text style={{ color: flashOn ? '#333' : '#FFF' }}>Flash: {flashOn ? 'ON' : 'OFF'}</Text>
</TouchableOpacity>
)}
</View>
</View>
</View>
</SafeAreaView>
);
export default CameraControls;
================================================
FILE: example/src/ScanDocument/DocumentScanner.js
================================================
import React, { useRef, useState } from 'react';
import { Animated, ActivityIndicator, Dimensions, Text, TouchableOpacity, View } from 'react-native';
import Scanner, { Filters, FlashAnimation, RectangleOverlay } from 'react-native-rectangle-scanner';
import { styles } from './styles';
import CameraControls from './CameraControls';
const JPEGQuality = 0.7;
const DocumentScanner = ({ closeScanner, onScannedImage }) => {
const [loadingCamera, setLoadingCamera] = useState(true);
const [cameraError, setCameraError] = useState();
const [cameraOn, setCameraOn] = useState(true);
const [flashOn, setFlashOn] = useState(false);
const [filterId, setFilterId] = useState(Filters.PLATFORM_DEFAULT_FILTER_ID);
const [flashIsAvailable, setFlashIsAvailable] = useState(false);
const [processingImage, setProcessingImage] = useState(false);
const [previewSize, setPreviewSize] = useState({});
const [detectedRectangle, setDetectedRectangle] = useState();
// const flashScreenOnCaptureAnimation = useRef(new Animated.Value(0)).current;
const cameraRef = useRef();
const capture = () => {
if (processingImage) return;
setProcessingImage(true);
cameraRef.current.capture();
// FlashAnimation.triggerSnapAnimation(flashScreenOnCaptureAnimation);
}
const onPictureProcessed = (event) => {
console.log('cropped, transformed, and added filters to captured image');
onScannedImage(event);
setProcessingImage(false);
}
const onDeviceSetup = (device) => {
setLoadingCamera(false);
setFlashIsAvailable(device.flashIsAvailable);
if (!device.hasCamera) {
setCameraError('Device does not have a camera');
setCameraOn(false);
} else if (!device.permissionToUseCamera) {
setCameraError('App does not have permission to use the camera');
setCameraOn(false);
}
const dimensions = Dimensions.get('window');
setPreviewSize({
height: `${device.previewHeightPercent * 100}%`,
width: `${device.previewWidthPercent * 100}%`,
marginTop: (1 - device.previewHeightPercent) * dimensions.height / 2,
marginLeft: (1 - device.previewWidthPercent) * dimensions.width / 2,
});
}
if (cameraOn) {
return (
<View style={{ position: 'relative', marginTop: previewSize.marginTop, marginLeft: previewSize.marginLeft, height: previewSize.height, width: previewSize.width }}>
<Scanner
onPictureTaken={() => console.log('picture captured...')}
onPictureProcessed={onPictureProcessed}
onErrorProcessingImage={(err) => console.error('Failed to capture scan', err?.message)}
enableTorch={flashOn}
filterId={filterId}
ref={cameraRef}
capturedQuality={JPEGQuality}
onRectangleDetected={(value) => setDetectedRectangle(value.detectedRectangle)}
onDeviceSetup={onDeviceSetup}
onTorchChanged={({ enabled }) => setFlashOn(enabled)}
style={styles.scanner}
/>
{!processingImage && (
<RectangleOverlay
detectedRectangle={detectedRectangle}
previewRatio={previewSize}
backgroundColor="rgba(255,181,6, 0.2)"
borderColor="rgb(255,181,6)"
borderWidth={4}
detectedBackgroundColor="rgba(255,181,6, 0.3)"
detectedBorderWidth={6}
detectedBorderColor="rgb(255,218,124)"
onDetectedCapture={this.capture}
allowDetection
/>
)}
{/* <FlashAnimation overlayFlashOpacity={flashScreenOnCaptureAnimation} /> */}
{loadingCamera && (
<View style={styles.overlay}>
<View style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
<ActivityIndicator color="white" />
<Text style={styles.loadingCameraMessage}>Loading Camera</Text>
</View>
</View>
)}
{processingImage && (
<View style={styles.overlay}>
<View style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
<View style={{ alignItems: 'center', justifyContent: 'center', height: 140, width: 200, borderRadius: 16, backgroundColor: 'rgba(220, 220, 220, 0.7)' }}>
<ActivityIndicator color="#333333" size="large" />
<Text style={{ color: '#333333', fontSize: 30, marginTop: 10 }}>Processing</Text>
</View>
</View>
</View>
)}
<CameraControls
closeScanner={closeScanner}
capture={capture}
isCapturing={processingImage}
flashIsAvailable={flashIsAvailable}
flashOn={flashOn}
setFlashOn={setFlashOn}
filterId={filterId}
setFilterId={setFilterId}
/>
</View>
);
}
return (
<View style={styles.cameraNotAvailableContainer}>
<View style={styles.buttonBottomContainer}>
<View style={styles.buttonGroup}>
<TouchableOpacity style={styles.button} onPress={closeScanner}>
<Text style={styles.buttonText}>Cancel</Text>
</TouchableOpacity>
</View>
</View>
<View style={styles.overlay}>
<View style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
<ActivityIndicator color="white" />
<Text style={styles.loadingCameraMessage}>{cameraError ? cameraError : 'Loading Camera'}</Text>
</View>
</View>
</View>
);
}
export default DocumentScanner;
================================================
FILE: example/src/ScanDocument/index.js
================================================
import React, { useState } from 'react';
import { Button, Text } from 'react-native';
import DocumentScanner from './DocumentScanner';
import useIsMultiTasking from '../useIsMultiTasking';
import { StatusBar } from 'expo-status-bar';
const ScanDocument = () => {
const [scannerIsOn, setScannerIsOn] = useState(false);
const [scannedImage, setScannedImage] = useState();
const onScannedImage = ({ croppedImage }) => {
console.log('scanned an image!');
setScannedImage(croppedImage);
}
const isMultiTasking = useIsMultiTasking();
if (isMultiTasking) return <Text>Not allowed while multi tasking</Text>;
if (!scannerIsOn) {
if (!scannedImage) {
return <Button title="Tap to scan" onPress={() => setScannerIsOn(true)} />;
} else {
return <Text>Captured an image!</Text>
}
}
return (
<>
<StatusBar animated={true} backgroundColor="black" barStyle="light-content" />
<DocumentScanner
closeScanner={() => setScannerIsOn(false)}
onScannedImage={onScannedImage}
/>
</>
)
}
export default ScanDocument;
================================================
FILE: example/src/ScanDocument/styles.js
================================================
import { StyleSheet } from "react-native";
export const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: 'black',
},
overlay: {
flex: 1,
position: 'absolute',
top: 0,
bottom: 0,
right: 0,
left: 0,
},
buttonBottomContainer: {
position: 'absolute',
bottom: 40,
left: 25,
right: 25,
justifyContent: 'space-between',
alignItems: 'flex-end',
flexDirection: 'row',
},
buttonTopContainer: {
position: 'absolute',
top: 40,
left: 25,
right: 25,
justifyContent: 'space-between',
alignItems: 'flex-start',
flexDirection: 'row',
},
buttonContainer: {
position: 'absolute',
right: 25,
top: 25,
bottom: 25,
justifyContent: 'space-between',
alignItems: 'flex-end',
flexDirection: 'column',
},
buttonActionGroup: {
flex: 1,
justifyContent: 'space-between',
flexDirection: 'column',
},
cameraOutline: {
borderWidth: 3,
borderColor: 'white',
borderRadius: 50,
height: 70,
width: 70,
},
cameraButton: {
backgroundColor: 'white',
borderRadius: 50,
margin: 3,
flex: 1,
},
buttonGroup: {
backgroundColor: '#00000080',
borderRadius: 17,
},
button: {
alignItems: 'center',
justifyContent: 'center',
height: 70,
width: 65,
},
buttonText: {
color: 'white',
fontSize: 13,
},
buttonIcon: {
color: 'white',
fontSize: 22,
marginBottom: 3,
textAlign: 'center',
},
scanner: {
flex: 1,
},
cameraNotAvailableContainer: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
marginHorizontal: 2,
},
cameraNotAvailableText: {
color: 'white',
fontSize: 25,
textAlign: 'center',
},
loadingCameraMessage: {
marginTop: 10,
color: 'white',
fontSize: 18,
textAlign: 'center',
},
});
================================================
FILE: example/src/useIsMultiTasking.js
================================================
import { Dimensions, useWindowDimensions } from "react-native";
// return true when the device goes into multi-tasking view (the window width will be less than the screen width)
export default function () {
const { width, height } = useWindowDimensions();
const screenWidth = Math.round(Dimensions.get('screen').width);
const screenHeight = Math.round(Dimensions.get('screen').height);
return (width < screenWidth);
}
================================================
FILE: index.js
================================================
import Scanner from './src/Scanner';
import RectangleOverlay from './src/RectangleOverlay';
import Filters from './src/Filters';
import FlashAnimation from './src/FlashAnimation';
export default Scanner;
export {
RectangleOverlay,
Filters,
FlashAnimation,
};
export const CACHE_FOLDER_NAME = 'RNRectangleScanner';
================================================
FILE: ios/CameraDeviceController.h
================================================
//
// CameraDeviceController.h
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
@interface CameraDeviceController : UIView
- (void)setupCameraView;
- (void)start;
- (void)stop;
- (void)focusCamera;
@property (nonatomic,assign,getter=isTorchEnabled) BOOL enableTorch;
- (void)focusAtPoint:(CGPoint)point completionHandler:(void(^)(void))completionHandler;
- (void)captureImageLater;
- (CIImage *)processOutput:(CIImage *)image;
- (UIView *)getPreviewLayerView;
- (CGRect)getBounds;
- (void)deviceWasSetup:(NSDictionary *)config;
- (void)torchWasChanged:(BOOL)enableTorch;
- (void)handleCapturedImage:(CIImage *)capturedImage orientation: (UIImageOrientation) orientation;
- (UIImageOrientation)getOrientationForImage;
@property (nonatomic, assign) BOOL hasTakenPhoto;
@property (nonatomic, assign) BOOL forceStop;
@property (nonatomic, assign) BOOL _isStopped;
@property (nonatomic, assign) BOOL _cameraIsSetup;
@property (nonatomic, assign) BOOL _isCapturing;
@property (nonatomic, assign) UIDeviceOrientation lastDeviceOrientation;
@property (nonatomic, assign) UIInterfaceOrientation lastInterfaceOrientation;
@property (nonatomic, assign) int filterId;
@property (nonatomic,strong) EAGLContext *context;
@property (nonatomic, strong) CIContext *_coreImageContext;
@end
================================================
FILE: ios/CameraDeviceController.m
================================================
//
// CameraDeviceController.m
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import "CameraDeviceController.h"
#import <AVFoundation/AVFoundation.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
#import <CoreImage/CoreImage.h>
#import <ImageIO/ImageIO.h>
#import <GLKit/GLKit.h>
@interface CameraDeviceController () <AVCaptureVideoDataOutputSampleBufferDelegate, AVCapturePhotoCaptureDelegate>
/*!
@property session
@abstract
The capture session used for scanning documents.
*/
@property (nonatomic,strong) AVCaptureSession *captureSession;
/*!
@property captureDevice
@abstract
Represents the physical device that is used (back camera for example).
*/
@property (nonatomic,strong) AVCaptureDevice *captureDevice;
/*!
@property deviceInput
@abstract
Represents the input from the camera device
*/
@property (nonatomic, strong) AVCaptureDeviceInput* deviceInput;
/*!
@property cameraOutput
@abstract
Used for image capture output
*/
@property (nonatomic, strong) AVCapturePhotoOutput *cameraOutput;
@end
/*!
Handles Generic camera device setup and capture
*/
@implementation CameraDeviceController
{
GLuint _renderBuffer;
GLKView *_glkView;
NSMutableDictionary *_deviceConfiguration;
dispatch_queue_t _captureImageQueue;
}
- (void)awakeFromNib
{
[super awakeFromNib];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_backgroundMode) name:UIApplicationWillResignActiveNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_foregroundMode) name:UIApplicationDidBecomeActiveNotification object:nil];
}
- (instancetype)init {
self = [super init];
_captureImageQueue = dispatch_queue_create("CaptureImageQueue",NULL);
// Keep track of the last device orientation for image orientation correction
[self deviceOrientationDidChanged];
[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(deviceOrientationDidChanged) name:UIDeviceOrientationDidChangeNotification object:[UIDevice currentDevice]];
return self;
}
/*!
Called When the app enters the background
*/
- (void)_backgroundMode
{
self.forceStop = YES;
[self setEnableTorch: NO];
}
/*!
Called When the app enters the foreground
*/
- (void)_foregroundMode
{
self.forceStop = NO;
}
- (void)dealloc
{
[[NSNotificationCenter defaultCenter] removeObserver:self];
}
// MARK: Setters
/*!
Set device orientation. It ignores orientations like "faceDown" so that the last real orientation is used.
*/
- (void)deviceOrientationDidChanged{
_lastInterfaceOrientation = [UIApplication sharedApplication].statusBarOrientation;
UIDeviceOrientation deviceOrientation = [[UIDevice currentDevice] orientation];
// Ignore odd orientations, we only care about last real orientation
if (deviceOrientation == UIDeviceOrientationFaceUp) return;
if (deviceOrientation == UIDeviceOrientationFaceDown) return;
if (deviceOrientation == UIDeviceOrientationUnknown) return;
_lastDeviceOrientation = deviceOrientation;
}
/*!
Toggles the flash on the camera device
*/
- (void)setEnableTorch:(BOOL)enableTorch
{
_enableTorch = enableTorch;
AVCaptureDevice *device = self.captureDevice;
if ([device hasTorch] && [device hasFlash]) {
[device lockForConfiguration:nil];
if (enableTorch) {
[device setTorchMode:AVCaptureTorchModeOn];
} else {
[device setTorchMode:AVCaptureTorchModeOff];
}
[device unlockForConfiguration];
}
[self torchWasChanged:enableTorch];
}
- (void)torchWasChanged:(BOOL)enableTorch {}
/*!
Starts the capture session
*/
- (void)start
{
self._isStopped = NO;
dispatch_queue_t globalQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
dispatch_async(globalQueue, ^{
[self.captureSession startRunning];
});
[self hidePreviewLayerView:NO completion:nil];
}
/*!
Stops the capture session
*/
- (void)stop
{
self._isStopped = YES;
dispatch_queue_t globalQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
dispatch_async(globalQueue, ^{
[self.captureSession stopRunning];
});
[self hidePreviewLayerView:YES completion:nil];
}
/*!
Focuses the camera. This is a NoOp as iOS always continuously autofocuses. This can be used to later expand into
focusing onto a specific point.
*/
- (void)focusCamera
{
// NOOP
}
/*!
Sets the currently active filter
*/
- (void)setFilterId:(int)filterId
{
_filterId = filterId;
}
/*!
Sets the device configuration flash setting
*/
- (void)_setDeviceConfigurationFlashAvailable: (BOOL) isAvailable{
[_deviceConfiguration setValue:isAvailable ? @TRUE : @FALSE forKey:@"flashIsAvailable"];
}
/*!
Sets the device configuration permission setting
*/
- (void)_setDeviceConfigurationPermissionToUseCamera: (BOOL) granted{
[_deviceConfiguration setValue:granted ? @TRUE : @FALSE forKey:@"permissionToUseCamera"];
}
/*!
Sets the device configuration camera availablility
*/
- (void)_setDeviceConfigurationHasCamera: (BOOL) isAvailable{
[_deviceConfiguration setValue:isAvailable ? @TRUE : @FALSE forKey:@"hasCamera"];
}
/*!
Sets the inital device configuration
*/
- (void)_resetDeviceConfiguration
{
_deviceConfiguration = [[NSMutableDictionary alloc] init];
[self _setDeviceConfigurationFlashAvailable:NO];
[self _setDeviceConfigurationPermissionToUseCamera:NO ];
[self _setDeviceConfigurationHasCamera:NO];
[_deviceConfiguration setValue: @1.0 forKey: @"previewHeightPercent"];
[_deviceConfiguration setValue: @1.0 forKey: @"previewWidthPercent"];
}
/*!
Called after the camera and session are set up. This lets you check if a camera is found and permission is granted to use it.
*/
- (void)_commitDeviceConfiguration {
[self deviceWasSetup:_deviceConfiguration];
}
- (void)deviceWasSetup:(NSDictionary *)config {}
/*!
Used to hide the output capture session preview layer
*/
- (void)hidePreviewLayerView:(BOOL)hidden completion:(void(^)(void))completion
{
[UIView animateWithDuration:0.1 animations:^
{
self->_glkView.alpha = (hidden) ? 0.0 : 1.0;
}
completion:^(BOOL finished)
{
if (!completion) return;
completion();
}];
}
// MARK: Getters
/*!
@return The orientation the image should be set to
@note This will always return "right" if the device and the UI rotation match. Also, if the device is rotation locked, the device orientation will always be the same as the interface orientation.
*/
- (UIImageOrientation)getOrientationForImage
{
if (_lastInterfaceOrientation == UIInterfaceOrientationPortrait) {
if (_lastDeviceOrientation == UIDeviceOrientationLandscapeLeft) return UIImageOrientationUp;
if (_lastDeviceOrientation == UIDeviceOrientationLandscapeRight) return UIImageOrientationDown;
if (_lastDeviceOrientation == UIDeviceOrientationPortraitUpsideDown) return UIImageOrientationLeft;
}
if (_lastInterfaceOrientation == UIInterfaceOrientationPortraitUpsideDown) {
if (_lastDeviceOrientation == UIDeviceOrientationLandscapeLeft) return UIImageOrientationUp;
if (_lastDeviceOrientation == UIDeviceOrientationLandscapeRight) return UIImageOrientationDown;
if (_lastDeviceOrientation == UIDeviceOrientationPortrait) return UIImageOrientationLeft;
}
// device landscape left == interface landscape right
if (_lastInterfaceOrientation == UIInterfaceOrientationLandscapeLeft) {
if (_lastDeviceOrientation == UIDeviceOrientationLandscapeLeft) return UIImageOrientationLeft;
if (_lastDeviceOrientation == UIDeviceOrientationPortrait) return UIImageOrientationUp;
if (_lastDeviceOrientation == UIDeviceOrientationPortraitUpsideDown) return UIImageOrientationDown;
}
// device landscape right == interface landscape left
if (_lastInterfaceOrientation == UIInterfaceOrientationLandscapeRight) {
if (_lastDeviceOrientation == UIDeviceOrientationLandscapeRight) return UIImageOrientationLeft;
if (_lastDeviceOrientation == UIDeviceOrientationPortrait) return UIImageOrientationDown;
if (_lastDeviceOrientation == UIDeviceOrientationPortraitUpsideDown) return UIImageOrientationUp;
}
return UIImageOrientationRight;
}
/*!
@return The view that is used to preview the camera output
*/
- (UIView *)getPreviewLayerView
{
return _glkView;
}
- (CGRect)getBounds{
return self.bounds;
}
/*!
Gets the orientation that the image should be set to before cropping and transforming
*/
- (int)getCGImageOrientationForCaptureImage
{
switch ([UIApplication sharedApplication].statusBarOrientation) {
case UIDeviceOrientationPortrait:
return kCGImagePropertyOrientationUp;
case UIDeviceOrientationPortraitUpsideDown:
return kCGImagePropertyOrientationDown;
case UIDeviceOrientationLandscapeLeft:
return kCGImagePropertyOrientationLeft;
case UIDeviceOrientationLandscapeRight:
return kCGImagePropertyOrientationRight;
default:
return kCGImagePropertyOrientationUp;
}
}
/*!
Gets a hardware camera device.
@return A camera hardware object or nil if not found
*/
- (AVCaptureDevice *)getCameraDevice{
AVCaptureDevice* possibleDevice;
possibleDevice = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack];
if (possibleDevice) return possibleDevice;
return nil;
}
// MARK: Setup
/*!
Creates a session for the camera device and outputs it to a preview view.
@note Called on view did load
*/
- (void)setupCameraView
{
[self createPreviewViewLayer];
[self _resetDeviceConfiguration];
[self setupCamera];
[self _commitDeviceConfiguration];
[self listenForOrientationChanges];
self._cameraIsSetup = YES;
}
/*!
Creates the preview layer view for the camera output.
@discussion
Produces a GLKView which the camera output is drawn on. There is a possibility that we could switch
to an AVCaptureVideoPreviewLayer instead. This is supposed to handle screen rotation better from what
I've seen. This is how Apple's AVCam project does it as well.
*/
- (void)createPreviewViewLayer
{
if (self.context) return;
self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
GLKView *view = [[GLKView alloc] initWithFrame:self.bounds];
view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
view.translatesAutoresizingMaskIntoConstraints = YES;
view.context = self.context;
view.contentScaleFactor = 1.0f;
view.drawableDepthFormat = GLKViewDrawableDepthFormat24;
[self insertSubview:view atIndex:0];
_glkView = view;
glGenRenderbuffers(1, &_renderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);
self._coreImageContext = [CIContext contextWithEAGLContext:self.context];
[EAGLContext setCurrentContext:self.context];
}
/*!
Sets up the hardware and capture session asking for permission to use the camera if needed.
*/
- (void)setupCamera {
if (![self setupCaptureDevice]) return;
if (![self setupInputCaptureFromDevice]) return;
// Set up the capture session from the input
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession beginConfiguration];
self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[self.captureSession addInput:self.deviceInput];
// Output session capture to queue
AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];
[dataOutput setAlwaysDiscardsLateVideoFrames:YES];
[dataOutput setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)}];
[dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[self.captureSession addOutput:dataOutput];
// Output session capture to still image output
self.cameraOutput = [[AVCapturePhotoOutput alloc] init];
[self.captureSession addOutput:self.cameraOutput];
// Correct the orientation of the output
[self setVideoOrientation];
[self.captureSession commitConfiguration];
}
/*!
Finds a physical camera, configures it, and sets the captureDevice property to it
@return The captureDevice property value (If falsey, could not find a valid camera)
*/
- (AVCaptureDevice *)setupCaptureDevice{
self.captureDevice = [self getCameraDevice];
if (!self.captureDevice) return nil;
[self _setDeviceConfigurationHasCamera:YES];
[self _setDeviceConfigurationFlashAvailable:([self.captureDevice hasTorch] && [self.captureDevice hasFlash])];
// Setup camera focus mode
if ([self.captureDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
{
[self.captureDevice lockForConfiguration:nil];
[self.captureDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[self.captureDevice unlockForConfiguration];
}
return self.captureDevice;
}
/*!
Gets input from the device (will ask for permission) and sets the deviceInput property.
@return The deviceInput property value (If falsey, permission is not granted)
*/
- (AVCaptureDeviceInput *) setupInputCaptureFromDevice{
NSError *error = nil;
self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.captureDevice error:&error];
[self _setDeviceConfigurationPermissionToUseCamera:self.deviceInput];
return self.deviceInput;
}
// MARK: Orientation
/*!
Sets the current capture session output orientation to the device's orientation
*/
- (void)setVideoOrientation {
UIInterfaceOrientation orientation = [UIApplication sharedApplication].statusBarOrientation;
AVCaptureVideoOrientation videoOrientation;
switch (orientation) {
case UIInterfaceOrientationPortrait:
videoOrientation = AVCaptureVideoOrientationPortrait;
break;
case UIInterfaceOrientationPortraitUpsideDown:
videoOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
break;
case UIInterfaceOrientationLandscapeLeft:
videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
break;
case UIInterfaceOrientationLandscapeRight:
videoOrientation = AVCaptureVideoOrientationLandscapeRight;
break;
default:
videoOrientation = AVCaptureVideoOrientationPortrait;
}
[[[self.captureSession.outputs firstObject].connections firstObject] setVideoOrientation:videoOrientation];
}
- (BOOL)isLandscapeOrientation:(int) orientation {
if (orientation == AVCaptureVideoOrientationPortrait || orientation == AVCaptureVideoOrientationPortraitUpsideDown) return YES;
return NO;
}
/*!
Listens for device orientation changes. On change, it will change the orientation of the video preview output
*/
- (void)listenForOrientationChanges {
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(handleApplicationDidChangeStatusBarNotification:)
name:UIApplicationDidChangeStatusBarOrientationNotification
object:nil];
}
/*!
Reponds to status bar orientation change events
*/
- (void)handleApplicationDidChangeStatusBarNotification:(NSNotification *)notification {
[self setVideoOrientation];
}
// MARK: Auto Focus
/*!
Focuses on a point of interest where the user tapped.
*/
- (void)focusAtPoint:(CGPoint)point completionHandler:(void(^)(void))completionHandler
{
AVCaptureDevice *device = self.captureDevice;
CGPoint pointOfInterest = CGPointZero;
CGSize frameSize = self.bounds.size;
pointOfInterest = CGPointMake(point.y / frameSize.height, 1.f - (point.x / frameSize.width));
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus])
{
NSError *error;
if ([device lockForConfiguration:&error])
{
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
{
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device setFocusPointOfInterest:pointOfInterest];
}
if([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure])
{
[device setExposurePointOfInterest:pointOfInterest];
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
completionHandler();
}
[device unlockForConfiguration];
}
}
else
{
completionHandler();
}
}
// MARK: previewLayer Output
/*!
Processes the image output from the capture session.
@note Override this method to add additional processing
*/
-(CIImage *)processOutput:(CIImage *)image
{
return [self applyFilters:image];
}
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if (self.forceStop) return;
if (self._isStopped || self._isCapturing || !CMSampleBufferIsValid(sampleBuffer)) return;
CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
// Crop to fit screen
CGRect cropRect = AVMakeRectWithAspectRatioInsideRect(CGSizeMake(self.bounds.size.width, self.bounds.size.height), image.extent);
image = [image imageByCroppingToRect:cropRect];
image = [self processOutput:image];
if (self.context && self._coreImageContext)
{
[self._coreImageContext drawImage:image inRect:self.bounds fromRect:image.extent];
[self.context presentRenderbuffer:GL_RENDERBUFFER];
[_glkView setNeedsDisplay];
}
}
// MARK: Capture Image
-(void)handleCapturedImage:(CIImage *)capturedImage orientation: (UIImageOrientation) orientation {
}
/*!
Responds to the capture Output call via delegate. It will apply a few filters and call handleCapturedImage which can be overrided for more processing
*/
-(void)captureOutput:(AVCapturePhotoOutput *)photo didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error
{
if (error) {
NSLog(@"error : %@", error.localizedDescription);
}
if (photoSampleBuffer) {
NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
CIImage *intialImage = [CIImage imageWithData:imageData];
intialImage = [intialImage imageByApplyingOrientation:[self getCGImageOrientationForCaptureImage]];
// Lock in the final image orientation
UIImageOrientation imageOutputOrientation = [self getOrientationForImage];
// Crop to fit screen size
CGSize screenSize = CGSizeMake(self.bounds.size.height, self.bounds.size.width);
CGRect cropRect = AVMakeRectWithAspectRatioInsideRect(screenSize, intialImage.extent);
intialImage = [intialImage imageByCroppingToRect:cropRect];
intialImage = [intialImage imageByApplyingTransform:CGAffineTransformMakeTranslation(-intialImage.extent.origin.x, -intialImage.extent.origin.y)];
[self setEnableTorch: NO];
dispatch_async(_captureImageQueue, ^{
CIImage *enhancedImage = [self applyFilters:intialImage];
self._isCapturing = NO;
[self handleCapturedImage:enhancedImage orientation: imageOutputOrientation];
});
}
}
/*!
Triggers a capture from the photo output
*/
- (void)captureImageLater
{
if (self._isCapturing) return;
self._isCapturing = YES;
AVCapturePhotoSettings *settings = [[AVCapturePhotoSettings alloc] init];
NSNumber *previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.firstObject;
NSString *formatTypeKey = (NSString *)kCVPixelBufferPixelFormatTypeKey;
NSString *widthKey = (NSString *)kCVPixelBufferWidthKey;
NSString *heightKey = (NSString *)kCVPixelBufferHeightKey;
NSDictionary *previewFormat = @{formatTypeKey:previewPixelType,
widthKey:@1024,
heightKey:@768
};
settings.previewPhotoFormat = previewFormat;
[self.cameraOutput capturePhotoWithSettings:settings delegate:self];
}
// MARK: Filters
/*!
Applies filters to the CIImage based on configuration
*/
- (CIImage *)applyFilters:(CIImage *)image{
switch (self.filterId) {
case 1: return image;
case 2: return [self applyGreyScaleFilterToImage:image];
case 3: return [self applyColorFilterToImage:image];
case 4: return [self applyBlackAndWhiteFilterToImage:image];
default: return image;
}
}
/*!
Adds a black and white filter over the image
*/
- (CIImage *)applyGreyScaleFilterToImage:(CIImage *)image
{
return [CIFilter filterWithName:@"CIColorControls" keysAndValues:kCIInputImageKey, image, kCIInputBrightnessKey, @(0), kCIInputContrastKey, @(1), kCIInputSaturationKey, @(0), nil].outputImage;
}
/*!
Adds a black and white filter that bumps up the clarity of edges
*/
- (CIImage *)applyBlackAndWhiteFilterToImage:(CIImage *)image
{
return [CIFilter filterWithName:@"CIColorControls" keysAndValues:kCIInputImageKey, image, kCIInputBrightnessKey, @(0.4), kCIInputContrastKey, @(2), kCIInputSaturationKey, @(0), nil].outputImage;
}
/*!
Adds a color filter that bumps up the clarity of edges
*/
- (CIImage *)applyColorFilterToImage:(CIImage *)image
{
return [CIFilter filterWithName:@"CIColorControls" keysAndValues:kCIInputImageKey, image, kCIInputBrightnessKey, @(0.35), kCIInputContrastKey, @(1.9), kCIInputSaturationKey, @(0.75), nil].outputImage;
}
@end
================================================
FILE: ios/RNRectangleScanner.xcodeproj/project.pbxproj
================================================
// !$*UTF8*$!
{
archiveVersion = 1;
classes = {
};
objectVersion = 51;
objects = {
/* Begin PBXBuildFile section */
23036F1023AABB5400C4A663 /* CameraDeviceController.m in Sources */ = {isa = PBXBuildFile; fileRef = 23036F0923AABB5400C4A663 /* CameraDeviceController.m */; };
23036F1123AABB5400C4A663 /* RNRectangleScannerManager.m in Sources */ = {isa = PBXBuildFile; fileRef = 23036F0A23AABB5400C4A663 /* RNRectangleScannerManager.m */; };
23036F1223AABB5400C4A663 /* RNRectangleScannerView.m in Sources */ = {isa = PBXBuildFile; fileRef = 23036F0C23AABB5400C4A663 /* RNRectangleScannerView.m */; };
23036F1323AABB5400C4A663 /* RectangleDetectionController.m in Sources */ = {isa = PBXBuildFile; fileRef = 23036F0D23AABB5400C4A663 /* RectangleDetectionController.m */; };
23036F4923AABDBE00C4A663 /* libReact.a in Frameworks */ = {isa = PBXBuildFile; fileRef = 23036F2A23AABDB000C4A663 /* libReact.a */; };
/* End PBXBuildFile section */
/* Begin PBXContainerItemProxy section */
23036F2923AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 83CBBA2E1A601D0E00E9B192;
remoteInfo = React;
};
23036F2B23AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 2D2A28131D9B038B00D4039D;
remoteInfo = "React-tvOS";
};
23036F2D23AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 3D3C059A1DE3340900C268FA;
remoteInfo = yoga;
};
23036F2F23AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 3D3C06751DE3340C00C268FA;
remoteInfo = "yoga-tvOS";
};
23036F3123AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 3D3CD9251DE5FBEC00167DC4;
remoteInfo = cxxreact;
};
23036F3323AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 3D3CD9321DE5FBEE00167DC4;
remoteInfo = "cxxreact-tvOS";
};
23036F3523AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = EBF21BDC1FC498900052F4D5;
remoteInfo = jsinspector;
};
23036F3723AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = EBF21BFA1FC4989A0052F4D5;
remoteInfo = "jsinspector-tvOS";
};
23036F3923AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 139D7ECE1E25DB7D00323FB7;
remoteInfo = "third-party";
};
23036F3B23AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 3D383D3C1EBD27B6005632C8;
remoteInfo = "third-party-tvOS";
};
23036F3D23AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 139D7E881E25C6D100323FB7;
remoteInfo = "double-conversion";
};
23036F3F23AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = 3D383D621EBD27B9005632C8;
remoteInfo = "double-conversion-tvOS";
};
23036F4123AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = EDEBC6D6214B3E7000DD5AC8;
remoteInfo = jsi;
};
23036F4323AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = EDEBC73B214B45A300DD5AC8;
remoteInfo = jsiexecutor;
};
23036F4523AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = ED296FB6214C9A0900B7C4FE;
remoteInfo = "jsi-tvOS";
};
23036F4723AABDB000C4A663 /* PBXContainerItemProxy */ = {
isa = PBXContainerItemProxy;
containerPortal = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
proxyType = 2;
remoteGlobalIDString = ED296FEE214C9CF800B7C4FE;
remoteInfo = "jsiexecutor-tvOS";
};
/* End PBXContainerItemProxy section */
/* Begin PBXCopyFilesBuildPhase section */
58B511D91A9E6C8500147676 /* CopyFiles */ = {
isa = PBXCopyFilesBuildPhase;
buildActionMask = 2147483647;
dstPath = "include/$(PRODUCT_NAME)";
dstSubfolderSpec = 16;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXCopyFilesBuildPhase section */
/* Begin PBXFileReference section */
134814201AA4EA6300B7C361 /* libRNRectangleScanner.a */ = {isa = PBXFileReference; explicitFileType = archive.ar; includeInIndex = 0; path = libRNRectangleScanner.a; sourceTree = BUILT_PRODUCTS_DIR; };
23036F0823AABB5400C4A663 /* RectangleDetectionController.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RectangleDetectionController.h; sourceTree = "<group>"; };
23036F0923AABB5400C4A663 /* CameraDeviceController.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = CameraDeviceController.m; sourceTree = "<group>"; wrapsLines = 1; };
23036F0A23AABB5400C4A663 /* RNRectangleScannerManager.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = RNRectangleScannerManager.m; sourceTree = "<group>"; };
23036F0B23AABB5400C4A663 /* CameraDeviceController.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CameraDeviceController.h; sourceTree = "<group>"; };
23036F0C23AABB5400C4A663 /* RNRectangleScannerView.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = RNRectangleScannerView.m; sourceTree = "<group>"; };
23036F0D23AABB5400C4A663 /* RectangleDetectionController.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = RectangleDetectionController.m; sourceTree = "<group>"; };
23036F0E23AABB5400C4A663 /* RNRectangleScannerView.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RNRectangleScannerView.h; sourceTree = "<group>"; };
23036F0F23AABB5400C4A663 /* RNRectangleScannerManager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RNRectangleScannerManager.h; sourceTree = "<group>"; };
23036F1623AABDAF00C4A663 /* React.xcodeproj */ = {isa = PBXFileReference; lastKnownFileType = "wrapper.pb-project"; name = React.xcodeproj; path = "../node_modules/react-native/React/React.xcodeproj"; sourceTree = "<group>"; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
58B511D81A9E6C8500147676 /* Frameworks */ = {
isa = PBXFrameworksBuildPhase;
buildActionMask = 2147483647;
files = (
23036F4923AABDBE00C4A663 /* libReact.a in Frameworks */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXFrameworksBuildPhase section */
/* Begin PBXGroup section */
134814211AA4EA7D00B7C361 /* Products */ = {
isa = PBXGroup;
children = (
134814201AA4EA6300B7C361 /* libRNRectangleScanner.a */,
);
name = Products;
sourceTree = "<group>";
};
23036F1523AABDAF00C4A663 /* Frameworks */ = {
isa = PBXGroup;
children = (
23036F1623AABDAF00C4A663 /* React.xcodeproj */,
);
name = Frameworks;
sourceTree = "<group>";
};
23036F1723AABDAF00C4A663 /* Products */ = {
isa = PBXGroup;
children = (
23036F2A23AABDB000C4A663 /* libReact.a */,
23036F2C23AABDB000C4A663 /* libReact.a */,
23036F2E23AABDB000C4A663 /* libyoga.a */,
23036F3023AABDB000C4A663 /* libyoga.a */,
23036F3223AABDB000C4A663 /* libcxxreact.a */,
23036F3423AABDB000C4A663 /* libcxxreact.a */,
23036F3623AABDB000C4A663 /* libjsinspector.a */,
23036F3823AABDB000C4A663 /* libjsinspector-tvOS.a */,
23036F3A23AABDB000C4A663 /* libthird-party.a */,
23036F3C23AABDB000C4A663 /* libthird-party.a */,
23036F3E23AABDB000C4A663 /* libdouble-conversion.a */,
23036F4023AABDB000C4A663 /* libdouble-conversion.a */,
23036F4223AABDB000C4A663 /* libjsi.a */,
23036F4423AABDB000C4A663 /* libjsiexecutor.a */,
23036F4623AABDB000C4A663 /* libjsi-tvOS.a */,
23036F4823AABDB000C4A663 /* libjsiexecutor-tvOS.a */,
);
name = Products;
sourceTree = "<group>";
};
58B511D21A9E6C8500147676 = {
isa = PBXGroup;
children = (
23036F0B23AABB5400C4A663 /* CameraDeviceController.h */,
23036F0923AABB5400C4A663 /* CameraDeviceController.m */,
23036F0E23AABB5400C4A663 /* RNRectangleScannerView.h */,
23036F0C23AABB5400C4A663 /* RNRectangleScannerView.m */,
23036F0F23AABB5400C4A663 /* RNRectangleScannerManager.h */,
23036F0A23AABB5400C4A663 /* RNRectangleScannerManager.m */,
23036F0823AABB5400C4A663 /* RectangleDetectionController.h */,
23036F0D23AABB5400C4A663 /* RectangleDetectionController.m */,
134814211AA4EA7D00B7C361 /* Products */,
23036F1523AABDAF00C4A663 /* Frameworks */,
);
indentWidth = 2;
sourceTree = "<group>";
tabWidth = 2;
};
/* End PBXGroup section */
/* Begin PBXNativeTarget section */
58B511DA1A9E6C8500147676 /* RNRectangleScanner */ = {
isa = PBXNativeTarget;
buildConfigurationList = 58B511EF1A9E6C8500147676 /* Build configuration list for PBXNativeTarget "RNRectangleScanner" */;
buildPhases = (
58B511D71A9E6C8500147676 /* Sources */,
58B511D81A9E6C8500147676 /* Frameworks */,
58B511D91A9E6C8500147676 /* CopyFiles */,
);
buildRules = (
);
dependencies = (
);
name = RNRectangleScanner;
productName = RCTDataManager;
productReference = 134814201AA4EA6300B7C361 /* libRNRectangleScanner.a */;
productType = "com.apple.product-type.library.static";
};
/* End PBXNativeTarget section */
/* Begin PBXProject section */
58B511D31A9E6C8500147676 /* Project object */ = {
isa = PBXProject;
attributes = {
LastUpgradeCheck = 1130;
ORGANIZATIONNAME = HarvestProfit;
TargetAttributes = {
58B511DA1A9E6C8500147676 = {
CreatedOnToolsVersion = 6.1.1;
};
};
};
buildConfigurationList = 58B511D61A9E6C8500147676 /* Build configuration list for PBXProject "RNRectangleScanner" */;
compatibilityVersion = "Xcode 10.0";
developmentRegion = en;
hasScannedForEncodings = 0;
knownRegions = (
en,
Base,
);
mainGroup = 58B511D21A9E6C8500147676;
productRefGroup = 58B511D21A9E6C8500147676;
projectDirPath = "";
projectReferences = (
{
ProductGroup = 23036F1723AABDAF00C4A663 /* Products */;
ProjectRef = 23036F1623AABDAF00C4A663 /* React.xcodeproj */;
},
);
projectRoot = "";
targets = (
58B511DA1A9E6C8500147676 /* RNRectangleScanner */,
);
};
/* End PBXProject section */
/* Begin PBXReferenceProxy section */
23036F2A23AABDB000C4A663 /* libReact.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libReact.a;
remoteRef = 23036F2923AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F2C23AABDB000C4A663 /* libReact.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libReact.a;
remoteRef = 23036F2B23AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F2E23AABDB000C4A663 /* libyoga.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libyoga.a;
remoteRef = 23036F2D23AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3023AABDB000C4A663 /* libyoga.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libyoga.a;
remoteRef = 23036F2F23AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3223AABDB000C4A663 /* libcxxreact.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libcxxreact.a;
remoteRef = 23036F3123AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3423AABDB000C4A663 /* libcxxreact.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libcxxreact.a;
remoteRef = 23036F3323AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3623AABDB000C4A663 /* libjsinspector.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libjsinspector.a;
remoteRef = 23036F3523AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3823AABDB000C4A663 /* libjsinspector-tvOS.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = "libjsinspector-tvOS.a";
remoteRef = 23036F3723AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3A23AABDB000C4A663 /* libthird-party.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = "libthird-party.a";
remoteRef = 23036F3923AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3C23AABDB000C4A663 /* libthird-party.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = "libthird-party.a";
remoteRef = 23036F3B23AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F3E23AABDB000C4A663 /* libdouble-conversion.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = "libdouble-conversion.a";
remoteRef = 23036F3D23AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F4023AABDB000C4A663 /* libdouble-conversion.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = "libdouble-conversion.a";
remoteRef = 23036F3F23AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F4223AABDB000C4A663 /* libjsi.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libjsi.a;
remoteRef = 23036F4123AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F4423AABDB000C4A663 /* libjsiexecutor.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = libjsiexecutor.a;
remoteRef = 23036F4323AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F4623AABDB000C4A663 /* libjsi-tvOS.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = "libjsi-tvOS.a";
remoteRef = 23036F4523AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
23036F4823AABDB000C4A663 /* libjsiexecutor-tvOS.a */ = {
isa = PBXReferenceProxy;
fileType = archive.ar;
path = "libjsiexecutor-tvOS.a";
remoteRef = 23036F4723AABDB000C4A663 /* PBXContainerItemProxy */;
sourceTree = BUILT_PRODUCTS_DIR;
};
/* End PBXReferenceProxy section */
/* Begin PBXSourcesBuildPhase section */
58B511D71A9E6C8500147676 /* Sources */ = {
isa = PBXSourcesBuildPhase;
buildActionMask = 2147483647;
files = (
23036F1123AABB5400C4A663 /* RNRectangleScannerManager.m in Sources */,
23036F1323AABB5400C4A663 /* RectangleDetectionController.m in Sources */,
23036F1223AABB5400C4A663 /* RNRectangleScannerView.m in Sources */,
23036F1023AABB5400C4A663 /* CameraDeviceController.m in Sources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXSourcesBuildPhase section */
/* Begin XCBuildConfiguration section */
58B511ED1A9E6C8500147676 /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_ANALYZER_LOCALIZABILITY_NONLOCALIZED = YES;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_COMMA = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INFINITE_RECURSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES;
CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES;
CLANG_WARN_OBJC_LITERAL_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_RANGE_LOOP_ANALYSIS = YES;
CLANG_WARN_STRICT_PROTOTYPES = YES;
CLANG_WARN_SUSPICIOUS_MOVE = YES;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
COPY_PHASE_STRIP = NO;
ENABLE_STRICT_OBJC_MSGSEND = YES;
ENABLE_TESTABILITY = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_DYNAMIC_NO_PIC = NO;
GCC_NO_COMMON_BLOCKS = YES;
GCC_OPTIMIZATION_LEVEL = 0;
GCC_PREPROCESSOR_DEFINITIONS = (
"DEBUG=1",
"$(inherited)",
);
GCC_SYMBOLS_PRIVATE_EXTERN = NO;
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
MTL_ENABLE_DEBUG_INFO = YES;
ONLY_ACTIVE_ARCH = YES;
SDKROOT = iphoneos;
};
name = Debug;
};
58B511EE1A9E6C8500147676 /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_ANALYZER_LOCALIZABILITY_NONLOCALIZED = YES;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_COMMA = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INFINITE_RECURSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES;
CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES;
CLANG_WARN_OBJC_LITERAL_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_RANGE_LOOP_ANALYSIS = YES;
CLANG_WARN_STRICT_PROTOTYPES = YES;
CLANG_WARN_SUSPICIOUS_MOVE = YES;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
COPY_PHASE_STRIP = YES;
ENABLE_NS_ASSERTIONS = NO;
ENABLE_STRICT_OBJC_MSGSEND = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_NO_COMMON_BLOCKS = YES;
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
MTL_ENABLE_DEBUG_INFO = NO;
SDKROOT = iphoneos;
VALIDATE_PRODUCT = YES;
};
name = Release;
};
58B511F01A9E6C8500147676 /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
HEADER_SEARCH_PATHS = (
"$(inherited)",
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include,
"$(SRCROOT)/../../../React/**",
"$(SRCROOT)/../../react-native/React/**",
);
LIBRARY_SEARCH_PATHS = "$(inherited)";
OTHER_LDFLAGS = "-ObjC";
PRODUCT_NAME = RNRectangleScanner;
SKIP_INSTALL = YES;
};
name = Debug;
};
58B511F11A9E6C8500147676 /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
HEADER_SEARCH_PATHS = (
"$(inherited)",
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include,
"$(SRCROOT)/../../../React/**",
"$(SRCROOT)/../../react-native/React/**",
);
LIBRARY_SEARCH_PATHS = "$(inherited)";
OTHER_LDFLAGS = "-ObjC";
PRODUCT_NAME = RNRectangleScanner;
SKIP_INSTALL = YES;
};
name = Release;
};
/* End XCBuildConfiguration section */
/* Begin XCConfigurationList section */
58B511D61A9E6C8500147676 /* Build configuration list for PBXProject "RNRectangleScanner" */ = {
isa = XCConfigurationList;
buildConfigurations = (
58B511ED1A9E6C8500147676 /* Debug */,
58B511EE1A9E6C8500147676 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
58B511EF1A9E6C8500147676 /* Build configuration list for PBXNativeTarget "RNRectangleScanner" */ = {
isa = XCConfigurationList;
buildConfigurations = (
58B511F01A9E6C8500147676 /* Debug */,
58B511F11A9E6C8500147676 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
/* End XCConfigurationList section */
};
rootObject = 58B511D31A9E6C8500147676 /* Project object */;
}
================================================
FILE: ios/RNRectangleScanner.xcodeproj/xcshareddata/xcschemes/RNRectangleScanner.xcscheme
================================================
<?xml version="1.0" encoding="UTF-8"?>
<Scheme
LastUpgradeVersion = "1130"
version = "1.3">
<BuildAction
parallelizeBuildables = "YES"
buildImplicitDependencies = "YES">
<BuildActionEntries>
<BuildActionEntry
buildForTesting = "YES"
buildForRunning = "YES"
buildForProfiling = "YES"
buildForArchiving = "YES"
buildForAnalyzing = "YES">
<BuildableReference
BuildableIdentifier = "primary"
BlueprintIdentifier = "58B511DA1A9E6C8500147676"
BuildableName = "libRNRectangleScanner.a"
BlueprintName = "RNRectangleScanner"
ReferencedContainer = "container:RNRectangleScanner.xcodeproj">
</BuildableReference>
</BuildActionEntry>
</BuildActionEntries>
</BuildAction>
<TestAction
buildConfiguration = "Debug"
selectedDebuggerIdentifier = "Xcode.DebuggerFoundation.Debugger.LLDB"
selectedLauncherIdentifier = "Xcode.DebuggerFoundation.Launcher.LLDB"
shouldUseLaunchSchemeArgsEnv = "YES">
<Testables>
</Testables>
</TestAction>
<LaunchAction
buildConfiguration = "Debug"
selectedDebuggerIdentifier = "Xcode.DebuggerFoundation.Debugger.LLDB"
selectedLauncherIdentifier = "Xcode.DebuggerFoundation.Launcher.LLDB"
launchStyle = "0"
useCustomWorkingDirectory = "NO"
ignoresPersistentStateOnLaunch = "NO"
debugDocumentVersioning = "YES"
debugServiceExtension = "internal"
allowLocationSimulation = "YES">
</LaunchAction>
<ProfileAction
buildConfiguration = "Release"
shouldUseLaunchSchemeArgsEnv = "YES"
savedToolIdentifier = ""
useCustomWorkingDirectory = "NO"
debugDocumentVersioning = "YES">
<MacroExpansion>
<BuildableReference
BuildableIdentifier = "primary"
BlueprintIdentifier = "58B511DA1A9E6C8500147676"
BuildableName = "libRNRectangleScanner.a"
BlueprintName = "RNRectangleScanner"
ReferencedContainer = "container:RNRectangleScanner.xcodeproj">
</BuildableReference>
</MacroExpansion>
</ProfileAction>
<AnalyzeAction
buildConfiguration = "Debug">
</AnalyzeAction>
<ArchiveAction
buildConfiguration = "Release"
revealArchiveInOrganizer = "YES">
</ArchiveAction>
</Scheme>
================================================
FILE: ios/RNRectangleScannerManager.h
================================================
//
// RNRectangleScannerManager.h
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import <React/RCTBridgeModule.h>
#import <React/RCTViewManager.h>
@interface RNRectangleScannerManager : RCTViewManager <RCTBridgeModule>
@end
================================================
FILE: ios/RNRectangleScannerManager.m
================================================
//
// RNRectangleScannerManager.m
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import "RNRectangleScannerManager.h"
#import "RNRectangleScannerView.h"
@interface RNRectangleScannerManager()
@property (strong, nonatomic) RNRectangleScannerView *scannerView;
@end
/*!
The React view manager. Exports props/methods to react
*/
@implementation RNRectangleScannerManager
- (dispatch_queue_t)methodQueue
{
return dispatch_get_main_queue();
}
RCT_EXPORT_MODULE(RNRectangleScannerManager)
/*!
Turns on the flash light
*/
RCT_EXPORT_VIEW_PROPERTY(enableTorch, BOOL)
/*!
The JPEG image quality of the final captured image (defaults to highest quality)
*/
RCT_EXPORT_VIEW_PROPERTY(capturedQuality, float)
/*!
Determines what filter id to use (1, 2, 3, or 4)
*/
RCT_EXPORT_VIEW_PROPERTY(filterId, int)
// MARK: Life cycle Actions
/*!
Starts the camera. If not setup, it will set up the device as well.
*/
RCT_EXPORT_METHOD(start) {
[_scannerView startCamera];
}
/*!
Stops the camera. It does not call any cleanup actions though.
*/
RCT_EXPORT_METHOD(stop) {
[_scannerView stopCamera];
}
/*!
focuses the camera
*/
RCT_EXPORT_METHOD(focusCamera) {
[_scannerView focusCamera];
}
/*!
Cleans up any extra running camera stuff
*/
RCT_EXPORT_METHOD(cleanup) {
[_scannerView cleanup];
}
/*!
Stops the camera, reinitializes everything, and starts the camera.
*/
RCT_EXPORT_METHOD(refresh) {
[_scannerView cleanup];
[_scannerView startCamera];
}
/*!
Starts taking a picture. This triggers a few events
*/
RCT_EXPORT_METHOD(capture) {
[_scannerView capture];
}
// MARK: Life cycle Events
/*!
Called when the device is setup, the event contains information about permissions and camera capabilities
*/
RCT_EXPORT_VIEW_PROPERTY(onDeviceSetup, RCTDirectEventBlock)
/*!
Called when the frame is captured. This is before any processing and is only available in memory.
*/
RCT_EXPORT_VIEW_PROPERTY(onPictureTaken, RCTDirectEventBlock)
/*!
Called when the captured frame is processed and saved to the temp file directory.
*/
RCT_EXPORT_VIEW_PROPERTY(onPictureProcessed, RCTDirectEventBlock)
/*!
Called when a rectangle is detected
*/
RCT_EXPORT_VIEW_PROPERTY(onErrorProcessingImage, RCTDirectEventBlock)
/*!
Called when a rectangle is detected
*/
RCT_EXPORT_VIEW_PROPERTY(onRectangleDetected, RCTDirectEventBlock)
/*!
Called when the flash is turned off or on
*/
RCT_EXPORT_VIEW_PROPERTY(onTorchChanged, RCTDirectEventBlock)
- (UIView*) view {
_scannerView = [[RNRectangleScannerView alloc] init];
return _scannerView;
}
@end
================================================
FILE: ios/RNRectangleScannerView.h
================================================
//
// RNRectangleScannerView.h
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import "RectangleDetectionController.h"
#import <React/RCTViewManager.h>
@interface RNRectangleScannerView : RectangleDetectionController
@property (nonatomic, copy) RCTDirectEventBlock onDeviceSetup;
@property (nonatomic, copy) RCTDirectEventBlock onTorchChanged;
@property (nonatomic, copy) RCTDirectEventBlock onPictureTaken;
@property (nonatomic, copy) RCTDirectEventBlock onPictureProcessed;
@property (nonatomic, copy) RCTDirectEventBlock onErrorProcessingImage;
@property (nonatomic, copy) RCTDirectEventBlock onRectangleDetected;
@property (nonatomic, assign) float capturedQuality;
- (void) capture;
- (void) startCamera;
- (void) stopCamera;
- (void) cleanup;
@end
================================================
FILE: ios/RNRectangleScannerView.m
================================================
//
// RNRectangleScannerView.m
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import "RNRectangleScannerView.h"
/*!
Wraps up the camera and rectangle detection code into a simple interface. Allows you to call start, stop, cleanup, and capture. Also is responsible for deterining how to cache the output images.
*/
@implementation RNRectangleScannerView
- (instancetype)init {
self = [super init];
if (self) {
[self setEnableBorderDetection:YES];
self._isStopped = TRUE;
self._cameraIsSetup = FALSE;
self.capturedQuality = 0.5;
}
return self;
}
/*!
Used to start the camera if it is stopped
*/
- (void) startCamera {
if ([self _isStopped]) {
if (![self _cameraIsSetup]) {
[self setupCameraView];
}
[self start];
}
}
/*!
Used to stop the camera if it is running
*/
- (void) stopCamera {
if (![self _isStopped]) [self stop];
}
/*!
Turns off the torch and stops the camera.
*/
- (void) cleanup {
[self setEnableTorch: NO];
[self stopCamera];
self._cameraIsSetup = NO;
}
/*!
Called after the camera and session are set up. This lets you check if a camera is found and permission is granted to use it.
*/
- (void)deviceWasSetup:(NSDictionary *)config {
[super deviceWasSetup:config];
if (self.onDeviceSetup) {
self.onDeviceSetup(config);
}
}
/*!
Called after the torch state is changed
*/
- (void)torchWasChanged:(BOOL)torchEnabled {
if (self.onTorchChanged) {
self.onTorchChanged(@{@"enabled": torchEnabled ? @TRUE : @FALSE});
}
}
/*!
Called after the camera and session are set up. This lets you check if a camera is found and permission is granted to use it.
*/
- (void)rectangleWasDetected:(NSDictionary *)detection {
[super rectangleWasDetected:detection];
if (self.onRectangleDetected) {
self.onRectangleDetected(detection);
}
}
- (void)onErrorOfImageProcessor:(NSDictionary*) errorBody {
if (self.onErrorProcessingImage) {
self.onErrorProcessingImage(errorBody);
}
}
/*!
After capture, the image is stored and sent to the event handler
*/
-(void)onProcessedCapturedImage:(UIImage *)croppedImage initialImage: (UIImage *) initialImage lastRectangleFeature: (CIRectangleFeature *) lastRectangleFeature {
NSString *dir = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) firstObject];
NSString *storageFolder = @"RNRectangleScanner";
dir = [dir stringByAppendingPathComponent:storageFolder];
NSFileManager *fileManager= [NSFileManager defaultManager];
NSError *error = nil;
if(![fileManager createDirectoryAtPath:dir withIntermediateDirectories:YES attributes:nil error:&error]) {
// An error has occurred, do something to handle it
NSLog(@"Failed to create directory \"%@\". Error: %@", dir, error);
[self onErrorOfImageProcessor:@{@"message": @"Failed to create the cache directory"}];
return;
}
NSString *croppedFilePath = [dir stringByAppendingPathComponent:[NSString stringWithFormat:@"C%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];
NSString *initialFilePath = [dir stringByAppendingPathComponent:[NSString stringWithFormat:@"O%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];
bool hasCroppedImage = (croppedImage != nil);
if (!hasCroppedImage) {
croppedFilePath = initialFilePath;
}
if (self.onPictureTaken) {
self.onPictureTaken(@{
@"croppedImage": croppedFilePath,
@"initialImage": initialFilePath
});
}
float quality = 0.5;
if (self.capturedQuality) {
quality = self.capturedQuality;
}
@autoreleasepool {
if (hasCroppedImage) {
NSData *croppedImageData = UIImageJPEGRepresentation(croppedImage, quality);
if (![croppedImageData writeToFile:croppedFilePath atomically:YES]) {
NSMutableDictionary *errorBody = [[NSMutableDictionary alloc] init];
[errorBody setValue:@"Failed to write cropped image to cache" forKey:@"message"];
[errorBody setValue:croppedFilePath forKey:@"filePath"];
[self onErrorOfImageProcessor:errorBody];
return;
}
}
NSData *initialImageData = UIImageJPEGRepresentation(initialImage, quality);
if (![initialImageData writeToFile:initialFilePath atomically:YES]) {
NSMutableDictionary *errorBody = [[NSMutableDictionary alloc] init];
[errorBody setValue:@"Failed to write original image to cache" forKey:@"message"];
[errorBody setValue:initialFilePath forKey:@"filePath"];
[self onErrorOfImageProcessor:errorBody];
return;
}
if (self.onPictureProcessed) {
self.onPictureProcessed(@{
@"croppedImage": croppedFilePath,
@"initialImage": initialFilePath
});
}
}
}
/*!
Captures the current frame and sends the processed image(s) to the on picture taken callback.
*/
- (void) capture {
[self captureImageLater];
}
@end
================================================
FILE: ios/RectangleDetectionController.h
================================================
//
// RectangleDetectionController.h
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import <UIKit/UIKit.h>
#import "CameraDeviceController.h"
#import <React/RCTViewManager.h>
@interface RectangleDetectionController : CameraDeviceController
- (void)setupCameraView;
- (void)start;
- (void)stop;
- (void)rectangleWasDetected:(NSDictionary *)detection;
-(void)onProcessedCapturedImage:(UIImage *)croppedImage initialImage: (UIImage *) initialImage lastRectangleFeature: (CIRectangleFeature *) lastRectangleFeature;
- (void)handleCapturedImage:(CIImage *)capturedImage orientation:(UIImageOrientation)orientation;
@property (nonatomic,assign,getter=isBorderDetectionEnabled) BOOL enableBorderDetection;
- (CIImage *)processOutput:(CIImage *)image;
@property (nonatomic, assign) NSInteger detectionRefreshRateInMS;
@end
================================================
FILE: ios/RectangleDetectionController.m
================================================
//
// RectangleDetectionController.m
//
// Created by Jake Humphrey on Jan 6, 2020.
// Copyright (c) 2020 Jake Humphrey. All rights reserved.
//
#import "RectangleDetectionController.h"
#import <AVFoundation/AVFoundation.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
#import <CoreImage/CoreImage.h>
#import <ImageIO/ImageIO.h>
@interface RectangleDetectionController ()
@property (nonatomic, assign) float lastDetectionRate;
@end
/*!
Takes the output from the camera device controller and attempts to detect rectangles from the output. On capture,
it will also crop the image.
*/
@implementation RectangleDetectionController
{
CGFloat _imageDedectionConfidence;
NSTimer *_borderDetectTimeKeeper;
BOOL _borderDetectFrame;
CIRectangleFeature *_borderDetectLastRectangleFeature;
CGRect _borderDetectLastRectangleBounds;
dispatch_queue_t _rectangleDetectionQueue;
}
- (instancetype)init {
self = [super init];
_rectangleDetectionQueue = dispatch_queue_create("RectangleDetectionQueue",NULL);
return self;
}
// MARK: Setters
/*!
Turns on the image detection. Once turned on, the next frame displayed on the previewlayer will get scanned for a
rectangle then it will turn off the border detection.
*/
- (void)enableBorderDetectFrame
{
_borderDetectFrame = YES;
}
// MARK: Camera
/*!
Before setting up the camera, set the deduction confidence to 0
*/
- (void)setupCameraView
{
_imageDedectionConfidence = 0.0;
[super setupCameraView];
}
/*!
Starts a capture sequence and starts a timer that will enable border detection for the set refresh rate.
*/
- (void)start
{
[super start];
float detectionRefreshRate = 20;
CGFloat detectionRefreshRateInSec = detectionRefreshRate/100;
_borderDetectTimeKeeper = [NSTimer scheduledTimerWithTimeInterval:detectionRefreshRateInSec target:self selector:@selector(enableBorderDetectFrame) userInfo:nil repeats:YES];
}
/*!
Stops the capture session and stops the timer
*/
- (void)stop
{
[super stop];
[_borderDetectTimeKeeper invalidate];
}
/*!
Runs each frame the image is being pushed to the preview layer
*/
-(CIImage *)processOutput:(CIImage *)image
{
if (self.isBorderDetectionEnabled)
{
if (_borderDetectLastRectangleFeature) {
_imageDedectionConfidence += .5;
} else {
_imageDedectionConfidence = 0.0f;
}
if (_borderDetectFrame) {
[self detectRectangleFromImageLater:image];
_borderDetectFrame = NO;
}
}
return [super processOutput:image];
}
/*!
Looks for a rectangle in the given image async
*/
- (void)detectRectangleFromImageLater:(CIImage *)image {
dispatch_async(_rectangleDetectionQueue, ^{
@autoreleasepool {
@try {
// need to convert the CI image to a CG image before use, otherwise there can be some unexpected behaviour on some devices
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgDetectionImage = [context createCGImage:image fromRect:image.extent];
CIImage *detectionImage = [CIImage imageWithCGImage:cgDetectionImage];
detectionImage = [detectionImage imageByApplyingOrientation:kCGImagePropertyOrientationLeft];
self->_borderDetectLastRectangleFeature = [self biggestRectangleInRectangles:[[self highAccuracyRectangleDetector] featuresInImage:detectionImage] image:detectionImage];
self->_borderDetectLastRectangleBounds = detectionImage.extent;
if (self->_borderDetectLastRectangleFeature) {
NSDictionary *rectangleCoordinates = [self computeRectangle:self->_borderDetectLastRectangleFeature forImage: detectionImage];
[self rectangleWasDetected:@{
@"detectedRectangle": rectangleCoordinates,
}];
} else {
[self rectangleWasDetected:@{
@"detectedRectangle": @FALSE,
}];
}
CGImageRelease(cgDetectionImage);
}
@catch (NSException * e) {
NSLog(@"Failed to parse image: %@", e);
}
}
});
}
- (void)rectangleWasDetected:(NSDictionary *)detection {}
// MARK: Capture
/*!
After an image is captured and cropped, this method is called
*/
-(void)onProcessedCapturedImage:(UIImage *)croppedImage initialImage: (UIImage *) initialImage lastRectangleFeature: (CIRectangleFeature *) lastRectangleFeature {
}
/*!
After an image is captured, this fuction is called and handles cropping the image
*/
-(void)handleCapturedImage:(CIImage *)capturedImage orientation: (UIImageOrientation) orientation{
if (self.isBorderDetectionEnabled && isRectangleDetectionConfidenceHighEnough(self->_imageDedectionConfidence) &&
self->_borderDetectLastRectangleFeature)
{
CIImage *croppedImage = [self correctPerspectiveForImage:capturedImage withFeatures:self->_borderDetectLastRectangleFeature fromBounds:self->_borderDetectLastRectangleBounds];
// need to convert the CI image to a CG image before use, otherwise there can be some unexpected behaviour on some devices
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef croppedref = [context createCGImage:croppedImage fromRect:croppedImage.extent];
UIImage *image = [UIImage imageWithCGImage:croppedref scale: 1.0 orientation:orientation];
CGImageRef capturedref = [context createCGImage:capturedImage fromRect:capturedImage.extent];
UIImage *initialImage = [UIImage imageWithCGImage:capturedref scale: 1.0 orientation:orientation];
[self onProcessedCapturedImage:image initialImage: initialImage lastRectangleFeature: self->_borderDetectLastRectangleFeature];
CGImageRelease(croppedref);
CGImageRelease(capturedref);
} else {
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef capturedref = [context createCGImage:capturedImage fromRect:capturedImage.extent];
UIImage *initialImage = [UIImage imageWithCGImage:capturedref scale: 1.0 orientation:orientation];
[self onProcessedCapturedImage:nil initialImage: initialImage lastRectangleFeature: nil];
CGImageRelease(capturedref);
}
}
/*!
Crops the image for the given coordinates, correcting its perspective.
*/
- (CIImage *)correctPerspectiveForImage:(CIImage *)image withFeatures:(CIRectangleFeature *)rectangleFeature fromBounds:(CGRect)bounds
{
float xScale = image.extent.size.width / bounds.size.width;
float yScale = image.extent.size.height / bounds.size.height;
NSMutableDictionary *rectangleCoordinates = [NSMutableDictionary new];
CGPoint newLeft = CGPointMake(rectangleFeature.topLeft.x * xScale, rectangleFeature.topLeft.y * yScale);
CGPoint newRight = CGPointMake(rectangleFeature.topRight.x * xScale, rectangleFeature.topRight.y * yScale);
CGPoint newBottomLeft = CGPointMake(rectangleFeature.bottomLeft.x * xScale, rectangleFeature.bottomLeft.y * yScale);
CGPoint newBottomRight = CGPointMake(rectangleFeature.bottomRight.x * xScale, rectangleFeature.bottomRight.y * yScale);
rectangleCoordinates[@"inputTopLeft"] = [CIVector vectorWithCGPoint:newLeft];
rectangleCoordinates[@"inputTopRight"] = [CIVector vectorWithCGPoint:newRight];
rectangleCoordinates[@"inputBottomLeft"] = [CIVector vectorWithCGPoint:newBottomLeft];
rectangleCoordinates[@"inputBottomRight"] = [CIVector vectorWithCGPoint:newBottomRight];
return [image imageByApplyingFilter:@"CIPerspectiveCorrection" withInputParameters:rectangleCoordinates];
}
// MARK: Rectangle Detection
/*!
Gets a rectangle detector that can be used to plug an image into and find the rectangles from
*/
- (CIDetector *)highAccuracyRectangleDetector
{
static CIDetector *detector = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^
{
detector = [CIDetector detectorOfType:CIDetectorTypeRectangle context:nil options:@{CIDetectorAccuracy : CIDetectorAccuracyHigh, CIDetectorReturnSubFeatures: @(YES) }];
});
return detector;
}
/*!
Finds the best fitting rectangle from the list of rectangles found in the image
*/
- (CIRectangleFeature *)biggestRectangleInRectangles:(NSArray *)rectangles image:(CIImage *)image
{
if (![rectangles count]) return nil;
float halfPerimiterValue = 0;
CIRectangleFeature *biggestRectangle = [rectangles firstObject];
for (CIRectangleFeature *rect in rectangles) {
CGPoint p1 = rect.topLeft;
CGPoint p2 = rect.topRight;
CGFloat width = hypotf(p1.x - p2.x, p1.y - p2.y);
CGPoint p3 = rect.topLeft;
CGPoint p4 = rect.bottomLeft;
CGFloat height = hypotf(p3.x - p4.x, p3.y - p4.y);
CGFloat currentHalfPerimiterValue = height + width;
if (halfPerimiterValue < currentHalfPerimiterValue) {
halfPerimiterValue = currentHalfPerimiterValue;
biggestRectangle = rect;
}
}
return biggestRectangle;
}
/*!
Maps the coordinates to the correct orientation. This maybe can be cleaned up and removed if the orientation is set on the input image.
*/
- (NSDictionary *) computeRectangle: (CIRectangleFeature *) rectangle forImage: (CIImage *) image {
CGRect imageBounds = image.extent;
if (!rectangle) return nil;
return @{
@"bottomLeft": @{
@"y": @(rectangle.topLeft.x),
@"x": @(rectangle.topLeft.y)
},
@"bottomRight": @{
@"y": @(rectangle.topRight.x),
@"x": @(rectangle.topRight.y)
},
@"topLeft": @{
@"y": @(rectangle.bottomLeft.x),
@"x": @(rectangle.bottomLeft.y)
},
@"topRight": @{
@"y": @(rectangle.bottomRight.x),
@"x": @(rectangle.bottomRight.y)
},
@"dimensions": @{@"height": @(imageBounds.size.width), @"width": @(imageBounds.size.height)}
};
}
/*!
Checks if the confidence of the current rectangle is above a threshold. The higher, the more likely the rectangle is the desired object to be scanned.
*/
BOOL isRectangleDetectionConfidenceHighEnough(float confidence)
{
return (confidence > 1.0);
}
@end
================================================
FILE: package.json
================================================
{
"name": "react-native-rectangle-scanner",
"version": "1.2.0",
"description": "Scan documents, automatic border detection, automatic crop, image filters",
"main": "index.js",
"files": [
"/android",
"!/android/build",
"/ios",
"/src",
"/*.podspec",
"react-native.config.js"
],
"types": "src/index.d.ts",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"publishConfig": {
"registry": "https://registry.npmjs.org/"
},
"keywords": [
"react-native",
"react-native-component",
"react-native-scanner",
"react-native-document-scanner",
"scanner",
"document",
"ios",
"android",
"camera"
],
"repository": {
"type": "git",
"url": "https://github.com/HarvestProfit/react-native-rectangle-scanner"
},
"author": "Jake Humphrey (https://github.com/humphreyja)",
"license": "MIT",
"bugs": {
"url": "https://github.com/HarvestProfit/react-native-rectangle-scanner/issues"
},
"homepage": "https://github.com/HarvestProfit/react-native-rectangle-scanner#readme",
"devDependencies": {
"@babel/core": "^7.4.3",
"@babel/plugin-proposal-class-properties": "^7.4.0",
"@babel/plugin-transform-flow-strip-types": "^7.4.0",
"@babel/plugin-transform-runtime": "^7.4.3",
"babel-eslint": "^10.0.1",
"babel-plugin-inline-import": "^3.0.0",
"eslint": "^5.16.0",
"eslint-config-airbnb": "^17.1.0",
"eslint-plugin-import": "^2.16.0",
"eslint-plugin-jsx-a11y": "^6.2.1",
"eslint-plugin-react": "^7.12.4",
"prop-types": "^15.7.2",
"react": "16.8.3",
"react-dom": "16.8.3",
"react-native": "~0.59.8",
"react-native-svg": "^10.0.0"
},
"peerDependencies": {
"react": "*",
"react-native": "*",
"react-native-svg": "*"
},
"jest": {
"preset": "react-native",
"modulePathIgnorePatterns": [
"<rootDir>/example/node_modules",
"<rootDir>/lib/"
]
}
}
================================================
FILE: react-native.config.js
================================================
module.exports = {
dependency: {
platforms: {
android: {
packageImportPath: 'import com.rectanglescanner.RectangleScannerPackage;',
packageInstance: 'new RectangleScannerPackage()',
},
},
},
};
================================================
FILE: src/Filters.js
================================================
import { Platform } from 'react-native';
const PHOTO_FILTER = { id: 1, name: 'Photo' };
const GREYSCALE_FILTER = { id: 2, name: 'Greyscale' };
const COLOR_FILTER = { id: 3, name: 'Color' };
const BLACK_AND_WHITE_FILTER = { id: 4, name: 'Black & White' };
const RECOMMENDED_PLATFORM_FILTERS = [
COLOR_FILTER,
BLACK_AND_WHITE_FILTER,
];
let PLATFORM_DEFAULT_FILTER_ID = COLOR_FILTER.id;
// On Android the color and black and white are too similar to
// the original and greyscale to justify showing all 4 filters
if (Platform.OS === 'ios') {
RECOMMENDED_PLATFORM_FILTERS.push(GREYSCALE_FILTER);
RECOMMENDED_PLATFORM_FILTERS.push(PHOTO_FILTER);
PLATFORM_DEFAULT_FILTER_ID = PHOTO_FILTER.id;
}
export default {
PHOTO_FILTER,
GREYSCALE_FILTER,
COLOR_FILTER,
BLACK_AND_WHITE_FILTER,
RECOMMENDED_PLATFORM_FILTERS,
PLATFORM_DEFAULT_FILTER_ID,
};
================================================
FILE: src/FlashAnimation.js
================================================
import { PropTypes } from 'prop-types';
import React, { Component } from 'react';
import { Animated, StyleSheet } from 'react-native';
const styles = StyleSheet.create({
flashOverlay: {
flex: 1,
position: 'absolute',
top: 0,
bottom: 0,
right: 0,
left: 0,
backgroundColor: 'white',
},
});
export default class FlashAnimation extends Component {
static propTypes = {
overlayFlashOpacity: PropTypes.object.isRequired,
}
static triggerSnapAnimation(overlayFlashOpacity) {
Animated.sequence([
Animated.timing(overlayFlashOpacity, { toValue: 0.2, duration: 100 }),
Animated.timing(overlayFlashOpacity, { toValue: 0, duration: 50 }),
Animated.timing(overlayFlashOpacity, { toValue: 0.6, delay: 100, duration: 120 }),
Animated.timing(overlayFlashOpacity, { toValue: 0, duration: 90 }),
]).start();
}
render() {
return (
<Animated.View style={{ ...styles.flashOverlay, opacity: this.props.overlayFlashOpacity }} />
);
}
}
================================================
FILE: src/RectangleOverlay.js
================================================
import { PropTypes } from 'prop-types';
import React, { Component } from 'react';
import { Dimensions, View } from 'react-native';
import { Svg, Path } from 'react-native-svg';
function getDifferenceBetweenRectangles(firstRectangle, secondRectangle) {
const topRightXDiff = Math.abs(firstRectangle.topRight.x - secondRectangle.topRight.x);
const topRightYDiff = Math.abs(firstRectangle.topRight.y - secondRectangle.topRight.y);
const topLeftXDiff = Math.abs(firstRectangle.topLeft.x - secondRectangle.topLeft.x);
const topLeftYDiff = Math.abs(firstRectangle.topLeft.y - secondRectangle.topLeft.y);
const bottomRightXDiff = Math.abs(firstRectangle.bottomRight.x - secondRectangle.bottomRight.x);
const bottomRightYDiff = Math.abs(firstRectangle.bottomRight.y - secondRectangle.bottomRight.y);
const bottomLeftXDiff = Math.abs(firstRectangle.bottomLeft.x - secondRectangle.bottomLeft.x);
const bottomLeftYDiff = Math.abs(firstRectangle.bottomLeft.y - secondRectangle.bottomLeft.y);
return (
topRightXDiff + topRightYDiff
+ topLeftXDiff + topLeftYDiff
+ bottomRightXDiff + bottomRightYDiff
+ bottomLeftXDiff + bottomLeftYDiff
);
}
export default class RectangleOverlay extends Component {
static propTypes = {
// The rectangle from the scanner native component
detectedRectangle: PropTypes.oneOfType([PropTypes.shape({
topRight: PropTypes.shape({ x: PropTypes.number, y: PropTypes.number }),
topLeft: PropTypes.shape({ x: PropTypes.number, y: PropTypes.number }),
bottomRight: PropTypes.shape({ x: PropTypes.number, y: PropTypes.number }),
bottomLeft: PropTypes.shape({ x: PropTypes.number, y: PropTypes.number }),
dimensions: PropTypes.shape({ height: PropTypes.number, width: PropTypes.number }),
}), PropTypes.bool]),
// The preview ratio from the scanner native component (or just 100%x100%)
previewRatio: PropTypes.shape({
height: PropTypes.number,
width: PropTypes.number,
}),
backgroundColor: PropTypes.string, // The background fill of the rectangle overlay
borderColor: PropTypes.string, // The border color of the rectangle overlay
borderWidth: PropTypes.number, // The border width of the rectangle overlay
allowDetection: PropTypes.bool, // Finds difference between current and previous rectangle
onDetectedCapture: PropTypes.func, // A function to call when it has detected a desired rectangle
detectedBackgroundColor: PropTypes.string, // Background fill of rectangle overlay when it has detected the desired rectangle
detectedBorderColor: PropTypes.string, // Border color of rectangle overlay when it has detected the desired rectangle
detectedBorderWidth: PropTypes.number, // Border width of rectangle overlay when it has detected the desired rectangle
rectangleDifferenceAllowance: PropTypes.number, // The amount of difference allowed between the difference of all the points of the rectangle
detectionCountBeforeCapture: PropTypes.number, // The amount of similar rectangles before onDetectedCapture is called
detectionCountBeforeUIChange: PropTypes.number, // The amount of similar rectangles before detected styles are used
};
static defaultProps = {
rectangleDifferenceAllowance: 50,
detectionCountBeforeCapture: 8,
detectionCountBeforeUIChange: 3,
detectedRectangle: false,
backgroundColor: 'rgba(255,203,6, 0.3)',
borderColor: 'rgb(255,203,6)',
borderWidth: 5,
detectedBackgroundColor: null,
detectedBorderColor: null,
detectedBorderWidth: null,
previewRatio: {
height: 1,
width: 1,
},
onDetectedCapture: null,
allowDetection: false,
};
static getDerivedStateFromProps(newProps, oldState) {
if (newProps.allowDetection && newProps.detectedRectangle && oldState.lastRectangle) {
const newRectangle = newProps.detectedRectangle;
const oldRectangle = oldState.lastRectangle;
const diff = getDifferenceBetweenRectangles(newRectangle, oldRectangle);
let detectionCount = oldState.detectionCount + 1;
if (diff > newProps.rectangleDifferenceAllowance) detectionCount = 0;
return {
lastRectangle: newRectangle,
detectionCount,
};
}
return {
lastRectangle: newProps.detectedRectangle,
detectionCount: 0,
};
}
constructor(props) {
super(props);
this.state = {
lastRectangle: null,
detectionCount: 0,
};
}
componentDidUpdate() {
if (!this.props.onDetectedCapture) return;
if (!this.foundRectangle(this.props.detectionCountBeforeCapture)) return;
this.setState({
lastRectangle: null,
detectionCount: 0,
});
this.props.onDetectedCapture();
}
foundRectangle(detectionCount) {
if (this.state.detectionCount < detectionCount) return false;
return true;
}
render() {
const {
previewRatio,
detectedRectangle,
backgroundColor,
borderColor,
borderWidth,
detectedBackgroundColor,
detectedBorderColor,
detectedBorderWidth,
} = this.props;
if (!detectedRectangle) return null;
const {
topRight,
topLeft,
bottomRight,
bottomLeft,
dimensions,
} = detectedRectangle;
const deviceWindow = Dimensions.get('window');
const commands = [];
const plotCoordNode = (cmds, point, svgCMD) => { cmds.push(`${svgCMD}${point.x},${point.y} `); };
plotCoordNode(commands, topLeft, 'M');
plotCoordNode(commands, bottomLeft, 'L');
plotCoordNode(commands, bottomRight, 'L');
plotCoordNode(commands, topRight, 'L');
commands.push('Z');
const d = commands.join(' ');
let stroke = borderColor;
let fill = backgroundColor;
let strokeWidth = borderWidth;
// adjust styles for initial detection
if (this.foundRectangle(this.props.detectionCountBeforeUIChange)) {
stroke = detectedBorderColor || borderColor;
fill = detectedBackgroundColor || backgroundColor;
strokeWidth = detectedBorderWidth || borderWidth;
}
return (
<View style={{ position: 'absolute', top: 0, bottom: 0, right: 0, left: 0, backgroundColor: 'rgba(0,0,0,0)' }}>
<Svg height={deviceWindow.height * previewRatio.height} width={deviceWindow.width * previewRatio.width} viewBox={`0 0 ${dimensions.width} ${dimensions.height}`}>
<Path
d={d}
style={{ fill, stroke, strokeWidth, strokeLinejoin: 'round', strokeLinecap: 'round' }}
/>
</Svg>
</View>
);
}
}
================================================
FILE: src/Scanner.js
================================================
import { PropTypes } from 'prop-types';
import React from 'react';
import {
NativeModules,
Platform,
requireNativeComponent,
PermissionsAndroid,
} from 'react-native';
const RNRectangleScanner = requireNativeComponent('RNRectangleScanner');
const CameraManager = NativeModules.RNRectangleScannerManager || {};
class Scanner extends React.Component {
static propTypes = {
onPictureTaken: PropTypes.func,
onPictureProcessed: PropTypes.func,
capturedQuality: PropTypes.number,
onDeviceSetup: PropTypes.func,
onRectangleDetected: PropTypes.func,
onTorchChanged: PropTypes.func,
onErrorProcessingImage: PropTypes.func,
androidPermission: PropTypes.oneOfType([PropTypes.bool, PropTypes.shape({
title: PropTypes.string,
message: PropTypes.string,
buttonNegative: PropTypes.string,
buttonPositive: PropTypes.string,
})]),
};
static defaultProps = {
onTorchChanged: null,
onPictureTaken: null,
onPictureProcessed: null,
onDeviceSetup: null,
onRectangleDetected: null,
onErrorProcessingImage: null,
capturedQuality: 0.5,
androidPermission: {
title: 'Permission to Access the Camera?',
message: 'Allows you to scan documents',
buttonNegative: "Don't Allow",
buttonPositive: 'OK',
},
}
constructor(props) {
super(props);
}
componentDidMount() {
if (Platform.OS === 'android') {
this.askForAndroidCameraForPermission(this.start);
} else {
this.start();
}
}
componentWillUnmount() {
if (CameraManager.cleanup) CameraManager.cleanup();
}
getImageQuality() {
if (!this.props.capturedQuality) return 0.8;
if (this.props.capturedQuality > 1) return 1;
if (this.props.capturedQuality < 0.1) return 0.1;
return this.props.capturedQuality;
}
sendOnPictureTakenEvent = (event) => {
if (!this.props.onPictureTaken) return null;
return this.props.onPictureTaken(event.nativeEvent);
}
sendOnPictureProcessedEvent = (event) => {
if (!this.props.onPictureProcessed) return null;
return this.props.onPictureProcessed(event.nativeEvent);
}
sendOnErrorProcessingImage = (event) => {
if (!this.props.onErrorProcessingImage) return null;
return this.props.onErrorProcessingImage(event.nativeEvent);
}
sendOnRectangleDetectedEvent = (event) => {
if (!this.props.onRectangleDetected) return null;
let detectionPayload = event.nativeEvent;
if (detectionPayload && detectionPayload.detectedRectangle === 0) {
detectionPayload = {
...detectionPayload,
detectedRectangle: false,
};
}
return this.props.onRectangleDetected(detectionPayload);
}
sendOnDeviceSetupEvent = (event) => {
if (!this.props.onDeviceSetup) return null;
return this.props.onDeviceSetup(event.nativeEvent);
}
sendOnTorchChangedEvent = (event) => {
if (!this.props.onTorchChanged) return null;
return this.props.onTorchChanged(event.nativeEvent);
}
askForAndroidCameraForPermission = async (onComplete) => {
if (!this.props.androidPermission) {
if (onComplete) onComplete();
return;
}
try {
const granted = await PermissionsAndroid.request(
PermissionsAndroid.PERMISSIONS.CAMERA,
this.props.androidPermission,
);
if (granted === PermissionsAndroid.RESULTS.GRANTED) {
if (onComplete) onComplete();
} else {
this.sendOnDeviceSetupEvent({
nativeEvent: { permissionToUseCamera: false, hasCamera: true },
});
}
} catch (err) {
this.sendOnDeviceSetupEvent({
nativeEvent: {
permissionToUseCamera: false, hasCamera: false,
},
});
}
}
start = () => {
setTimeout(() => {
CameraManager.start();
}, 10);
}
// eslint-disable-next-line
capture() { CameraManager.capture(); }
// eslint-disable-next-line
refresh() { CameraManager.refresh(); }
// eslint-disable-next-line
focus() { CameraManager.focus(); }
render() {
return (
<RNRectangleScanner
{...this.props}
onPictureTaken={this.sendOnPictureTakenEvent}
onPictureProcessed={this.sendOnPictureProcessedEvent}
onErrorProcessingImage={this.sendOnErrorProcessingImage}
onRectangleDetected={this.sendOnRectangleDetectedEvent}
onDeviceSetup={this.sendOnDeviceSetupEvent}
onTorchChanged={this.sendOnTorchChangedEvent}
capturedQuality={this.getImageQuality()}
/>
);
}
}
export default Scanner;
================================================
FILE: src/index.d.ts
================================================
declare module 'react-native-rectangle-scanner' {
import { ComponentClass } from 'react';
import { ViewProps, Animated } from 'react-native';
export interface PictureCallbackProps {
croppedImage: string,
initialImage: string,
}
export interface DeviceSetupCallbackProps {
hasCamera: boolean,
permissionToUseCamera: boolean,
flashIsAvailable: boolean,
previewHeightPercent: number,
previewWidthPercent: number,
}
export interface Coordinate {
x: number,
y: number,
}
export interface DetectedRectangle {
bottomLeft: Coordinate,
bottomRight: Coordinate,
topLeft: Coordinate,
topRight: Coordinate,
dimensions: {
height: number,
width: number,
}
}
export interface TorchCallbackProps {
enabled: boolean
}
export interface Filter {
id: number,
name: string
}
export interface AndroidPermissionObject {
title: string,
message: string,
buttonNegative: string,
buttonPositive: string,
}
export interface ScannerComponentProps extends ViewProps {
onPictureTaken?: (args: PictureCallbackProps) => void,
onPictureProcessed?: (args: PictureCallbackProps) => void,
onDeviceSetup?: (args: DeviceSetupCallbackProps) => void,
onRectangleDetected?: (args: { detectedRectangle: DetectedRectangle }) => void,
onTorchChanged?: (args: TorchCallbackProps) => void,
onErrorProcessingImage?: (args: PictureCallbackProps) => void,
filterId?: number,
enableTorch?: boolean,
capturedQuality?: number,
styles?: object,
androidPermission?: AndroidPermissionObject | boolean,
}
export interface RectangleOverlayComponentProps extends ViewProps {
detectedRectangle?: DetectedRectangle,
previewRatio?: { height: number, width: number },
backgroundColor?: string,
borderColor?: string,
borderWidth?: number,
detectedBackgroundColor?: string,
detectedBorderColor?: string,
detectedBorderWidth?: number,
rectangleDifferenceAllowance?: number,
detectionCountBeforeCapture?: number,
detectionCountBeforeUIChange?: number,
allowDetection?: boolean,
onDetectedCapture?: () => void,
}
export interface FlashAnimationComponentProps extends ViewProps {
overlayFlashOpacity: Animated.Value,
}
const Scanner: ComponentClass<ScannerComponentProps>;
export const RectangleOverlay: ComponentClass<RectangleOverlayComponentProps>;
export const FlashAnimation: ComponentClass<FlashAnimationComponentProps>;
export const Filters: {
PHOTO_FILTER: Filter,
GREYSCALE_FILTER: Filter,
COLOR_FILTER: Filter,
BLACK_AND_WHITE_FILTER: Filter,
RECOMMENDED_PLATFORM_FILTERS: Filter[],
PLATFORM_DEFAULT_FILTER_ID: number,
}
export default Scanner;
}
gitextract_pw_p8jxp/
├── .eslintrc.json
├── .gitignore
├── CONTRIBUTING.md
├── LICENSE.md
├── README.md
├── RNRectangleScanner.podspec
├── android/
│ ├── .settings/
│ │ └── org.eclipse.buildship.core.prefs
│ ├── build.gradle
│ ├── gradle/
│ │ └── wrapper/
│ │ ├── gradle-wrapper.jar
│ │ └── gradle-wrapper.properties
│ ├── gradle.properties
│ ├── gradlew
│ ├── gradlew.bat
│ └── src/
│ └── main/
│ ├── AndroidManifest.xml
│ ├── java/
│ │ └── com/
│ │ └── rectanglescanner/
│ │ ├── RNRectangleScannerManager.java
│ │ ├── RNRectangleScannerModule.java
│ │ ├── RectangleScannerPackage.java
│ │ ├── helpers/
│ │ │ ├── CapturedImage.java
│ │ │ ├── ImageProcessor.java
│ │ │ ├── ImageProcessorMessage.java
│ │ │ └── Quadrilateral.java
│ │ └── views/
│ │ ├── CameraDeviceController.java
│ │ ├── MainView.java
│ │ ├── RNRectangleScannerView.java
│ │ └── RectangleDetectionController.java
│ └── res/
│ └── layout/
│ └── activity_rectangle_scanner.xml
├── example/
│ ├── .gitignore
│ ├── App.js
│ ├── app.json
│ ├── babel.config.js
│ ├── package.json
│ └── src/
│ ├── ScanDocument/
│ │ ├── CameraControls.js
│ │ ├── DocumentScanner.js
│ │ ├── index.js
│ │ └── styles.js
│ └── useIsMultiTasking.js
├── index.js
├── ios/
│ ├── CameraDeviceController.h
│ ├── CameraDeviceController.m
│ ├── RNRectangleScanner.xcodeproj/
│ │ ├── project.pbxproj
│ │ └── xcshareddata/
│ │ └── xcschemes/
│ │ └── RNRectangleScanner.xcscheme
│ ├── RNRectangleScannerManager.h
│ ├── RNRectangleScannerManager.m
│ ├── RNRectangleScannerView.h
│ ├── RNRectangleScannerView.m
│ ├── RectangleDetectionController.h
│ └── RectangleDetectionController.m
├── package.json
├── react-native.config.js
└── src/
├── Filters.js
├── FlashAnimation.js
├── RectangleOverlay.js
├── Scanner.js
└── index.d.ts
SYMBOL INDEX (169 symbols across 18 files)
FILE: android/src/main/java/com/rectanglescanner/RNRectangleScannerManager.java
class RNRectangleScannerManager (line 18) | public class RNRectangleScannerManager extends ViewGroupManager<MainView> {
method getName (line 23) | @Override
method createViewInstance (line 28) | @Override
method setEnableTorch (line 36) | @ReactProp(name = "enableTorch", defaultBoolean = false)
method setCapturedQuality (line 41) | @ReactProp(name = "capturedQuality", defaultDouble = 0.5)
method setFilterId (line 46) | @ReactProp(name = "filterId", defaultInt = 1)
method getExportedCustomDirectEventTypeConstants (line 52) | @Override
FILE: android/src/main/java/com/rectanglescanner/RNRectangleScannerModule.java
class RNRectangleScannerModule (line 12) | public class RNRectangleScannerModule extends ReactContextBaseJavaModule{
method RNRectangleScannerModule (line 14) | public RNRectangleScannerModule(ReactApplicationContext reactContext){
method getName (line 18) | @Override
method start (line 23) | @ReactMethod
method stop (line 29) | @ReactMethod
method cleanup (line 35) | @ReactMethod
method refresh (line 41) | @ReactMethod
method capture (line 48) | @ReactMethod
method focus (line 54) | @ReactMethod
FILE: android/src/main/java/com/rectanglescanner/RectangleScannerPackage.java
class RectangleScannerPackage (line 16) | public class RectangleScannerPackage implements ReactPackage {
method createNativeModules (line 17) | @Override
method createViewManagers (line 24) | @Override
FILE: android/src/main/java/com/rectanglescanner/helpers/CapturedImage.java
class CapturedImage (line 10) | public class CapturedImage {
method CapturedImage (line 23) | public CapturedImage(Mat original) {
method getProcessed (line 27) | public Mat getProcessed() {
method setProcessed (line 31) | public CapturedImage setProcessed(Mat processed) {
method release (line 36) | public void release() {
FILE: android/src/main/java/com/rectanglescanner/helpers/ImageProcessor.java
class ImageProcessor (line 42) | public class ImageProcessor extends Handler {
method ImageProcessor (line 48) | public ImageProcessor(Looper looper, RectangleDetectionController main...
method handleMessage (line 57) | public void handleMessage(Message msg) {
method processPreviewFrame (line 76) | private void processPreviewFrame(Mat frame) {
method processCapturedImage (line 86) | private void processCapturedImage(Mat capturedImage) {
method detectRectangleInFrame (line 106) | private void detectRectangleInFrame(Mat inputRgba) {
method cropImageToLatestQuadrilateral (line 124) | private CapturedImage cropImageToLatestQuadrilateral(Mat capturedImage) {
method getQuadrilateral (line 147) | private Quadrilateral getQuadrilateral(ArrayList<MatOfPoint> contours,...
method sortPoints (line 176) | private Point[] sortPoints(Point[] src) {
method insideArea (line 212) | private boolean insideArea(Point[] rp, Size size) {
method fourPointTransform (line 236) | private Mat fourPointTransform(Mat src, Point[] pts) {
method findContours (line 270) | private ArrayList<MatOfPoint> findContours(Mat src) {
method applyFilters (line 314) | public void applyFilters(Mat image) {
method applyGreyscaleFilterToImage (line 341) | public Mat applyGreyscaleFilterToImage(Mat image)
method applyBlackAndWhiteFilterToImage (line 350) | public Mat applyBlackAndWhiteFilterToImage(Mat image)
method applyColorFilterToImage (line 360) | public Mat applyColorFilterToImage(Mat image)
method rotateImageForScreen (line 367) | public void rotateImageForScreen(Mat image) {
FILE: android/src/main/java/com/rectanglescanner/helpers/ImageProcessorMessage.java
class ImageProcessorMessage (line 6) | public class ImageProcessorMessage {
method ImageProcessorMessage (line 11) | public ImageProcessorMessage(String command , Object obj ) {
method getCommand (line 17) | public String getCommand() {
method setCommand (line 21) | public void setCommand(String command) {
method getObj (line 25) | public Object getObj() {
method setObj (line 29) | public void setObj(Object obj) {
FILE: android/src/main/java/com/rectanglescanner/helpers/Quadrilateral.java
class Quadrilateral (line 15) | public class Quadrilateral {
method Quadrilateral (line 20) | public Quadrilateral(MatOfPoint contour, Point[] points, Size sourceSi...
method cropImageToRectangleSize (line 29) | public Mat cropImageToRectangleSize(Mat image) {
method getPointsForSize (line 56) | public Point[] getPointsForSize(Size outputSize) {
method toBundle (line 76) | public Bundle toBundle() {
FILE: android/src/main/java/com/rectanglescanner/views/CameraDeviceController.java
class CameraDeviceController (line 43) | public class CameraDeviceController extends JavaCameraView implements Pi...
method CameraDeviceController (line 66) | public CameraDeviceController(Context context, AttributeSet attrs) {
method CameraDeviceController (line 70) | public CameraDeviceController(Context context, Integer numCam, Activit...
method setEnableTorch (line 87) | public void setEnableTorch(boolean enableTorch) {
method torchWasChanged (line 99) | protected void torchWasChanged(boolean torchEnabled) {}
method cleanupCamera (line 105) | public void cleanupCamera() {
method refreshCamera (line 118) | private void refreshCamera() {
method startCamera (line 126) | public void startCamera() {
method stopCamera (line 146) | public void stopCamera() {
method focusCamera (line 164) | public void focusCamera() {
method setDeviceConfigurationFlashAvailable (line 172) | public void setDeviceConfigurationFlashAvailable(boolean isAvailable) {
method setDeviceConfigurationPermissionToUseCamera (line 179) | public void setDeviceConfigurationPermissionToUseCamera(boolean granted){
method setDeviceConfigurationHasCamera (line 186) | public void setDeviceConfigurationHasCamera(boolean isAvailable){
method setDeviceConfigurationPreviewPercentSize (line 193) | public void setDeviceConfigurationPreviewPercentSize(double heightPerc...
method resetDeviceConfiguration (line 201) | public void resetDeviceConfiguration()
method commitDeviceConfiguration (line 214) | public void commitDeviceConfiguration() {
method deviceWasSetup (line 217) | protected void deviceWasSetup(WritableMap config) {}
method getCameraDevice (line 223) | private int getCameraDevice() {
method getOptimalResolution (line 245) | private Camera.Size getOptimalResolution(float ratioToFitTo, List<Came...
method setupCameraView (line 288) | public void setupCameraView()
method setupCamera (line 308) | public void setupCamera() {
method setDevicePreviewSize (line 368) | public void setDevicePreviewSize(float previewRatio) {
method setupCaptureDevice (line 409) | public boolean setupCaptureDevice() {
method captureImageLater (line 433) | public void captureImageLater() {
method takePicture (line 460) | private void takePicture() {
method onPictureFailed (line 465) | private void onPictureFailed() {
method onPictureTaken (line 474) | @Override
method handleCapturedImage (line 490) | public void handleCapturedImage(Mat capturedImage) {}
method getScreenRotationOnPhone (line 493) | public int getScreenRotationOnPhone() {
method onConfigurationChanged (line 514) | @Override
method surfaceDestroyed (line 520) | @Override
method onPreviewFrame (line 528) | @Override
method processOutput (line 547) | public void processOutput(Mat image) {}
method makeShutterSound (line 549) | private void makeShutterSound() {
method getResolutionList (line 558) | private List<Camera.Size> getResolutionList() {
method getPictureResolutionList (line 562) | private List<Camera.Size> getPictureResolutionList() {
FILE: android/src/main/java/com/rectanglescanner/views/MainView.java
class MainView (line 14) | public class MainView extends FrameLayout {
method getInstance (line 19) | public static MainView getInstance() {
method createInstance (line 23) | public static void createInstance(Context context, Activity activity) {
method MainView (line 27) | private MainView(Context context, Activity activity) {
method onLayout (line 39) | @Override
method setEnableTorch (line 46) | public void setEnableTorch(boolean enable) {
method setCapturedQuality (line 50) | public void setCapturedQuality(double quality) {
method setFilterId (line 54) | public void setFilterId(int filterId) {
method startCamera (line 58) | public void startCamera() {
method stopCamera (line 62) | public void stopCamera() {
method cleanupCamera (line 66) | public void cleanupCamera() {
method capture (line 70) | public void capture() {
method focusCamera (line 74) | public void focusCamera() {
method deviceWasSetup (line 78) | public void deviceWasSetup(WritableMap config) {
method torchWasChanged (line 83) | public void torchWasChanged(boolean torchEnabled) {
method rectangleWasDetected (line 90) | public void rectangleWasDetected(WritableMap detection) {
method pictureWasTaken (line 95) | public void pictureWasTaken(WritableMap pictureDetails) {
method pictureWasProcessed (line 100) | public void pictureWasProcessed(WritableMap pictureDetails) {
method pictureDidFailToProcess (line 105) | public void pictureDidFailToProcess(WritableMap errorDetails) {
FILE: android/src/main/java/com/rectanglescanner/views/RNRectangleScannerView.java
class RNRectangleScannerView (line 30) | public class RNRectangleScannerView extends RectangleDetectionController {
method RNRectangleScannerView (line 38) | public RNRectangleScannerView(Context context, Integer numCam, Activit...
method setParent (line 44) | public void setParent(MainView view) {
method setCapturedQuality (line 51) | public void setCapturedQuality(double quality) {
method capture (line 58) | public void capture() {
method pictureWasTaken (line 65) | private void pictureWasTaken(WritableMap pictureDetails) {
method pictureWasProcessed (line 73) | private void pictureWasProcessed(WritableMap pictureDetails) {
method pictureDidFailToProcess (line 81) | private void pictureDidFailToProcess(WritableMap errorDetails) {
method torchWasChanged (line 89) | @Override
method deviceWasSetup (line 99) | @Override
method rectangleWasDetected (line 109) | @Override
method onProcessedCapturedImage (line 118) | @Override
method generateStoredFileName (line 165) | private String generateStoredFileName(String name) throws Exception {
method saveToDirectory (line 183) | private boolean saveToDirectory(Mat doc, String fileName) {
FILE: android/src/main/java/com/rectanglescanner/views/RectangleDetectionController.java
class RectangleDetectionController (line 28) | public class RectangleDetectionController extends CameraDeviceController {
method setImageProcessorBusy (line 35) | public void setImageProcessorBusy(boolean isBusy) {
method getFilterId (line 39) | public int getFilterId() {
method setFilterId (line 46) | public void setFilterId(int filterId) {
method RectangleDetectionController (line 54) | public RectangleDetectionController(Context context, Integer numCam, A...
method initializeImageProcessor (line 62) | private void initializeImageProcessor(Context context) {
method processOutput (line 96) | @Override
method detectRectangleFromImageLater (line 104) | private void detectRectangleFromImageLater(Mat image) {
method rectangleWasDetected (line 116) | public void rectangleWasDetected(WritableMap detection) {}
method handleCapturedImage (line 125) | @Override
method onProcessedCapturedImage (line 136) | public void onProcessedCapturedImage(CapturedImage scannedDocument) {
FILE: example/App.js
function App (line 5) | function App() {
FILE: index.js
constant CACHE_FOLDER_NAME (line 13) | const CACHE_FOLDER_NAME = 'RNRectangleScanner';
FILE: src/Filters.js
constant PHOTO_FILTER (line 3) | const PHOTO_FILTER = { id: 1, name: 'Photo' };
constant GREYSCALE_FILTER (line 4) | const GREYSCALE_FILTER = { id: 2, name: 'Greyscale' };
constant COLOR_FILTER (line 5) | const COLOR_FILTER = { id: 3, name: 'Color' };
constant BLACK_AND_WHITE_FILTER (line 6) | const BLACK_AND_WHITE_FILTER = { id: 4, name: 'Black & White' };
constant RECOMMENDED_PLATFORM_FILTERS (line 8) | const RECOMMENDED_PLATFORM_FILTERS = [
constant PLATFORM_DEFAULT_FILTER_ID (line 12) | let PLATFORM_DEFAULT_FILTER_ID = COLOR_FILTER.id;
FILE: src/FlashAnimation.js
class FlashAnimation (line 17) | class FlashAnimation extends Component {
method triggerSnapAnimation (line 22) | static triggerSnapAnimation(overlayFlashOpacity) {
method render (line 31) | render() {
FILE: src/RectangleOverlay.js
function getDifferenceBetweenRectangles (line 6) | function getDifferenceBetweenRectangles(firstRectangle, secondRectangle) {
class RectangleOverlay (line 27) | class RectangleOverlay extends Component {
method getDerivedStateFromProps (line 80) | static getDerivedStateFromProps(newProps, oldState) {
method constructor (line 100) | constructor(props) {
method componentDidUpdate (line 109) | componentDidUpdate() {
method foundRectangle (line 119) | foundRectangle(detectionCount) {
method render (line 124) | render() {
FILE: src/Scanner.js
class Scanner (line 13) | class Scanner extends React.Component {
method constructor (line 46) | constructor(props) {
method componentDidMount (line 50) | componentDidMount() {
method componentWillUnmount (line 58) | componentWillUnmount() {
method getImageQuality (line 62) | getImageQuality() {
method capture (line 141) | capture() { CameraManager.capture(); }
method refresh (line 144) | refresh() { CameraManager.refresh(); }
method focus (line 147) | focus() { CameraManager.focus(); }
method render (line 149) | render() {
FILE: src/index.d.ts
type PictureCallbackProps (line 6) | interface PictureCallbackProps {
type DeviceSetupCallbackProps (line 11) | interface DeviceSetupCallbackProps {
type Coordinate (line 19) | interface Coordinate {
type DetectedRectangle (line 24) | interface DetectedRectangle {
type TorchCallbackProps (line 35) | interface TorchCallbackProps {
type Filter (line 39) | interface Filter {
type AndroidPermissionObject (line 44) | interface AndroidPermissionObject {
type ScannerComponentProps (line 51) | interface ScannerComponentProps extends ViewProps {
type RectangleOverlayComponentProps (line 65) | interface RectangleOverlayComponentProps extends ViewProps {
type FlashAnimationComponentProps (line 81) | interface FlashAnimationComponentProps extends ViewProps {
Condensed preview — 54 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (193K chars).
[
{
"path": ".eslintrc.json",
"chars": 415,
"preview": "{\n \"env\": {\n \"jest\": true\n },\n \"extends\": \"airbnb\",\n \"parser\": \"babel-eslint\",\n \"rules\": {\n \"react/no-unesc"
},
{
"path": ".gitignore",
"chars": 621,
"preview": "# OSX\n#\n.DS_Store\n\n# XDE\n.expo/\n\n# VSCode\n.vscode/\njsconfig.json\n\n# Xcode\n#\nbuild/\n*.pbxuser\n!default.pbxuser\n*.mode1v3\n"
},
{
"path": "CONTRIBUTING.md",
"chars": 1371,
"preview": "# Contributing\n\n### Issues\nWhen opening an issue, try to be specific. For example, if you are opening an issue relating "
},
{
"path": "LICENSE.md",
"chars": 1055,
"preview": "Copyright (c) 2020 GitHub Inc.\n\nPermission is hereby granted, free of charge, to any person obtaining\na copy of this sof"
},
{
"path": "README.md",
"chars": 11590,
"preview": "# `react-native-rectangle-scanner`\n\n### ⚠️ Deprecation Notice ⚠️\niOS and Android have come a long way since this package"
},
{
"path": "RNRectangleScanner.podspec",
"chars": 747,
"preview": "require 'json'\n\npackage = JSON.parse(File.read(File.join(__dir__, 'package.json')))\n\nPod::Spec.new do |s|\n s.name "
},
{
"path": "android/.settings/org.eclipse.buildship.core.prefs",
"chars": 54,
"preview": "connection.project.dir=\neclipse.preferences.version=1\n"
},
{
"path": "android/build.gradle",
"chars": 774,
"preview": "buildscript {\n repositories {\n mavenCentral()\n google()\n maven {\n // All of React Nat"
},
{
"path": "android/gradle/wrapper/gradle-wrapper.properties",
"chars": 230,
"preview": "#Thu Aug 01 13:05:36 CDT 2024\ndistributionBase=GRADLE_USER_HOME\ndistributionPath=wrapper/dists\ndistributionUrl=https\\://"
},
{
"path": "android/gradle.properties",
"chars": 889,
"preview": "# Project-wide Gradle settings.\n\n# IDE (e.g. Android Studio) users:\n# Gradle settings configured through the IDE *will o"
},
{
"path": "android/gradlew",
"chars": 5080,
"preview": "#!/usr/bin/env bash\n\n##############################################################################\n##\n## Gradle start "
},
{
"path": "android/gradlew.bat",
"chars": 2404,
"preview": "@if \"%DEBUG%\" == \"\" @echo off\r\n@rem ##########################################################################\r\n@rem\r\n@r"
},
{
"path": "android/src/main/AndroidManifest.xml",
"chars": 540,
"preview": "<manifest xmlns:android=\"http://schemas.android.com/apk/res/android\"\n package=\"com.rectanglescanner\">\n\n <uses-perm"
},
{
"path": "android/src/main/java/com/rectanglescanner/RNRectangleScannerManager.java",
"chars": 2190,
"preview": "package com.rectanglescanner;\n\nimport android.app.Activity;\nimport com.rectanglescanner.views.MainView;\nimport com.faceb"
},
{
"path": "android/src/main/java/com/rectanglescanner/RNRectangleScannerModule.java",
"chars": 1347,
"preview": "package com.rectanglescanner;\n\nimport com.rectanglescanner.views.MainView;\nimport com.facebook.react.bridge.ReactApplica"
},
{
"path": "android/src/main/java/com/rectanglescanner/RectangleScannerPackage.java",
"chars": 895,
"preview": "package com.rectanglescanner;\n\nimport com.facebook.react.ReactPackage;\nimport com.facebook.react.bridge.JavaScriptModule"
},
{
"path": "android/src/main/java/com/rectanglescanner/helpers/CapturedImage.java",
"chars": 897,
"preview": "package com.rectanglescanner.helpers;\n\nimport org.opencv.core.Mat;\nimport org.opencv.core.Point;\nimport org.opencv.core."
},
{
"path": "android/src/main/java/com/rectanglescanner/helpers/ImageProcessor.java",
"chars": 12721,
"preview": "package com.rectanglescanner.helpers;\n\nimport android.content.Context;\nimport android.content.SharedPreferences;\nimport "
},
{
"path": "android/src/main/java/com/rectanglescanner/helpers/ImageProcessorMessage.java",
"chars": 570,
"preview": "package com.rectanglescanner.helpers;\n\n/**\n * Created by Jake on Jan 6, 2020.\n */\npublic class ImageProcessorMessage {\n"
},
{
"path": "android/src/main/java/com/rectanglescanner/helpers/Quadrilateral.java",
"chars": 3304,
"preview": "package com.rectanglescanner.helpers;\n\nimport org.opencv.core.MatOfPoint;\nimport org.opencv.core.Rect;\nimport org.opencv"
},
{
"path": "android/src/main/java/com/rectanglescanner/views/CameraDeviceController.java",
"chars": 18421,
"preview": "package com.rectanglescanner.views;\n\nimport android.app.Activity;\nimport android.content.Context;\nimport android.content"
},
{
"path": "android/src/main/java/com/rectanglescanner/views/MainView.java",
"chars": 3668,
"preview": "package com.rectanglescanner.views;\n\nimport android.app.Activity;\nimport android.content.Context;\nimport android.view.La"
},
{
"path": "android/src/main/java/com/rectanglescanner/views/RNRectangleScannerView.java",
"chars": 6699,
"preview": "package com.rectanglescanner.views;\n\nimport android.app.Activity;\nimport android.content.Context;\nimport android.util.Lo"
},
{
"path": "android/src/main/java/com/rectanglescanner/views/RectangleDetectionController.java",
"chars": 4540,
"preview": "package com.rectanglescanner.views;\n\nimport android.app.Activity;\nimport android.content.Context;\nimport android.os.Buil"
},
{
"path": "android/src/main/res/layout/activity_rectangle_scanner.xml",
"chars": 941,
"preview": "<FrameLayout xmlns:android=\"http://schemas.android.com/apk/res/android\"\n xmlns:tools=\"http://schemas.android.com/tool"
},
{
"path": "example/.gitignore",
"chars": 374,
"preview": "# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files\n\n# dependencies\nnode_modules"
},
{
"path": "example/App.js",
"chars": 524,
"preview": "import { StatusBar } from 'expo-status-bar';\nimport { StyleSheet, Text, View } from 'react-native';\nimport ScanDocument "
},
{
"path": "example/app.json",
"chars": 929,
"preview": "{\n \"expo\": {\n \"name\": \"example\",\n \"slug\": \"example\",\n \"version\": \"1.0.0\",\n \"orientation\": \"portrait\",\n \""
},
{
"path": "example/babel.config.js",
"chars": 107,
"preview": "module.exports = function(api) {\n api.cache(true);\n return {\n presets: ['babel-preset-expo'],\n };\n};\n"
},
{
"path": "example/package.json",
"chars": 529,
"preview": "{\n \"name\": \"example\",\n \"version\": \"1.0.0\",\n \"main\": \"expo/AppEntry.js\",\n \"scripts\": {\n \"start\": \"expo start --dev"
},
{
"path": "example/src/ScanDocument/CameraControls.js",
"chars": 2181,
"preview": "import React from 'react';\nimport { SafeAreaView, Text, TouchableOpacity, View } from 'react-native';\nimport { Filters }"
},
{
"path": "example/src/ScanDocument/DocumentScanner.js",
"chars": 5523,
"preview": "import React, { useRef, useState } from 'react';\nimport { Animated, ActivityIndicator, Dimensions, Text, TouchableOpacit"
},
{
"path": "example/src/ScanDocument/index.js",
"chars": 1096,
"preview": "import React, { useState } from 'react';\nimport { Button, Text } from 'react-native';\nimport DocumentScanner from './Doc"
},
{
"path": "example/src/ScanDocument/styles.js",
"chars": 1888,
"preview": "import { StyleSheet } from \"react-native\";\n\nexport const styles = StyleSheet.create({\n container: {\n flex: 1,\n ba"
},
{
"path": "example/src/useIsMultiTasking.js",
"chars": 427,
"preview": "import { Dimensions, useWindowDimensions } from \"react-native\";\n\n// return true when the device goes into multi-tasking "
},
{
"path": "index.js",
"chars": 322,
"preview": "import Scanner from './src/Scanner';\nimport RectangleOverlay from './src/RectangleOverlay';\nimport Filters from './src/F"
},
{
"path": "ios/CameraDeviceController.h",
"chars": 1407,
"preview": "//\n// CameraDeviceController.h\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphrey. Al"
},
{
"path": "ios/CameraDeviceController.m",
"chars": 21871,
"preview": "//\n// CameraDeviceController.m\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphrey. Al"
},
{
"path": "ios/RNRectangleScanner.xcodeproj/project.pbxproj",
"chars": 21948,
"preview": "// !$*UTF8*$!\n{\n\tarchiveVersion = 1;\n\tclasses = {\n\t};\n\tobjectVersion = 51;\n\tobjects = {\n\n/* Begin PBXBuildFile section *"
},
{
"path": "ios/RNRectangleScanner.xcodeproj/xcshareddata/xcschemes/RNRectangleScanner.xcscheme",
"chars": 2437,
"preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Scheme\n LastUpgradeVersion = \"1130\"\n version = \"1.3\">\n <BuildAction\n "
},
{
"path": "ios/RNRectangleScannerManager.h",
"chars": 293,
"preview": "//\n// RNRectangleScannerManager.h\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphrey."
},
{
"path": "ios/RNRectangleScannerManager.m",
"chars": 2644,
"preview": "//\n// RNRectangleScannerManager.m\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphrey."
},
{
"path": "ios/RNRectangleScannerView.h",
"chars": 825,
"preview": "//\n// RNRectangleScannerView.h\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphrey. Al"
},
{
"path": "ios/RNRectangleScannerView.m",
"chars": 4904,
"preview": "//\n// RNRectangleScannerView.m\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphrey. Al"
},
{
"path": "ios/RectangleDetectionController.h",
"chars": 887,
"preview": "//\n// RectangleDetectionController.h\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphr"
},
{
"path": "ios/RectangleDetectionController.m",
"chars": 9872,
"preview": "//\n// RectangleDetectionController.m\n//\n// Created by Jake Humphrey on Jan 6, 2020.\n// Copyright (c) 2020 Jake Humphr"
},
{
"path": "package.json",
"chars": 1957,
"preview": "{\n \"name\": \"react-native-rectangle-scanner\",\n \"version\": \"1.2.0\",\n \"description\": \"Scan documents, automatic border d"
},
{
"path": "react-native.config.js",
"chars": 234,
"preview": "module.exports = {\n dependency: {\n platforms: {\n android: {\n packageImportPath: 'import com.rectanglesca"
},
{
"path": "src/Filters.js",
"chars": 864,
"preview": "import { Platform } from 'react-native';\n\nconst PHOTO_FILTER = { id: 1, name: 'Photo' };\nconst GREYSCALE_FILTER = { id: "
},
{
"path": "src/FlashAnimation.js",
"chars": 1011,
"preview": "import { PropTypes } from 'prop-types';\nimport React, { Component } from 'react';\nimport { Animated, StyleSheet } from '"
},
{
"path": "src/RectangleOverlay.js",
"chars": 6563,
"preview": "import { PropTypes } from 'prop-types';\nimport React, { Component } from 'react';\nimport { Dimensions, View } from 'reac"
},
{
"path": "src/Scanner.js",
"chars": 4562,
"preview": "import { PropTypes } from 'prop-types';\nimport React from 'react';\nimport {\n NativeModules,\n Platform,\n requireNative"
},
{
"path": "src/index.d.ts",
"chars": 2782,
"preview": "\ndeclare module 'react-native-rectangle-scanner' {\n import { ComponentClass } from 'react';\n import { ViewProps, Anima"
}
]
// ... and 1 more files (download for full content)
About this extraction
This page contains the full source code of the HarvestProfit/react-native-rectangle-scanner GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 54 files (176.7 KB), approximately 45.7k tokens, and a symbol index with 169 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.