Repository: janten/dpt-rp1-py
Branch: master
Commit: 4a64c2a1ffb4
Files: 17
Total size: 98.5 KB
Directory structure:
gitextract_3f7ws_ss/
├── .github/
│ └── workflows/
│ └── python-publish.yml
├── .gitignore
├── LICENSE
├── MANIFEST.in
├── README.md
├── docs/
│ └── linux-ethernet-over-usb.md
├── dptrp1/
│ ├── __init__.py
│ ├── cli/
│ │ ├── __init__.py
│ │ ├── dptmount.py
│ │ └── dptrp1.py
│ ├── dptrp1.py
│ └── pyDH.py
├── samples/
│ ├── wifi_2.5G.json
│ ├── wifi_5G.json
│ └── wifi_del_2.5G.json
├── setup.json
└── setup.py
================================================
FILE CONTENTS
================================================
================================================
FILE: .github/workflows/python-publish.yml
================================================
# This workflow will upload a Python Package using Twine when a release is created
# For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
name: Upload Python Package
on:
workflow_dispatch:
release:
types: [published]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.x'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build
- name: Build package
run: python -m build
- name: Publish package
uses: pypa/gh-action-pypi-publish@27b31702a0e7fc50959f5ad993c78deac1bdfc29
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
================================================
FILE: .gitignore
================================================
certs/
# Created by https://www.gitignore.io/api/node,macos,python,virtualenv
### macOS ###
*.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
### Node ###
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
# nyc test coverage
.nyc_output
# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (http://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# Typescript v1 declaration files
typings/
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
### Python ###
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
# Translations
*.mo
*.pot
# Django stuff:
local_settings.py
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
### VirtualEnv ###
# Virtualenv
# http://iamzed.com/2009/05/07/a-primer-on-virtualenv/
# [Bb]in
[Ii]nclude
[Ll]ib
[Ll]ib64
[Ll]ocal
[Mm]an
[Ss]cripts
[Tt]cl
pyvenv.cfg
pip-selfcheck.json
# End of https://www.gitignore.io/api/node,macos,python,virtualenv
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2018 Jan-Gerd Tenberge
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: MANIFEST.in
================================================
include setup.json
================================================
FILE: README.md
================================================
# dpt-rp1-py
Python script to manage electronic paper devices made by Sony (Digital Paper, DPT-RP1, DPT-CP1) or Fujitsu (Quaderno) without the Digital Paper App. This repository includes a Python library and a command line utility to manage documents on the reader. Tested on Windows, Linux, and macOS. Should also work for Sony's other digital paper readers.
Throughout this document, _reader_ or _device_ refers to your Digital Paper device.
## Installation
We now have a proper Python package, so you may just run:
```
pip3 install dpt-rp1-py
```
Installing the package also installs the command line utilities `dptrp1` and `dptmount`. To install the library from the sources, clone this repository, then run `python3 setup.py install` or `pip3 install .` from the root directory. To install as a developer use `python3 setup.py develop` (see [the setuptools docs](http://setuptools.readthedocs.io/en/latest/setuptools.html#development-mode)) and work on the source as usual.
## Using the command line utility
The command line utility requires a connection to the reader via WiFi, Bluetooth, or USB. The USB connection works on Windows and MacOS but may not work on a Linux machine.
To see if you can successfully connect to the reader, try the command `dptrp1 list-documents`. If you have Sony's Digital Paper App installed, this should work without any further configuration. If this fails, register your reader with the app using `dptrp1 register`.
### Basic usage
Here you see some basic usage examples for the utility. Text following a dollar sign is the command as entered on the command line on MacOS or Linux. Your paths may look slightly different on Windows.
#### Registering the device
This command pairs the command line utility to your reader. You only need to run this once. Keep the device nearby, you will need to read a code from the display and enter it.
```
$ dptrp1 register
Discovering Digital Paper for 30 seconds…
Found Digital Paper with serial number 500XXXX
Cleaning up...
<Response [204]>
Requesting PIN...
Encoding nonce...
Please enter the PIN shown on the DPT-RP1:
```
#### Listing all documents on the device
```
$ dptrp1 list-documents
Document/Note/Graph_20171022.pdf
Document/Work/Scans/Contract.pdf
Document/Papers/svetachov2010.pdf
Document/Papers/sporns2012.pdf
```
#### Getting general usage instructions
```
$ dptrp1 -h
usage: dptrp1 [-h] [--client-id CLIENT_ID] [--key KEY] [--addr ADDR]
[--serial SERIAL] [--yes] [--quiet]
{copy-document,[...],wifi-scan}
[command_args [command_args ...]]
Remote control for Sony DPT-RP1
positional arguments:
{copy-document,[...],wifi-scan}
Command to run
command_args Arguments for the command
optional arguments:
-h, --help show this help message and exit
--client-id CLIENT_ID
File containing the device's client id
--key KEY File containing the device's private key
--addr ADDR Hostname or IP address of the device. Disables auto
discovery.
--serial SERIAL Device serial number for auto discovery. Auto
discovery only works for some minutes after the
Digital Paper's Wi-Fi setting is switched on.
--yes, -y Automatically answer yes to confirmation prompts, for
running non-interactively.
--quiet, -q Suppress informative messages.
```
#### Getting help for the upload command
```
$ dptrp1 help upload
Usage: dptrp1 upload <local_path> [<remote_path>]
Upload a local document to the reader.
Will upload to Document/ if only the local path is specified.
```
#### Uploading a document to the reader
```
$ dptrp1 upload ~/Desktop/scan.pdf
```
#### Opening the second page of a document on the reader
```
$ dptrp1 display-document Document/scan.pdf 2
```
#### Connecting to a WiFi network
This command requires the path to a WiFi configuration file as a parameter. Look at the [sample configuration](https://github.com/janten/dpt-rp1-py/blob/master/samples/wifi_2.5G.json) file and put your network name in the _ssid_ field and your password into the _passwd_ field. You can generally leave the other fields unchanged.
```
$ dptrp1 wifi-add config.json
```
### Supported commands
You can get a list of the implemented commands by running `dptrp1` with no additional arguments. The most important commands for everyday use are _register_, _help_, _upload_, _download_, and _sync_.
You can get additional information about a specific command by calling `dptrp1 help <command>`, e.g. `dptrp1 help sync`.
Note that the root path for DPT-RP1 is always `Document/`, which is misleadingly displayed as "System Storage" on the device. To download a document called _file.pdf_ from a folder called _Articles_ of the DPT-RP1, the correct command is `dptrp1 download Document/Articles/file.pdf`.
### Registering the DPT-RP1
The DPT-RP1 uses SSL encryption to communicate with the computer. This requires registering the DPT-RP1 with the computer, which results in two pieces of information, the client ID and the private key. If you have used Sony's Digital Paper App on the same computer, the utility will automatically try to use the existing credentials. If you do not have the Digital Paper App, use the _register_ command.
#### Registering without the Digital Paper App
If you want to use a WiFi connection, make sure that the reader and your computer are connected to the same WiFi network. Some versions of the DPT-RP1 do not allow you to connect to a WiFi network from the device itself. In this case, use Bluetooth or USB first to configure the WiFi network (using the _wifi-add_ command) or update the firmware (using _update-firmware_).
The tool can generally figure out the correct IP address of the device automatically, but you may also specify it with the `--addr <address>` option. If you're on WiFi, go to _Wi-Fi Settings_ on the device and tap the connected network to see the device's address. If you use a Bluetooth connection, it's likely _172.25.47.1_. You can also try the hostname _digitalpaper.local_. Use the _register_ command like seen below, substituting the IP address of the device.
```
dptrp1 --addr 10.0.0.1 register
```
If you get an error, wait a few seconds and try again. Sometimes it takes two or three tries to work.
## Mounting as a file system
This Repository contains a `dptmount` script to mount the Digital Paper as a userspace mount. This tool has additional requirements.
- On macOS, install osxfuse (e.g. with `brew cask install osxfuse`).
- On Linux, you may need to install libfuse.
### How to use
Create a yaml file with configuration details at _~/.config/dpt-rp1.conf_. You must specify either an address (with `addr`) or a Device ID (with `serial`). All entries must be strings, the serial number must be wrapped in quotation marks.
```
dptrp1:
addr: 192.168.0.200
serial: "50040222"
client-id: ~/.config/dpt/deviceid.dat
key: ~/.config/dpt/privatekey.dat
```
If you register with `dptrp1 register` command, the client-id shall be $HOME/.config/dpt/deviceid.dat, and key shall be $HOME/.config/dpt/privatekey.dat. Mount the Digital Paper to a directory with `dptmount --config ~/.config/dpt-rp1.conf /mnt/mountpoint`
#### Finding the private key and client ID on Windows
If you have already registered on Windows, the Digital Paper app stores the files in _Users/{username}/AppData/Roaming/Sony Corporation/Digital Paper App/_. You'll need the files _deviceid.dat_ and _privatekey.dat_.
#### Finding the private key and client ID on macOS
If you have already registered on macOS, the Digital Paper app stores the files in _$HOME/Library/Application Support/Sony Corporation/Digital Paper App/_. You'll need the files _deviceid.dat_ and _privatekey.dat_.
#### What works
* Reading files
* Moving files (both rename and move to different folder)
* Uploading new files
* Deleting files and folders
#### What does not work
* Currently there is no caching, therefore operations can be slow as they require uploading or downloading from the
device. However, this avoids having to resolve conflicts if a document has been changed both on the Digital Paper and
the caching directory.
================================================
FILE: docs/linux-ethernet-over-usb.md
================================================
# Accessing the DPT-RP1 over USB in Linux
To use the DPT-RP1 through the USB cable, you need to perform two steps:
1. Switch the USB mode for DPT-RP1 to Ethernet-over-USB.
2. Determine the IPv6 link-local address for `digitalpaper.local` using mDNS.
## Switching the USB mode the Ethernet-over-USB.
When the DPT-RP1 is plugged into a USB port, it appears as a USB CDC ACM device (i.e. a serial port), usually at `/dev/ttyACM0`.
By sending a sequence of bytes to this serial port, the DPT-RP1 mode can be switched to Ethernet-over-USB.
The DPT-RP1 supports two protocols for Ethernet-over-USB : remote NDIS (RNDIS) for Windows machines, and USB CDC/ECM for Macs. Linux supports both these modes.
You only need to enable one of these modes.
### Activating RNDIS mode
To activate RNDIS mode, send the following Python byte sequence to `/dev/ttyACM0` using [pyserial](https://pythonhosted.org/pyserial/) for example.
b"\x01\x00\x00\x01\x00\x00\x00\x01\x00\x04"
Check the output of `dmesg` to verify this worked:
rndis_host 2-1:1.0 usb0: register 'rndis_host' at usb-0000:00:14.0-1, RNDIS device, xx:xx:xx:xx:xx:xx
where `xx:xx:xx:xx:xx:xx` is the Ethernet address for the DPT-RP1.
### Activating CDC/ECM mode
To activate CDC/ECM mode, send the following alternative Python byte sequence:
b"\x01\x00\x00\x01\x00\x00\x00\x01\x01\x04"
The `dmesg` command will show:
cdc_ether 2-1:1.0 usb0: register 'cdc_ether' at usb-0000:00:14.0-1, CDC Ethernet Device, xx:xx:xx:xx:xx:xx
## De-activate DHCP on the new Ethernet device
If you're using DHCP to obtain addresses, you should disable it for the DPT-RP1, since the DPT-RP1 does not run a DHCP server.
For example, if you're using Network Manager, change the IPv4 settings on the DPT-RP1 Ethernet device to 'Link-Local Only' instead of 'Automatic'. This will assign your end of the Ethernet link an IPv4 link-local address in the 169.254.0.0/16 range.
When using Network Manager, also make sure that in the 'Ethernet' tab, 'device' is set to the interface name, not the MAC address. This will help Network Manager to restore the settings when connecting next time.
## Determining the address for DPT-RP1
The DPT-RP1 uses an IPv6 link-local address when in Ethernet-over-USB. You can determine this address by using an mDNS resolver such as `avahi`.
$ avahi-resolve -n digitalpaper.local
digitalpaper.local fe80::xxxx:xxxx:xxxx:xxxx
Although this returns the IPv6 link-local address, at least on my system, this address is incomplete. IPv6 link-local addresses need a scope identifier which identifies the network interface (i.e. link). On my system, the DPT-RP1 Ethernet device appears as `usb0` (from the output of `ifconfig`), and therefore the full address is:
fe80::xxxx:xxxx:xxxx:xxxx%usb0
The full URI for the DPT-RP1 would be:
https://[fe80::xxxx:xxxx:xxxx:xxxx%usb0]:8443/...
This syntax is accepted by urllib3 v1.22 and above.
# Accessing the Fujitsu Quaderno Gen 2 over USB in Linux
The instructions in this guide will work, at the exception of the last part, trying to find the IPv6 local address:
Instead of `digitalpaper.local`, the Quaderno name is `Android.local`.
$ avahi-resolve -n Android.local
Android.local fe80::xxxx:xxxx:xxxx:xxxx
Another way to find the device IP address if you don't know the name is to run `avahi-browse`:
$ avahi-browse -avr
= usb0 IPv6 Digital Paper FMVDP41 _dp_fujitsu._tcp local
hostname = [Android.local]
address = [fe80::xxxx:xxxx:xxxx:xxxx]
port = [8080]
txt = []
================================================
FILE: dptrp1/__init__.py
================================================
================================================
FILE: dptrp1/cli/__init__.py
================================================
================================================
FILE: dptrp1/cli/dptmount.py
================================================
#!/usr/bin/env python3
"""
Usage
-----
> dptmount /mnt/mymountpoint
Config file
------------
A simple yaml such as
> dptrp1:
> client-id: ~/.config/dpt/deviceid.dat
> key: ~/.config/dpt/privatekey.dat
> addr: 192.168.0.200
Todo
----
* Main thing is to allow for writing/uploading
* Also, a reasonable and robust caching is needed
* Rename/Move should be possible in the near future
Author
------
Juan Grigera <juan@grigera.com.ar>
upload functionality by Jochen Schroeder <cycomanic@gmail.com>
"""
# debian-dependency: python3-fusepy
# pip3 install fusepy
import os
import sys
import errno
import time
import calendar
import yaml
import io
from errno import ENOENT, EACCES
from stat import S_IFDIR, S_IFLNK, S_IFREG
import logging
logger = logging.getLogger("dptmount")
try:
from fuse import FUSE, FuseOSError, Operations, LoggingMixIn
except ModuleNotFoundError:
from fusepy import FUSE, FuseOSError, Operations, LoggingMixIn
from dptrp1.dptrp1 import DigitalPaper, find_auth_files
import anytree
class FileHandle(object):
def __init__(self, fs, local_path, new=False):
self.fs = fs
dpath, fname = os.path.split(local_path)
self.parent = self.fs._map_local_remote(dpath)
self.remote_path = os.path.join(self.parent.remote_path, fname)
if new:
self.status = "clean"
else:
node = self.fs._map_local_remote(local_path)
assert self.remote_path == node.item['entry_path']
self.status = "unread"
self.data = bytearray()
def read(self, length, offset):
if self.status == "unread":
logger.info('Downloading %s', self.remote_path)
self.status = "clean"
self.data = self.fs.dpt.download(self.remote_path)
return self.data[offset:offset + length]
def write(self, buf, offset):
self.status = "dirty"
self.data[offset:] = buf
return len(buf)
def flush(self):
if self.status != "dirty":
return
stream = io.BytesIO(self.data)
self.fs.dpt.upload(stream, self.remote_path)
# XXX do we sometimes need to remove an old node?
self.fs._add_remote_path_to_tree(self.parent, self.remote_path) # TBI
self.status = "clean"
class DptTablet(LoggingMixIn, Operations):
def __init__(
self,
dpt_ip_address=None,
dpt_serial_number=None,
dpt_key=None,
dpt_client_id=None,
uid=None,
gid=None,
):
self.dpt_ip_address = dpt_ip_address
self.dpt_serial_number = dpt_serial_number
self.dpt_key = os.path.expanduser(dpt_key)
self.dpt_client_id = os.path.expanduser(dpt_client_id)
self.uid = uid
self.gid = gid
self.__authenticate__()
# Create root node
self.__init_empty_tree()
# Cache this for the session
logger.info("Loading initial document list")
self._load_document_list()
logger.debug(anytree.RenderTree(self.root))
self.handle = {}
self.files = {}
self.fd = 0
def __init_empty_tree(self):
# Create root node
self.now = time.time()
self.root = anytree.Node('Document', item = None, localpath='/',
remote_path="Document",
lstat=dict(st_mode=(S_IFDIR | 0o755),
st_ctime=self.now,
st_mtime=self.now,
st_atime=self.now,
st_nlink=2), )
def __authenticate__(self):
self.dpt = DigitalPaper(self.dpt_ip_address, self.dpt_serial_number)
with open(self.dpt_client_id) as fh:
client_id = fh.readline().strip()
with open(self.dpt_key, "rb") as fh:
key = fh.read()
self.dpt.authenticate(client_id, key)
def _remove_node(self, node):
node.parent = None
del node
def _add_node_to_tree(self, parent, item):
return anytree.Node(
item["entry_name"],
parent=parent,
item=item,
remote_path=item["entry_path"],
lstat=self._get_lstat(item),
localpath=os.path.join(parent.localpath, item["entry_name"]),
)
def _add_remote_path_to_tree(self, parent, remote_path):
item = self.dpt._resolve_object_by_path(remote_path)
return self._add_node_to_tree(parent, item)
def _load_document_list(self):
# TODO maybe some smarter caching?
self._recurse_load_document_list(self.root)
def _recurse_load_document_list(self, parent):
parentnodepath = "/".join([str(node.name) for node in parent.path])
for item in self.dpt.list_objects_in_folder(parentnodepath):
node = self._add_node_to_tree(parent, item)
if item["entry_type"] == "folder":
self._recurse_load_document_list(node)
def _get_lstat(self, item):
if "reading_date" in item:
atime = calendar.timegm(
time.strptime(item["reading_date"], "%Y-%m-%dT%H:%M:%SZ")
)
else:
# access time = now if never read...
atime = self.now
lstat = dict(
st_atime=atime,
st_gid=self.gid,
st_uid=self.uid,
st_ctime=calendar.timegm(
time.strptime(item["created_date"], "%Y-%m-%dT%H:%M:%SZ")
),
)
# usual thing for directories is st_link keeps number of subdirectories
if item["entry_type"] == "folder":
lstat["st_nlink"] = 2
# todo: increment nlink in parent dir
lstat["st_mode"] = S_IFDIR | 0o755
lstat["st_mtime"] = self.now
else:
lstat["st_mode"] = S_IFREG | 0o644
lstat["st_mtime"] = calendar.timegm(
time.strptime(item["modified_date"], "%Y-%m-%dT%H:%M:%SZ")
)
lstat["st_nlink"] = 1
lstat["st_size"] = int(item["file_size"])
#'st_inot': item['entry_id'], 'entry_id': 'fe13e1df-1cfe-4fe3-9e83-3e12e78b8a47',
# 'entry_name': '10.1017.pdf', 'entry_path': 'Document/10.1017.pdf', 'entry_type': 'document',
# 'file_revision': 'a21ea4b1c368.2.0',
# 'is_new': 'false', 'mime_type': 'application/pdf',
# 'title': 'untitled', 'total_page': '4'}
return lstat
def _map_local_remote(self, full_local):
return anytree.search.find(
self.root, filter_=lambda node: node.localpath == full_local
)
def _is_read_only_flags(self, flags):
# from pcachefs
access_flags = os.O_RDONLY | os.O_WRONLY | os.O_RDWR
return flags & access_flags == os.O_RDONLY
# Filesystem methods
# ==================
def chmod(self, path, mode):
# TODO: should support chown/chmod
return 0
def chown(self, path, uid, gid):
# TODO: should support chown/chmod
return 0
def getattr(self, path, fh=None):
if path in self.files:
return self.files[path]
node = self._map_local_remote(path)
if node is None:
raise FuseOSError(ENOENT)
return node.lstat
def readdir(self, path, fh):
node = self._map_local_remote(path)
entries = node.children
dirents = [".", ".."]
dirents.extend([e.name for e in entries])
logger.debug(dirents)
return dirents
def unlink(self, path):
node = self._map_local_remote(path)
remote_path = node.remote_path
data = self.dpt.delete_document(node.remote_path)
self._remove_node(node)
return 0
# Directory creation
# ============
def rmdir(self, path):
node = self._map_local_remote(path)
self.dpt.delete_folder(node.remote_path)
self._remove_node(node)
return 0
def mkdir(self, path, mode):
ppath, dirname = os.path.split(path)
parent = self._map_local_remote(ppath)
remote_path = os.path.join(parent.remote_path, dirname)
self.dpt.new_folder(remote_path)
node = self._add_remote_path_to_tree(parent, remote_path)
return 0
# File methods
# ============
def open(self, path, flags):
if not self._is_read_only_flags(flags):
return FuseOSError(EACCES)
self.fd += 1
self.handle[self.fd] = FileHandle(self, path, new=False)
logger.info('file handle %d opened' % self.fd)
return self.fd
def release(self, path, fh):
# TODO: something is going wrong with releasing the file handles for new created docs
logger.info("file handle %d closed" % fh)
node = self._map_local_remote(path)
del self.handle[fh]
return 0
def read(self, path, length, offset, fh):
return self.handle[fh].read(length, offset)
def rename(self, oldpath, newpath):
old_node = self._map_local_remote(oldpath)
new_folder, fname = os.path.split(newpath)
new_folder_node = self._map_local_remote(new_folder)
newpath = os.path.join(new_folder_node.remote_path, fname)
self.dpt.rename_document(old_node.remote_path, newpath)
self._remove_node(old_node)
self._add_remote_path_to_tree(new_folder_node, newpath)
def create(self, path, mode, fi=None):
#TODO: check if files is necessary
logger.debug("create path {}".format(path))
self.files[path] = dict(
st_mode=(S_IFREG | mode),
st_nlink=1,
st_size=0,
st_ctime=time.time(),
st_mtime=time.time(),
st_atime=time.time(),
)
self.fd += 1
self.handle[self.fd] = FileHandle(self, path, new=True)
return self.fd
def write(self, path, buf, offset, fh):
return self.handle[fh].write(buf, offset)
def flush(self, path, fh):
self.handle[fh].flush()
self.files.pop(path, None)
YAML_CONFIG_PATH = os.path.expanduser("~/.config/dpt-rp1.conf")
def main():
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("mountpoint")
parser.add_argument(
"--config",
default=YAML_CONFIG_PATH,
help="config file, default is %s" % YAML_CONFIG_PATH,
)
parser.add_argument("--verbose", action="store_true", help="Enable verbose logging")
parser.add_argument(
"--logfile",
default=False,
help="Log to a file (default: log to standard output)",
)
parser.add_argument("--big_writes", default=True, help="Enable writes of big")
args = parser.parse_args()
kwarg = ["big_writes"]
kwargs = {}
for k in kwarg:
kwargs[k] = getattr(args, k)
# Set up logging
if args.logfile is False:
logging.basicConfig()
else:
logging.basicConfig(filename=args.logfile)
if args.verbose:
logging.getLogger().setLevel(logging.DEBUG)
else:
logging.getLogger().setLevel(logging.INFO)
# Read YAML config if found
if os.path.isfile(args.config):
config = yaml.safe_load(open(args.config, "r"))
else:
print("Config file not found")
sys.exit(-1)
# config
dpt_client_id, dpt_key = find_auth_files()
cfgargs = config["dptrp1"]
params = dict(
dpt_ip_address=cfgargs.get("addr", None),
dpt_serial_number=cfgargs.get("serial", None),
dpt_client_id=cfgargs.get("client-id", dpt_client_id),
dpt_key=cfgargs.get("key", dpt_key),
uid=os.getuid(),
gid=os.getgid(),
)
tablet = DptTablet(**params)
fuse = FUSE(
tablet,
args.mountpoint,
foreground=True,
allow_other=False,
nothreads=True,
**kwargs
)
if __name__ == "__main__":
main()
================================================
FILE: dptrp1/cli/dptrp1.py
================================================
#!/usr/bin/env python3
# coding=utf-8
import argparse
import inspect
import json
import sys
import os
import re
from pathlib import Path
from dptrp1.dptrp1 import DigitalPaper, find_auth_files, get_default_auth_files
ROOT_FOLDER = 'Document'
def do_screenshot(d, filename):
"""
Take a screenshot of the device's screen and save it to the given local path.
"""
pic = d.take_screenshot()
with open(filename, "wb") as f:
f.write(pic)
def do_list_templates(d):
data = d.list_templates()
for d in data:
print(d["template_name"])
def do_list_documents(d):
data = d.list_documents()
for d in data:
print(d["entry_path"])
def do_list_folders(d, *remote_paths):
data = d.list_all()
for d in data:
if d["entry_type"] == "folder":
print(d["entry_path"] + "/")
def do_move_document(d, old_path, new_path):
d.move_file(old_path, new_path)
def do_copy_document(d, old_path, new_path):
d.copy_file(old_path, new_path)
def do_upload(d, local_path, remote_path=""):
"""
Upload a local document to the reader.
Will upload to Document/ if only the local path is specified.
"""
if not remote_path:
remote_path = ROOT_FOLDER + "/" + os.path.basename(local_path)
d.upload_file(local_path, add_prefix(remote_path))
def do_upload_template(d, local_path, template_name=''):
"""
Upload a local document as a template for the reader.
The template name will be set as the file name if
only the local path is specified.
"""
if not template_name:
template_name = os.path.basename(local_path)
with open(local_path, 'rb') as f:
d.upload_template(f, template_name)
def do_download(d, remote_path, local_path):
"""
Download a document from the reader to your computer.
"""
data = d.download(remote_path)
if os.path.isdir(local_path):
re.sub("/?$", "/", local_path)
local_path += os.path.basename(remote_path)
with open(local_path, "wb") as f:
f.write(data)
def do_list_document_info(d, remote_path=''):
"""
Print metadata about a document on the device.
If no path is given, information is printed for every document on the device.
"""
if not remote_path:
infos = d.list_all()
for info in infos:
print(info["entry_path"])
for key in info:
print(" - " + key + ": " + info[key])
else:
info = d.list_document_info(add_prefix(remote_path))
print(info["entry_path"])
for key in info:
print(" - " + key + ": " + info[key])
def do_display_document(d, remote_path, page=1):
"""
Displays the given document on the reader.
The path must be a valid path on the device.
To display a local document, upload it first.
Optionally pass a page number to open a specific page, number 1 being the front page.
Will show the first page if the page parameter is omitted.
Example: dptrp1 display-document Document/Magazines/Comic.pdf 5
"""
info = d.list_document_info(add_prefix(remote_path))
d.display_document(info["entry_id"], page)
def do_update_firmware(d, local_path):
with open(local_path, "rb") as fwfh:
d.update_firmware(fwfh)
def add_prefix(remote_path: str) -> str:
return remote_path if remote_path.startswith(ROOT_FOLDER) else f'{ROOT_FOLDER}/{remote_path}'
def do_delete_document(d, remote_path):
d.delete_document(add_prefix(remote_path))
def do_delete_template(d,remote_path):
d.delete_template(remote_path)
def do_delete_folder(d, remote_path):
d.delete_folder(add_prefix(remote_path))
def do_sync(d, local_path, remote_path="Document"):
"""
Synchronize all PDF documents between a local path (on your PC) and a
remote path (on the DPT). Older documents will be overwritten by newer ones
without any additional warning. Also synchronizes the time and date on the
reader to the computer's time and date.
Example: dptrp1 sync ~/Dropbox/Papers Document/Papers
"""
d.sync(local_path, remote_path)
def do_new_folder(d, remote_path):
d.new_folder(add_prefix(remote_path))
def do_wifi_list(d):
data = d.wifi_list()
print(json.dumps(data, indent=2))
def do_wifi_scan(d):
data = d.wifi_scan()
print(json.dumps(data, indent=2))
def do_wifi(d):
print(d.wifi_enabled()["value"])
def do_wifi_enable(d):
print(d.enable_wifi())
def do_wifi_disable(d):
print(d.disable_wifi())
def do_add_wifi(d, cfg_file=""):
try:
cfg = json.load(open(cfg_file))
except JSONDecodeError:
quit("JSONDecodeError: Check the contents of %s" % cfg_file)
except FileNotFoundError:
quit("File Not Found: %s" % cfg_file)
if not cfg:
print(
d.configure_wifi(
ssid="vecna2",
security="psk",
passwd="elijah is a cat",
dhcp="true",
static_address="",
gateway="",
network_mask="",
dns1="",
dns2="",
proxy="false",
)
)
else:
print(d.configure_wifi(**cfg))
def do_delete_wifi(d, cfg_file=""):
try:
cfg = json.load(open(cfg_file))
except ValueError:
quit("JSONDecodeError: Check the contents of %s" % cfg_file)
except FileNotFoundError:
quit("File Not Found: %s" % cfg_file)
if not cfg:
print(d.delete_wifi(ssid="vecna2", security="psk"))
else:
print(d.delete_wifi(**cfg))
def do_register(d, key_file, id_file):
_, key, device_id = d.register()
with open(key_file, "w") as f:
f.write(key)
with open(id_file, "w") as f:
f.write(device_id)
def format_parameter(parameter):
desc = ""
if parameter.default != inspect.Parameter.empty:
desc += "["
desc += "<{}>".format(parameter.name)
if parameter.default != inspect.Parameter.empty:
desc += " = " + str(parameter.default) + "]"
return desc
def do_help(command):
"""
Print additional information about a command, if available.
"""
try:
args = list(inspect.signature(commands[command]).parameters.values())
args = [format_parameter(x) for x in args[1:]]
print()
print(" Usage:", sys.argv[0], command, *args)
except:
pass
print(commands[command].__doc__)
def do_get_config(d, path):
"""
Saves the current device configuration to the given path.
The configuration will be saved as a JSON file compatible with the set-configuration command.
"""
config = d.get_config()
with open(path, "w") as file:
json.dump(config, file, indent=4, sort_keys=True)
def do_set_config(d, path):
"""
Reads the JSON-encoded configuration file and applies the configuration to the device.
Use get-configuration first to read the current configuration.
"""
with open(path) as file:
config = json.load(file)
d.set_config(config)
commands = {
"screenshot": do_screenshot,
"list-documents": do_list_documents,
"list-templates" : do_list_templates,
"document-info": do_list_document_info,
"upload": do_upload,
"upload-template" : do_upload_template,
"download": do_download,
"delete": do_delete_document,
"delete-folder": do_delete_folder,
"delete-template": do_delete_template,
"new-folder": do_new_folder,
"move-document": do_move_document,
"copy-document": do_copy_document,
"list-folders": do_list_folders,
"wifi-list": do_wifi_list,
"wifi-scan": do_wifi_scan,
"wifi-add": do_add_wifi,
"wifi-del": do_delete_wifi,
"wifi": do_wifi,
"wifi-enable": do_wifi_enable,
"wifi-disable": do_wifi_disable,
"register": do_register,
"update-firmware": do_update_firmware,
"sync": do_sync,
"help": do_help,
"display-document": do_display_document,
"get-configuration": do_get_config,
"set-configuration": do_set_config,
}
def build_parser():
p = argparse.ArgumentParser(description="Remote control for Sony DPT-RP1")
p.add_argument(
"--client-id", help="File containing the device's client id", default=None
)
p.add_argument(
"--key", help="File containing the device's private key", default=None
)
p.add_argument(
"--addr",
help="Hostname or IP address of the device. Disables auto discovery.",
default=None,
)
p.add_argument(
"--serial",
help="Device serial number for auto discovery. Auto discovery only works for some minutes after the Digital Paper's Wi-Fi setting is switched on.",
default=None,
)
p.add_argument(
"--yes",
"-y",
help="Automatically answer yes to confirmation prompts, for running non-interactively.",
action="store_true",
dest="assume_yes",
default=False,
)
p.add_argument(
"--quiet",
"-q",
help="Suppress informative messages.",
action="store_true",
dest="quiet",
default=False,
)
p.add_argument("command", help="Command to run", choices=sorted(commands.keys()))
p.add_argument("command_args", help="Arguments for the command", nargs="*")
return p
def main():
args = build_parser().parse_args()
if args.command in ["help", "command-help"]:
# Help is available without a device
commands[args.command](*args.command_args)
return
dp = DigitalPaper(
addr=args.addr, id=args.serial, assume_yes=args.assume_yes, quiet=args.quiet
)
if args.command == "register":
# When registering the device, we default to storing auth files in our own configuration directory
default_deviceid, default_privatekey = get_default_auth_files()
do_register(
dp, args.key or default_privatekey, args.client_id or default_deviceid
)
return
# When connecting to a device, we default to looking for auth files in
# both our own configuration directory and in Sony's paths
found_deviceid, found_privatekey = find_auth_files()
if not args.key:
args.key = found_privatekey
if not args.client_id:
args.client_id = found_deviceid
if not os.path.exists(args.key) or not os.path.exists(args.client_id):
print("Could not read device identifier and private key.")
print("Please use command 'register' first:")
print()
print(" {} register".format(sys.argv[0]))
print()
exit(1)
with open(args.client_id) as fh:
client_id = fh.readline().strip()
with open(args.key, "rb") as fh:
key = fh.read()
dp.authenticate(client_id, key)
try:
commands[args.command](dp, *args.command_args)
except Exception as e:
print("An error occured:", e, file=sys.stderr)
print("For help, call:", sys.argv[0], "help", args.command)
sys.exit(1)
if __name__ == "__main__":
main()
================================================
FILE: dptrp1/dptrp1.py
================================================
#!/usr/bin/env python3
import os
import sys
import uuid
import time
import base64
import httpsig
import urllib3
import requests
import functools
import unicodedata
import pickle
import shutil
from tqdm import tqdm
from glob import glob
from urllib.parse import quote_plus
from dptrp1.pyDH import DiffieHellman
from datetime import datetime, timezone
from pbkdf2 import PBKDF2
from Crypto.Hash import SHA256
from Crypto.Hash.HMAC import HMAC
from Crypto.Cipher import AES
from Crypto.PublicKey import RSA
from pathlib import Path
from collections import defaultdict
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def get_default_auth_files():
"""Get the default path where the authentication files for connecting to DPT-RP1 are stored"""
config_path = os.path.join(os.path.expanduser("~"), ".config", "dpt")
os.makedirs(config_path, exist_ok=True)
deviceid = os.path.join(config_path, "deviceid.dat")
privatekey = os.path.join(config_path, "privatekey.dat")
return deviceid, privatekey
def find_auth_files():
"""Search for authentication files for connecting to DPT-RP1, both in default path and in paths from Sony's Digital Paper App"""
deviceid, privatekey = get_default_auth_files()
if not os.path.exists(deviceid) or not os.path.exists(privatekey):
# Could not find our own auth-files. Let's see if we can find any auth files created by Sony's Digital Paper App
search_paths = [
os.path.join(
os.path.expanduser("~"),
"Library/Application Support/Sony Corporation/Digital Paper App",
), # Mac
os.path.join(
os.path.expanduser("~"),
"AppData/Roaming/Sony Corporation/Digital Paper App",
), # Windows
]
for path in search_paths:
# Recursively look for deviceid.dat and privatekey.dat in any sub-folders of the search paths
deviceid_matches = glob(
os.path.join(path, "**/deviceid.dat"), recursive=True
)
privatekey_matches = glob(
os.path.join(path, "**/privatekey.dat"), recursive=True
)
if deviceid_matches and privatekey_matches:
# Found a match. Selecting the first file from each for now.
# This might not be correct if the user has several devices with their own keys. Should ideally be configurable
deviceid = deviceid_matches[0]
privatekey = privatekey_matches[0]
break
return deviceid, privatekey
class DigitalPaperException(Exception):
pass
class ResolveObjectFailed(DigitalPaperException):
pass
class LookUpDPT:
def __init__(self, quiet=False):
import threading
self.addr = None
self.id = None
self.lock = threading.Lock()
self.quiet = quiet
def update_service(self, zeroconf, service_type, name):
pass
def add_service(self, zeroconf, type, name):
info = zeroconf.get_service_info(type, name)
import ipaddress
addr = ipaddress.IPv4Address(info.addresses[0])
info = requests.get(
"http://{}:{}/register/information".format(addr, info.port)
).json()
if not self.id:
self.id = info["serial_number"]
if not self.quiet:
print("Found Digital Paper with serial number {}".format(self.id))
print("To discover only this specific device, call:")
print()
print(
" {} --serial {} {}".format(
sys.argv[0], self.id, " ".join(sys.argv[1:])
)
)
print()
if info["serial_number"] == self.id:
self.addr = str(addr)
self.lock.release()
def find(self, id, timeout=30):
from zeroconf import ServiceBrowser, Zeroconf
if not self.quiet:
print("Discovering Digital Paper for {} seconds…".format(timeout))
sys.stdout.flush()
self.id = id
zc = Zeroconf()
self.lock.acquire()
ServiceBrowser(zc, ["_digitalpaper._tcp.local.", "_dp_fujitsu._tcp.local."], self)
wait = self.lock.acquire(timeout=timeout) or (self.addr is not None)
zc.close()
if not wait:
print("Failed".format(timeout))
return None
else:
if not self.quiet:
print("Found digital paper at", self.addr)
print("To skip the discovery process (and this message), call:")
print()
print(
" {} --addr {} {}".format(
sys.argv[0], self.addr, " ".join(sys.argv[1:])
)
)
print()
return self.addr
class DigitalPaper:
def __init__(self, addr=None, id=None, assume_yes=False, quiet=False):
if addr:
self.addr = addr
if id:
print(
"Ignoring serial number since address is set. Remove --serial {} from call to silence this message.".format(
id
)
)
else:
lookup = LookUpDPT(quiet=quiet)
self.addr = lookup.find(id)
self.session = requests.Session()
self.session.verify = False # disable ssl certificate verification
self.assume_yes = assume_yes # Whether to disable interactive prompts (currently only in sync())
@property
def base_url(self):
if self.addr and ":" in self.addr and self.addr[0] != "[":
port = ""
else:
port = ":8443"
return "https://" + self.addr + port
### Authentication
def register(self):
"""
Gets authentication info from a DPT-RP1. You can call this BEFORE
DigitalPaper.authenticate()
Returns (ca, priv_key, client_id):
- ca: a PEM-encoded X.509 server certificate, issued by the CA
on the device
- priv_key: a PEM-encoded 2048-bit RSA private key
- client_id: the client id
"""
register_pin_url = "/register/pin"
register_hash_url = "/register/hash"
register_ca_url = "/register/ca"
register_url = "/register"
register_cleanup_url = "/register/cleanup"
print("Cleaning up...")
r = self._reg_endpoint_request("PUT", register_cleanup_url)
print(r)
print("Requesting PIN...")
r = self._reg_endpoint_request("POST", register_pin_url)
m1 = r.json()
n1 = base64.b64decode(m1["a"])
mac = base64.b64decode(m1["b"])
yb = base64.b64decode(m1["c"])
yb = int.from_bytes(yb, "big")
n2 = os.urandom(16) # random nonce
dh = DiffieHellman()
ya = dh.gen_public_key()
ya = b"\x00" + ya.to_bytes(256, "big")
zz = dh.gen_shared_key(yb)
zz = zz.to_bytes(256, "big")
yb = yb.to_bytes(256, "big")
derivedKey = PBKDF2(
passphrase=zz, salt=n1 + mac + n2, iterations=10000, digestmodule=SHA256
).read(48)
authKey = derivedKey[:32]
keyWrapKey = derivedKey[32:]
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(n1 + mac + yb + n1 + n2 + mac + ya)
m2hmac = hmac.digest()
m2 = dict(
a=base64.b64encode(n1).decode("utf-8"),
b=base64.b64encode(n2).decode("utf-8"),
c=base64.b64encode(mac).decode("utf-8"),
d=base64.b64encode(ya).decode("utf-8"),
e=base64.b64encode(m2hmac).decode("utf-8"),
)
print("Encoding nonce...")
r = self._reg_endpoint_request("POST", register_hash_url, data=m2)
m3 = r.json()
if base64.b64decode(m3.get("a", "")) != n2:
print("Nonce N2 doesn't match")
return
eHash = base64.b64decode(m3["b"])
m3hmac = base64.b64decode(m3["e"])
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(n1 + n2 + mac + ya + m2hmac + n2 + eHash)
if m3hmac != hmac.digest():
print("M3 HMAC doesn't match")
return
pin = input("Please enter the PIN shown on the DPT-RP1: ")
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(pin.encode())
psk = hmac.digest()
rs = os.urandom(16) # random nonce
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(rs + psk + yb + ya)
rHash = hmac.digest()
wrappedRs = wrap(rs, authKey, keyWrapKey)
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(n2 + eHash + m3hmac + n1 + rHash + wrappedRs)
m4hmac = hmac.digest()
m4 = dict(
a=base64.b64encode(n1).decode("utf-8"),
b=base64.b64encode(rHash).decode("utf-8"),
d=base64.b64encode(wrappedRs).decode("utf-8"),
e=base64.b64encode(m4hmac).decode("utf-8"),
)
print("Getting certificate from device CA...")
r = self._reg_endpoint_request("POST", register_ca_url, data=m4)
print(r)
m5 = r.json()
if base64.b64decode(m5["a"]) != n2:
print("Nonce N2 doesn't match")
return
wrappedEsCert = base64.b64decode(m5["d"])
m5hmac = base64.b64decode(m5["e"])
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(n1 + rHash + wrappedRs + m4hmac + n2 + wrappedEsCert)
if hmac.digest() != m5hmac:
print("HMAC doesn't match!")
return
esCert = unwrap(wrappedEsCert, authKey, keyWrapKey)
es = esCert[:16]
cert = esCert[16:]
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(es + psk + yb + ya)
if hmac.digest() != eHash:
print("eHash does not match!")
return
# print("Certificate: ")
# print(cert)
print("Generating RSA2048 keys")
new_key = RSA.generate(2048, e=65537)
# with open("key.pem", 'wb') as f:
# f.write(new_key.exportKey("PEM"))
keyPubC = new_key.publickey().exportKey("PEM")
selfDeviceId = str(uuid.uuid4())
print("Device ID: " + selfDeviceId)
selfDeviceId = selfDeviceId.encode()
# with open("client_id.txt", 'wb') as f:
# f.write(selfDeviceId)
wrappedDIDKPUBC = wrap(selfDeviceId + keyPubC, authKey, keyWrapKey)
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(n2 + wrappedEsCert + m5hmac + n1 + wrappedDIDKPUBC)
m6hmac = hmac.digest()
m6 = dict(
a=base64.b64encode(n1).decode("utf-8"),
d=base64.b64encode(wrappedDIDKPUBC).decode("utf-8"),
e=base64.b64encode(m6hmac).decode("utf-8"),
)
print("Registering device...")
r = self._reg_endpoint_request("POST", register_url, data=m6)
print(r)
print("Cleaning up...")
r = self._reg_endpoint_request("PUT", register_cleanup_url)
print(r)
return (
cert.decode("utf-8"),
new_key.exportKey("PEM").decode("utf-8"),
selfDeviceId.decode("utf-8"),
)
def authenticate(self, client_id, key):
sig_maker = httpsig.Signer(secret=key, algorithm="rsa-sha256")
nonce = self._get_nonce(client_id)
signed_nonce = sig_maker.sign(nonce)
data = {"client_id": client_id, "nonce_signed": signed_nonce}
r = self._put_endpoint("/auth", data=data)
# cookiejar cannot parse the cookie format used by the tablet,
# so we have to set it manually.
_, credentials = r.headers["Set-Cookie"].split("; ")[0].split("=")
self.session.cookies["Credentials"] = credentials
return r
### File management
def list_templates(self):
data = self._get_endpoint('/viewer/configs/note_templates').json()
return data['template_list']
def list_documents(self):
data = self.traverse_folder_recursively("Document")
return data
def list_all(self):
data = self._get_endpoint("/documents2?entry_type=all").json()
return data["entry_list"]
def list_objects_in_folder(self, remote_path):
remote_id = self._get_object_id(remote_path)
entries = self.list_folder_entries_by_id(remote_id)
return entries
def list_folder_entries_by_id(self, folder_id):
response = self._get_endpoint(f"/folders/{folder_id}/entries")
return response.json()["entry_list"]
def traverse_folder(self, remote_path, fields=[]):
# In most cases, the request overhead of traversing folders is larger than the overhead of
# requesting all info. So let's just request all info and filter for remote_path on our side
if fields:
field_query = "&fields=" + ",".join(fields)
else:
field_query = ""
entry_data = self._get_endpoint(
f"/documents2?entry_type=all" + field_query
).json()
if entry_data.get("count") != len(entry_data.get("entry_list", [])):
# The device seems to not want to return more than 1300 items in the entry_list, meaning that we will miss entries if the device
# has more files/folders than this. Luckly, it can easily be detected by comparing the number of entries with the count.
# Perhaps there is some way to request the remaining entries from the same endpoint through some form of pagination,
# but we do not know how. Let's fall back to the slower recursive traversal
print("Warning: Fast folder traversal did not work. Falling back to slower, recursive folder traversal.")
return self.traverse_folder_recursively(remote_path)
all_entries = entry_data["entry_list"]
return list(
filter(lambda e: e["entry_path"].startswith(remote_path), all_entries)
)
def traverse_folder_recursively(self, remote_path):
# This is the old recursive implementation of traverse_folder.
# It is slower because the main overhead when communicating with the DPT-RP1 is the request latency,
# and this recursive implementation makes one request per folder. However, the faster implementation
# above fails when there are more than 1300 items, in which case we fall back to this older implementation
def traverse(obj):
if obj['entry_type'] == 'document':
return [obj]
else:
children = self \
._get_endpoint("/folders/{remote_id}/entries2".format(remote_id = obj['entry_id'])) \
.json()['entry_list']
return [obj] + functools.reduce(lambda acc, c: traverse(c) + acc, children[::-1], [])
return traverse(self._resolve_object_by_path(remote_path))
def list_document_info(self, remote_path):
remote_info = self._resolve_object_by_path(remote_path)
return remote_info
def download(self, remote_path):
remote_id = self._get_object_id(remote_path)
path = "/documents/{remote_id}/file".format(remote_id=remote_id)
response = self._get_endpoint(path)
return response.content
def delete_document(self, remote_path):
try:
remote_id = self._get_object_id(remote_path)
except ResolveObjectFailed as e:
# Path not found
return
self.delete_document_by_id(remote_id)
def delete_template(self,template_name):
template_list = self.list_templates()
for t in template_list:
if t['template_name']==template_name:
remote_id = t['note_template_id']
self.delete_template_by_id(remote_id)
def display_document(self, document_id, page=1):
info = {"document_id": document_id, "page": page}
r = self._put_endpoint("/viewer/controls/open2", data=info)
def delete_folder(self, remote_path):
try:
remote_id = self._get_object_id(remote_path)
except ResolveObjectFailed as e:
# Path not found
return
self.delete_folder_by_id(remote_id)
def delete_document_by_id(self, doc_id):
self._delete_endpoint(f"/documents/{doc_id}")
def delete_folder_by_id(self, folder_id):
self._delete_endpoint(f"/folders/{folder_id}")
def delete_template_by_id(self, template_id):
self._delete_endpoint(f"/viewer/configs/note_templates/{template_id}")
def upload_template(self, fh, remote_path):
filename = os.path.basename(remote_path)
info = {
"templateName": filename,
"document_source": ""
}
r = self._post_endpoint("/viewer/configs/note_templates", data=info)
doc = r.json()
doc_url = "/viewer/configs/note_templates/{}/file".format(doc["note_template_id"])
files = { 'file': (quote_plus(filename), fh, 'rb') }
self._put_endpoint(doc_url, files=files)
def upload(self, fh, remote_path):
filename = os.path.basename(remote_path)
try:
# If there exists a document in the specified remote path, overwrite it.
doc_id = self._get_object_id(remote_path)
except ResolveObjectFailed as e:
remote_directory = os.path.dirname(remote_path)
self.new_folder(remote_directory)
directory_id = self._get_object_id(remote_directory)
info = {
"file_name": filename,
"parent_folder_id": directory_id,
"document_source": "",
}
r = self._post_endpoint("/documents2", data=info)
doc = r.json()
doc_id = doc["document_id"]
doc_url = "/documents/{doc_id}/file".format(doc_id=doc_id)
files = {"file": (quote_plus(filename), fh, "rb")}
self._put_endpoint(doc_url, files=files)
def new_folder(self, remote_path):
folder_name = os.path.basename(remote_path)
remote_directory = os.path.dirname(remote_path)
if not remote_directory:
return
if not self.path_exists(remote_directory):
self.new_folder(remote_directory)
directory_id = self._get_object_id(remote_directory)
info = {"folder_name": folder_name, "parent_folder_id": directory_id}
r = self._post_endpoint("/folders2", data=info)
def list_folders(self):
if not self.folder_list:
data = self.list_all()
for d in data:
if d["entry_type"] == "folder":
self.folder_list.append(d["entry_path"])
return self.folder_list
def download_file(self, remote_path, local_path):
local_folder = os.path.dirname(local_path)
# Make sure that local_folder exists so that we can write data there.
# If local_path is just a filename, local_folder will be '', and
# we won't need to create any directories.
if local_folder != "":
os.makedirs(os.path.dirname(local_path), exist_ok=True)
data = self.download(remote_path)
with open(local_path, "wb") as f:
f.write(data)
def upload_file(self, local_path, remote_path):
if self.path_is_folder(remote_path):
local_filename = os.path.basename(local_path)
remote_path = os.path.join(remote_path, local_filename)
with open(local_path, "rb") as f:
self.upload(f, remote_path)
def path_is_folder(self, remote_path):
remote_filename = os.path.basename(remote_path)
if not remote_filename:
# Always a folder if path ends in slash.
# Folder may not exist in this case!
return True
try:
remote_obj = self._resolve_object_by_path(remote_path)
if remote_obj["entry_type"] == "folder":
return True
except ResolveObjectFailed:
pass
return False
def path_exists(self, remote_path):
try:
remote_id = self._get_object_id(remote_path)
except ResolveObjectFailed as e:
return False
return True
def sync(self, local_folder, remote_folder):
checkpoint_info = self.load_checkpoint(local_folder)
self.set_datetime()
self.new_folder(remote_folder)
print("Looking for changes on device... ", end="", flush=True)
remote_info = self.traverse_folder_recursively(remote_folder)
print("done")
# Syncing will require different comparions between local and remote paths.
# Let's normalize them to ensure stable comparisions,
# both with respect to unicode normalization and with respect to
# directory separator symbols.
def normalize_path(path):
return unicodedata.normalize("NFC", path).replace(os.sep, "/")
# Create a defaultdict of defaultdict
# so that we can save data to it with two indexes, without having to manually create
# nested dictionaries.
# Will contain:
# file_data[<filename>][<checkpoint/remote/local>_time] = <timestamp>
file_data = defaultdict(lambda: defaultdict(lambda: None))
# When it comes to folders, we want to handle them separately and not care about creation/deletion.
# We therefore use a slightly different data structure:
# folder_data[<filename>][checkpoint/remote/local_exists] = True/False
folder_data = defaultdict(lambda: defaultdict(lambda: False))
# Then we will go changes locally, remotely, and in checkpoint, and save all modificaiton times to the same
# data structure for easy comparison, and the same with folders.
# The checkpoint and remote_info contain the same data-structure, because the checkpoint is simply a dump of
# remote_info at a previous point in time. Therefore, we use the same code to look through both of them:
for location_info, location in [
(checkpoint_info, "checkpoint"),
(remote_info, "remote"),
]:
for f in location_info:
path = normalize_path(f["entry_path"])
if path.startswith(remote_folder):
if f["entry_type"] == "document":
modification_time = datetime.strptime(
f["modified_date"], "%Y-%m-%dT%H:%M:%SZ"
)
file_data[path][f"{location}_time"] = modification_time
elif f["entry_type"] == "folder":
folder_data[path][f"{location}_exists"] = True
print("Looking for local changes... ", end="", flush=True)
# Recursively traverse the local path looking for PDF files.
# Use relatively low-level os.scandir()-api instead of a higher-level api such as glob.glob()
# because os.scandir() gives access to mtime without having to perform an additional syscall on Windows,
# leading to much faster scanning times on Windows
def traverse_local_folder(path):
# Let's store to folder_data that this folder exists
relative_path = Path(path).relative_to(local_folder)
remote_path = normalize_path(
(Path(remote_folder) / relative_path).as_posix()
)
folder_data[remote_path]["local_exists"] = True
# And recursively go through all items inside of the folder
for entry in os.scandir(path):
if entry.is_dir():
traverse_local_folder(entry.path)
# Only handle PDF files, ignore files starting with a dot.
elif entry.name.lower().endswith(".pdf") and not entry.name.startswith(
"."
):
relative_path = Path(entry.path).relative_to(local_folder)
remote_path = normalize_path(
(Path(remote_folder) / relative_path).as_posix()
)
modification_time = datetime.utcfromtimestamp(entry.stat().st_mtime)
file_data[remote_path]["local_time"] = modification_time
traverse_local_folder(local_folder)
print("done")
# Let's loop through the data structure
# to create list of actions to take
to_download = []
to_delete_local = []
to_upload = []
to_delete_remote = []
missing_checkpoint_files = []
for filename, data in file_data.items():
if data["checkpoint_time"] is None:
if data["remote_time"] and data["local_time"]:
# File exists both on device and locally, but not in checkpoint.
# Corrupt or missing checkpoint?
# The safest bet is to assume that the two files are identical, and not sync in either directions.
missing_checkpoint_files.append(filename)
continue
if data["remote_time"]:
# File only exists on remote, so it's new and should be downloaded
to_download.append(filename)
continue
if data["local_time"]:
# File only exists locally, sot it's new and should be uploaded
to_upload.append(filename)
continue
# If we get to here, file exists in checkpoint
modified_local = (
data["local_time"] and data["local_time"] > data["checkpoint_time"]
)
modified_remote = (
data["remote_time"] and data["remote_time"] > data["checkpoint_time"]
)
deleted_local = data["local_time"] is None
deleted_remote = data["remote_time"] is None
if modified_local and modified_remote:
print(
f"Warning, sync conflict! {filename} is changed both locally and remotely."
)
if data["local_time"] > data["remote_time"]:
print("Local change is newer and will take precedence.")
to_upload.append(filename)
else:
print("Remote change is newer and will take precedence.")
to_download.append(filename)
elif modified_local:
to_upload.append(filename)
elif modified_remote:
to_download.append(filename)
elif deleted_local:
to_delete_remote.append(filename)
elif deleted_remote:
to_delete_local.append(filename)
if missing_checkpoint_files:
print(
"\nWarning: The following files exist both locally and on the DPT, but do not seem to have been synchronized using this tool:"
)
max_print = 20 # Let's only print the first max_print filenames to avoid completely flooding
# stdout with unusable information if missing metadata means that this happens
# to all files in the user's library
print("\t" + "\n\t".join(missing_checkpoint_files[:max_print]))
if len(missing_checkpoint_files) > max_print:
print(
f"\t... and {len(missing_checkpoint_files)-max_print} additional files"
)
print("The files will be assumed to be identical.\n")
# Just syncing the files will automatically create the necessary folders to store the given files, but it won't sync empty folders,
# or folder deletion. Therefore, let's go through the folder_data as well, to see which additional folder operations need to be performed:
folders_to_delete_remote = []
folders_to_delete_local = []
folders_to_create_remote = []
folders_to_create_local = []
for foldername, data in folder_data.items():
# data contains information about whether the given foldername exists locally, remotely, and in the checkpoint.
# In addition, we plan to upload/download some files, in which case we won't need to manually create the folders.
# So let's updte data to describe the expected situation after uploading/downloding those files, to decide which additional
# folder operations need to be performed.
data["remote_exists"] = data["remote_exists"] or any(
[f.startswith(foldername) for f in to_upload]
)
data["local_exists"] = data["local_exists"] or any(
[f.startswith(foldername) for f in to_download]
)
# Depending on whether the folder exists is remote/local/checkpoint, let's decide whether to create/delete the folder from remote/local.
create_remote = (
data["local_exists"]
and (not data["checkpoint_exists"])
and (not data["remote_exists"])
)
create_local = (
data["remote_exists"]
and (not data["checkpoint_exists"])
and (not data["local_exists"])
)
delete_remote = (
(not data["local_exists"])
and data["checkpoint_exists"]
and data["remote_exists"]
)
delete_local = (
(not data["remote_exists"])
and data["checkpoint_exists"]
and data["local_exists"]
)
if create_remote:
folders_to_create_remote.append(foldername)
if create_local:
folders_to_create_local.append(foldername)
if delete_remote:
folders_to_delete_remote.append(foldername)
if delete_local:
folders_to_delete_local.append(foldername)
# If a folder structure is deleted, let's sort the deletion so that we always select the innermost, empty, folder first.
folders_to_delete_remote.sort(reverse=True)
folders_to_delete_local.sort(reverse=True)
print("")
print("Ready to sync")
print("")
actions = [
(to_delete_local + folders_to_delete_local, "DELETED locally"),
(to_delete_remote + folders_to_delete_remote, "DELETED from device"),
(to_upload + folders_to_create_remote, "UPLOADED to device"),
(to_download + folders_to_create_local, "DOWNLOADED from device"),
]
for file_list, description in actions:
if file_list:
print(f"{len(file_list):4d} files will be {description}")
if not (
to_delete_local
or to_delete_remote
or to_upload
or to_download
or folders_to_delete_local
or folders_to_delete_remote
or folders_to_create_local
or folders_to_create_remote
):
print("All files are in sync. Exiting.")
return
# Conferm that the user actually wants to perform the actions that
# have been prepared.
print("")
confirm = ""
while not (confirm in ("y", "yes") or self.assume_yes):
confirm = input(f"Proceed (y/n/?)? ")
if confirm in ("n", "no"):
return
if confirm in ("?", "list", "l"):
for file_list, description in actions:
if file_list:
print("")
print(f"The following files will be {description}:")
print("\t" + "\n\t".join(file_list))
print("")
# Syncing can potentially take some time, so let's display a progress bar
# to give the user some idea about the progress.
# Calling print() will interfere with the progress bar, so all print calls
# are replaced by tqdm.write() while the progress bar is in use
progress_bar = tqdm(
total=len(to_delete_local)
+ len(to_delete_remote)
+ len(to_upload)
+ len(to_download),
desc="Synchronizing",
unit="files",
)
# Apply changes in remote to local
for remote_path in to_download:
relative_path = Path(remote_path).relative_to(remote_folder)
local_path = Path(local_folder) / relative_path
tqdm.write("⇣ " + str(remote_path))
self.download_file(remote_path, local_path)
remote_time = (
file_data[remote_path]["remote_time"]
.replace(tzinfo=timezone.utc)
.astimezone(tz=None)
)
mod_time = time.mktime(remote_time.timetuple())
os.utime(local_path, (mod_time, mod_time))
progress_bar.update()
for remote_path in to_delete_local:
relative_path = Path(remote_path).relative_to(remote_folder)
local_path = Path(local_folder) / relative_path
if os.path.exists(local_path):
tqdm.write("X " + str(local_path))
os.remove(local_path)
progress_bar.update()
for remote_path in folders_to_delete_local:
relative_path = Path(remote_path).relative_to(remote_folder)
local_path = Path(local_folder) / relative_path
if os.path.exists(local_path):
tqdm.write("X " + str(local_path))
try:
os.rmdir(local_path)
except OSError as e:
if e.errno == 39:
tqdm.write(
f"WARNING: The folder {local_path} is not empty and will not be deleted."
)
else:
raise
progress_bar.update()
for remote_path in folders_to_create_local:
relative_path = Path(remote_path).relative_to(remote_folder)
local_path = Path(local_folder) / relative_path
tqdm.write("⇣ " + str(remote_path))
os.makedirs(local_path, exist_ok=True)
progress_bar.update()
# Apply changes in local to remote
for remote_file in to_delete_remote:
if self.path_exists(remote_file):
tqdm.write("X " + str(remote_file))
self.delete_document(remote_file)
progress_bar.update()
for remote_deletion_folder in folders_to_delete_remote:
if self.path_exists(remote_deletion_folder):
tqdm.write("X " + str(remote_deletion_folder))
self.delete_folder(remote_deletion_folder)
progress_bar.update()
for remote_path in to_upload:
relative_path = Path(remote_path).relative_to(remote_folder)
local_path = Path(local_folder) / relative_path
tqdm.write("⇡ " + str(local_path))
self.upload_file(local_path, remote_path)
progress_bar.update()
for remote_path in folders_to_create_remote:
relative_path = Path(remote_path).relative_to(remote_folder)
local_path = Path(local_folder) / relative_path
tqdm.write("⇡ " + str(local_path))
self.new_folder(remote_path)
progress_bar.update()
progress_bar.close()
print("Refreshing file information... ", end="", flush=True)
remote_info = self.traverse_folder(
remote_folder, fields=["entry_path", "modified_date", "entry_type"]
)
self.sync_checkpoint(local_folder, remote_info)
print("done")
def load_checkpoint(self, local_folder):
checkpoint_file = os.path.join(local_folder, ".sync")
if not os.path.exists(checkpoint_file):
return []
with open(checkpoint_file, "rb") as f:
return pickle.load(f)
def sync_checkpoint(self, local_folder, doclist):
checkpoint_file = os.path.join(local_folder, ".sync")
with open(checkpoint_file, "wb") as f:
pickle.dump(doclist, f)
def _copy_move_data(self, file_id, folder_id, new_filename=None):
data = {"parent_folder_id": folder_id}
if new_filename is not None:
data["file_name"] = new_filename
return data
def copy_file_to_folder_by_id(self, file_id, folder_id, new_filename=None):
"""
Copies a file with given file_id to a folder with given folder_id.
If new_filename is given, rename the file.
"""
data = self._copy_move_data(file_id, folder_id, new_filename)
return self._post_endpoint(f"/documents/{file_id}/copy", data=data)
def move_file_to_folder_by_id(self, file_id, folder_id, new_filename=None):
"""
Moves a file with given file_id to a folder with given folder_id.
If new_filename is given, rename the file.
"""
data = self._copy_move_data(file_id, folder_id, new_filename)
return self._put_endpoint(f"/documents/{file_id}", data=data)
def _copy_move_find_ids(self, old_path, new_path):
old_id = self._get_object_id(old_path)
new_filename = None
try: # find out whether new_path is a filename or folder
new_folder_id = self._get_object_id(new_path)
except ResolveObjectFailed:
new_filename = os.path.basename(new_path)
new_folder = os.path.dirname(new_path)
new_folder_id = self._get_object_id(new_folder)
return old_id, new_folder_id, new_filename
def copy_file(self, old_path, new_path):
"""
Copies a file with given path to a new path.
"""
old_id, new_folder_id, new_filename = self._copy_move_find_ids(
old_path, new_path
)
self.copy_file_to_folder_by_id(old_id, new_folder_id, new_filename)
def move_file(self, old_path, new_path):
"""
Moves a file with given path to a new path.
"""
old_id, new_folder_id, new_filename = self._copy_move_find_ids(
old_path, new_path
)
return self.move_file_to_folder_by_id(old_id, new_folder_id, new_filename)
### Wifi
def wifi_list(self):
data = self._get_endpoint("/system/configs/wifi_accesspoints").json()
for ap in data["aplist"]:
ap["ssid"] = base64.b64decode(ap["ssid"]).decode("utf-8", errors="replace")
return data["aplist"]
def wifi_scan(self):
data = self._post_endpoint("/system/controls/wifi_accesspoints/scan").json()
for ap in data["aplist"]:
ap["ssid"] = base64.b64decode(ap["ssid"]).decode("utf-8", errors="replace")
return data["aplist"]
def configure_wifi(
self,
ssid,
security,
passwd,
dhcp,
static_address,
gateway,
network_mask,
dns1,
dns2,
proxy,
):
# cnf = {
# "ssid": base64.b64encode(b'YYY').decode('utf-8'),
# "security": "nonsec", # psk, nonsec, XXX
# # "passwd": "XXX",
# "dhcp": "false",
# "static_address": "172.20.123.4",
# "gateway": "172.20.123.160",
# "network_mask": "24",
# "dns1": "172.20.123.160",
# "dns2": "",
# "proxy": "false"
# }
# print(kwargs['ssid'])
conf = dict(
ssid=base64.b64encode(ssid.encode()).decode("utf-8"),
security=security,
passwd=passwd,
dhcp=dhcp,
static_address=static_address,
gateway=gateway,
network_mask=network_mask,
dns1=dns1,
dns2=dns2,
proxy=proxy,
)
return self._put_endpoint(
"/system/controls/wifi_accesspoints/register", data=conf
)
def delete_wifi(self, ssid, security):
url = "/system/configs/wifi_accesspoints/{ssid}/{security}".format(
ssid=ssid, security=security
)
# .format(ssid = base64.b64encode(ssid.encode()).decode('utf-8'),
return self._delete_endpoint(url)
def wifi_enabled(self):
return self._get_endpoint("/system/configs/wifi").json()
def enable_wifi(self):
return self._put_endpoint("/system/configs/wifi", data={"value": "on"})
def disable_wifi(self):
return self._put_endpoint("/system/configs/wifi", data={"value": "off"})
### Configuration
def get_config(self):
"""
Returns the current configuration.
Return value will be a dictionary of dictionaries.
"""
data = self._get_endpoint("/system/configs/").json()
return data
def set_config(self, config):
"""
Update the device configuration.
Input uses the same format that get_config() returns.
"""
for key, setting in config.items():
data = self._put_endpoint("/system/configs/" + key, data=setting)
def get_timeout(self):
data = self._get_endpoint("/system/configs/timeout_to_standby").json()
return data["value"]
def set_timeout(self, value):
data = self._put_endpoint(
"/system/configs/timeout_to_standby", data={"value": value}
)
def get_date_format(self):
data = self._get_endpoint("/system/configs/date_format").json()
return data["value"]
def set_date_format(self, value):
data = self._put_endpoint("/system/configs/date_format", data={"value": value})
def get_time_format(self):
data = self._get_endpoint("/system/configs/time_format").json()
return data["value"]
def set_time_format(self, value):
data = self._put_endpoint("/system/configs/time_format", data={"value": value})
def get_timezone(self):
data = self._get_endpoint("/system/configs/timezone").json()
return data["value"]
def set_timezone(self, value):
data = self._put_endpoint("/system/configs/timezone", data={"value": value})
def get_owner(self):
data = self._get_endpoint("/system/configs/owner").json()
return data["value"]
def set_owner(self, value):
data = self._put_endpoint("/system/configs/owner", data={"value": value})
### System info
def get_storage(self):
data = self._get_endpoint("/system/status/storage").json()
return data
def get_firmware_version(self):
data = self._get_endpoint("/system/status/firmware_version").json()
return data["value"]
def get_api_version(self):
resp = self._reg_endpoint_request("GET", "/api_version")
return resp.json()["value"]
def get_mac_address(self):
data = self._get_endpoint("/system/status/mac_address").json()
return data["value"]
def get_battery(self):
data = self._get_endpoint("/system/status/battery").json()
return data
def get_info(self):
data = self._get_endpoint("/register/information").json()
return data
def set_datetime(self):
now = datetime.utcnow().replace(microsecond=0).isoformat() + "Z"
self._put_endpoint("/system/configs/datetime", data={"value": now})
### Etc
def take_screenshot(self):
# Or "{base_url}/system/controls/screen_shot" for a PNG image.
r = self._get_endpoint("/system/controls/screen_shot2", params={"query": "jpeg"})
return r.content
def ping(self):
"""
Returns True if we are authenticated.
"""
r = self._get_endpoint("/ping")
return r.ok
## Update firmware
def update_firmware(self, fwfh):
filename = "FwUpdater.pkg"
fw_url = "/system/controls/update_firmware/file".format(base_url=self.base_url)
files = {"file": (quote_plus(filename), fwfh, "rb")}
# TODO: add file transferring feedback
self._put_endpoint(fw_url, files=files)
precheck_msg = self._get_endpoint(
"/system/controls/update_firmware/precheck"
).json()
battery_check = precheck_msg.get("battery", "not ok")
uploaded_image_check = precheck_msg.get("image_file", "not ok")
print("* battery check: {}".format(battery_check))
print("* uploaded image check: {}".format(uploaded_image_check))
for key in precheck_msg:
if not (key == "battery" or key == "image_file"):
print(
"! Find unrecognized key-value pair: ({0}, {1})".format(
key, precheck_msg[key]
)
)
if battery_check == "ok" and uploaded_image_check == "ok":
# TODO: add check if status is 204
self._put_endpoint("/system/controls/update_firmware")
### Utility
def _reg_endpoint_request(self, method, endpoint, data=None, files=None):
base_url = "http://{addr}:8080".format(addr=self.addr)
req = requests.Request(method, base_url, json=data, files=files)
prep = self.session.prepare_request(req)
prep.url = prep.url.replace('%25', '%')
# modifying the prepared request, so that the "endpoint" part of
# the URL will not be modified by urllib.
prep.url += endpoint.lstrip("/")
return self.session.send(prep)
def _endpoint_request(self, method, endpoint, data=None, files=None):
req = requests.Request(method, self.base_url, json=data, files=files)
prep = self.session.prepare_request(req)
prep.url = prep.url.replace('%25', '%')
# modifying the prepared request, so that the "endpoint" part of
# the URL will not be modified by urllib.
prep.url += endpoint.lstrip("/")
return self.session.send(prep)
def _get_endpoint(self, endpoint=""):
return self._endpoint_request("GET", endpoint)
def _put_endpoint(self, endpoint="", data={}, files=None):
return self._endpoint_request("PUT", endpoint, data, files)
def _post_endpoint(self, endpoint="", data={}):
return self._endpoint_request("POST", endpoint, data)
def _delete_endpoint(self, endpoint="", data={}):
return self._endpoint_request("DELETE", endpoint, data)
def _get_nonce(self, client_id):
r = self._get_endpoint(f"/auth/nonce/{client_id}")
return r.json()["nonce"]
def _resolve_object_by_path(self, path):
enc_path = quote_plus(path)
url = f"/resolve/entry/path/{enc_path}"
resp = self._get_endpoint(url)
if not resp.ok:
raise ResolveObjectFailed(path, resp.json()["message"])
return resp.json()
def _get_object_id(self, path):
return self._resolve_object_by_path(path)["entry_id"]
# crypto helpers
def wrap(data, authKey, keyWrapKey):
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(data)
kwa = hmac.digest()[:8]
iv = os.urandom(16)
cipher = AES.new(keyWrapKey, AES.MODE_CBC, iv)
wrapped = cipher.encrypt(pad(data + kwa))
wrapped = wrapped + iv
return wrapped
# from https://gist.github.com/adoc/8550490
def pad(bytestring, k=16):
"""
Pad an input bytestring according to PKCS#7
"""
l = len(bytestring)
val = k - (l % k)
return bytestring + bytearray([val] * val)
def unwrap(data, authKey, keyWrapKey):
iv = data[-16:]
cipher = AES.new(keyWrapKey, AES.MODE_CBC, iv)
unwrapped = cipher.decrypt(data[:-16])
unwrapped = unpad(unwrapped)
kwa = unwrapped[-8:]
unwrapped = unwrapped[:-8]
hmac = HMAC(authKey, digestmod=SHA256)
hmac.update(unwrapped)
local_kwa = hmac.digest()[:8]
if kwa != local_kwa:
print("Unwrapped kwa does not match")
return unwrapped
def unpad(bytestring, k=16):
"""
Remove the PKCS#7 padding from a text bytestring.
"""
val = bytestring[-1]
if val > k:
raise ValueError("Input is not padded or padding is corrupt")
l = len(bytestring) - val
return bytestring[:l]
================================================
FILE: dptrp1/pyDH.py
================================================
# Apache License
# Version 2.0, January 2004
# Copyright 2015 Amirali Sanatinia
""" Pure Python Diffie Hellman implementation """
import os
import binascii
import hashlib
# RFC 3526 - More Modular Exponential (MODP) Diffie-Hellman groups for
# Internet Key Exchange (IKE) https://tools.ietf.org/html/rfc3526
primes = {
# 1536-bit
5: {
"prime": 0xFFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA237327FFFFFFFFFFFFFFFF,
"generator": 2,
},
# 2048-bit
14: {
"prime": 0xFFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA18217C32905E462E36CE3BE39E772C180E86039B2783A2EC07A28FB5C55DF06F4C52C9DE2BCBF6955817183995497CEA956AE515D2261898FA051015728E5A8AACAA68FFFFFFFFFFFFFFFF,
"generator": 2,
},
# 3072-bit
15: {
"prime": 0xFFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA18217C32905E462E36CE3BE39E772C180E86039B2783A2EC07A28FB5C55DF06F4C52C9DE2BCBF6955817183995497CEA956AE515D2261898FA051015728E5A8AAAC42DAD33170D04507A33A85521ABDF1CBA64ECFB850458DBEF0A8AEA71575D060C7DB3970F85A6E1E4C7ABF5AE8CDB0933D71E8C94E04A25619DCEE3D2261AD2EE6BF12FFA06D98A0864D87602733EC86A64521F2B18177B200CBBE117577A615D6C770988C0BAD946E208E24FA074E5AB3143DB5BFCE0FD108E4B82D120A93AD2CAFFFFFFFFFFFFFFFF,
"generator": 2,
},
# 4096-bit
16: {
"prime": 0xFFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA18217C32905E462E36CE3BE39E772C180E86039B2783A2EC07A28FB5C55DF06F4C52C9DE2BCBF6955817183995497CEA956AE515D2261898FA051015728E5A8AAAC42DAD33170D04507A33A85521ABDF1CBA64ECFB850458DBEF0A8AEA71575D060C7DB3970F85A6E1E4C7ABF5AE8CDB0933D71E8C94E04A25619DCEE3D2261AD2EE6BF12FFA06D98A0864D87602733EC86A64521F2B18177B200CBBE117577A615D6C770988C0BAD946E208E24FA074E5AB3143DB5BFCE0FD108E4B82D120A92108011A723C12A787E6D788719A10BDBA5B2699C327186AF4E23C1A946834B6150BDA2583E9CA2AD44CE8DBBBC2DB04DE8EF92E8EFC141FBECAA6287C59474E6BC05D99B2964FA090C3A2233BA186515BE7ED1F612970CEE2D7AFB81BDD762170481CD0069127D5B05AA993B4EA988D8FDDC186FFB7DC90A6C08F4DF435C934063199FFFFFFFFFFFFFFFF,
"generator": 2,
},
# 6144-bit
17: {
"prime": 0xFFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA18217C32905E462E36CE3BE39E772C180E86039B2783A2EC07A28FB5C55DF06F4C52C9DE2BCBF6955817183995497CEA956AE515D2261898FA051015728E5A8AAAC42DAD33170D04507A33A85521ABDF1CBA64ECFB850458DBEF0A8AEA71575D060C7DB3970F85A6E1E4C7ABF5AE8CDB0933D71E8C94E04A25619DCEE3D2261AD2EE6BF12FFA06D98A0864D87602733EC86A64521F2B18177B200CBBE117577A615D6C770988C0BAD946E208E24FA074E5AB3143DB5BFCE0FD108E4B82D120A92108011A723C12A787E6D788719A10BDBA5B2699C327186AF4E23C1A946834B6150BDA2583E9CA2AD44CE8DBBBC2DB04DE8EF92E8EFC141FBECAA6287C59474E6BC05D99B2964FA090C3A2233BA186515BE7ED1F612970CEE2D7AFB81BDD762170481CD0069127D5B05AA993B4EA988D8FDDC186FFB7DC90A6C08F4DF435C93402849236C3FAB4D27C7026C1D4DCB2602646DEC9751E763DBA37BDF8FF9406AD9E530EE5DB382F413001AEB06A53ED9027D831179727B0865A8918DA3EDBEBCF9B14ED44CE6CBACED4BB1BDB7F1447E6CC254B332051512BD7AF426FB8F401378CD2BF5983CA01C64B92ECF032EA15D1721D03F482D7CE6E74FEF6D55E702F46980C82B5A84031900B1C9E59E7C97FBEC7E8F323A97A7E36CC88BE0F1D45B7FF585AC54BD407B22B4154AACC8F6D7EBF48E1D814CC5ED20F8037E0A79715EEF29BE32806A1D58BB7C5DA76F550AA3D8A1FBFF0EB19CCB1A313D55CDA56C9EC2EF29632387FE8D76E3C0468043E8F663F4860EE12BF2D5B0B7474D6E694F91E6DCC4024FFFFFFFFFFFFFFFF,
"generator": 2,
},
# 8192-bit
18: {
"prime": 0xFFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA18217C32905E462E36CE3BE39E772C180E86039B2783A2EC07A28FB5C55DF06F4C52C9DE2BCBF6955817183995497CEA956AE515D2261898FA051015728E5A8AAAC42DAD33170D04507A33A85521ABDF1CBA64ECFB850458DBEF0A8AEA71575D060C7DB3970F85A6E1E4C7ABF5AE8CDB0933D71E8C94E04A25619DCEE3D2261AD2EE6BF12FFA06D98A0864D87602733EC86A64521F2B18177B200CBBE117577A615D6C770988C0BAD946E208E24FA074E5AB3143DB5BFCE0FD108E4B82D120A92108011A723C12A787E6D788719A10BDBA5B2699C327186AF4E23C1A946834B6150BDA2583E9CA2AD44CE8DBBBC2DB04DE8EF92E8EFC141FBECAA6287C59474E6BC05D99B2964FA090C3A2233BA186515BE7ED1F612970CEE2D7AFB81BDD762170481CD0069127D5B05AA993B4EA988D8FDDC186FFB7DC90A6C08F4DF435C93402849236C3FAB4D27C7026C1D4DCB2602646DEC9751E763DBA37BDF8FF9406AD9E530EE5DB382F413001AEB06A53ED9027D831179727B0865A8918DA3EDBEBCF9B14ED44CE6CBACED4BB1BDB7F1447E6CC254B332051512BD7AF426FB8F401378CD2BF5983CA01C64B92ECF032EA15D1721D03F482D7CE6E74FEF6D55E702F46980C82B5A84031900B1C9E59E7C97FBEC7E8F323A97A7E36CC88BE0F1D45B7FF585AC54BD407B22B4154AACC8F6D7EBF48E1D814CC5ED20F8037E0A79715EEF29BE32806A1D58BB7C5DA76F550AA3D8A1FBFF0EB19CCB1A313D55CDA56C9EC2EF29632387FE8D76E3C0468043E8F663F4860EE12BF2D5B0B7474D6E694F91E6DBE115974A3926F12FEE5E438777CB6A932DF8CD8BEC4D073B931BA3BC832B68D9DD300741FA7BF8AFC47ED2576F6936BA424663AAB639C5AE4F5683423B4742BF1C978238F16CBE39D652DE3FDB8BEFC848AD922222E04A4037C0713EB57A81A23F0C73473FC646CEA306B4BCBC8862F8385DDFA9D4B7FA2C087E879683303ED5BDD3A062B3CF5B3A278A66D2A13F83F44F82DDF310EE074AB6A364597E899A0255DC164F31CC50846851DF9AB48195DED7EA1B1D510BD7EE74D73FAF36BC31ECFA268359046F4EB879F924009438B481C6CD7889A002ED5EE382BC9190DA6FC026E479558E4475677E9AA9E3050E2765694DFC81F56E880B96E7160C980DD98EDD3DFFFFFFFFFFFFFFFF,
"generator": 2,
},
}
class DiffieHellman:
""" Class to represent the Diffie-Hellman key exchange protocol """
# Current minimum recommendation is 2048 bit.
def __init__(self, group=14):
if group in primes:
self.p = primes[group]["prime"]
self.g = primes[group]["generator"]
else:
raise Exception("Group not supported")
self.__a = int(binascii.hexlify(os.urandom(32)), base=16)
def get_private_key(self):
""" Return the private key (a) """
return self.__a
def gen_public_key(self):
""" Return A, A = g ^ a mod p """
# calculate G^a mod p
return pow(self.g, self.__a, self.p)
def check_other_public_key(self, other_contribution):
# check if the other public key is valid based on NIST SP800-56
# 2 <= g^b <= p-2 and Lagrange for safe primes (g^bq)=1, q=(p-1)/2
if 2 <= other_contribution and other_contribution <= self.p - 2:
if pow(other_contribution, (self.p - 1) // 2, self.p) == 1:
return True
return False
def gen_shared_key(self, other_contribution):
""" Return g ^ ab mod p """
# calculate the shared key G^ab mod p
if self.check_other_public_key(other_contribution):
self.shared_key = pow(other_contribution, self.__a, self.p)
return self.shared_key
# return hashlib.sha256(str(self.shared_key).encode()).digest()
else:
raise Exception("Bad public key from other party")
================================================
FILE: samples/wifi_2.5G.json
================================================
{
"security": "psk",
"ssid": "Roxy123",
"passwd": "Lucinda",
"dhcp": "true",
"static_address": "",
"gateway": "",
"network_mask": "",
"dns1": "",
"dns2": "",
"proxy": "false"
}
================================================
FILE: samples/wifi_5G.json
================================================
{
"security": "psk",
"ssid": "BadBomb",
"passwd": "TickingLikeAHotSamosa",
"dhcp": "true",
"static_address": "",
"gateway": "",
"network_mask": "",
"dns1": "",
"dns2": "",
"proxy": "false"
}
================================================
FILE: samples/wifi_del_2.5G.json
================================================
{
"security": "psk",
"ssid": "Roxy123"
}
================================================
FILE: setup.json
================================================
{
"name": "dpt-rp1-py",
"version": "0.1.20",
"description": "Python package to manage a Sony DPT-RP1",
"license": "MIT",
"authors": [
"Jan-Gerd Tenberge",
"Cuihtlauac Alvarado",
"Juan Grigera",
"Yunjae Lee",
"Kazuhiko Sakaguchi",
"Yanzi Zhu",
"Sreepathi Pai",
"Jochen Schroeder",
"Alexander Fuchs",
"Xiang Ji",
"Håkon J. D. Johnsen"
],
"emails": [
"jan-gerd.tenberge@uni-muenster.de",
"cuihtlauac.alvarado@orange.com",
"juan@grigera.com.ar",
"lyj7694@gmail.com",
"pi8027@gmail.com",
"zhuyanzi@gmail.com",
"sree314@gmail.com",
"jochen.schroeder@chalmers.se",
"alex.fu27@gmail.com",
"hi@xiangji.me",
"hakon.j.d.johnsen@ntnu.no"
],
"namespace_packages": [
],
"packages": [
"dptrp1",
"dptrp1.cli"
],
"install_requires": [
"httpsig>=1.1.2",
"requests>=2.18.4",
"pbkdf2>=1.3",
"urllib3>=1.22",
"pyyaml",
"anytree",
"fusepy",
"zeroconf>=0.29.0",
"tqdm",
"setuptools"
],
"entry_points": {
"console_scripts": [
"dptrp1=dptrp1.cli.dptrp1:main",
"dptmount=dptrp1.cli.dptmount:main"
]
},
"classifiers": [
"Development Status :: 3 - Alpha",
"Programming Language :: Python :: 3"
]
}
================================================
FILE: setup.py
================================================
#!/usr/bin/env python
# coding=utf-8
import os
import sys
import json
import setuptools
from setuptools.command.test import test as TestCommand
DIRECTORY = os.path.dirname(os.path.realpath(__file__))
SETUP_JSON = None
try:
with open(os.path.join(DIRECTORY, "setup.json"), "r") as f:
SETUP_JSON = json.load(f)
except Exception as e:
print(
"! Error loading setup.json file in the same directory as setup.py.\n"
+ " Check your installation."
)
print(" Exception: {}".format(e))
sys.exit(1)
def readme():
with open(os.path.join(DIRECTORY, "README.md"), encoding='utf-8') as f:
return f.read()
class PyTest(TestCommand):
user_options = [("pytest-args=", "a", "Arguments to pass to py.test")]
def initialize_options(self):
TestCommand.initialize_options(self)
self.pytest_args = []
def run_tests(self):
# import here, cause outside the eggs aren't loaded
import pytest
errno = pytest.main(self.pytest_args)
sys.exit(errno)
setuptools.setup(
name=SETUP_JSON["name"],
version=SETUP_JSON["version"],
author=", ".join(SETUP_JSON["authors"]),
author_email=", ".join(SETUP_JSON["emails"]),
description=SETUP_JSON["description"],
long_description=readme(),
long_description_content_type="text/markdown",
license=SETUP_JSON["license"],
keywords="",
url=None,
namespace_packages=SETUP_JSON["namespace_packages"],
packages=SETUP_JSON["packages"],
install_requires=SETUP_JSON["install_requires"],
tests_require=["pytest"],
cmdclass={"test": PyTest},
entry_points=SETUP_JSON["entry_points"],
classifiers=SETUP_JSON["classifiers"],
include_package_data=True,
zip_safe=False,
)
gitextract_3f7ws_ss/ ├── .github/ │ └── workflows/ │ └── python-publish.yml ├── .gitignore ├── LICENSE ├── MANIFEST.in ├── README.md ├── docs/ │ └── linux-ethernet-over-usb.md ├── dptrp1/ │ ├── __init__.py │ ├── cli/ │ │ ├── __init__.py │ │ ├── dptmount.py │ │ └── dptrp1.py │ ├── dptrp1.py │ └── pyDH.py ├── samples/ │ ├── wifi_2.5G.json │ ├── wifi_5G.json │ └── wifi_del_2.5G.json ├── setup.json └── setup.py
SYMBOL INDEX (163 symbols across 5 files)
FILE: dptrp1/cli/dptmount.py
class FileHandle (line 59) | class FileHandle(object):
method __init__ (line 61) | def __init__(self, fs, local_path, new=False):
method read (line 74) | def read(self, length, offset):
method write (line 81) | def write(self, buf, offset):
method flush (line 86) | def flush(self):
class DptTablet (line 95) | class DptTablet(LoggingMixIn, Operations):
method __init__ (line 96) | def __init__(
method __init_empty_tree (line 125) | def __init_empty_tree(self):
method __authenticate__ (line 136) | def __authenticate__(self):
method _remove_node (line 147) | def _remove_node(self, node):
method _add_node_to_tree (line 151) | def _add_node_to_tree(self, parent, item):
method _add_remote_path_to_tree (line 161) | def _add_remote_path_to_tree(self, parent, remote_path):
method _load_document_list (line 165) | def _load_document_list(self):
method _recurse_load_document_list (line 169) | def _recurse_load_document_list(self, parent):
method _get_lstat (line 177) | def _get_lstat(self, item):
method _map_local_remote (line 217) | def _map_local_remote(self, full_local):
method _is_read_only_flags (line 222) | def _is_read_only_flags(self, flags):
method chmod (line 229) | def chmod(self, path, mode):
method chown (line 233) | def chown(self, path, uid, gid):
method getattr (line 237) | def getattr(self, path, fh=None):
method readdir (line 245) | def readdir(self, path, fh):
method unlink (line 254) | def unlink(self, path):
method rmdir (line 263) | def rmdir(self, path):
method mkdir (line 269) | def mkdir(self, path, mode):
method open (line 279) | def open(self, path, flags):
method release (line 287) | def release(self, path, fh):
method read (line 294) | def read(self, path, length, offset, fh):
method rename (line 297) | def rename(self, oldpath, newpath):
method create (line 306) | def create(self, path, mode, fi=None):
method write (line 322) | def write(self, path, buf, offset, fh):
method flush (line 325) | def flush(self, path, fh):
function main (line 332) | def main():
FILE: dptrp1/cli/dptrp1.py
function do_screenshot (line 16) | def do_screenshot(d, filename):
function do_list_templates (line 24) | def do_list_templates(d):
function do_list_documents (line 29) | def do_list_documents(d):
function do_list_folders (line 35) | def do_list_folders(d, *remote_paths):
function do_move_document (line 42) | def do_move_document(d, old_path, new_path):
function do_copy_document (line 46) | def do_copy_document(d, old_path, new_path):
function do_upload (line 50) | def do_upload(d, local_path, remote_path=""):
function do_upload_template (line 59) | def do_upload_template(d, local_path, template_name=''):
function do_download (line 70) | def do_download(d, remote_path, local_path):
function do_list_document_info (line 84) | def do_list_document_info(d, remote_path=''):
function do_display_document (line 102) | def do_display_document(d, remote_path, page=1):
function do_update_firmware (line 116) | def do_update_firmware(d, local_path):
function add_prefix (line 120) | def add_prefix(remote_path: str) -> str:
function do_delete_document (line 123) | def do_delete_document(d, remote_path):
function do_delete_template (line 126) | def do_delete_template(d,remote_path):
function do_delete_folder (line 129) | def do_delete_folder(d, remote_path):
function do_sync (line 133) | def do_sync(d, local_path, remote_path="Document"):
function do_new_folder (line 145) | def do_new_folder(d, remote_path):
function do_wifi_list (line 149) | def do_wifi_list(d):
function do_wifi_scan (line 154) | def do_wifi_scan(d):
function do_wifi (line 159) | def do_wifi(d):
function do_wifi_enable (line 163) | def do_wifi_enable(d):
function do_wifi_disable (line 167) | def do_wifi_disable(d):
function do_add_wifi (line 171) | def do_add_wifi(d, cfg_file=""):
function do_delete_wifi (line 197) | def do_delete_wifi(d, cfg_file=""):
function do_register (line 210) | def do_register(d, key_file, id_file):
function format_parameter (line 220) | def format_parameter(parameter):
function do_help (line 230) | def do_help(command):
function do_get_config (line 244) | def do_get_config(d, path):
function do_set_config (line 254) | def do_set_config(d, path):
function build_parser (line 296) | def build_parser():
function main (line 335) | def main():
FILE: dptrp1/dptrp1.py
function get_default_auth_files (line 30) | def get_default_auth_files():
function find_auth_files (line 40) | def find_auth_files():
class DigitalPaperException (line 76) | class DigitalPaperException(Exception):
class ResolveObjectFailed (line 80) | class ResolveObjectFailed(DigitalPaperException):
class LookUpDPT (line 84) | class LookUpDPT:
method __init__ (line 85) | def __init__(self, quiet=False):
method update_service (line 93) | def update_service(self, zeroconf, service_type, name):
method add_service (line 96) | def add_service(self, zeroconf, type, name):
method find (line 120) | def find(self, id, timeout=30):
class DigitalPaper (line 149) | class DigitalPaper:
method __init__ (line 150) | def __init__(self, addr=None, id=None, assume_yes=False, quiet=False):
method base_url (line 168) | def base_url(self):
method register (line 178) | def register(self):
method authenticate (line 350) | def authenticate(self, client_id, key):
method list_templates (line 363) | def list_templates(self):
method list_documents (line 367) | def list_documents(self):
method list_all (line 371) | def list_all(self):
method list_objects_in_folder (line 375) | def list_objects_in_folder(self, remote_path):
method list_folder_entries_by_id (line 380) | def list_folder_entries_by_id(self, folder_id):
method traverse_folder (line 384) | def traverse_folder(self, remote_path, fields=[]):
method traverse_folder_recursively (line 409) | def traverse_folder_recursively(self, remote_path):
method list_document_info (line 425) | def list_document_info(self, remote_path):
method download (line 429) | def download(self, remote_path):
method delete_document (line 436) | def delete_document(self, remote_path):
method delete_template (line 444) | def delete_template(self,template_name):
method display_document (line 451) | def display_document(self, document_id, page=1):
method delete_folder (line 455) | def delete_folder(self, remote_path):
method delete_document_by_id (line 463) | def delete_document_by_id(self, doc_id):
method delete_folder_by_id (line 466) | def delete_folder_by_id(self, folder_id):
method delete_template_by_id (line 469) | def delete_template_by_id(self, template_id):
method upload_template (line 472) | def upload_template(self, fh, remote_path):
method upload (line 485) | def upload(self, fh, remote_path):
method new_folder (line 509) | def new_folder(self, remote_path):
method list_folders (line 521) | def list_folders(self):
method download_file (line 529) | def download_file(self, remote_path, local_path):
method upload_file (line 540) | def upload_file(self, local_path, remote_path):
method path_is_folder (line 547) | def path_is_folder(self, remote_path):
method path_exists (line 561) | def path_exists(self, remote_path):
method sync (line 568) | def sync(self, local_folder, remote_folder):
method load_checkpoint (line 906) | def load_checkpoint(self, local_folder):
method sync_checkpoint (line 913) | def sync_checkpoint(self, local_folder, doclist):
method _copy_move_data (line 918) | def _copy_move_data(self, file_id, folder_id, new_filename=None):
method copy_file_to_folder_by_id (line 924) | def copy_file_to_folder_by_id(self, file_id, folder_id, new_filename=N...
method move_file_to_folder_by_id (line 932) | def move_file_to_folder_by_id(self, file_id, folder_id, new_filename=N...
method _copy_move_find_ids (line 940) | def _copy_move_find_ids(self, old_path, new_path):
method copy_file (line 953) | def copy_file(self, old_path, new_path):
method move_file (line 962) | def move_file(self, old_path, new_path):
method wifi_list (line 972) | def wifi_list(self):
method wifi_scan (line 978) | def wifi_scan(self):
method configure_wifi (line 984) | def configure_wifi(
method delete_wifi (line 1029) | def delete_wifi(self, ssid, security):
method wifi_enabled (line 1036) | def wifi_enabled(self):
method enable_wifi (line 1039) | def enable_wifi(self):
method disable_wifi (line 1042) | def disable_wifi(self):
method get_config (line 1047) | def get_config(self):
method set_config (line 1055) | def set_config(self, config):
method get_timeout (line 1063) | def get_timeout(self):
method set_timeout (line 1067) | def set_timeout(self, value):
method get_date_format (line 1072) | def get_date_format(self):
method set_date_format (line 1076) | def set_date_format(self, value):
method get_time_format (line 1079) | def get_time_format(self):
method set_time_format (line 1083) | def set_time_format(self, value):
method get_timezone (line 1086) | def get_timezone(self):
method set_timezone (line 1090) | def set_timezone(self, value):
method get_owner (line 1093) | def get_owner(self):
method set_owner (line 1097) | def set_owner(self, value):
method get_storage (line 1102) | def get_storage(self):
method get_firmware_version (line 1106) | def get_firmware_version(self):
method get_api_version (line 1110) | def get_api_version(self):
method get_mac_address (line 1114) | def get_mac_address(self):
method get_battery (line 1118) | def get_battery(self):
method get_info (line 1122) | def get_info(self):
method set_datetime (line 1126) | def set_datetime(self):
method take_screenshot (line 1132) | def take_screenshot(self):
method ping (line 1137) | def ping(self):
method update_firmware (line 1146) | def update_firmware(self, fwfh):
method _reg_endpoint_request (line 1175) | def _reg_endpoint_request(self, method, endpoint, data=None, files=None):
method _endpoint_request (line 1185) | def _endpoint_request(self, method, endpoint, data=None, files=None):
method _get_endpoint (line 1194) | def _get_endpoint(self, endpoint=""):
method _put_endpoint (line 1197) | def _put_endpoint(self, endpoint="", data={}, files=None):
method _post_endpoint (line 1200) | def _post_endpoint(self, endpoint="", data={}):
method _delete_endpoint (line 1203) | def _delete_endpoint(self, endpoint="", data={}):
method _get_nonce (line 1206) | def _get_nonce(self, client_id):
method _resolve_object_by_path (line 1210) | def _resolve_object_by_path(self, path):
method _get_object_id (line 1218) | def _get_object_id(self, path):
function wrap (line 1223) | def wrap(data, authKey, keyWrapKey):
function pad (line 1236) | def pad(bytestring, k=16):
function unwrap (line 1246) | def unwrap(data, authKey, keyWrapKey):
function unpad (line 1265) | def unpad(bytestring, k=16):
FILE: dptrp1/pyDH.py
class DiffieHellman (line 48) | class DiffieHellman:
method __init__ (line 52) | def __init__(self, group=14):
method get_private_key (line 61) | def get_private_key(self):
method gen_public_key (line 65) | def gen_public_key(self):
method check_other_public_key (line 70) | def check_other_public_key(self, other_contribution):
method gen_shared_key (line 79) | def gen_shared_key(self, other_contribution):
FILE: setup.py
function readme (line 25) | def readme():
class PyTest (line 30) | class PyTest(TestCommand):
method initialize_options (line 33) | def initialize_options(self):
method run_tests (line 37) | def run_tests(self):
Condensed preview — 17 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (106K chars).
[
{
"path": ".github/workflows/python-publish.yml",
"chars": 1084,
"preview": "# This workflow will upload a Python Package using Twine when a release is created\n# For more information see: https://h"
},
{
"path": ".gitignore",
"chars": 2813,
"preview": "certs/\n\n# Created by https://www.gitignore.io/api/node,macos,python,virtualenv\n\n### macOS ###\n*.DS_Store\n.AppleDouble\n.L"
},
{
"path": "LICENSE",
"chars": 1074,
"preview": "MIT License\n\nCopyright (c) 2018 Jan-Gerd Tenberge\n\nPermission is hereby granted, free of charge, to any person obtaining"
},
{
"path": "MANIFEST.in",
"chars": 18,
"preview": "include setup.json"
},
{
"path": "README.md",
"chars": 8503,
"preview": "# dpt-rp1-py\nPython script to manage electronic paper devices made by Sony (Digital Paper, DPT-RP1, DPT-CP1) or Fujitsu "
},
{
"path": "docs/linux-ethernet-over-usb.md",
"chars": 3590,
"preview": "# Accessing the DPT-RP1 over USB in Linux\n\nTo use the DPT-RP1 through the USB cable, you need to perform two steps:\n\n 1"
},
{
"path": "dptrp1/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "dptrp1/cli/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "dptrp1/cli/dptmount.py",
"chars": 11957,
"preview": "#!/usr/bin/env python3\n\"\"\"\nUsage\n-----\n\n> dptmount /mnt/mymountpoint\n\n\nConfig file\n------------\n\nA simple yaml such as\n\n"
},
{
"path": "dptrp1/cli/dptrp1.py",
"chars": 11113,
"preview": "#!/usr/bin/env python3\n# coding=utf-8\n\nimport argparse\nimport inspect\nimport json\nimport sys\nimport os\nimport re\n\nfrom p"
},
{
"path": "dptrp1/dptrp1.py",
"chars": 48460,
"preview": "#!/usr/bin/env python3\nimport os\nimport sys\nimport uuid\nimport time\nimport base64\nimport httpsig\nimport urllib3\nimport r"
},
{
"path": "dptrp1/pyDH.py",
"chars": 8608,
"preview": "# \t\t\t Apache License\n# Version 2.0, January 2004\n# Copyright 2015 Amirali Sanatinia\n\n\"\"\" Pure Python Diffi"
},
{
"path": "samples/wifi_2.5G.json",
"chars": 197,
"preview": "{\n \"security\": \"psk\",\n \"ssid\": \"Roxy123\",\n \"passwd\": \"Lucinda\",\n \"dhcp\": \"true\",\n \"static_address\": \"\",\n \"gateway\""
},
{
"path": "samples/wifi_5G.json",
"chars": 211,
"preview": "{\n \"security\": \"psk\",\n \"ssid\": \"BadBomb\",\n \"passwd\": \"TickingLikeAHotSamosa\",\n \"dhcp\": \"true\",\n \"static_address\": \""
},
{
"path": "samples/wifi_del_2.5G.json",
"chars": 49,
"preview": "{\n \"security\": \"psk\",\n \"ssid\": \"Roxy123\"\n}\n"
},
{
"path": "setup.json",
"chars": 1472,
"preview": "{\n \"name\": \"dpt-rp1-py\",\n \"version\": \"0.1.20\",\n \"description\": \"Python package to manage a Sony DPT-RP1\",\n \""
},
{
"path": "setup.py",
"chars": 1765,
"preview": "#!/usr/bin/env python\n# coding=utf-8\n\nimport os\nimport sys\nimport json\nimport setuptools\nfrom setuptools.command.test im"
}
]
About this extraction
This page contains the full source code of the janten/dpt-rp1-py GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 17 files (98.5 KB), approximately 26.0k tokens, and a symbol index with 163 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.