Showing preview only (241K chars total). Download the full file or copy to clipboard to get everything.
Repository: gicait/geoserver-rest
Branch: master
Commit: 10bef2ce0b8b
Files: 48
Total size: 227.6 KB
Directory structure:
gitextract_o9tiakx3/
├── .github/
│ └── workflows/
│ ├── python-publish.yml
│ └── python-test.yml
├── .gitignore
├── .pre-commit-config.yaml
├── .readthedocs.yaml
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── DEV_DOCS.md
├── LICENSE
├── README.md
├── conda-recipe/
│ ├── bld.bat
│ ├── build.sh
│ └── meta.yaml
├── docs/
│ ├── Makefile
│ ├── make.bat
│ ├── requirements-docs.txt
│ └── source/
│ ├── about.rst
│ ├── acknowledgements.rst
│ ├── advanced_uses.rst
│ ├── change_log.rst
│ ├── conf.py
│ ├── contribution.rst
│ ├── geo.rst
│ ├── how_to_use.rst
│ ├── index.rst
│ ├── installation.rst
│ └── license.rst
├── geo/
│ ├── Calculation_gdal.py
│ ├── Geoserver.py
│ ├── Style.py
│ ├── __init__.py
│ ├── __version__.py
│ └── supports.py
├── requirements.txt
├── requirements_dev.txt
├── requirements_style.txt
├── setup.cfg
├── setup.py
└── tests/
├── .env_template
├── __init__.py
├── common.py
├── data/
│ ├── countries-test.gpkg
│ ├── sample_geotiff.tif
│ └── tos_O1_2001-2002.nc
├── docker-compose.yaml
├── test_geoserver.py
├── test_layergroup.py
└── test_path_fix.py
================================================
FILE CONTENTS
================================================
================================================
FILE: .github/workflows/python-publish.yml
================================================
# This workflow will upload a Python Package using Twine when a release is created
# For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
name: Publish geoserver-rest to PyPI / GitHub
on:
push:
tags:
- "v*"
permissions:
contents: read
jobs:
deploy:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: "3.x"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build
- name: Build package
run: python -m build
- name: Publish package
uses: pypa/gh-action-pypi-publish@27b31702a0e7fc50959f5ad993c78deac1bdfc29
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
================================================
FILE: .github/workflows/python-test.yml
================================================
name: Run tests
on:
push:
branches: [master]
pull_request:
branches: [master]
# schedule:
# - cron: "0 0 * * *"
permissions:
contents: read
jobs:
test-ubuntu:
runs-on: ubuntu-22.04
strategy:
matrix:
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12']
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5.1.0
with:
cache: 'pip'
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
sudo apt-add-repository ppa:ubuntugis/ppa
sudo apt-get update
sudo apt-get install gdal-bin libgdal-dev
python -m pip install --upgrade pip
python -m pip install GDAL==`gdal-config --version`
python -m pip install -r requirements_dev.txt
- name: Set up docker-compose
run: |
docker compose -f tests/docker-compose.yaml up -d
sleep 60 # Geoserver takes quite a long time to boot up and there is no healthcheck
- name: Test with pytest
uses: dariocurr/pytest-summary@main
with:
paths: tests/test_geoserver.py
env:
DB_HOST: postgis
- name: Upload test summary
uses: actions/upload-artifact@v4
with:
name: test-summary-linux
path: test-summary-linux.md
if: always()
#test-windows:
#
# runs-on: windows-latest
#
# steps:
# - uses: actions/checkout@v4
#
# - name: Set up Python 3.10
# uses: actions/setup-python@v3
# with:
# python-version: '3.10'
#
# - name: Set up PostGIS
# run: |
#
# # Install PostGIS (PostgreSQL comes on the GitHub Actions runner by default but lacks PostGIS control files and dependencies)
# netsh advfirewall firewall show rule name="Allow Localhost 5432"
# Invoke-WebRequest -Uri "http://download.osgeo.org/postgis/windows/pg14/postgis-bundle-pg14x64-setup-3.4.1-1.exe" -OutFile "postgis-installer.exe"
# Start-Process "postgis-installer.exe" -ArgumentList "/S /D=C:\Program Files\PostgreSQL\14" -Wait -NoNewWindow
# & "C:\Program Files\PostgreSQL\14\bin\pg_ctl.exe" -D "C:\Program Files\PostgreSQL\14\data" start
# & "C:\Program Files\PostgreSQL\14\bin\psql.exe" -U postgres -c "CREATE DATABASE geodb;"
# & "C:\Program Files\PostgreSQL\14\bin\psql.exe" -U postgres -c "CREATE USER geodb_user WITH ENCRYPTED PASSWORD 'geodb_pass';"
# & "C:\Program Files\PostgreSQL\14\bin\psql.exe" -U postgres -c "GRANT ALL PRIVILEGES ON DATABASE geodb TO geodb_user;"
# & "C:\Program Files\PostgreSQL\14\bin\psql.exe" -U postgres -d geodb -c "CREATE EXTENSION postgis;"
#
# - name: Set up Tomcat/GeoServer
# run: |
#
# # Configure firewall to allow connectiosn to localhost port 8080 (might not be necessary)
# netsh advfirewall firewall add rule name="Allow Localhost 5432" dir=in action=allow protocol=TCP localport=5432
#
# # Download and install Apache Tomcat (need version 9 because the JRE on the GitHub Actions runner is incompatible with 10+)
# curl -L https://dlcdn.apache.org/tomcat/tomcat-9/v9.0.88/bin/apache-tomcat-9.0.88-windows-x64.zip -o tomcat.zip
# Expand-Archive -Path tomcat.zip -DestinationPath "C:\"
# $directory = Get-ChildItem -Path "C:\" -Directory | Where-Object { $_.Name -like "*apache-tomcat*" } | Select-Object -First 1
# if ($directory) { Rename-Item -Path $directory.FullName -NewName "Tomcat" }
#
# # Download and install Geoserver, then move it to the Tomcat directory
# # Version-locked to 2.22.0 beacause of Java 8
# curl -L https://sourceforge.net/projects/geoserver/files/GeoServer/2.22.0/geoserver-2.22.0-war.zip/download -o geoserver.zip
# Expand-Archive -Path geoserver.zip -DestinationPath "C:\GeoServer"
# cp C:\GeoServer\geoserver.war C:\Tomcat\webapps\geoserver.war
#
# # Set env vars for Tomcat and run it (it takes a little while, so wait 30 seconds after)
# $env:CATALINA_BASE = "C:\Tomcat"
# $env:CATALINA_HOME = "C:\Tomcat"
# $env:CATALINA_TMPDIR = "C:\Tomcat\temp"
# C:\Tomcat\bin\startup.bat
# Start-Sleep -Seconds 30
#
# shell: pwsh
#
# - name: Install Miniconda
# run: |
# curl -O https://repo.anaconda.com/miniconda/Miniconda3-py39_4.10.3-Windows-x86_64.exe
# Start-Process -FilePath "Miniconda3-py39_4.10.3-Windows-x86_64.exe" -ArgumentList '/InstallationType=JustMe /RegisterPython=0 /S /D="%UserProfile%\Miniconda3"' -Wait -NoNewWindow
# & "$env:UserProfile\Miniconda3\Scripts\conda" init powershell
# shell: pwsh
#
# - name: Configure Conda environment
# run: |
# $env:PATH = "$env:UserProfile\Miniconda3;$env:UserProfile\Miniconda3\Scripts;$env:UserProfile\Miniconda3\Library\bin;$env:PATH"
# conda update conda -y
# conda create -n geospatial python=3.10 -y
# conda activate geospatial
# conda install -c conda-forge gdal>=3.4.1 -y
# python -m pip install --upgrade pip
# pip install -r requirements_dev.txt
# shell: pwsh
#
# #- name: Test with pytest
# # uses: dariocurr/pytest-summary@main
# # with:
# # paths: tests/test_geoserver.py
# # env:
# # DB_HOST: postgis
#
# - name: Test with pytest
# run: |
# conda activate geospatial
# pytest tests/test_geoserver.py
#
# - name: Upload test summary
# uses: actions/upload-artifact@v3
# with:
# name: test-summary-windows
# path: test-summary-windows.md
# if: always()
================================================
FILE: .gitignore
================================================
style.sld
FunctionsToImplement.py
record.txt
package_test.py
test.py
.idea/
# Created by https://www.toptal.com/developers/gitignore/api/python
# Edit at https://www.toptal.com/developers/gitignore?templates=python
### Python ###
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
record.txt
FunctionsToImplement.py
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
pytestdebug.log
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
doc/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
test.ipynb
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
.vscode/
.direnv/
venv/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# End of https://www.toptal.com/developers/gitignore/api/python
================================================
FILE: .pre-commit-config.yaml
================================================
default_language_version:
python: python3
repos:
- repo: https://github.com/asottile/pyupgrade
rev: v3.3.1
hooks:
- id: pyupgrade
language_version: python3
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: check-yaml
language_version: python3
exclude: conda-recipe/meta.yaml
- id: debug-statements
language_version: python3
- id: end-of-file-fixer
language_version: python3
exclude: .ipynb|.txt|.sld
- id: trailing-whitespace
language_version: python3
exclude: .txt|.sld
- repo: https://github.com/ambv/black
rev: 22.12.0
hooks:
- id: black
args: ["--target-version", "py36"]
- repo: https://github.com/timothycrosley/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black", "--filter-files"]
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
hooks:
- id: flake8
args: ['--config=setup.cfg']
- repo: https://github.com/pycqa/pydocstyle
rev: 6.2.1
hooks:
- id: pydocstyle
language_version: python3
args: ['--convention=numpy', '--match="(?!test_).*\.py"']
- repo: meta
hooks:
- id: check-hooks-apply
- id: check-useless-excludes
ci:
autoupdate_schedule: quarterly
skip: []
submodules: false
================================================
FILE: .readthedocs.yaml
================================================
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.12"
# You can also specify other tool versions:
# nodejs: "20"
# rust: "1.70"
# golang: "1.20"
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: docs/source/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
# fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
# formats:
# - pdf
# - epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: docs/requirements-docs.txt
================================================
FILE: CODE_OF_CONDUCT.md
================================================
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at iamtekson@gmail.com. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq
================================================
FILE: CONTRIBUTING.md
================================================
# Contributing Guidelines
Geoserver-rest is an open source library written in python and contributors are needed to keep this library moving forward. Any kind of contributions are welcome.
# Guidelines
1. Please use the request library for the http request.
2. One feature per pull request.
3. Please add the update about your PR on the [change log documentation](https://github.com/gicait/geoserver-rest/blob/master/docs/source/change_log.rst#master-branch) as well.
================================================
FILE: DEV_DOCS.md
================================================
# Documentation
### Publishing the python package
For publishing the python package, follow following steps,
1. Install twine `pip install twine`
2. Generate packages and create dist folder by running setup.py file `python setup.py bdist_wheel sdist`
3. Push to PyPI `twine upload dist/*`
### Pre-commit Reference
To install the additional libraries for dev (testing, formatting, etc.) run `pip install --editable .[dev]`
[`pre-commit`](https://pre-commit.com/) is a pretty standard tooling-helper for automating the formatting of code (`isort`, `black`, `end-of-file-fixer`, `trailing-space-fixer`, etc.) and evaluating any potential issues (`flake8`). This essentially makes all code contributions look the same no matter what in order to favour readability/maintainability.
Installing and running `pre-commit` is fairly automatic. After installing the requirements-dev.txt, run:
```bash
pre-commit install
```
and the environments will be managed automatically. Any calls to git commit will run the checks. If something changes, you need to simply run git commit a second time and it should be good.
Some checks will require changes (e.g. imports of `pdb` are a violation, unused imports, unused initialized objects, etc.). If these are needed by design you can do either of the following:
To commit the files and ignore the checks (not great as checks will fail for future commits and for others):
```bash
git commit --no-verify -m "my message"
```
If you want the violation to be ignored, place a \_\_# noqa (precisely: two empty spaces, #, one empty space, noqa) next to the affected line.
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2020, Geoinformatics Center, Asian Institute of Technology
Copyright (c) 2020, Tek Kshetri
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
# Python wrapper for GeoServer REST API
[](https://pepy.tech/project/geoserver-rest)
[](https://opensource.org/licenses/MIT)
[](https://github.com/psf/black)
[](https://flake8.pycqa.org/)
[](https://pycqa.github.io/isort/)
[](https://github.com/pre-commit/pre-commit)
## Full documentation
The documentation for this project is moved here: [https://geoserver-rest.readthedocs.io/](https://geoserver-rest.readthedocs.io/).
## Overview
The `geoserver-rest` package is useful for the management of geospatial data in [GeoServer](http://geoserver.org/). The package is useful for the creating, updating and deleting geoserver workspaces, stores, layers, and style files.
## Installation
```python
conda install -c conda-forge geoserver-rest
```
For the `pip` installation, check the [official documentation of geoserver-rest](https://geoserver-rest.readthedocs.io/en/latest/installation.html)
## Some examples
Please check the [https://geoserver-rest.readthedocs.io/](https://geoserver-rest.readthedocs.io/) for full documentation.
```python
# Import the library
from geo.Geoserver import Geoserver
# Initialize the library
geo = Geoserver('http://127.0.0.1:8080/geoserver', username='admin', password='geoserver')
# For creating workspace
geo.create_workspace(workspace='demo')
# For uploading raster data to the geoserver
geo.create_coveragestore(layer_name='layer1', path=r'path\to\raster\file.tif', workspace='demo')
# For creating postGIS connection and publish postGIS table
geo.create_featurestore(store_name='geo_data', workspace='demo', db='postgres', host='localhost', pg_user='postgres',
pg_password='admin')
geo.publish_featurestore(workspace='demo', store_name='geo_data', pg_table='geodata_table_name')
# For uploading SLD file and connect it with layer
geo.upload_style(path=r'path\to\sld\file.sld', workspace='demo')
geo.publish_style(layer_name='geoserver_layer_name', style_name='sld_file_name', workspace='demo')
# delete workspace
geo.delete_workspace(workspace='demo')
# delete layer
geo.delete_layer(layer_name='layer1', workspace='demo')
# delete style file
geo.delete_style(style_name='style1', workspace='demo')
```
To create the dynamic style you need to install the extra dependencies as, `gdal`, `seaborn` and `matplotlib`. The below functions will be only available after installing the full package `geoserver-rest[all]` or `geoserver-rest[style]`,
```python
# For creating the style file for raster data dynamically and connect it with layer
geo.create_coveragestyle(raster_path=r'path\to\raster\file.tiff', style_name='style_1', workspace='demo',
color_ramp='RdYiGn')
geo.publish_style(layer_name='geoserver_layer_name', style_name='raster_file_name', workspace='demo')
# For creating outline featurestyle
geo.create_outline_featurestyle(style_name='style1', color='#ff0000')
```
## Contribution
Geoserver-rest is the open source library written in python and contributors are needed to keep this library moving forward. Any kind of contributions are welcome. Here are the basic rule for the new contributors:
1. Please use the request library for the http request.
2. One feature per pull request (If the PR is huge, you need to create a issue and discuss).
3. Please add the update about your PR on the [change log documentation](https://github.com/gicait/geoserver-rest/blob/master/docs/source/change_log.rst#master-branch) as well.
## Citation
Full paper is available here: https://doi.org/10.5194/isprs-archives-XLVI-4-W2-2021-91-2021
```
@Article{isprs-archives-XLVI-4-W2-2021-91-2021,
AUTHOR = {Tek Bahadur Kshetri, Angsana Chaksana and Shraddha Sharma},
TITLE = {THE ROLE OF OPEN-SOURCE PYTHON PACKAGE GEOSERVER-REST IN WEB-GIS DEVELOPMENT},
JOURNAL = {The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences},
VOLUME = {XLVI-4/W2-2021},
YEAR = {2021},
PAGES = {91--96},
URL = {https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLVI-4-W2-2021/91/2021/},
DOI = {10.5194/isprs-archives-XLVI-4-W2-2021-91-2021}
}
```
================================================
FILE: conda-recipe/bld.bat
================================================
cd %RECIPE_DIR%\..
%PYTHON% setup.py install --single-version-externally-managed --record=record.txt
================================================
FILE: conda-recipe/build.sh
================================================
$PYTHON setup.py install
================================================
FILE: conda-recipe/meta.yaml
================================================
{% set data = load_setup_py_data() %}
package:
name: "geoserver-rest"
version: {{ data.get('version') }}
build:
# entry_points:
# - anaconda = binstar_client.scripts.cli:main
# - binstar = binstar_client.scripts.cli:main
# - conda-server = binstar_client.scripts.cli:main
source:
path: ./../
requirements:
build:
- python
- setuptools
run:
- python
- setuptools
- gdal
- seaborn
- requests
- pygments
- matplotlib
- xmltodict
about:
home: https://github.com/gicait/geoserver-rest
license: MIT
license_familY: MIT
license_file: LICENSE
summary: "The package for management of geospatial data in GeoServer"
extra:
recipe-maintainers:
- iamtekson
================================================
FILE: docs/Makefile
================================================
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
================================================
FILE: docs/make.bat
================================================
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd
================================================
FILE: docs/requirements-docs.txt
================================================
jinja2
sphinx
sphinx_rtd_theme
readthedocs-sphinx-ext
m2r
mistune
sphinxcontrib-httpdomain
sphinxcontrib-openapi
geopandas
seaborn
requests
xmltodict
================================================
FILE: docs/source/about.rst
================================================
About geoserver-rest!
=====================
What is geoserver-rest?
^^^^^^^^^^^^^^^^^^^^^^^
The geoserver-rest package is useful for the management of geospatial data in `GeoServer <http://geoserver.org/>`_.
This package is useful for creating, updating and deleting geoserver workspaces, stores, layers, and style files.
For a live example of geoserver-rest in action, check out the video tutorial on geoserver-rest below:
.. raw:: html
<iframe width="560" height="315" src="https://www.youtube.com/embed/nXvzmbGukeE" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
Current version
---------------
Current version: v1.5.1
================================================
FILE: docs/source/acknowledgements.rst
================================================
Acknowledgements
================
Created and managed by `Tek Bahadur Kshetri <https://github.com/iamtekson>`_ for the activities of the Geoinformatics Center of Asian Institute of Technology, Thailand.
================================================
FILE: docs/source/advanced_uses.rst
================================================
Advanced uses for automation
============================
The following code will first convert all the ``.rst`` data format inside ``C:\Users\gic\Desktop\etlIa\`` folder, into ``tiff`` format and then upload all the ``tiff`` files to the GeoServer.
.. code-block:: python3
from geo.Geoserver import Geoserver
from osgeo import gdal
import glob
import os
geo = Geoserver('http://localhost:8080/geoserver', username='admin', password='geoserver')
rst_files = glob.glob(r'C:\Users\gic\Desktop\etlIa\*.rst')
geo.create_workspace('geonode')
for rst in rst_files:
file_name = os.path.basename(rst)
src = gdal.Open(rst)
tiff = r'C:\Users\tek\Desktop\try\{}'.format(file_name)
gdal.Translate(tiff, src)
geo.create_coveragestore(layer_name=file_name, path=tiff, workspace='geonode') #, overwrite=True
The following code will upload all the ``tiff`` files (with extension .tiff or .tif) located in ``data/landuse`` to the GeoServer.
.. code-block:: python3
from geo.Geoserver import Geoserver
import glob
import os
geo = Geoserver('http://localhost:8080/geoserver', username='admin', password='geoserver')
geo.create_workspace('test')
tiff_files = glob.glob('data/landuse/*.tiff') + glob.glob('data/landuse/*.tif')
for tiff in tiff_files:
file_name = os.path.basename(tiff)
# Removing extension for layer name
temp = os.path.splitext(file_name)
layer_name = temp[0]
# Will overwrite layer if it exists
geo.create_coveragestore(layer_name=layer_name, path=tiff, workspace='test')
print(file_name + ' uploaded to geoserver')
================================================
FILE: docs/source/change_log.rst
================================================
Change Log
=============
``Master branch``
^^^^^^^^^^^^^^^^^
* New method `create_gpkg_datastore`
* Bugfixes for `add_layer_to_layergroup` and `remove_layer_from_layergroup`
* New method `remove_layer_from_layergroup`
``[v2.4.1 - 2023-01-14]``
^^^^^^^^^^^^^^^^^^^^^^^^^^
* New method `add_layer_to_layergroup` (see issue `#102 <https://github.com/gicait/geoserver-rest/issues/102>`)
* Allow deletion of layergroups from workspaces (see issue `#100 <https://github.com/gicait/geoserver-rest/issues/100>`) and add unittests for the layergroup methods.
* Fix json-bug in create_coveragestore method (see issue `#86 <https://github.com/gicait/geoserver-rest/issues/86>`)
``[v2.4.0 - 2023-01-10]``
^^^^^^^^^^^^^^^^^^^^^^^^^^
* Fix the import issue, close `#76 <https://github.com/gicait/geoserver-rest/issues/76>`_
* Removed the ``rest`` from Geoserver class URL, Revert back to previous state. Close `#77 <https://github.com/gicait/geoserver-rest/issues/76>`_
* Add Optional Parameter for ``title`` to ``publish_featurestore`` function
* Update on git hooks `#94 <https://github.com/gicait/geoserver-rest/pull/94>`_, `#92 <https://github.com/gicait/geoserver-rest/pull/92>`_
* Exception handeling in a better way `#93 <https://github.com/gicait/geoserver-rest/pull/93>`_
``[v2.3.0 - 2022-05-06]``
^^^^^^^^^^^^^^^^^^^^^^^^^^
* Params for `delete_workspace` and `delete_style` changed to `{recursive: true}`
* Added methods to use REST API for user/group service CRUD operations.
* Removed ``key_column`` parameter and added ``srid`` parameter (coordinate system of the layer, default is 4326) from ``publish_featurestore_sqlview`` function
* Solved the Bug `#73 <https://github.com/gicait/geoserver-rest/issues/73>`_ and `#69 <https://github.com/gicait/geoserver-rest/issues/69>`_
* ``create_layergroup`` function added
* ``update_layergroup`` function added
* ``delete_layergroup`` function added
* Added layer and workspace checks to layergroup methods
``[V2.1.2 - 2021-10-14]``
^^^^^^^^^^^^^^^^^^^^^^^^^
* ``create_featurestore`` bug on Expose primary key fixed close #56.
* ``create_featurestore`` will now support all the options from geoserver.
``[V2.0.0 - 2021-08-14]``
^^^^^^^^^^^^^^^^^^^^^^^^^^
* Expose primary key option for datastore in ``create_featurestore`` function
* Time dimention support for the coverage store
* Bug fixing for the ``.tiff`` datastore
* Added the request.content to error messages in order to get more information about error
``[V2.0.0 - 2021-05-28]``
^^^^^^^^^^^^^^^^^^^^^^^^^^
* Fully replaced the `pycurl <http://pycurl.io/>`_ dependency with `request` and `psycopg2 <https://www.psycopg.org/>`_
* Dropped the PostgreSQL functionalities (deleted ``geo/Postgres.py`` file). I think the functionalities of PostgreSQL is outside the scope of this library. So I initiated the seperated library `postgres-helper <https://postgres-helper.readthedocs.io/en/latest/>`_
* Documentation adjustments
* The ``overwrite`` options removed from ``create_coveragestore``, ``create_coveragestyle`` and other style functions
``[V1.6.0 - 2021-04-13]``
^^^^^^^^^^^^^^^^^^^^^^^^^^
* Documentation adjustments (bunch of Sphinx-docs formatting fixes and English corrections)
* Add black and pre-commit
* ``create_coveragestore`` function parameter name ``lyr_name`` changed to ``layer_name``
* Pytest examples, docstrings and typed calls added
``[V1.5.2] - 2020-11-03``
^^^^^^^^^^^^^^^^^^^^^^^^^
* **1. create_datastore** This function can create the datastore from `.shp`, `.gpkg`, WFS url and directory containing `.shp`.
* **2. create_shp_datastore** This function will be useful for uploading the shapefile and publishing the shapefile as a layer. This function will upload the data to the geoserver ``data_dir`` in ``h2`` database structure and publish it as a layer.
* Update on docs
================================================
FILE: docs/source/conf.py
================================================
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
import sys
sys.path.insert(0, os.path.abspath("../../"))
# -- Project information -----------------------------------------------------
project = "geoserver-rest"
copyright = "2021, Tek Bahadur Kshetri"
author = "Tek Bahadur Kshetri"
# The full version, including alpha/beta/rc tags
release = "2.5.1"
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = ["sphinx_rtd_theme", "sphinx.ext.autodoc"]
# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = []
autosummary_generate = True
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = "sphinx_rtd_theme"
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ["_static"]
================================================
FILE: docs/source/contribution.rst
================================================
Contribution
=============
Geoserver-rest is an open source library written in python and contributors are needed to keep this library moving forward. Any kind of contributions are welcome.
================================================
FILE: docs/source/geo.rst
================================================
Python API Reference
====================
**Disclamer**: The documentation in the code was initially written by people, but was then passed through an AI large
language model (specifically ChatGPT-4o) to fill in gaps and correct minor mistakes. The results were also validated by
a person, but it is possible that an AI "hallucination" has occurred that was not caught and resulted in an incorrect
documentation. Please `report an issue <https://github.com/gicait/geoserver-rest/issues>`_ you find one.
The ``Geoserver`` class
------------------------
.. automodule:: geo.Geoserver
:members:
:undoc-members:
:show-inheritance:
The ``Style`` functions
------------------------
.. automodule:: geo.Style
:members:
:undoc-members:
:show-inheritance:
================================================
FILE: docs/source/how_to_use.rst
================================================
How to use
===========
This library is built for getting, creating, updating and deleting workspaces, coveragestores, featurestores, and styles. Some examples are shown below.
Getting started with `geoserver-rest`
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This following step is used to initialize the library. It takes parameters as geoserver url, username, password.
.. code-block:: python
from geo.Geoserver import Geoserver
geo = Geoserver('http://127.0.0.1:8080/geoserver', username='admin', password='geoserver')
Creating workspaces
-------------------
.. code-block:: python
geo.create_workspace(workspace='demo')
Creating coveragestores
-----------------------
It is helpful for publishing the raster data to the geoserver. Here if you don't pass the ``lyr_name`` parameter, it will take the raster file name as the layer name.
.. code-block:: python
geo.create_coveragestore(layer_name='layer1', path=r'path\to\raster\file.tif', workspace='demo')
.. note::
If your raster is not loading correctly, please make sure you assign the coordinate system for your raster file.
If the ``layer_name`` already exists in geoserver, it will automatically overwrite the previous one.
Creating and publishing featurestores and featurestore layers
-------------------------------------------------------------
.. _create_featurestore:
It is used for connecting the ``PostGIS`` with geoserver and publish this as a layer. It is only useful for vector data. The postgres connection parameters should be passed as the parameters. For publishing the PostGIS tables, the ``pg_table`` parameter represent the table name in postgres
.. code-block:: python3
geo.create_featurestore(store_name='geo_data', workspace='demo', db='postgres', host='localhost', pg_user='postgres', pg_password='admin')
geo.publish_featurestore(workspace='demo', store_name='geo_data', pg_table='geodata_table_name')
The new function ``publish_featurestore_sqlview`` is available from geoserver-rest version ``1.3.0``. The function can be run by using following command,
.. code-block:: python3
sql = 'SELECT name, id, geom FROM post_gis_table_name'
geo.publish_featurestore_sqlview(store_name='geo_data', name='view_name', sql=sql, key_column='name', workspace='demo')
Creating and publishing shapefile datastore layers
--------------------------------------------------
The ``create_shp datastore`` function will be useful for uploading the shapefile and publishing the shapefile as a layer. This function will upload the data to the geoserver ``data_dir`` in ``h2`` database structure and publish it as a layer. The layer name will be same as the shapefile name.
.. code-block:: python3
geo.create_shp_datastore(path=r'path/to/zipped/shp/file.zip', store_name='store', workspace='demo')
Creating and publishing datastore layers
----------------------------------------
The ``create_datastore`` function will create the datastore for the specific data. After creating the datastore, you need to publish it as a layer by using ``publish_featurestore`` function. It can take the following type of data path:
1. Path to shapefile (`.shp`) file;
2. Path to GeoPackage (`.gpkg`) file;
3. WFS url (e.g. http://localhost:8080/geoserver/wfs?request=GetCapabilities) or;
4. Directory containing shapefiles.
If you have PostGIS datastore, please use :ref:`create_featurestore <create_featurestore>` function.
.. code-block:: python3
geo.create_datastore(name="ds", path=r'path/to/shp/file_name.shp', workspace='demo')
geo.publish_featurestore(workspace='demo', store_name='ds', pg_table='file_name')
If your data is coming from ``WFS`` url, then use this,
.. code-block:: python3
geo.create_datastore(name="ds", path='http://localhost:8080/geoserver/wfs?request=GetCapabilities', workspace='demo')
geo.publish_featurestore(workspace='demo', store_name='ds', pg_table='wfs_layer_name')
Creating Layer Groups
-------------------------------
A layer group is a grouping of layers and styles that can be accessed as a single layer in a WMS GetMap request.
Layer groups can be created either inside a workspace, or globally without a workspace.
You can create a layer group from layers that have been uploaded previously with the ``create_layergroup`` method.
.. code-block:: python3
# create a new layergroup from 2 existing layers
geo.create_layergroup(
name = "my_fancy_layergroup",
mode = "single",
title = "My Fancy Layergroup Title",
abstract_text = "This is a very fancy Layergroup",
layers = ["fancy_layer_1", "fancy_layer_2"],
workspace = "my_space", #None if you want to create a layergroup outside the workspace
keywords = ["list", "of", "keywords"]
)
# add another layer
geo.add_layer_to_layergroup(
layergroup_name = "my_fancy_layergroup",
layergroup_workspace = "my_space",
layer_name = "superfancy_layer",
layer_workspace = "my_space"
)
# remove a layer
geo.remove_layer_from_layergroup(
layergroup_name = "my_fancy_layergroup",
layergroup_workspace = "my_space",
layer_name = "superfancy_layer",
layer_workspace = "my_space"
)
Uploading and publishing styles
-------------------------------
**WARNING:** As of version 2.9.0, the required dependency ``gdal``, ``matplotlib`` and ``seaborn`` was converted into an optional dependency. Fresh installations of this library will require that you then install ``gdal``, ``matplotlib`` and ``seaborn`` yourself with ``pip install gdal matplotlib seaborn``.
It is used for uploading ``SLD`` files and publish style. If the style name already exists, you can pass the parameter ``overwrite=True`` to overwrite it. The name of the style will be name of the uploaded file name.
Before uploading ``SLD`` file, please check the version of your sld file. By default the version of sld will be ``1.0.0``. As I noticed, by default the QGIS will provide the .sld file of version ``1.0.0`` for raster data version ``1.1.0`` for vector data.
.. code-block:: python3
geo.upload_style(path=r'path\to\sld\file.sld', workspace='demo')
geo.publish_style(layer_name='geoserver_layer_name', style_name='sld_file_name', workspace='demo')
Creating and applying dynamic styles based on the raster coverages
------------------------------------------------------------------
**WARNING:** As of version 2.9.0, the required dependency ``gdal`` was converted into an optional dependency. Fresh installations of this library will require that you then install ``gdal`` yourself with ``pip install gdal``.
It is used to create the style file for raster data. You can get the ``color_ramp`` name from `matplotlib colormaps <https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html>`_. By default ``color_ramp='RdYlGn'`` (red to green color ramp).
.. code-block:: python
geo.create_coveragestyle(raster_path=r'path\to\raster\file.tiff', style_name='style_1', workspace='demo', color_ramp='RdBu_r')
geo.publish_style(layer_name='geoserver_layer_name', style_name='raster_file_name', workspace='demo')
.. note::
If you have your own custom color and the custom label, you can pass the ``values:color`` pair as below to generate the map with dynamic legend.
.. code-block:: python
c_ramp = {
'label 1 value': '#ffff55',
'label 2 value': '#505050',
'label 3 value': '#404040',
'label 4 value': '#333333'
}
geo.create_coveragestyle(raster_path=r'path\to\raster\file.tiff',
style_name='style_2',
workspace='demo',
color_ramp=c_ramp,
cmap_type='values')
# you can also pass this list of color if you have your custom colors for the ``color_ramp``
'''
geo.create_coveragestyle(raster_path=r'path\to\raster\file.tiff',
style_name='style_3',
workspace='demo',
color_ramp=[#ffffff, #453422, #f0f0f0, #aaaaaa],
cmap_type='values')
'''
geo.publish_style(layer_name='geoserver_layer_name', style_name='raster_file_name', workspace='demo')
For generating the style for ``classified raster``, you can pass the another parameter called ``cmap_type='values'`` as,
.. code-block:: python
geo.create_coveragestyle(raster_path=r'path\to\raster\file.tiff', style_name='style_1', workspace='demo', color_ramp='RdYiGn', cmap_type='values')
.. list-table:: Options for ``create_coveragestyle``
:widths: 15 15 15 55
:header-rows: 1
* - Option
- Type
- Default
- Description
* - style_name
- string
- file_name
- This is optional field. If you don't pass the style_name parameter, then it will take the raster file name as the default name of style in geoserver
* - raster_path
- path
- None
- path to the raster file (Required)
* - workspace
- string
- None
- The name of the workspace. Optional field. It will take the default workspace of geoserver if nothing is provided
* - color_ramp
- string, list, dict
- RdYiGn
- The color ramp name. The name of the color ramp can be found here in `matplotlib colormaps <https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html>`_
* - overwrite
- boolean
- False
- For overwriting the previous style file in geoserver
Creating feature styles
-----------------------
**WARNING:** As of version 2.9.0, the required dependency ``gdal``, ``matplotlib`` and ``seaborn`` was converted into an optional dependency. Fresh installations of this library will require that you then install ``gdal``, ``matplotlib`` and ``seaborn`` yourself with ``pip install gdal matplotlib seaborn``.
It is used for creating the style for ``point``, ``line`` and ``polygon`` dynamically. It currently supports three different types of feature styles:
1. ``Outline featurestyle``: For creating the style which have only boundary color but not the fill style
2. ``Catagorized featurestyle``: For creating catagorized dataset
3. ``Classified featurestyle``: Classify the input data and style it: (For now, it only supports polygon geometry)
.. code-block:: python
geo.create_outline_featurestyle(style_name='new_style', color="#3579b1", geom_type='polygon', workspace='demo')
geo.create_catagorized_featurestyle(style_name='name_of_style', column_name='name_of_column', column_distinct_values=[1,2,3,4,5,6,7], workspace='demo')
geo.create_classified_featurestyle(style_name='name_of_style', column_name='name_of_column', column_distinct_values=[1,2,3,4,5,6,7], workspace='demo')
.. note::
* The ``geom_type`` must be either ``point``, ``line`` or ``polygon``.
* The ``color_ramp`` name can be obtained from `matplotlib colormaps <https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html>`_.
The options for creating categorized/classified `featurestyles` are as follows,
.. list-table:: Options for ``create_catagorized_featurestyle`` and ``create_classified_featurestyle``
:widths: 15 15 15 55
:header-rows: 1
* - Option
- Type
- Default
- Description
* - style_name
- string
- file_name
- This is optional field. If you don't pass the style_name parameter, then it will take the raster file name as the default name of style in geoserver
* - column_name
- string
- None
- The name of the column, based on which the style will be generated
* - column_distinct_values
- list/array
- None
- The column distinct values based on which the style will be applied/classified. This option is only available for ``create_classified_featurestyle``
* - workspace
- string
- None
- The name of the workspace. Optional field. It will take the default workspace of geoserver if nothing is provided
* - color_ramp
- string
- RdYiGn
- The color ramp name. The name of the color ramp can be found here in `matplotlib colormaps <https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html>`_
* - geom_type
- string
- polygon
- The geometry type, available options are ``point``, ``line`` or ``polygon``
* - outline_color
- color hex value
- '#3579b1'
- The outline color of the polygon/line
* - overwrite
- boolean
- False
- For overwriting the previous style file in geoserver
Deletion requests examples
^^^^^^^^^^^^^^^^^^^^^^^^^^
.. code-block:: python
# delete workspace
geo.delete_workspace(workspace='demo')
# delete layer
geo.delete_layer(layer_name='agri_final_proj', workspace='demo')
# delete feature store, i.e. remove postgresql connection
geo.delete_featurestore(featurestore_name='ftry', workspace='demo')
# delete coveragestore, i.e. delete raster store
geo.delete_coveragestore(coveragestore_name='agri_final_proj', workspace='demo')
# delete style file
geo.delete_style(style_name='kamal2', workspace='demo')
Some get request examples
^^^^^^^^^^^^^^^^^^^^^^^^^
.. code-block:: python
# get geoserver version
version = geo.get_version()
print(version)
# get ststem info
status = geo.get_status()
system_status = geo.get_system_status()
# get workspace
workspace = geo.get_workspace(workspace='workspace_name')
# get default workspace
dw = geo.get_default_wokspace()
# get all the workspaces
workspaces = geo.get_workspaces()
# get datastore
datastore = geo.get_datastore(store_name='store')
# get all the datastores
datastores = geo.get_datastores()
# get coveragestore
cs = geo.get_coveragestore(coveragestore_name='cs')
# get all the coveragestores
css = geo.get_coveragestores()
# get layer
layer = geo.get_layer(layer_name='layer_name')
# get all the layers
layers = geo.get_layers()
# get layergroup
layergroup = geo.get_layergroup('layergroup_name')
# get all the layers
layergroups = geo.get_layergroups()
# get style
style = geo.get_style(style_name='style_name')
# get all the styles
styles = geo.get_styles()
# get featuretypes
featuretypes = geo.get_featuretypes(store_name='store_name')
# get feature attribute
fa = geo.get_feature_attribute(feature_type_name='ftn', workspace='ws', store_name='sn')
# get feature store
fs = geo.get_featurestore(store_name='sn', workspace='ws')
Special functions
^^^^^^^^^^^^^^^^^
.. code-block:: python
# Reloads the GeoServer catalog and configuration from disk. This operation is used in cases where an external tool has modified the on-disk configuration. This operation will also force GeoServer to drop any internal caches and reconnect to all data stores.
geo.reload()
# Resets all store, raster, and schema caches. This operation is used to force GeoServer to drop all caches and store connections and reconnect to each of them the next time they are needed by a request. This is useful in case the stores themselves cache some information about the data structures they manage that may have changed in the meantime.
geo.reset()
# set default workspace
geo.set_default_workspace(workspace='workspace_name')
Global parameters for most functions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The following parameters are common to most functions/methods:
* ``workspace``: If workspace is not provided, the function will take the ``default`` workspace.
* ``overwrite``: This parameter takes only the boolean value. In most of the create method, the ``overwrite`` parameter is available. The default value is ``False``. But if you set it to True, the method will be in update mode.
================================================
FILE: docs/source/index.rst
================================================
.. geoserver-rest documentation master file, created by
sphinx-quickstart on Wed Mar 10 14:46:38 2021.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to geoserver-rest documentation!
========================================
.. toctree::
:maxdepth: 2
about
installation
how_to_use
advanced_uses
change_log
license
contribution
acknowledgements
geo
Checkout the video tutorial on geoserver-rest below,
.. raw:: html
<iframe width="560" height="315" src="https://www.youtube.com/embed/nXvzmbGukeE" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
================================================
FILE: docs/source/installation.rst
================================================
Installation
=============
.. warning::
As of version 2.9.0, the required dependency ``gdal``, ``matplotlib`` and ``seaborn`` was converted into an optional dependency. Fresh installations of this library will require that you then install ``gdal``, ``matplotlib`` and ``seaborn`` yourself with ``pip install gdal matplotlib seaborn``.
Conda installation
^^^^^^^^^^^^^^^^^^
The ``geoserver-rest`` can be installed from either ``conda-forge`` channel as below:
.. code-block:: shell
$ conda install -c conda-forge geoserver-rest
Pip installation
^^^^^^^^^^^^^^^^
The ``geoserver-rest`` library can be installed using ``pip`` as below:
.. code-block:: shell
$ pip install geoserver-rest
But best way to get all the functationality is to install the optional dependencies as well:
.. code-block:: shell
$ pip install geoserver-rest[all]
If you want to install the geoserver-rest library with the optional dependencies (this will be useful if you are planning to create dynamic style files based on your dataset. Explore ``create_coveragestyle``, ``upload_style`` etc functions), you need to install the following dependencies first:
* `GDAL <https://gdal.org/>`_
* `matplotlib <https://matplotlib.org/>`_
* `seaborn <https://seaborn.pydata.org/>`_
Dependencies installation in Windows
------------------------------------
.. warning::
As of March 2022, ``pipwin`` has been deprecated and is no longer maintained. Do not use this method.
For Windows, the ``gdal`` dependency can be complex to install. There are a handful of ways to install ``gdal`` in Windows.
One way is install the wheel directly from the `Geospatial library wheels for Python Windows <https://github.com/cgohlke/geospatial-wheels>`_ releases page. Be sure to select the wheel for your system from the latest release and install it using pip install command:
.. code-block:: shell
# For Python3.10 on Windows 64-bit systems
$ pip.exe install https://github.com/cgohlke/geospatial-wheels/releases/download/<release_version>/GDAL-3.7.1-cp310-cp310-win_amd64.whl
$ pip.exe install seaborn matplotlib
Another way is to use the GDAL network installer binary package available at: `OSGeo4W <https://trac.osgeo.org/osgeo4w/>`_.
macOS installation
------------------
For macOS, we suggest using the `homebrew` package manager to install ``gdal``. Once ``homebrew`` is installed, ``gdal`` can be installed using following method:
.. code-block:: shell
$ brew update
$ brew install gdal
$ pip3 install pygdal=="$(gdalinfo --version | awk '{print $2}' | sed s'/.$//')"
Linux installation
------------------
For Ubuntu specifically, we suggest installing ``gdal`` from the ``ubuntugis`` PPA:
.. code-block:: shell
$ sudo add-apt-repository ppa:ubuntugis/ppa
$ sudo apt update -y
$ sudo apt upgrade -y
$ sudo apt install gdal-bin libgdal-dev
For other versions of Linux, simply use your package manager to install ``gdal``.
.. code-block:: shell
# Debian, Mint, etc.
$ sudo apt install gdal-bin libgdal-dev
# Fedora, RHEL, etc.
$ sudo yum install gdal gdal-devel
# Arch, Manjaro, etc.
$ sudo pacman -S gdal
# Void Linux
$ sudo xbps-install -S libgdal libgdal-devel
Now the ``pygdal`` and ``geoserver-rest`` libraries can be installed using ``pip``:
.. code-block:: shell
$ pip install pygdal=="$(gdal-config --version).*"
$ pip install geoserver-rest
================================================
FILE: docs/source/license.rst
================================================
License
=========
MIT License
^^^^^^^^^^^^
Copyright (c) 2020, Geoinformatics Center, Asian Institute of Technology
Copyright (c) 2020, Tek Kshetri
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: geo/Calculation_gdal.py
================================================
import os
try:
from osgeo import gdal # noqa
except ImportError:
import gdal # noqa
except ImportError:
raise ImportError("Package `gdal` is required to run this function. Install it with `pip install gdal`.")
def raster_value(path: str) -> dict:
file = os.path.basename(path)
file_format = os.path.splitext(file)[1]
file_name = file.split(".")[0]
valid_extension = [".tif", ".tiff", ".gtiff"]
if file_format.lower() in valid_extension:
try:
gtif = gdal.Open(path)
srcband = gtif.GetRasterBand(1)
srcband.ComputeStatistics(0)
n = srcband.GetMaximum() - srcband.GetMinimum() + 1
n = int(n)
min_value = srcband.GetMinimum()
max_value = srcband.GetMaximum()
result = {
"N": n,
"min": min_value,
"max": max_value,
"file_name": file_name,
}
return result
except Exception as error:
print("An error occured when computing band statistics: ", error)
raise error
else:
print("sorry, file format is incorrect")
================================================
FILE: geo/Geoserver.py
================================================
# inbuilt libraries
import os
from typing import List, Optional, Set, Union, Dict, Iterable, Any
from pathlib import Path
# third-party libraries
import requests
from xmltodict import parse, unparse
# custom functions
from .supports import prepare_zip_file, is_valid_xml, is_surrounded_by_quotes
def _parse_request_options(request_options: Dict[str, Any]):
"""
Parse request options.
Parameters
----------
request_options : dict
The request options to parse.
Returns
-------
dict
The parsed request options.
"""
return request_options if request_options is not None else {}
# Custom exceptions.
class GeoserverException(Exception):
"""
Custom exception for Geoserver errors.
Parameters
----------
status : int
The status code of the error.
message : str
The error message.
"""
def __init__(self, status, message):
self.status = status
self.message = message
super().__init__(f"Status : {self.status} - {self.message}")
# call back class for reading the data
class DataProvider:
"""
Data provider for reading data.
Parameters
----------
data : str
The data to be read.
"""
def __init__(self, data):
self.data = data
self.finished = False
def read_cb(self, size):
"""
Read callback.
Parameters
----------
size : int
The size of the data to read.
Returns
-------
str
The read data.
"""
assert len(self.data) <= size
if not self.finished:
self.finished = True
return self.data
else:
# Nothing more to read
return ""
# callback class for reading the files
class FileReader:
"""
File reader for reading files.
Parameters
----------
fp : file object
The file object to read from.
"""
def __init__(self, fp):
self.fp = fp
def read_callback(self, size):
"""
Read callback.
Parameters
----------
size : int
The size of the data to read.
Returns
-------
str
The read data.
"""
return self.fp.read(size)
class Geoserver:
"""
Geoserver class to interact with GeoServer REST API.
Attributes
----------
service_url : str
The URL for the GeoServer instance.
username : str
Login name for session.
password: str
Password for session.
request_options : dict
Additional parameters to be sent with each request.
"""
def __init__(
self,
service_url: str = "http://localhost:8080/geoserver", # default deployment url during installation
username: str = "admin", # default username during geoserver installation
password: str = "geoserver", # default password during geoserver installation
request_options: Dict[
str, Any
] = None, # additional parameters to be sent with each request
):
self.service_url = service_url
self.username = username
self.password = password
self.request_options = request_options if request_options is not None else {}
def _requests(self, method: str, url: str, **kwargs) -> requests.Response:
"""
Convenience wrapper to the requests library which automatically handles the authentication, as well as additional options to be passed to each request.
Parameters
----------
method : str
Which method to use (`get`, `post`, `put`, `delete`)
url : str
URL to which to make the request
kwargs : dict
Additional arguments to pass to the request.
Returns
-------
requests.Response
The response object.
"""
if method.lower() == "post":
return requests.post(
url,
auth=(self.username, self.password),
**kwargs,
**self.request_options,
)
elif method.lower() == "get":
return requests.get(
url,
auth=(self.username, self.password),
**kwargs,
**self.request_options,
)
elif method.lower() == "put":
return requests.put(
url,
auth=(self.username, self.password),
**kwargs,
**self.request_options,
)
elif method.lower() == "delete":
return requests.delete(
url,
auth=(self.username, self.password),
**kwargs,
**self.request_options,
)
else:
raise Exception("unsupported http method name.")
# _______________________________________________________________________________________________
#
# GEOSERVER AND SERVER SPECIFIC METHODS
# _______________________________________________________________________________________________
#
def get_manifest(self):
"""
Returns the manifest of the GeoServer. The manifest is a JSON of all the loaded JARs on the GeoServer server.
Returns
-------
dict
The manifest of the GeoServer.
"""
url = "{}/rest/about/manifest.json".format(self.service_url)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_version(self):
"""
Returns the version of the GeoServer as JSON. It contains only the details of the high level components: GeoServer, GeoTools, and GeoWebCache.
Returns
-------
dict
The version information of the GeoServer.
"""
url = "{}/rest/about/version.json".format(self.service_url)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_status(self):
"""
Returns the status of the GeoServer. It shows the status details of all installed and configured modules.
Returns
-------
dict
The status of the GeoServer.
"""
url = "{}/rest/about/status.json".format(self.service_url)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_system_status(self):
"""
Returns the system status of the GeoServer. It returns a list of system-level information. Major operating systems (Linux, Windows, and MacOS) are supported out of the box.
Returns
-------
dict
The system status of the GeoServer.
"""
url = "{}/rest/about/system-status.json".format(self.service_url)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def reload(self):
"""
Reloads the GeoServer catalog and configuration from disk.
This operation is used in cases where an external tool has modified the on-disk configuration. This operation will also force GeoServer to drop any internal caches and reconnect to all data stores.
Returns
-------
str
The status code of the reload operation.
"""
url = "{}/rest/reload".format(self.service_url)
r = self._requests("post", url)
if r.status_code == 200:
return "Status code: {}".format(r.status_code)
else:
raise GeoserverException(r.status_code, r.content)
def reset(self):
"""
Resets all store, raster, and schema caches. This operation is used to force GeoServer to drop all caches and store connections and reconnect to each of them the next time they are needed by a request. This is useful in case the stores themselves cache some information about the data structures they manage that may have changed in the meantime.
Returns
-------
str
The status code of the reset operation.
"""
url = "{}/rest/reset".format(self.service_url)
r = self._requests("post", url)
if r.status_code == 200:
return "Status code: {}".format(r.status_code)
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# WORKSPACES
# _______________________________________________________________________________________________
#
def get_default_workspace(self):
"""
Returns the default workspace.
Returns
-------
dict
The default workspace.
"""
url = "{}/rest/workspaces/default".format(self.service_url)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_workspace(self, workspace):
"""
Get the name of a workspace if it exists.
Parameters
----------
workspace : str
The name of the workspace.
Returns
-------
dict
The workspace information.
"""
url = "{}/rest/workspaces/{}.json".format(self.service_url, workspace)
r = self._requests("get", url, params={"recurse": "true"})
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_workspaces(self):
"""
Returns all the workspaces.
Returns
-------
dict
All the workspaces.
"""
url = "{}/rest/workspaces".format(self.service_url)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def set_default_workspace(self, workspace: str):
"""
Set the default workspace.
Parameters
----------
workspace : str
The name of the workspace to set as default.
Returns
-------
str
The status code of the operation.
"""
url = "{}/rest/workspaces/default".format(self.service_url)
data = "<workspace><name>{}</name></workspace>".format(workspace)
r = self._requests("put", url, data=data, headers={"content-type": "text/xml"})
if r.status_code == 200:
return "Status code: {}, default workspace {} set!".format(
r.status_code, workspace
)
else:
raise GeoserverException(r.status_code, r.content)
def create_workspace(self, workspace: str):
"""
Create a new workspace in GeoServer. The GeoServer workspace URL will be the same as the name of the workspace.
Parameters
----------
workspace : str
The name of the workspace to create.
Returns
-------
str
The status code and message of the operation.
"""
url = "{}/rest/workspaces".format(self.service_url)
data = "<workspace><name>{}</name></workspace>".format(workspace)
headers = {"content-type": "text/xml"}
r = self._requests("post", url, data=data, headers=headers)
if r.status_code == 201:
return "{} Workspace {} created!".format(r.status_code, workspace)
else:
raise GeoserverException(r.status_code, r.content)
def delete_workspace(self, workspace: str):
"""
Delete a workspace.
Parameters
----------
workspace : str
The name of the workspace to delete.
Returns
-------
str
The status code and message of the operation.
"""
payload = {"recurse": "true"}
url = "{}/rest/workspaces/{}".format(self.service_url, workspace)
r = self._requests("delete", url, params=payload)
if r.status_code == 200:
return "Status code: {}, delete workspace".format(r.status_code)
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# DATASTORES
# _______________________________________________________________________________________________
#
def get_datastore(self, store_name: str, workspace: Optional[str] = None):
"""
Return the data store in a given workspace. If workspace is not provided, it will take the default workspace.
Parameters
----------
store_name : str
The name of the data store.
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The data store information.
"""
if workspace is None:
workspace = "default"
url = "{}/rest/workspaces/{}/datastores/{}".format(
self.service_url, workspace, store_name
)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_datastores(self, workspace: Optional[str] = None):
"""
List all data stores in a workspace. If workspace is not provided, it will list all the datastores inside the default workspace.
Parameters
----------
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The list of data stores.
"""
if workspace is None:
workspace = "default"
url = "{}/rest/workspaces/{}/datastores.json".format(
self.service_url, workspace
)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# COVERAGE STORES
# _______________________________________________________________________________________________
#
def get_coveragestore(
self, coveragestore_name: str, workspace: Optional[str] = None
):
"""
Returns the store name if it exists.
Parameters
----------
coveragestore_name : str
The name of the coverage store.
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The coverage store information.
"""
payload = {"recurse": "true"}
if workspace is None:
workspace = "default"
url = "{}/rest/workspaces/{}/coveragestores/{}.json".format(
self.service_url, workspace, coveragestore_name
)
r = self._requests(method="get", url=url, params=payload)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_coveragestores(self, workspace: str = None):
"""
Returns all the coverage stores inside a specific workspace.
Parameters
----------
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The list of coverage stores.
"""
if workspace is None:
workspace = "default"
url = "{}/rest/workspaces/{}/coveragestores".format(self.service_url, workspace)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def create_coverage(
self,
workspace: str,
coveragestore: str,
coverage_name: str = None,
navite_name: str = None,
coverage_title: str = None,
):
"""
Create a coverage in a coveragestore.The coveragestore must already exist.
Parameters
----------
workspace: The workspace name
coveragestore: The coveragestore name
coverage_name: The coverage name
native_Name: The coverage's source name
coverage_title: The coverage title
Returns
-------
str
The coverage name.
"""
if not workspace:
raise ValueError("Workspace is required")
if not coveragestore:
raise ValueError("Coveragestore is required")
navite_name = navite_name or coveragestore
coverage_name = coverage_name or navite_name
coverage_title = coverage_title.replace(" ", "_") if coverage_title else None
body = {
"coverage": {
"name": coverage_name,
"nativeName": navite_name,
"title": coverage_title,
"store": {"name": coveragestore},
}
}
url = f"{self.service_url}/rest/workspaces/{workspace}/coverages"
content_type = "application/json"
r = self._requests(
method="post", url=url, json=body, headers={"content-type": content_type}
)
if r.status_code == 201:
return r.text
raise GeoserverException(r.status_code, r.content)
def create_coveragestore(
self,
path,
workspace: Optional[str] = None,
layer_name: Optional[str] = None,
file_type: str = "GeoTIFF",
content_type: str = "image/tiff",
method: str = "file",
):
"""
Creates the coverage store; Data will be uploaded to the server.
Parameters
----------
path : str
The path to the file.
workspace : str, optional
The name of the workspace.
layer_name : str, optional
The name of the coverage store. If not provided, parsed from the file name.
file_type : str
The type of the file.
content_type : str
The content type of the file.
method : str
file | url | external | remote
Returns
-------
dict
The response from the server.
Notes
-----
the path to the file and file_type indicating it is a geotiff, arcgrid or other raster type
"""
if path is None:
raise Exception("You must provide the full path to the raster")
if workspace is None:
workspace = "default"
if layer_name is None:
layer_name = os.path.basename(path)
f = layer_name.split(".")
if len(f) > 0:
layer_name = f[0]
file_type = file_type.lower()
if file_type == "netcdf":
# files such as netcdf contain multiple layers, which means a single coverage name cannot be specified.
url = "{0}/rest/workspaces/{1}/coveragestores/{2}/{3}.{4}".format(
self.service_url, workspace, layer_name, method, file_type
)
else:
url = "{0}/rest/workspaces/{1}/coveragestores/{2}/{3}.{4}?coverageName={2}".format(
self.service_url, workspace, layer_name, method, file_type
)
if method == "file":
headers = {"content-type": content_type, "Accept": "application/json"}
with open(path, "rb") as f:
r = self._requests(method="put", url=url, data=f, headers=headers)
else:
headers = {"content-type": "text/plain", "Accept": "application/json"}
r = self._requests(method="put", url=url, data=path, headers=headers)
if r.status_code == 201:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def publish_time_dimension_to_coveragestore(
self,
store_name: Optional[str] = None,
workspace: Optional[str] = None,
presentation: Optional[str] = "LIST",
units: Optional[str] = "ISO8601",
default_value: Optional[str] = "MINIMUM",
content_type: str = "application/xml; charset=UTF-8",
):
"""
Create time dimension in coverage store to publish time series in GeoServer.
Parameters
----------
store_name : str, optional
The name of the coverage store.
workspace : str, optional
The name of the workspace.
presentation : str, optional
The presentation style.
units : str, optional
The units of the time dimension.
default_value : str, optional
The default value of the time dimension.
content_type : str
The content type of the request.
Returns
-------
dict
The response from the server.
Notes
-----
More about time support in geoserver WMS you can read here:
https://docs.geoserver.org/master/en/user/services/wms/time.html
"""
url = "{0}/rest/workspaces/{1}/coveragestores/{2}/coverages/{2}".format(
self.service_url, workspace, store_name
)
headers = {"content-type": content_type}
time_dimension_data = (
"<coverage>"
"<enabled>true</enabled>"
"<metadata>"
"<entry key='time'>"
"<dimensionInfo>"
"<enabled>true</enabled>"
"<presentation>{}</presentation>"
"<units>{}</units>"
"<defaultValue>"
"<strategy>{}</strategy>"
"</defaultValue>"
"</dimensionInfo>"
"</entry>"
"</metadata>"
"</coverage>".format(presentation, units, default_value)
)
r = self._requests(
method="put", url=url, data=time_dimension_data, headers=headers
)
if r.status_code in [200, 201]:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# LAYERS
# _______________________________________________________________________________________________
#
def get_layer(self, layer_name: str, workspace: Optional[str] = None):
"""
Returns the layer by layer name.
Parameters
----------
layer_name : str
The name of the layer.
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The layer information.
"""
url = "{}/rest/layers/{}".format(self.service_url, layer_name)
if workspace is not None:
url = "{}/rest/workspaces/{}/layers/{}".format(
self.service_url, workspace, layer_name
)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_layers(self, workspace: Optional[str] = None):
"""
Get all the layers from GeoServer. If workspace is None, it will list all the layers from GeoServer.
Parameters
----------
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The list of layers.
"""
url = "{}/rest/layers".format(self.service_url)
if workspace is not None:
url = "{}/rest/workspaces/{}/layers".format(self.service_url, workspace)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def delete_layer(self, layer_name: str, workspace: Optional[str] = None):
"""
Delete a layer.
Parameters
----------
layer_name : str
The name of the layer to delete.
workspace : str, optional
The name of the workspace.
Returns
-------
str
The status code and message of the operation.
"""
payload = {"recurse": "true"}
url = "{}/rest/workspaces/{}/layers/{}".format(
self.service_url, workspace, layer_name
)
if workspace is None:
url = "{}/rest/layers/{}".format(self.service_url, layer_name)
r = self._requests(method="delete", url=url, params=payload)
if r.status_code == 200:
return "Status code: {}, delete layer".format(r.status_code)
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# LAYER GROUPS
# _______________________________________________________________________________________________
#
def get_layergroups(self, workspace: Optional[str] = None):
"""
Returns all the layer groups from GeoServer. If workspace is None, it will list all the layer groups from GeoServer.
Parameters
----------
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The list of layer groups.
Notes
-----
If workspace is None, it will list all the layer groups from geoserver.
"""
url = "{}/rest/layergroups".format(self.service_url)
if workspace is not None:
url = "{}/rest/workspaces/{}/layergroups".format(
self.service_url, workspace
)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_layergroup(self, layer_name: str, workspace: Optional[str] = None):
"""
Returns the layer group by layer group name.
Parameters
----------
layer_name : str
The name of the layer group.
workspace : str, optional
The name of the workspace.
Returns
-------
dict
The layer group information.
"""
url = "{}/rest/layergroups/{}".format(self.service_url, layer_name)
if workspace is not None:
url = "{}/rest/workspaces/{}/layergroups/{}".format(
self.service_url, workspace, layer_name
)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def create_layergroup(
self,
name: str = "geoserver-rest-layergroup",
mode: str = "single",
title: str = "geoserver-rest layer group",
abstract_text: str = "A new layergroup created with geoserver-rest python package",
layers: List[str] = [],
workspace: Optional[str] = None,
formats: str = "html",
metadata: List[dict] = [],
keywords: List[str] = [],
) -> str:
"""
Creates the Layergroup.
Parameters
----------
name : str
The name of the layer group.
mode : str
The mode of the layer group.
title : str
The title of the layer group.
abstract_text : str
The abstract text of the layer group.
layers : list
The list of layers in the layer group.
workspace : str, optional
The name of the workspace.
formats : str, optional
The format of the layer group.
metadata : list, optional
The metadata of the layer group.
keywords : list, optional
The keywords of the layer group.
Returns
-------
str
The URL of the created layer group.
Notes
-----
title is a human readable text for the layergroup
abstract_text is a long text, like a brief info about the layergroup
workspace is Optional(Global Layergroups don't need workspace).A layergroup can exist without a workspace.
"""
assert isinstance(name, str), "Name must be of type String:''"
assert isinstance(mode, str), "Mode must be of type String:''"
assert isinstance(title, str), "Title must be of type String:''"
assert isinstance(abstract_text, str), "Abstract text must be of type String:''"
assert isinstance(formats, str), "Format must be of type String:''"
assert isinstance(
metadata, list
), "Metadata must be of type List of dict:[{'about':'geoserver rest data metadata','content_url':'link to content url'}]"
assert isinstance(
keywords, list
), "Keywords must be of type List:['keyword1','keyword2'...]"
assert isinstance(
layers, list
), "Layers must be of type List:['layer1','layer2'...]"
if workspace:
assert isinstance(workspace, str), "Workspace must be of type String:''"
# check if the workspace is valid in GeoServer
if self.get_workspace(workspace) is None:
raise Exception("Workspace is not valid in GeoServer Instance")
supported_modes: Set = {
"single",
"opaque",
"named",
"container",
"eo",
}
supported_formats: Set = {"html", "json", "xml"}
if mode.lower() != "single" and mode.lower() not in supported_modes:
raise Exception(
f"Mode not supported. Acceptable modes are : {supported_modes}"
)
if formats.lower() != "html" and formats.lower() not in supported_formats:
raise Exception(
f"Format not supported. Acceptable formats are : {supported_formats}"
)
# check if it already exist in GeoServer
try:
existing_layergroup = self.get_layergroup(name, workspace=workspace)
except GeoserverException:
existing_layergroup = None
if existing_layergroup is not None:
raise Exception(f"Layergroup: {name} already exist in GeoServer instance")
if len(layers) == 0:
raise Exception("No layer provided!")
else:
for layer in layers:
# check if it is valid in geoserver
try:
# Layer check
self.get_layer(
layer_name=layer,
workspace=workspace if workspace is not None else None,
)
except GeoserverException:
try:
# Layer group check
self.get_layergroup(
layer_name=layer,
workspace=workspace if workspace is not None else None,
)
except GeoserverException:
raise Exception(
f"Layer: {layer} is not a valid layer in the GeoServer instance"
)
skeleton = ""
if workspace:
skeleton += f"<workspace><name>{workspace}</name></workspace>"
# metadata structure = [{about:"",content_url:""},{...}]
metadata_xml_list = []
if len(metadata) >= 1:
for meta in metadata:
metadata_about = meta.get("about")
metadata_content_url = meta.get("content_url")
metadata_xml_list.append(
f"""
<metadataLink>
<type>text/plain</type>
<about>{metadata_about}</about>
<metadataType>ISO19115:2003</metadataType>
<content>{metadata_content_url}</content>
</metadataLink>
"""
)
metadata_xml = f"<metadataLinks>{''.join(['{}'] * len(metadata_xml_list)).format(*metadata_xml_list)}</metadataLinks>"
skeleton += metadata_xml
layers_xml_list: List[str] = []
for layer in layers:
published_type = "layer"
try:
# Layer check
self.get_layer(
layer_name=layer,
workspace=workspace if workspace is not None else None,
)
except GeoserverException: # It's a layer group
published_type = "layerGroup"
layers_xml_list.append(
f"""<published type="{published_type}">
<name>{layer}</name>
<link>{self.service_url}/layers/{layer}.xml</link>
</published>
"""
)
layers_xml: str = (
f"<publishables>{''.join(['{}'] * len(layers)).format(*layers_xml_list)}</publishables>"
)
skeleton += layers_xml
if len(keywords) >= 1:
keyword_xml_list: List[str] = [
f"<keyword>{keyword}</keyword>" for keyword in keywords
]
keywords_xml: str = (
f"<keywords>{''.join(['{}'] * len(keywords)).format(*keyword_xml_list)}</keywords>"
)
skeleton += keywords_xml
data = f"""
<layerGroup>
<name>{name}</name>
<mode>{mode}</mode>
<title>{title}</title>
<abstractTxt>{abstract_text}</abstractTxt>
{skeleton}
</layerGroup>
"""
url = f"{self.service_url}/rest/layergroups/"
r = self._requests(
method="post", url=url, data=data, headers={"content-type": "text/xml"}
)
if r.status_code == 201:
layergroup_url = f"{self.service_url}/rest/layergroups/{name}.{formats}"
return f"layergroup created successfully! Layergroup link: {layergroup_url}"
else:
raise GeoserverException(r.status_code, r.content)
def update_layergroup(
self,
layergroup_name,
title: Optional[str] = None,
abstract_text: Optional[str] = None,
formats: str = "html",
metadata: List[dict] = [],
keywords: List[str] = [],
) -> str:
"""
Updates a Layergroup.
Parameters
----------
layergroup_name: str
The name of the layergroup to update.
title : str, optional
The new title for the layergroup.
abstract_text : str, optional
The new abstract text for the layergroup.
formats : str, optional
The format of the response. Default is "html".
metadata : list of dict, optional
List of metadata entries where each entry is a dictionary with "about" and "content_url" keys.
keywords : list of str, optional
List of keywords associated with the layergroup.
Returns
-------
str
A success message indicating that the layergroup was updated.
Raises
------
GeoserverException
If there is an issue updating the layergroup.
"""
# check if layergroup is valid in Geoserver
if self.get_layergroup(layer_name=layergroup_name) is None:
raise Exception(
f"Layer group: {layergroup_name} is not a valid layer group in the Geoserver instance"
)
if title is not None:
assert isinstance(title, str), "Title must be of type String:''"
if abstract_text is not None:
assert isinstance(
abstract_text, str
), "Abstract text must be of type String:''"
assert isinstance(formats, str), "Format must be of type String:''"
assert isinstance(
metadata, list
), "Metadata must be of type List of dict:[{'about':'geoserver rest data metadata','content_url':'lint to content url'}]"
assert isinstance(
keywords, list
), "Keywords must be of type List:['keyword1','keyword2'...]"
supported_formats: Set = {"html", "json", "xml"}
if formats.lower() != "html" and formats.lower() not in supported_formats:
raise Exception(
f"Format not supported. Acceptable formats are : {supported_formats}"
)
skeleton = ""
if title:
skeleton += f"<title>{title}</title>"
if abstract_text:
skeleton += f"<abstractTxt>{abstract_text}</abstractTxt>"
metadata_xml_list = []
if len(metadata) >= 1:
for meta in metadata:
metadata_about = meta.get("about")
metadata_content_url = meta.get("content_url")
metadata_xml_list.append(
f"""
<metadataLink>
<type>text/plain</type>
<about>{metadata_about}</about>
<metadataType>ISO19115:2003</metadataType>
<content>{metadata_content_url}</content>
</metadataLink>
"""
)
metadata_xml = f"<metadataLinks>{''.join(['{}'] * len(metadata_xml_list)).format(*metadata_xml_list)}</metadataLinks>"
skeleton += metadata_xml
if len(keywords) >= 1:
keyword_xml_list: List[str] = [
f"<keyword>{keyword}</keyword>" for keyword in keywords
]
keywords_xml: str = (
f"<keywords>{''.join(['{}'] * len(keyword_xml_list)).format(*keyword_xml_list)}</keywords>"
)
skeleton += keywords_xml
data = f"""
<layerGroup>
{skeleton}
</layerGroup>
"""
url = f"{self.service_url}/rest/layergroups/{layergroup_name}"
r = self._requests(
method="put",
url=url,
data=data,
headers={"content-type": "text/xml", "accept": "application/xml"},
)
if r.status_code == 200:
layergroup_url = (
f"{self.service_url}/rest/layergroups/{layergroup_name}.{formats}"
)
return f"layergroup updated successfully! Layergroup link: {layergroup_url}"
else:
raise GeoserverException(r.status_code, r.content)
def delete_layergroup(
self, layergroup_name: str, workspace: Optional[str] = None
) -> str:
"""
Delete a layer group from the geoserver and raise an exception
in case the layer group does not exist, or the geoserver is unavailable.
Parameters
----------
layergroup_name: str
The name of the layer group to be deleted.
workspace: str, optional
The workspace the layergroup is located in.
Returns
-------
str
A success message indicating that the layer group was deleted.
Raises
------
GeoserverException
If there is an issue deleting the layergroup.
"""
# raises an exception in case the layer group doesn't exist
self.get_layergroup(layer_name=layergroup_name, workspace=workspace)
if workspace is None:
url = f"{self.service_url}/rest/layergroups/{layergroup_name}"
else:
url = f"{self.service_url}/rest/workspaces/{workspace}/layergroups/{layergroup_name}"
r = self._requests(url=url, method="delete")
if r.status_code == 200:
return "Layer group deleted successfully"
else:
raise GeoserverException(r.status_code, r.content)
def add_layer_to_layergroup(
self,
layer_name: str,
layer_workspace: str,
layergroup_name: str,
layergroup_workspace: str = None,
) -> None:
"""
Add the specified layer to an existing layer group and raise an exception if
either the layer or layergroup doesn't exist, or the geoserver is unavailable.
Parameters
----------
layer_name: str
The name of the layer.
layer_workspace: str
The workspace the layer is located in.
layergroup_workspace: str, optional
The workspace the layergroup is located in.
layergroup_name: str
The name of the layer group.
layergroup_workspace: str, optional
The workspace the layergroup is located in.
Returns
-------
None
Raises
------
GeoserverException
If there is an issue adding the layer to the layergroup.
"""
layergroup_info = self.get_layergroup(
layer_name=layergroup_name, workspace=layergroup_workspace
)
layer_info = self.get_layer(layer_name=layer_name, workspace=layer_workspace)
# build list of existing publishables & styles
publishables = layergroup_info["layerGroup"]["publishables"]["published"]
if not isinstance(publishables, list): # only 1 layer up to now
publishables = [publishables]
styles = layergroup_info["layerGroup"]["styles"]["style"]
if not isinstance(styles, list): # only 1 layer up to now
styles = [styles]
# add publishable & style for the new layer
new_pub = {
"name": f"{layer_workspace}:{layer_name}",
"href": f"{self.service_url}/rest/workspaces/{layer_workspace}/layers/{layer_name}.json",
}
publishables.append(new_pub)
new_style = layer_info["layer"]["defaultStyle"]
styles.append(new_style)
data = self._layergroup_definition_from_layers_and_styles(
publishables=publishables, styles=styles
)
if layergroup_workspace is None:
url = f"{self.service_url}/rest/layergroups/{layergroup_name}"
else:
url = f"{self.service_url}/rest/workspaces/{layergroup_workspace}/layergroups/{layergroup_name}"
r = self._requests(
method="put",
url=url,
data=data,
headers={"content-type": "text/xml", "accept": "application/xml"},
)
if r.status_code == 200:
return
else:
raise GeoserverException(r.status_code, r.content)
def remove_layer_from_layergroup(
self,
layer_name: str,
layer_workspace: str,
layergroup_name: str,
layergroup_workspace: str = None,
) -> None:
"""
Add remove the specified layer from an existing layer group and raise an exception if
either the layer or layergroup doesn't exist, or the geoserver is unavailable.
Parameters
----------
layer_name: str
The name of the layer.
layer_workspace: str
The workspace the layer is located in.
layergroup_workspace: str, optional
The workspace the layergroup is located in.
layergroup_name: str
The name of the layer group.
layergroup_workspace: str, optional
The workspace the layergroup is located in.
Returns
-------
None
Raises
------
GeoserverException
If there is an issue removing the layer from the layergroup.
"""
layergroup_info = self.get_layergroup(
layer_name=layergroup_name, workspace=layergroup_workspace
)
# build list of existing publishables & styles
publishables = layergroup_info["layerGroup"]["publishables"]["published"]
if not isinstance(publishables, list): # only 1 layer up to now
publishables = [publishables]
styles = layergroup_info["layerGroup"]["styles"]["style"]
if not isinstance(styles, list): # only 1 layer up to now
styles = [styles]
layer_to_remove = f"{layer_workspace}:{layer_name}"
revised_set_of_publishables_and_styles = [
(pub, style)
for (pub, style) in zip(
layergroup_info["layerGroup"]["publishables"]["published"],
layergroup_info["layerGroup"]["styles"]["style"],
)
if pub["name"] != layer_to_remove
]
revised_set_of_publishables = list(
map(list, zip(*revised_set_of_publishables_and_styles))
)[0]
revised_set_of_styles = list(
map(list, zip(*revised_set_of_publishables_and_styles))
)[1]
xml_payload = self._layergroup_definition_from_layers_and_styles(
publishables=revised_set_of_publishables, styles=revised_set_of_styles
)
if layergroup_workspace is None:
url = f"{self.service_url}/rest/layergroups/{layergroup_name}"
else:
url = f"{self.service_url}/rest/workspaces/{layergroup_workspace}/layergroups/{layergroup_name}"
r = self._requests(
method="put",
url=url,
data=xml_payload,
headers={"content-type": "text/xml", "accept": "application/xml"},
)
if r.status_code == 200:
return
else:
raise GeoserverException(r.status_code, r.content)
def _layergroup_definition_from_layers_and_styles(
self, publishables: list, styles: list
) -> str:
"""
Helper function for add_layer_to_layergroup and remove_layer_from_layergroup.
Parameters
----------
publishables: list
List of publishable layers.
styles: list
List of styles associated with the publishable layers.
Returns
-------
str
Formatted XML request body for PUT layergroup.
"""
# the get_layergroup method may return an empty string for style;
# so we get the default styles for each layer with no style information in the layergroup
if len(styles) == 1:
index = [0]
else:
index = range(len(styles))
for ix, this_style, this_layer in zip(index, styles, publishables):
if this_style == "":
this_layer_info = self.get_layer(
layer_name=this_layer["name"].split(":")[1],
workspace=this_layer["name"].split(":")[0],
)
styles[ix] = {
"name": this_layer_info["layer"]["defaultStyle"]["name"],
"href": this_layer_info["layer"]["defaultStyle"]["href"],
}
# build xml structure
layer_skeleton = ""
style_skeleton = ""
for publishable in publishables:
layer_str = f"""
<published type="layer">
<name>{publishable['name']}</name>
<link>{publishable['href']}</link>
</published>
"""
layer_skeleton += layer_str
for style in styles:
style_str = f"""
<style>
<name>{style['name']}</name>
<link>{style['href']}</link>
</style>
"""
style_skeleton += style_str
data = f"""
<layerGroup>
<publishables>
{layer_skeleton}
</publishables>
<styles>
{style_skeleton}
</styles>
</layerGroup>
"""
return data
# _______________________________________________________________________________________________
#
# STYLES
# _______________________________________________________________________________________________
#
def get_style(self, style_name, workspace: Optional[str] = None):
"""
Returns the style by style name.
Parameters
----------
style_name: str
The name of the style.
workspace: str, optional
The workspace the style is located in.
Returns
-------
dict
A dictionary representation of the style.
Raises
------
GeoserverException
If there is an issue retrieving the style.
"""
url = "{}/rest/styles/{}.json".format(self.service_url, style_name)
if workspace is not None:
url = "{}/rest/workspaces/{}/styles/{}.json".format(
self.service_url, workspace, style_name
)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def get_styles(self, workspace: Optional[str] = None):
"""
Returns all loaded styles from geoserver.
Parameters
----------
workspace: str, optional
The workspace to filter the styles by.
Returns
-------
dict
A dictionary containing all the styles.
Raises
------
GeoserverException
If there is an issue retrieving the styles.
"""
url = "{}/rest/styles.json".format(self.service_url)
if workspace is not None:
url = "{}/rest/workspaces/{}/styles.json".format(
self.service_url, workspace
)
r = self._requests("get", url)
if r.status_code == 200:
return r.json()
else:
raise GeoserverException(r.status_code, r.content)
def upload_style(
self,
path: str,
name: Optional[str] = None,
workspace: Optional[str] = None,
sld_version: str = "1.0.0",
):
"""
Uploads a style file to geoserver.
Parameters
----------
path : str
Path to the style file or XML string.
name : str, optional
The name of the style. If None, the name is parsed from the file name.
workspace : str, optional
The workspace to upload the style to.
sld_version : str, optional
The version of the SLD. Default is "1.0.0".
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue uploading the style.
Notes
-----
The name of the style file will be, sld_name:workspace
This function will create the style file in a specified workspace.
`path` can either be the path to the SLD file itself, or a string containing valid XML to be used for the style
Inputs: path to the sld_file or the contents of an SLD file itself, workspace,
"""
if name is None:
name = os.path.basename(path)
f = name.split(".")
if len(f) > 0:
name = f[0]
if is_valid_xml(path):
# path is actually just the xml itself
xml = path
elif Path(path).exists():
# path is pointing to an existing file
with open(path, "rb") as f:
xml = f.read()
else:
# path is non-existing file or not valid xml
raise ValueError(
"`path` must be either a path to a style file, or a valid XML string."
)
headers = {"content-type": "text/xml"}
url = "{}/rest/workspaces/{}/styles".format(self.service_url, workspace)
sld_content_type = "application/vnd.ogc.sld+xml"
if sld_version == "1.1.0" or sld_version == "1.1":
sld_content_type = "application/vnd.ogc.se+xml"
header_sld = {"content-type": sld_content_type}
if workspace is None:
# workspace = "default"
url = "{}/rest/styles".format(self.service_url)
style_xml = "<style><name>{}</name><filename>{}</filename></style>".format(
name, name + ".sld"
)
r = self._requests(method="post", url=url, data=style_xml, headers=headers)
if r.status_code == 201:
r_sld = self._requests(
method="put", url=url + "/" + name, data=xml, headers=header_sld
)
if r_sld.status_code == 200:
return r_sld.status_code
else:
raise GeoserverException(r_sld.status_code, r_sld.content)
else:
raise GeoserverException(r.status_code, r.content)
def create_coveragestyle(
self,
raster_path: str,
style_name: Optional[str] = None,
workspace: str = None,
color_ramp: str = "RdYlGn_r",
cmap_type: str = "ramp",
number_of_classes: int = 5,
opacity: float = 1,
):
"""
Dynamically create style for raster.
Parameters
----------
raster_path : str
Path to the raster file.
style_name : str, optional
The name of the style. If None, the name is parsed from the raster file name.
workspace : str
The workspace to create the style in.
color_ramp : str
The color ramp to use.
cmap_type : str
The type of color map.
number_of_classes : int
The number of classes.
opacity : float
The opacity of the style.
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue creating the style.
Notes
-----
The name of the style file will be, rasterName:workspace
This function will dynamically create the style file for raster.
Inputs: name of file, workspace, cmap_type (two options: values, range), ncolors: determines the number of class, min for minimum value of the raster, max for the max value of raster
"""
from .Calculation_gdal import raster_value
from .Style import coverage_style_xml
raster = raster_value(raster_path)
min_value = raster["min"]
max_value = raster["max"]
if style_name is None:
style_name = raster["file_name"]
coverage_style_xml(
color_ramp,
style_name,
cmap_type,
min_value,
max_value,
number_of_classes,
opacity,
)
style_xml = "<style><name>{}</name><filename>{}</filename></style>".format(
style_name, style_name + ".sld"
)
if style_name is None:
style_name = os.path.basename(raster_path)
f = style_name.split(".")
if len(f) > 0:
style_name = f[0]
headers = {"content-type": "text/xml"}
url = "{}/rest/workspaces/{}/styles".format(self.service_url, workspace)
sld_content_type = "application/vnd.ogc.sld+xml"
header_sld = {"content-type": sld_content_type}
if workspace is None:
url = "{}/rest/styles".format(self.service_url)
r = self._requests(
"post",
url,
data=style_xml,
headers=headers,
)
if r.status_code == 201:
with open("style.sld", "rb") as f:
r_sld = self._requests(
method="put",
url=url + "/" + style_name,
data=f.read(),
headers=header_sld,
)
os.remove("style.sld")
if r_sld.status_code == 200:
return r_sld.status_code
else:
raise GeoserverException(r_sld.status_code, r_sld.content)
else:
raise GeoserverException(r.status_code, r.content)
def create_catagorized_featurestyle(
self,
style_name: str,
column_name: str,
column_distinct_values,
workspace: str = None,
color_ramp: str = "tab20",
geom_type: str = "polygon",
):
"""
Dynamically create categorized style for postgis geometry,
Parameters
----------
style_name : str
The name of the style.
column_name : str
The column name to base the style on.
column_distinct_values
The distinct values in the column.
workspace : str
The workspace to create the style in.
color_ramp : str
The color ramp to use.
geom_type : str
The geometry type (point, line, polygon).
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue creating the style.
Notes
-----
The data type must be point, line or polygon
Inputs: column_name (based on which column style should be generated), workspace,
color_or_ramp (color should be provided in hex code or the color ramp name, geom_type(point, line, polygon), outline_color(hex_color))
"""
from .Style import catagorize_xml
catagorize_xml(column_name, column_distinct_values, color_ramp, geom_type)
style_xml = "<style><name>{}</name><filename>{}</filename></style>".format(
style_name, style_name + ".sld"
)
headers = {"content-type": "text/xml"}
url = "{}/rest/workspaces/{}/styles".format(self.service_url, workspace)
sld_content_type = "application/vnd.ogc.sld+xml"
header_sld = {"content-type": sld_content_type}
if workspace is None:
url = "{}/rest/styles".format(self.service_url)
r = self._requests(
"post",
url,
data=style_xml,
headers=headers,
)
if r.status_code == 201:
with open("style.sld", "rb") as f:
r_sld = self._requests(
"put",
url + "/" + style_name,
data=f.read(),
headers=header_sld,
)
os.remove("style.sld")
if r_sld.status_code == 200:
return r_sld.status_code
else:
raise GeoserverException(r_sld.status_code, r_sld.content)
else:
raise GeoserverException(r.status_code, r.content)
def create_outline_featurestyle(
self,
style_name: str,
color: str = "#3579b1",
width: str = "2",
geom_type: str = "polygon",
workspace: Optional[str] = None,
):
"""
Dynamically creates the outline style for postgis geometry
Parameters
----------
style_name : str
The name of the style.
color : str
The color of the outline.
geom_type : str
The geometry type (point, line, polygon).
workspace : str, optional
The workspace to create the style in.
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue creating the style.
Notes
-----
The geometry type must be point, line or polygon
Inputs: style_name (name of the style file in geoserver), workspace, color (style color)
"""
from .Style import outline_only_xml
outline_only_xml(color, width, geom_type)
style_xml = "<style><name>{}</name><filename>{}</filename></style>".format(
style_name, style_name + ".sld"
)
headers = {"content-type": "text/xml"}
url = "{}/rest/workspaces/{}/styles".format(self.service_url, workspace)
sld_content_type = "application/vnd.ogc.sld+xml"
header_sld = {"content-type": sld_content_type}
if workspace is None:
url = "{}/rest/styles".format(self.service_url)
r = self._requests(
"post",
url,
data=style_xml,
headers=headers,
)
if r.status_code == 201:
with open("style.sld", "rb") as f:
r_sld = self._requests(
"put",
url + "/" + style_name,
data=f.read(),
headers=header_sld,
)
os.remove("style.sld")
if r_sld.status_code == 200:
return r_sld.status_code
else:
raise GeoserverException(r_sld.status_code, r_sld.content)
else:
raise GeoserverException(r.status_code, r.content)
def create_classified_featurestyle(
self,
style_name: str,
column_name: str,
column_distinct_values,
workspace: Optional[str] = None,
color_ramp: str = "tab20",
geom_type: str = "polygon",
# outline_color: str = "#3579b1",
):
"""
Dynamically creates the classified style for postgis geometries.
Parameters
----------
style_name : str
The name of the style.
column_name : str
The column name to base the style on.
column_distinct_values
The distinct values in the column.
workspace : str, optional
The workspace to create the style in.
color_ramp : str
The color ramp to use.
geom_type : str
The geometry type (point, line, polygon).
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue creating the style.
Notes
-----
The data type must be point, line or polygon
Inputs: column_name (based on which column style should be generated), workspace,
color_or_ramp (color should be provided in hex code or the color ramp name, geom_type(point, line, polygon), outline_color(hex_color))
"""
from .Style import classified_xml
classified_xml(
style_name,
column_name,
column_distinct_values,
color_ramp,
geom_type,
)
style_xml = "<style><name>{}</name><filename>{}</filename></style>".format(
column_name, column_name + ".sld"
)
headers = {"content-type": "text/xml"}
url = "{}/rest/workspaces/{}/styles".format(self.service_url, workspace)
sld_content_type = "application/vnd.ogc.sld+xml"
header_sld = {"content-type": sld_content_type}
if workspace is None:
url = "{}/rest/styles".format(self.service_url)
r = self._requests(
"post",
url,
data=style_xml,
headers=headers,
)
if r.status_code == 201:
with open("style.sld", "rb") as f:
r_sld = self._requests(
"put",
url + "/" + style_name,
data=f.read(),
headers=header_sld,
)
os.remove("style.sld")
if r_sld.status_code == 200:
return r_sld.status_code
else:
raise GeoserverException(r_sld.status_code, r_sld.content)
else:
raise GeoserverException(r.status_code, r.content)
def publish_style(
self,
layer_name: str,
style_name: str,
workspace: str,
):
"""
Publish a raster file to geoserver.
Parameters
----------
layer_name : str
The name of the layer.
style_name : str
The name of the style.
workspace : str
The workspace the layer is located in.
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue publishing the style.
Notes
-----
The coverage store will be created automatically as the same name as the raster layer name.
input parameters: the parameters connecting geoserver (user,password, url and workspace name),
the path to the file and file_type indicating it is a geotiff, arcgrid or other raster type.
"""
headers = {"content-type": "text/xml"}
url = "{}/rest/layers/{}:{}".format(self.service_url, workspace, layer_name)
style_xml = (
"<layer><defaultStyle><name>{}</name></defaultStyle></layer>".format(
style_name
)
)
r = self._requests(
"put",
url,
data=style_xml,
headers=headers,
)
if r.status_code == 200:
return r.status_code
else:
raise GeoserverException(r.status_code, r.content)
def delete_style(self, style_name: str, workspace: Optional[str] = None):
"""
Delete a style from the geoserver.
Parameters
----------
style_name : str
The name of the style.
workspace : str, optional
The workspace the style is located in.
Returns
-------
str
A success message indicating that the style was deleted.
Raises
------
GeoserverException
If there is an issue deleting the style.
"""
payload = {"recurse": "true"}
url = "{}/rest/workspaces/{}/styles/{}".format(
self.service_url, workspace, style_name
)
if workspace is None:
url = "{}/rest/styles/{}".format(self.service_url, style_name)
r = self._requests("delete", url, params=payload)
if r.status_code == 200:
return "Status code: {}, delete style".format(r.status_code)
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# FEATURES AND DATASTORES
# _______________________________________________________________________________________________
#
def create_featurestore(
self,
store_name: str,
workspace: Optional[str] = None,
db: str = "postgres",
host: str = "localhost",
port: int = 5432,
schema: str = "public",
pg_user: str = "postgres",
pg_password: str = "admin",
overwrite: bool = False,
expose_primary_keys: str = "false",
description: Optional[str] = None,
evictor_run_periodicity: Optional[int] = 300,
max_open_prepared_statements: Optional[int] = 50,
encode_functions: Optional[str] = "false",
primary_key_metadata_table: Optional[str] = None,
batch_insert_size: Optional[int] = 1,
preparedstatements: Optional[str] = "false",
loose_bbox: Optional[str] = "true",
estimated_extends: Optional[str] = "true",
fetch_size: Optional[int] = 1000,
validate_connections: Optional[str] = "true",
support_on_the_fly_geometry_simplification: Optional[str] = "true",
connection_timeout: Optional[int] = 20,
create_database: Optional[str] = "false",
min_connections: Optional[int] = 1,
max_connections: Optional[int] = 10,
evictor_tests_per_run: Optional[int] = 3,
test_while_idle: Optional[str] = "true",
max_connection_idle_time: Optional[int] = 300,
):
"""
Create PostGIS store for connecting postgres with geoserver.
Parameters
----------
store_name : str
The name of the feature store.
workspace : str, optional
The workspace to create the feature store in.
db : str
The database type. Default is "postgres".
host : str
The database host. Default is "localhost".
port : int
The database port. Default is 5432.
schema : str
The database schema. Default is "public".
pg_user : str
The database user. Default is "postgres".
pg_password : str
The database password. Default is "admin".
overwrite : bool
Whether to overwrite the existing feature store.
expose_primary_keys : str
Whether to expose primary keys. Default is "false".
description : str, optional
The description of the feature store.
evictor_run_periodicity : int, optional
The periodicity of the evictor run.
max_open_prepared_statements : int, optional
The maximum number of open prepared statements.
encode_functions : str, optional
Whether to encode functions. Default is "false".
primary_key_metadata_table : str, optional
The primary key metadata table.
batch_insert_size : int, optional
The batch insert size. Default is 1.
preparedstatements : str, optional
Whether to use prepared statements. Default is "false".
loose_bbox : str, optional
Whether to use loose bounding boxes. Default is "true".
estimated_extends : str, optional
Whether to use estimated extends. Default is "true".
fetch_size : int, optional
The fetch size. Default is 1000.
validate_connections : str, optional
Whether to validate connections. Default is "true".
support_on_the_fly_geometry_simplification : str, optional
Whether to support on-the-fly geometry simplification. Default is "true".
connection_timeout : int, optional
The connection timeout. Default is 20.
create_database : str, optional
Whether to create the database. Default is "false".
min_connections : int, optional
The minimum number of connections. Default is 1.
max_connections : int, optional
The maximum number of connections. Default is 10.
evictor_tests_per_run : int, optional
The number of evictor tests per run.
test_while_idle : str, optional
Whether to test while idle. Default is "true".
max_connection_idle_time : int, optional
The maximum connection idle time. Default is 300.
Returns
-------
str
A success message indicating that the feature store was created/updated.
Raises
------
GeoserverException
If there is an issue creating/updating the feature store.
Notes
-----
After creating feature store, you need to publish it. See the layer publish guidline here: https://geoserver-rest.readthedocs.io/en/latest/how_to_use.html#creating-and-publishing-featurestores-and-featurestore-layers
"""
url = "{}/rest/workspaces/{}/datastores".format(self.service_url, workspace)
headers = {"content-type": "text/xml"}
database_connection = """
<dataStore>
<name>{}</name>
<description>{}</description>
<connectionParameters>
<entry key="Expose primary keys">{}</entry>
<entry key="host">{}</entry>
<entry key="port">{}</entry>
<entry key="user">{}</entry>
<entry key="passwd">{}</entry>
<entry key="dbtype">postgis</entry>
<entry key="schema">{}</entry>
<entry key="database">{}</entry>
<entry key="Evictor run periodicity">{}</entry>
<entry key="Max open prepared statements">{}</entry>
<entry key="encode functions">{}</entry>
<entry key="Primary key metadata table">{}</entry>
<entry key="Batch insert size">{}</entry>
<entry key="preparedStatements">{}</entry>
<entry key="Estimated extends">{}</entry>
<entry key="fetch size">{}</entry>
<entry key="validate connections">{}</entry>
<entry key="Support on the fly geometry simplification">{}</entry>
<entry key="Connection timeout">{}</entry>
<entry key="create database">{}</entry>
<entry key="min connections">{}</entry>
<entry key="max connections">{}</entry>
<entry key="Evictor tests per run">{}</entry>
<entry key="Test while idle">{}</entry>
<entry key="Max connection idle time">{}</entry>
<entry key="Loose bbox">{}</entry>
</connectionParameters>
</dataStore>
""".format(
store_name,
description,
expose_primary_keys,
host,
port,
pg_user,
pg_password,
schema,
db,
evictor_run_periodicity,
max_open_prepared_statements,
encode_functions,
primary_key_metadata_table,
batch_insert_size,
preparedstatements,
estimated_extends,
fetch_size,
validate_connections,
support_on_the_fly_geometry_simplification,
connection_timeout,
create_database,
min_connections,
max_connections,
evictor_tests_per_run,
test_while_idle,
max_connection_idle_time,
loose_bbox,
)
if overwrite:
url = "{}/rest/workspaces/{}/datastores/{}".format(
self.service_url, workspace, store_name
)
r = self._requests(
"put",
url,
data=database_connection,
headers=headers,
)
else:
r = self._requests(
"post",
url,
data=database_connection,
headers=headers,
)
if r.status_code in [200, 201]:
return "Featurestore created/updated successfully"
else:
raise GeoserverException(r.status_code, r.content)
def create_datastore(
self,
name: str,
path: str,
workspace: Optional[str] = None,
overwrite: bool = False,
force_absolute_path: bool = True,
):
"""
Create a datastore within the GeoServer.
Parameters
----------
name : str
Name of datastore to be created. After creating the datastore, you need to publish it by using publish_featurestore function.
path : str
Path to shapefile (.shp) file, GeoPackage (.gpkg) file, WFS url
(e.g. http://localhost:8080/geoserver/wfs?request=GetCapabilities) or directory containing shapefiles.
workspace : str, optional
The workspace to create the datastore in. Default is "default".
overwrite : bool
Whether to overwrite the existing datastore.
force_absolute_path : bool, optional
Whether to force absolute paths (legacy behavior). Default is True.
Set to False to convert absolute paths to relative paths for portability.
Returns
-------
str
A success message indicating that the datastore was created/updated.
Raises
------
GeoserverException
If there is an issue creating/updating the datastore.
Notes
-----
If you have PostGIS datastore, please use create_featurestore function
"""
if workspace is None:
workspace = "default"
if path is None:
raise Exception("You must provide a full path to the data")
# Handle HTTP URLs (WFS endpoints)
if path.startswith("http://") or path.startswith("https://"):
data_url = "<GET_CAPABILITIES_URL>{}</GET_CAPABILITIES_URL>".format(path)
else:
# Handle file paths with inline path conversion
if not force_absolute_path and os.path.isabs(path):
# Convert absolute path to relative path inline
filename = os.path.basename(path)
relative_path = f"data/{workspace}/{filename}"
data_url = "<url>file:{}</url>".format(relative_path)
else:
# Use path as-is (could be relative or absolute)
data_url = "<url>file:{}</url>".format(path)
data = "<dataStore><name>{}</name><connectionParameters>{}</connectionParameters></dataStore>".format(
name, data_url
)
headers = {"content-type": "text/xml"}
if overwrite:
url = "{}/rest/workspaces/{}/datastores/{}".format(
self.service_url, workspace, name
)
r = self._requests("put", url, data=data, headers=headers)
else:
url = "{}/rest/workspaces/{}/datastores".format(self.service_url, workspace)
r = self._requests(method="post", url=url, data=data, headers=headers)
if r.status_code in [200, 201]:
return "Data store created/updated successfully"
else:
raise GeoserverException(r.status_code, r.content)
def create_shp_datastore(
self,
path: str,
store_name: Optional[str] = None,
workspace: Optional[str] = None,
file_extension: str = "shp",
):
"""
Create datastore for a shapefile.
Parameters
----------
path : str
Path to the zipped shapefile (.shp).
store_name : str, optional
Name of store to be created. If None, parses from the filename stem.
workspace: str, optional
Name of workspace to be used. Default: "default".
file_extension : str
The file extension of the shapefile. Default is "shp".
Returns
-------
str
A success message indicating that the shapefile datastore was created.
Raises
------
GeoserverException
If there is an issue creating the shapefile datastore.
Notes
-----
The layer name will be assigned according to the shp name
"""
if path is None:
raise Exception("You must provide a full path to shapefile")
if workspace is None:
workspace = "default"
if store_name is None:
store_name = os.path.basename(path)
f = store_name.split(".")
if len(f) > 0:
store_name = f[0]
headers = {
"Content-type": "application/zip",
"Accept": "application/xml",
}
if isinstance(path, dict):
path = prepare_zip_file(store_name, path)
url = "{0}/rest/workspaces/{1}/datastores/{2}/file.{3}?filename={2}&update=overwrite".format(
self.service_url, workspace, store_name, file_extension
)
with open(path, "rb") as f:
r = self._requests("put", url, data=f.read(), headers=headers)
if r.status_code in [200, 201, 202]:
return "The shapefile datastore created successfully!"
else:
raise GeoserverException(r.status_code, r.content)
def create_gpkg_datastore(
self,
path: str,
store_name: Optional[str] = None,
workspace: Optional[str] = None,
file_extension: str = "gpkg",
):
"""
Create datastore for a geopackage.
Parameters
----------
path : str
Path to the geopackage file.
store_name : str, optional
Name of store to be created. If None, parses from the filename.
workspace: str, optional
Name of workspace to be used. Default: "default".
file_extension : str
The file extension of the geopackage. Default is "gpkg".
Returns
-------
str
A success message indicating that the geopackage datastore was created.
Raises
------
GeoserverException
If there is an issue creating the geopackage datastore.
Notes
-----
The layer name will be assigned according to the layer name in the geopackage.
If the layer already exist it will be updated.
"""
if path is None:
raise Exception("You must provide a full path to shapefile")
if workspace is None:
workspace = "default"
if store_name is None:
store_name = os.path.basename(path)
f = store_name.split(".")
if len(f) > 0:
store_name = f[0]
headers = {
"Content-type": "application/x-sqlite3",
"Accept": "application/json",
}
url = "{0}/rest/workspaces/{1}/datastores/{2}/file.{3}?filename={2}".format(
self.service_url, workspace, store_name, file_extension
)
with open(path, "rb") as f:
r = self._requests("put", url, data=f.read(), headers=headers)
if r.status_code in [200, 201, 202]:
return "The geopackage datastore created successfully!"
else:
raise GeoserverException(r.status_code, r.content)
def publish_featurestore(
self,
store_name: str,
pg_table: str,
workspace: Optional[str] = None,
title: Optional[str] = None,
advertised: Optional[bool] = True,
abstract: Optional[str] = None,
keywords: Optional[List[str]] = None,
cqlfilter: Optional[str] = None,
) -> int:
"""
Publish a featurestore to geoserver.
Parameters
----------
store_name : str
The name of the featurestore.
pg_table : str
The name of the PostgreSQL table.
workspace : str, optional
The workspace to publish the featurestore in. Default is "default".
title : str, optional
The title of the featurestore. If None, the table name is used.
advertised : bool, optional
Whether to advertise the featurestore. Default is True.
abstract : str, optional
The abstract of the featurestore.
keywords : list of str, optional
List of keywords associated with the featurestore.
cqlfilter : str, optional
The CQL filter for the featurestore.
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue publishing the featurestore.
Notes
-----
Only user for postgis vector data
input parameters: specify the name of the table in the postgis database to be published, specify the store,workspace name, and the Geoserver user name, password and URL
"""
if workspace is None:
workspace = "default"
if title is None:
title = pg_table
url = "{}/rest/workspaces/{}/datastores/{}/featuretypes/".format(
self.service_url, workspace, store_name
)
abstract_xml = f"<abstract>{abstract}</abstract>" if abstract else ""
keywords_xml = ""
if keywords:
keywords_xml = "<keywords>"
for keyword in keywords:
keywords_xml += f"<string>{keyword}</string>"
keywords_xml += "</keywords>"
cqlfilter_xml = f"<cqlFilter>{cqlfilter}</cqlFilter>" if cqlfilter else ""
layer_xml = f"""<featureType>
<name>{pg_table}</name>
<title>{title}</title>
<advertised>{advertised}</advertised>
{abstract_xml}
{keywords_xml}
{cqlfilter_xml}
</featureType>"""
headers = {"content-type": "text/xml"}
r = self._requests("post", url, data=layer_xml, headers=headers)
if r.status_code == 201:
return r.status_code
else:
raise GeoserverException(r.status_code, r.content)
def edit_featuretype(
self,
store_name: str,
workspace: Optional[str],
pg_table: str,
name: str,
title: str,
abstract: Optional[str] = None,
keywords: Optional[List[str]] = None,
recalculate: Optional[str] = None,
) -> int:
"""
Edit a featuretype in the geoserver.
Parameters
----------
recalculate : str, optional
Recalculate param. Can be: empty string, nativebbox and nativebbox,latlonbbox.
store_name : str
The name of the feature store.
workspace : str, optional
The workspace of the feature store.
pg_table : str
The name of the PostgreSQL table.
name : str
The name of the feature type.
title : str
The title of the feature type.
abstract : str, optional
The abstract of the feature type.
keywords : list of str, optional
List of keywords associated with the feature type.
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue editing the feature type.
"""
if workspace is None:
workspace = "default"
recalculate_param = f"?recalculate={recalculate}" if recalculate else ""
url = "{}/rest/workspaces/{}/datastores/{}/featuretypes/{}.xml{}".format(
self.service_url, workspace, store_name, pg_table, recalculate_param
)
# Create XML for abstract and keywords
abstract_xml = f"<abstract>{abstract}</abstract>" if abstract else ""
keywords_xml = ""
if keywords:
keywords_xml = "<keywords>"
for keyword in keywords:
keywords_xml += f"<string>{keyword}</string>"
keywords_xml += "</keywords>"
layer_xml = f"""<featureType>
<name>{name}</name>
<title>{title}</title>
{abstract_xml}{keywords_xml}
</featureType>"""
headers = {"content-type": "text/xml"}
r = self._requests("put", url, data=layer_xml, headers=headers)
if r.status_code == 200:
return r.status_code
else:
raise GeoserverException(r.status_code, r.content)
def publish_featurestore_sqlview(
self,
name: str,
store_name: str,
sql: str,
parameters: Optional[Iterable[Dict]] = None,
key_column: Optional[str] = None,
geom_name: str = "geom",
geom_type: str = "Geometry",
srid: Optional[int] = 4326,
workspace: Optional[str] = None,
) -> int:
"""
Publishes an SQL query as a layer, optionally with parameters.
Parameters
----------
name : str
The name of the SQL view.
store_name : str
The name of the feature store.
sql : str
The SQL query.
parameters : iterable of dict, optional
List of parameters for the SQL query.
key_column : str, optional
The key column.
geom_name : str, optional
The name of the geometry column.
geom_type : str, optional
The type of the geometry column.
srid : int, optional
The spatial reference ID. Default is 4326.
workspace : str, optional
The workspace to publish the SQL view in. Default is "default".
Returns
-------
int
The status code of the request.
Raises
------
GeoserverException
If there is an issue publishing the SQL view.
Notes
-----
With regards to SQL view parameters, it is advised to read the relevant section from the geoserver docs:
https://docs.geoserver.org/main/en/user/data/database/sqlview.html#parameterizing-sql-views
An integer-based parameter must have a default value
You should be VERY careful with the `regexp_validator`, as it can open you to SQL injection attacks. If you do
not supply one for a parameter, it will use the geoserver default `^[\w\d\s]+$`.
The `parameters` iterable must contain dictionaries with this structure:
```json
{
"name": "<name of parameter (required)>"
"regexpValidator": "<string containing regex validator> (optional)"
"defaultValue" : "<default value of parameter if not specified (required only for non-string parameters)>"
}
```
"""
if workspace is None:
workspace = "default"
# issue #87
if key_column is not None:
key_column_xml = """<keyColumn>{}</keyColumn>""".format(key_column)
else:
key_column_xml = """"""
parameters_xml = ""
if parameters is not None:
for parameter in parameters:
# non-string parameters MUST have a default value supplied
if (
not is_surrounded_by_quotes(sql, parameter["name"])
and not "defaultValue" in parameter
):
raise ValueError(
f"Parameter `{parameter['name']}` appears to be a non-string in the supplied query"
", but does not have a default value specified. You must supply a default value "
"for non-string parameters using the `defaultValue` key."
)
param_name = parameter.get("name", "")
default_value = parameter.get("defaultValue", "")
regexp_validator = parameter.get("regexpValidator", r"^[\w\d\s]+$")
parameters_xml += f"""
<parameter>
<name>{param_name}</name>
<defaultValue>{default_value}</defaultValue>
<regexpValidator>{regexp_validator}</regexpValidator>
</parameter>\n
""".strip()
layer_xml = """<featureType>
<name>{0}</name>
<enabled>true</enabled>
<namespace>
<name>{4}</name>
</namespace>
<title>{0}</title>
<srs>EPSG:{5}</srs>
<metadata>
<entry key="JDBC_VIRTUAL_TABLE">
<virtualTable>
<name>{0}</name>
<sql>{1}</sql>
<escapeSql>true</escapeSql>
<geometry>
<name>{2}</name>
<type>{3}</type>
<srid>{5}</srid>
</geometry>{6}
{7}
</virtualTable>
</entry>
</metadata>
</featureType>""".format(
name,
sql,
geom_name,
geom_type,
workspace,
srid,
key_column_xml,
parameters_xml,
)
# rest API url
url = "{}/rest/workspaces/{}/datastores/{}/featuretypes".format(
self.service_url, workspace, store_name
)
# headers
headers = {"content-type": "text/xml"}
# request
r = self._requests("post", url, data=layer_xml, headers=headers)
if r.status_code == 201:
return r.status_code
else:
raise GeoserverException(r.status_code, r.content)
def get_featuretypes(
self, workspace: str = None, store_name: str = None
) -> List[str]:
"""
Get feature types from the geoserver.
Parameters
----------
workspace : str
The workspace to get the feature types from.
store_name : str
The name of the feature store.
Returns
-------
list of str
A list of feature types.
Raises
------
GeoserverException
If there is an issue getting the feature types.
"""
url = "{}/rest/workspaces/{}/datastores/{}/featuretypes.json".format(
self.service_url, workspace, store_name
)
r = self._requests("get", url)
if r.status_code == 200:
r_dict = r.json()
features = [i["name"] for i in r_dict["featureTypes"]["featureType"]]
return features
else:
raise GeoserverException(r.status_code, r.content)
def get_feature_attribute(
self, feature_type_name: str, workspace: str, store_name: str
) -> List[str]:
"""
Get feature attributes from the geoserver.
Parameters
----------
feature_type_name : str
The name of the feature type.
workspace : str
The workspace of the feature store.
store_name : str
The name of the feature store.
Returns
-------
list of str
A list of feature attributes.
Raises
------
GeoserverException
If there is an issue getting the feature attributes.
"""
url = "{}/rest/workspaces/{}/datastores/{}/featuretypes/{}.json".format(
self.service_url, workspace, store_name, feature_type_name
)
r = self._requests("get", url)
if r.status_code == 200:
r_dict = r.json()
attribute = [
i["name"] for i in r_dict["featureType"]["attributes"]["attribute"]
]
return attribute
else:
raise GeoserverException(r.status_code, r.content)
def get_featurestore(self, store_name: str, workspace: str) -> dict:
"""
Get a featurestore from the geoserver.
Parameters
----------
store_name : str
The name of the feature store.
workspace : str
The workspace of the feature store.
Returns
-------
dict
A dictionary representation of the feature store.
Raises
------
GeoserverException
If there is an issue getting the feature store.
"""
url = "{}/rest/workspaces/{}/datastores/{}".format(
self.service_url, workspace, store_name
)
r = self._requests("get", url)
if r.status_code == 200:
r_dict = r.json()
return r_dict["dataStore"]
else:
raise GeoserverException(r.status_code, r.content)
def delete_featurestore(
self, featurestore_name: str, workspace: Optional[str] = None
) -> str:
"""
Delete a featurestore from the geoserver.
Parameters
----------
featurestore_name : str
The name of the featurestore.
workspace : str, optional
The workspace of the featurestore.
Returns
-------
str
A success message indicating that the featurestore was deleted.
Raises
------
GeoserverException
If there is an issue deleting the featurestore.
"""
payload = {"recurse": "true"}
url = "{}/rest/workspaces/{}/datastores/{}".format(
self.service_url, workspace, featurestore_name
)
if workspace is None:
url = "{}/datastores/{}".format(self.service_url, featurestore_name)
r = self._requests("delete", url, params=payload)
if r.status_code == 200:
return "Status code: {}, delete featurestore".format(r.status_code)
else:
raise GeoserverException(r.status_code, r.content)
def delete_coveragestore(
self, coveragestore_name: str, workspace: Optional[str] = None
) -> str:
"""
Delete a coveragestore from the geoserver.
Parameters
----------
coveragestore_name : str
The name of the coveragestore.
workspace : str, optional
The workspace of the coveragestore.
Returns
-------
str
A success message indicating that the coveragestore was deleted.
Raises
------
GeoserverException
If there is an issue deleting the coveragestore.
"""
payload = {"recurse": "true"}
url = "{}/rest/workspaces/{}/coveragestores/{}".format(
self.service_url, workspace, coveragestore_name
)
if workspace is None:
url = "{}/rest/coveragestores/{}".format(
self.service_url, coveragestore_name
)
r = self._requests("delete", url, params=payload)
if r.status_code == 200:
return "Coverage store deleted successfully"
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# USERS AND USERGROUPS
# _______________________________________________________________________________________________
#
def get_all_users(self, service=None) -> dict:
"""
Query all users in the provided user/group service, else default user/group service is queried.
Parameters
----------
service: str, optional
The user/group service to query.
Returns
-------
dict
A dictionary containing all users.
Raises
------
GeoserverException
If there is an issue getting the users.
"""
url = "{}/rest/security/usergroup/".format(self.service_url)
if service is None:
url += "users/"
else:
url += "service/{}/users/".format(service)
headers = {"accept": "application/xml"}
r = self._requests("get", url, headers=headers)
if r.status_code == 200:
return parse(r.content)
else:
raise GeoserverException(r.status_code, r.content)
def create_user(
self, username: str, password: str, enabled: bool = True, service=None
) -> str:
"""
Add a new user to the provided user/group service.
Parameters
----------
username : str
The username of the new user.
password: str
The password of the new user.
enabled: bool
Whether the new user is enabled.
service : str, optional
The user/group service to add the user to.
Returns
-------
str
A success message indicating that the user was created.
Raises
------
GeoserverException
If there is an issue creating the user.
"""
url = "{}/rest/security/usergroup/".format(self.service_url)
if service is None:
url += "users/"
else:
url += "service/{}/users/".format(service)
data = "<user><userName>{}</userName><password>{}</password><enabled>{}</enabled></user>".format(
username, password, str(enabled).lower()
)
headers = {"content-type": "text/xml", "accept": "application/json"}
r = self._requests("post", url, data=data, headers=headers)
if r.status_code == 201:
return "User created successfully"
else:
raise GeoserverException(r.status_code, r.content)
def modify_user(
self, username: str, new_name=None, new_password=None, enable=None, service=None
) -> str:
"""
Modifies a user in the provided user/group service.
Parameters
----------
username : str
The username of the user to modify.
new_name : str, optional
The new username.
new_password : str, optional
The new password.
enable : bool, optional
Whether the user is enabled.
service : str, optional
The user/group service to modify the user in.
Returns
-------
str
A success message indicating that the user was modified.
Raises
------
GeoserverException
If there is an issue modifying the user.
"""
url = "{}/rest/security/usergroup/".format(self.service_url)
if service is None:
url += "user/{}".format(username)
else:
url += "service/{}/user/{}".format(service, username)
modifications = dict()
if new_name is not None:
modifications["userName"] = new_name
if new_password is not None:
modifications["password"] = new_password
if enable is not None:
modifications["enabled"] = enable
data = unparse({"user": modifications})
print(url, data)
headers = {"content-type": "text/xml", "accept": "application/json"}
r = self._requests("post", url, data=data, headers=headers)
if r.status_code == 200:
return "User modified successfully"
else:
raise GeoserverException(r.status_code, r.content)
def delete_user(self, username: str, service=None) -> str:
"""
Deletes user from the provided user/group service.
Parameters
----------
username : str
The username of the user to delete.
service : str, optional
The user/group service to delete the user from.
Returns
-------
str
A success message indicating that the user was deleted.
Raises
------
GeoserverException
If there is an issue deleting the user.
"""
url = "{}/rest/security/usergroup/".format(self.service_url)
if service is None:
url += "user/{}".format(username)
else:
url += "service/{}/user/{}".format(service, username)
headers = {"accept": "application/json"}
r = self._requests("delete", url, headers=headers)
if r.status_code == 200:
return "User deleted successfully"
else:
raise GeoserverException(r.status_code, r.content)
def get_all_usergroups(self, service=None) -> dict:
"""
Queries all the groups in the given user/group service.
Parameters
----------
service : str, optional
The user/group service to query.
Returns
-------
dict
A dictionary containing all user groups.
Raises
------
GeoserverException
If there is an issue getting the user groups.
"""
url = "{}/rest/security/usergroup/".format(self.service_url)
if service is None:
url += "groups/"
else:
url += "service/{}/groups/".format(service)
r = self._requests("get", url)
if r.status_code == 200:
return parse(r.content)
else:
raise GeoserverException(r.status_code, r.content)
def create_usergroup(self, group: str, service=None) -> str:
"""
Add a new usergroup to the provided user/group service.
Parameters
----------
group : str
The name of the user group.
service : str, optional
The user/group service to add the user group to.
Returns
-------
str
A success message indicating that the user group was created.
Raises
------
GeoserverException
If there is an issue creating the user group.
"""
url = "{}/rest/security/usergroup/".format(self.service_url)
if service is None:
url += "group/{}".format(group)
else:
url += "service/{}/group/{}".format(service, group)
r = self._requests("post", url)
if r.status_code == 201:
return "Group created successfully"
else:
raise GeoserverException(r.status_code, r.content)
def delete_usergroup(self, group: str, service=None) -> str:
"""
Deletes given usergroup from provided user/group service.
Parameters
----------
group : str
The name of the user group to delete.
service : str, optional
The user/group service to delete the user group from.
Returns
-------
str
A success message indicating that the user group was deleted.
Raises
------
GeoserverException
If there is an issue deleting the user group.
"""
url = "{}/rest/security/usergroup/".format(self.service_url)
if service is None:
url += "group/{}".format(group)
else:
url += "service/{}/group/{}".format(service, group)
r = self._requests("delete", url)
if r.status_code == 200:
return "Group deleted successfully"
else:
raise GeoserverException(r.status_code, r.content)
# _______________________________________________________________________________________________
#
# SERVICES
# _______________________________________________________________________________________________
#
def update_service(self, service: str, **kwargs):
"""
Update selected service's options.
Parameters
----------
service : str
Type of service (e.g., wms, wfs)
kwargs : dict
Options to be modified (e.g., maxRenderingTime=600)
Returns
-------
str
A success message indicating that the options were updated.
Raises
------
GeoserverException
If there is an issue updating the service's options.
"""
url = "{}/rest/services/{}/settings".format(self.service_url, service)
headers = {"content-type": "text/xml"}
data = ""
for key, value in kwargs.items():
data += "<{}><{}>{}</{}></{}>".format(service, key, value, key, service)
r = self._requests("put", url, data=data, headers=headers)
if r.status_code == 200:
return "Service's option updated successfully"
else:
raise GeoserverException(r.status_code, r.content)
================================================
FILE: geo/Style.py
================================================
# inbuilt libraries
from typing import Dict, Iterable, List, Union
# third-party libraries
import seaborn as sns
from matplotlib.colors import rgb2hex
def coverage_style_colormapentry(
color_ramp: Union[List, Dict, Iterable],
min_value: float,
max_value: float,
number_of_classes: int = None,
):
"""
Parameters
----------
color_ramp
min_value
max_value
number_of_classes
Returns
-------
Notes
-----
This is the core function for controlling the layers styles
The color_ramp can be list or dict or touple or str
min, max will be dynamically calculated value from raster
number_of_classes will be available in map legend
"""
style_append = ""
n = len(color_ramp)
if isinstance(color_ramp, list):
if n != number_of_classes:
number_of_classes = n
interval = (max_value - min_value) / (number_of_classes - 1)
for i, color in enumerate(color_ramp):
value = min_value + interval * i
value = round(value, 1)
style_append += (
'<sld:ColorMapEntry color="{}" label="{}" quantity="{}"/>'.format(
color, value, value
)
)
elif isinstance(color_ramp, dict):
if n != number_of_classes:
number_of_classes = n
interval = (max_value - min_value) / (number_of_classes - 1)
for name, color, i in zip(color_ramp.keys(), color_ramp.values(), range(n)):
value = min_value + interval * i
style_append += (
'<sld:ColorMapEntry color="{}" label=" {}" quantity="{}"/>'.format(
color, name, value
)
)
else:
for i, color in enumerate(color_ramp):
interval = (max_value - min_value) / (number_of_classes - 1)
value = min_value + interval * i
style_append += (
'<sld:ColorMapEntry color="{}" label="{}" quantity="{}"/>'.format(
color, value, value
)
)
return style_append
def coverage_style_xml(
color_ramp, style_name, cmap_type, min_value, max_value, number_of_classes, opacity
):
min_max_difference = max_value - min_value
style_append = ""
interval = min_max_difference / (number_of_classes - 1) # noqa
# The main style of the coverage style
if isinstance(color_ramp, str):
palette = sns.color_palette(color_ramp, int(number_of_classes))
color_ramp = [rgb2hex(i) for i in palette]
style_append += coverage_style_colormapentry(
color_ramp, min_value, max_value, number_of_classes
)
style = """
<StyledLayerDescriptor xmlns="http://www.opengis.net/sld" xmlns:gml="http://www.opengis.net/gml" version="1.0.0" xmlns:ogc="http://www.opengis.net/ogc" xmlns:sld="http://www.opengis.net/sld">
<UserLayer>
<sld:LayerFeatureConstraints>
<sld:FeatureTypeConstraint/>
</sld:LayerFeatureConstraints>
<sld:UserStyle>
<sld:Name>{2}</sld:Name>
<sld:FeatureTypeStyle>
<sld:Rule>
<sld:RasterSymbolizer>
<sld:Opacity>{3}</sld:Opacity>
<sld:ChannelSelection>
<sld:GrayChannel>
<sld:SourceChannelName>1</sld:SourceChannelName>
</sld:GrayChannel>
</sld:ChannelSelection>
<sld:ColorMap type="{0}">
{1}
</sld:ColorMap>
</sld:RasterSymbolizer>
</sld:Rule>
</sld:FeatureTypeStyle>
</sld:UserStyle>
</UserLayer>
</StyledLayerDescriptor>
""".format(
cmap_type, style_append, style_name, opacity
)
with open("style.sld", "w") as f:
f.write(style)
def outline_only_xml(color, width, geom_type="polygon"):
if geom_type == "point":
symbolizer = """
<PointSymbolizer>
<Graphic>
<Mark>
<WellKnownName>circle</WellKnownName>
<Fill>
<CssParameter name="fill">{}</CssParameter>
</Fill>
</Mark>
<Size>8</Size>
</Graphic>
</PointSymbolizer>
""".format(
color
)
elif geom_type == "line":
symbolizer = """
<LineSymbolizer>
<Stroke>
<CssParameter name="stroke">{}</CssParameter>
<CssParameter name="stroke-width"{}</CssParameter>
</Stroke>
</LineSymbolizer>
""".format(
color, width
)
elif geom_type == "polygon":
symbolizer = """
<PolygonSymbolizer>
<Stroke>
<CssParameter name="stroke">{}</CssParameter>
<CssParameter name="stroke-width">{}</CssParameter>
</Stroke>
</PolygonSymbolizer>
""".format(
color, width
)
else:
print("Error: Invalid geometry type")
return
style = """
<StyledLayerDescriptor xmlns="http://www.opengis.net/sld" xmlns:ogc="http://www.opengis.net/ogc" xmlns:se="http://www.opengis.net/se" xmlns:xlink="http://www.w3.org/1999/xlink" xsi:schemaLocation="http://www.opengis.net/sld http://schemas.opengis.net/sld/1.1.0/StyledLayerDescriptor.xsd" version="1.1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<NamedLayer>
<se:Name>Layer name</se:Name>
<UserStyle>
<se:Name>Layer name</se:Name>
<se:FeatureTypeStyle>
<se:Rule>
<se:Name>Single symbol</se:Name>
{}
</se:Rule>
</se:FeatureTypeStyle>
</UserStyle>
</NamedLayer>
</StyledLayerDescriptor>
""".format(
symbolizer
)
with open("style.sld", "w") as f:
f.write(style)
def catagorize_xml(
column_name: str,
values: List[float],
color_ramp: str = None,
geom_type: str = "polygon",
):
n = len(values)
palette = sns.color_palette(color_ramp, int(n))
palette_hex = [rgb2hex(i) for i in palette]
rule = ""
for value, color in zip(values, palette_hex):
if geom_type == "point":
rule += """
<Rule>
<Name>{0}</Name>
<Title>{1}</Title>
<ogc:Filter>
<ogc:PropertyIsEqualTo>
<ogc:PropertyName>{0}</ogc:PropertyName>
<ogc:Literal>{1}</ogc:Literal>
</ogc:PropertyIsEqualTo>
</ogc:Filter>
<PointSymbolizer>
<Graphic>
<Mark>
<WellKnownName>circle</WellKnownName>
<Fill>
<CssParameter name="fill">{2}</CssParameter>
</Fill>
</Mark>
<Size>5</Size>
</Graphic>
</PointSymbolizer>
</Rule>
""".format(
column_name, value, color
)
elif geom_type == "line":
rule += """
<Rule>
<Name>{1}</Name>
<ogc:Filter>
<ogc:PropertyIsEqualTo>
<ogc:PropertyName>{0}</ogc:PropertyName>
<ogc:Literal>{1}</ogc:Literal>
</ogc:PropertyIsEqualTo>
</ogc:Filter>
<LineSymbolizer>
<Stroke>
<CssParameter name="stroke">{2}</CssParameter>
<CssParameter name="stroke-width">1</CssParameter>
</Stroke>
</LineSymbolizer>
</Rule>
""".format(
column_name, value, color
)
elif geom_type == "polygon":
rule += """
<Rule>
<Name>{0}</Name>
<Title>{1}</Title>
<ogc:Filter>
<ogc:PropertyIsEqualTo>
<ogc:PropertyName>{0}</ogc:PropertyName>
<ogc:Literal>{1}</ogc:Literal>
</ogc:PropertyIsEqualTo>
</ogc:Filter>
<PolygonSymbolizer>
<Fill>
<CssParameter name="fill">{2}</CssParameter>
</Fill>
<Stroke>
<CssParameter name="stroke">{3}</CssParameter>
<CssParameter name="stroke-width">0.5</CssParameter>
</Stroke>
</PolygonSymbolizer>
</Rule>
""".format(
column_name, value, color, "#000000"
)
else:
print("Error: Invalid geometry type")
return
style = """
<StyledLayerDescriptor xmlns="http://www.opengis.net/sld" xmlns:ogc="http://www.opengis.net/ogc" xmlns:se="http://www.opengis.net/se" xmlns:xlink="http://www.w3.org/1999/xlink" xsi:schemaLocation="http://www.opengis.net/sld http://schemas.opengis.net/sld/1.1.0/StyledLayerDescriptor.xsd" version="1.1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<NamedLayer>
<se:Name>Layer name</se:Name>
<UserStyle>
<se:Name>Layer name</se:Name>
<FeatureTypeStyle>
{}
</FeatureTypeStyle>
</UserStyle>
</NamedLayer>
</StyledLayerDescriptor>
""".format(
rule
)
with open("style.sld", "w") as f:
f.write(style)
def classified_xml(
style_name: str,
column_name: str,
values: List[float],
color_ramp: str = None,
geom_type: str = "polygon",
):
max_value = max(values)
min_value = min(values)
diff = max_value - min_value
n = 5
interval = diff / 5
palette = sns.color_palette(color_ramp, int(n))
palette_hex = [rgb2hex(i) for i in palette]
# interval = N/4
# color_values = [{value: color} for value, color in zip(values, palette_hex)]
# print(color_values)
rule = ""
for i, color in enumerate(palette_hex):
print(i)
rule += """
<se:Rule>
<se:Name>{1}</se:Name>
<se:Description>
<se:Title>{4}</se:Title>
</se:Description>
<ogc:Filter xmlns:ogc="http://www.opengis.net/ogc">
<ogc:And>
<ogc:PropertyIsGreaterThan>
<ogc:PropertyName>{0}</ogc:PropertyName>
<ogc:Literal>{5}</ogc:Literal>
</ogc:PropertyIsGreaterThan>
<ogc:PropertyIsLessThanOrEqualTo>
<ogc:PropertyName>{0}</ogc:PropertyName>
<ogc:Literal>{4}</ogc:Literal>
</ogc:PropertyIsLessThanOrEqualTo>
</ogc:And>
</ogc:Filter>
<se:PolygonSymbolizer>
<se:Fill>
<se:SvgParameter name="fill">{2}</se:SvgParameter>
</se:Fill>
<se:Stroke>
<se:SvgParameter name="stroke">{3}</se:SvgParameter>
<se:SvgParameter name="stroke-width">1</se:SvgParameter>
<se:SvgParameter name="stroke-linejoin">bevel</se:SvgParameter>
</se:Stroke>
</se:PolygonSymbolizer>
</se:Rule>
""".format(
column_name,
style_name,
color,
"#000000",
min_value + interval * i,
min_value + interval * (i + 1),
)
style = """
<StyledLayerDescriptor xmlns="http://www.opengis.net/sld" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:ogc="http://www.opengis.net/ogc" version="1.1.0" xmlns:se="http://www.opengis.net/se" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opengis.net/sld http://schemas.opengis.net/sld/1.1.0/StyledLayerDescriptor.xsd">
<NamedLayer>
<se:Name>{0}</se:Name>
<UserStyle>
<se:Name>{0}</se:Name>
<se:FeatureTypeStyle>
{1}
</se:FeatureTypeStyle>
</UserStyle>
</NamedLayer>
</StyledLayerDescriptor>
""".format(
style_name, rule
)
with open("style.sld", "w") as f:
f.write(style)
================================================
FILE: geo/__init__.py
================================================
================================================
FILE: geo/__version__.py
================================================
# This information is located in its own file so that it can be loaded
# without importing the main package when its dependencies are not installed.
# See: https://packaging.python.org/guides/single-sourcing-package-version
__author__ = "Tek Kshetri"
__email__ = "iamtekson@gmail.com"
__version__ = "2.10.0"
================================================
FILE: geo/supports.py
================================================
import os
import re
from tempfile import mkstemp
from typing import Dict
from zipfile import ZipFile
import xml.etree.ElementTree as ET
def prepare_zip_file(name: str, data: Dict) -> str:
"""Creates a zip file from
GeoServer's REST API uses ZIP archives as containers for file formats such
as Shapefile and WorldImage which include several 'boxcar' files alongside
the main data. In such archives, GeoServer assumes that all of the relevant
files will have the same base name and appropriate extensions, and live in
the root of the ZIP archive. This method produces a zip file that matches
these expectations, based on a basename, and a dict of extensions to paths or
file-like objects. The client code is responsible for deleting the zip
archive when it's done.
Parameters
----------
name : name of files
data : dict
Returns
-------
str
"""
fd, path = mkstemp()
zip_file = ZipFile(path, "w", allowZip64=True)
print(fd, path, zip_file, data)
for ext, stream in data.items():
fname = "{}.{}".format(name, ext)
if isinstance(stream, str):
zip_file.write(stream, fname)
else:
zip_file.writestr(fname, stream.read())
zip_file.close()
os.close(fd)
return path
def is_valid_xml(xml_string: str) -> bool:
"""
Returns True if string is valid XML, false otherwise
Parameters
----------
xml_string : string containing xml
Returns
-------
bool
"""
try:
# Attempt to parse the XML string
ET.fromstring(xml_string)
return True
except ET.ParseError:
return False
def is_surrounded_by_quotes(text, param):
# The regex pattern searches for '%foo%' surrounded by single quotes.
# It uses \'%foo%\' to match '%foo%' literally, including the single quotes.
pattern = rf"\'%{param}%\'"
# re.search() searches the string for the first location where the regex pattern produces a match.
# If a match is found, re.search() returns a match object. Otherwise, it returns None.
match = re.search(pattern, text)
# Return True if a match is found, False otherwise.
return bool(match)
================================================
FILE: requirements.txt
================================================
pygments
requests
xmltodict
================================================
FILE: requirements_dev.txt
================================================
-r requirements.txt
pytest
black
flake8
sphinx>=1.7
sphinx-rtd-theme>=2.0
pre-commit
environs
ddt
sqlalchemy>=2.0.29
psycopg2>=2.9.9
gdal>=3.4.1 # Note: in the automated test pipeline, this will be replaced by whatever is the output of `gdal-config --version`
seaborn>=0.13.2
ddt>=1.7.1
xmltodict>=0.13.0
================================================
FILE: requirements_style.txt
================================================
matplotlib
seaborn
gdal
================================================
FILE: setup.cfg
================================================
[flake8]
ignore =
C901
E203
E231
E266
E501
F401
F403
W503
W504
max-line-length = 88
max-complexity = 12
exclude =
.git,
__pycache__,
docs/source/conf.py,
build,
dist,
src,
.eggs,
[isort]
profile = black
[tool:pytest]
norecursedirs = src .git bin conda-recipe joss
python_files = test_*.py
================================================
FILE: setup.py
================================================
import os
from typing import Dict
from setuptools import setup
HERE = os.path.abspath(os.path.dirname(__file__))
about = dict()
with open(os.path.join(HERE, "geo", "__version__.py")) as f:
exec(f.read(), about)
with open("README.md") as fh:
long_description = fh.read()
setup(
name="geoserver-rest",
version=about["__version__"],
author=about["__author__"],
author_email=about["__email__"],
description="Package for GeoServer rest API",
py_modules=["geoserver-rest-python"],
# package_dir={'':'src'},
license="MIT License",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/iamtekson/geoserver-rest-python",
packages=["geo"],
keywords=[
"geoserver-rest-python",
"geoserver rest",
"python geoserver",
"geoserver api",
"api",
"rest geoserver",
"python",
"geoserver python",
"geoserver rest",
],
classifiers=[
"Programming Language :: Python :: 3",
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
install_requires=[
"pygments",
"requests",
"xmltodict",
],
extras_require={"dev": ["pytest", "black", "flake8", "sphinx>=1.7", "pre-commit"], 'style': ['matplotlib', 'seaborn', 'gdal'], 'all': ['matplotlib', 'seaborn', 'gdal']},
python_requires=">=3.6",
)
================================================
FILE: tests/.env_template
================================================
# Login information for geoserver
GEOSERVER_URL="http://localhost:8080/geoserver"
GEOSERVER_USER="admin"
GEOSERVER_PASSWORD="geoserver"
================================================
FILE: tests/__init__.py
================================================
================================================
FILE: tests/common.py
================================================
import os
from geo.Geoserver import Geoserver
GEO_URL = os.getenv("GEO_URL", "http://localhost:8080/geoserver") # relative to test machine
geo = Geoserver(GEO_URL, username=os.getenv("GEO_USER", "admin"), password=os.getenv("GEO_PASS", "geoserver"))
postgis_params = {
"host": os.getenv("DB_HOST", "localhost"), # relative to the geoserver instance
"port": os.getenv("DB_PORT", "5432"), # relative to the geoserver instance
"db": os.getenv("DB_NAME", "geodb"),
"pg_user": os.getenv("DB_USER", "geodb_user"),
"pg_password": os.getenv("DB_PASS", "geodb_pass")
}
# in case you are using docker or something, and the location of the database is different relative to your host machine
postgis_params_local_override = {
"host": os.getenv("DB_HOST_LOCAL", "localhost"), # relative to the test machine
"port": os.getenv("DB_PORT_LOCAL", "5432"), # relative to the test machine
}
postgis_params_local = {**postgis_params, **postgis_params_local_override}
================================================
FILE: tests/docker-compose.yaml
================================================
version: '3.8'
services:
postgis:
image: postgis/postgis:latest
environment:
POSTGRES_DB: geodb
POSTGRES_USER: geodb_user
POSTGRES_PASSWORD: geodb_pass
ports:
- "0.0.0.0:5432:5432"
geoserver:
image: kartoza/geoserver:latest
environment:
- GEOSERVER_ADMIN_PASSWORD=geoserver
- GEOSERVER_ADMIN_USER=admin
- SAMPLE_DATA=true
ports:
- "0.0.0.0:8080:8080"
================================================
FILE: tests/test_geoserver.py
================================================
import os
import pathlib
import requests
import pytest
import sqlalchemy as sa
from geo.Style import catagorize_xml, classified_xml
from geo.Geoserver import GeoserverException, Geoserver
from .common import GEO_URL, geo, postgis_params, postgis_params_local
HERE = pathlib.Path(__file__).parent.resolve()
class TestCustomRequestParameters:
def test_custom_request_parameters(self):
"""
Tests that a custom request parameter is properly applied when spcecified
It's a bit kludgy, we check that if we specify a timeout of 0, then requests raises a ValueError, which is the
intended behaviour for that library given that option. That proves that the request_options are getting passed
properly for any given request
"""
geo = Geoserver(
GEO_URL,
username=os.getenv("GEO_USER", "admin"),
password=os.getenv("GEO_PASS", "geoserver"),
request_options={"timeout": 0},
)
url = "{}/rest/about/manifest.json".format(geo.service_url)
with pytest.raises(ValueError):
geo._requests("get", url)
class TestGeoserverMethods:
def test_get_manifest(self):
"""
Tests that the manifest endpoint returns the proper dictionary
"""
response = geo.get_manifest()
assert len(response["about"]) > 0
def test_get_version(self):
"""
Tests that the version endpoint returns a dictionary containing at least one resource called `GeoServer`
"""
response = geo.get_version()
assert "GeoServer" in [
resource["@name"] for resource in response["about"]["resource"]
]
def test_get_status(self):
"""
Tests that the status endpoint returns a dictionary containing a key called `status`
"""
response = geo.get_status()
# NOT A TYPO! Geoserver returns a key called exactly `statuss`
assert "statuss" in response.keys()
def test_get_system_status(self):
"""
Tests that the status endpoint returns a dictionary containing a key called `metric`
"""
response = geo.get_system_status()
assert "metrics" in response.keys()
def test_reload(self):
"""
Tests that the reload endpoint returns the string `Status code: 200`
"""
response = geo.reload()
assert response == "Status code: 200"
def test_reset(self):
"""
Tests that the reset endpoint returns the string `Status code: 200`
"""
response = geo.reset()
assert response == "Status code: 200"
class TestWorkspace:
def test_get_default_workspace(self):
response = geo.get_default_workspace()
# Assuming that we are using the kartoza/geoserver docker image, which uses `ne` as the default workspace
assert response["workspace"]["name"] == "ne"
def test_get_workspace(self):
response = geo.get_workspace("ne")
assert response["workspace"]["name"] == "ne"
def test_get_workspaces(self):
response = geo.get_workspaces()
# Assuming that we are using the kartoza/geoserver docker image, which uses the following as workspaces
expected_workspace_names = sorted(
["cite", "it.geosolutions", "ne", "nurc", "sde", "sf", "tiger", "topp"]
)
for expected_workspace_name in expected_workspace_names:
assert expected_workspace_name in [
ws["name"] for ws in response["workspaces"]["workspace"]
]
def test_set_default_workspace(self):
try:
geo.set_default_workspace("cite")
response = geo.get_default_workspace()
assert response["workspace"]["name"] == "cite"
finally:
# Assuming that we are using the kartoza/geoserver docker image, which uses `ne` as the default workspace
geo.set_default_workspace("ne")
@pytest.mark.skip(reason="Only setup for local testing.")
class TestRequest:
def test_information(self):
geo.get_version()
geo.get_manifest()
geo.get_status()
geo.get_system_status()
def test_datastore_create(self):
a = geo.create_shp_datastore(
r"C:\Program Files (x86)\GeoServer 2.15.1\data_dir\data\demo\C_Jamoat\C_Jamoat.zip",
store_name="111",
)
# assert a == "something we expect"
print(a)
geo.get_layer("jamoat-db", workspace="demo")
geo.get_datastore("111", "demo")
geo.get_style(
"hazard_exp",
workspace="geoinformatics_center",
)
a = geo.get_styles()
# assert a == "something we expect"
a = geo.create_datastore(
"datastore4",
r"http://localhost:8080/geoserver/wfs?request=GetCapabilities",
workspace="demo",
overwrite=True,
)
# assert a == "something we expect"
a = geo.create_shp_datastore(
r"C:\Users\tek\Desktop\try\geoserver-rest\data\A_Admin_boundaries\A_Country\A_Country.zip",
"aaa",
"default",
)
# assert a == "something we expect"
print(a)
geo.publish_featurestore("datastore2", "admin_units", workspace="demo")
class TestCoverages:
def setup_method(self):
self.workspace_name = "test_workspace"
self.coveragestore = "tos"
self.coverage_name = "tos_test"
self.coverage_title = "tos test title"
self.path = f"{HERE}/data/tos_O1_2001-2002.nc"
self.type = "NetCDF"
try:
geo.create_workspace(self.workspace_name)
except:
geo.delete_workspace(self.workspace_name)
geo.create_workspace(self.workspace_name)
geo.create_coveragestore(
path=self.path,
workspace=self.workspace_name,
layer_name=self.coveragestore,
file_type=self.type,
content_type="application/x-netcdf",
method="file",
)
def teardown_method(self):
geo.delete_coveragestore(
coveragestore_name=self.coveragestore, workspace=self.workspace_name
)
geo.delete_workspace(self.workspace_name)
@pytest.mark.skip(reason="Only setup for local testing.")
def test_coverage(self):
geo.create_coveragestore(
r"C:\Users\tek\Desktop\try\geoserver-rest\data\C_EAR\a_Agriculture\agri_final_proj.tif",
workspace="demo",
lyr_name="name_try",
overwrite=False,
)
geo.upload_style(
r"C:\Users\tek\Desktop\try_sld.sld", sld_version="1.1.0", workspace="try"
)
geo.publish_style("agri_final_proj", "dem", "demo")
color_ramp1 = {"value1": "#ffff55", "value2": "#505050", "value3": "#404040"}
geo.create_coveragestyle(
style_name="demo",
raster_path=r"C:\Users\tek\Desktop\try\geoserver-rest\data\flood_alert.tif",
workspace="demo",
color_ramp=color_ramp1,
cmap_type="values",
overwrite=True,
)
@pytest.mark.skip(reason="Only setup for local testing.")
def test_create_coverage(self):
resp = geo.create_coverage(
workspace=self.workspace_name,
coveragestore=self.coveragestore,
coverage_name=self.coverage_name,
coverage_title=self.coverage_title,
)
assert resp == self.coverage_name
# @pytest.mark.skip(reason="Only setup for local testing.")
class TestFeatures:
def test_featurestore(self):
"""
Tests that you can publish an existing table as a layer
"""
table_name = "test_table"
workspace_name = "test_ws"
featurestore_name = "test_ds"
# set up DB and create a table with a feature inside
DB_HOST = postgis_params_local["host"]
DB_PORT = postgis_params_local["port"]
DB_PASS = postgis_params_local["pg_password"]
DB_USER = postgis_params_local["pg_user"]
DB_NAME = postgis_params_local["db"]
engine = sa.create_engine(
f"postgresql://{DB_USER}:{DB_PASS}@{DB_HOST}:{DB_PORT}/{DB_NAME}",
echo=False,
)
with engine.connect() as conn:
conn.execute(sa.text(f"drop table if exists {table_name};"))
conn.execute(
sa.text(
f"create table {table_name} (id integer primary key, foo text, geom geometry);"
)
)
conn.execute(
sa.text(
f"insert into {table_name} (id, foo, geom) values (0, 'bar', ST_MakePoint(0, 0, 4326));"
)
)
conn.commit()
try:
geo.create_workspace(workspace_name)
geo.create_featurestore(
workspace=workspace_name, store_name=featurestore_name, **postgis_params
)
geo.publish_featurestore(
store_name=featurestore_name,
pg_table=table_name,
workspace=workspace_name,
)
wfs_query = (
f"{GEO_URL}/{workspace_name}/ows?"
"service=WFS&"
"version=1.0.0&"
"request=GetFeature&"
f"typeName={workspace_name}%3A{table_name}&"
"outputFormat=application%2Fjson"
)
r = requ
gitextract_o9tiakx3/
├── .github/
│ └── workflows/
│ ├── python-publish.yml
│ └── python-test.yml
├── .gitignore
├── .pre-commit-config.yaml
├── .readthedocs.yaml
├── CODE_OF_CONDUCT.md
├── CONTRIBUTING.md
├── DEV_DOCS.md
├── LICENSE
├── README.md
├── conda-recipe/
│ ├── bld.bat
│ ├── build.sh
│ └── meta.yaml
├── docs/
│ ├── Makefile
│ ├── make.bat
│ ├── requirements-docs.txt
│ └── source/
│ ├── about.rst
│ ├── acknowledgements.rst
│ ├── advanced_uses.rst
│ ├── change_log.rst
│ ├── conf.py
│ ├── contribution.rst
│ ├── geo.rst
│ ├── how_to_use.rst
│ ├── index.rst
│ ├── installation.rst
│ └── license.rst
├── geo/
│ ├── Calculation_gdal.py
│ ├── Geoserver.py
│ ├── Style.py
│ ├── __init__.py
│ ├── __version__.py
│ └── supports.py
├── requirements.txt
├── requirements_dev.txt
├── requirements_style.txt
├── setup.cfg
├── setup.py
└── tests/
├── .env_template
├── __init__.py
├── common.py
├── data/
│ ├── countries-test.gpkg
│ ├── sample_geotiff.tif
│ └── tos_O1_2001-2002.nc
├── docker-compose.yaml
├── test_geoserver.py
├── test_layergroup.py
└── test_path_fix.py
SYMBOL INDEX (154 symbols across 7 files)
FILE: geo/Calculation_gdal.py
function raster_value (line 11) | def raster_value(path: str) -> dict:
FILE: geo/Geoserver.py
function _parse_request_options (line 14) | def _parse_request_options(request_options: Dict[str, Any]):
class GeoserverException (line 32) | class GeoserverException(Exception):
method __init__ (line 44) | def __init__(self, status, message):
class DataProvider (line 51) | class DataProvider:
method __init__ (line 61) | def __init__(self, data):
method read_cb (line 65) | def read_cb(self, size):
class FileReader (line 89) | class FileReader:
method __init__ (line 99) | def __init__(self, fp):
method read_callback (line 102) | def read_callback(self, size):
class Geoserver (line 119) | class Geoserver:
method __init__ (line 135) | def __init__(
method _requests (line 149) | def _requests(self, method: str, url: str, **kwargs) -> requests.Respo...
method get_manifest (line 205) | def get_manifest(self):
method get_version (line 221) | def get_version(self):
method get_status (line 237) | def get_status(self):
method get_system_status (line 253) | def get_system_status(self):
method reload (line 269) | def reload(self):
method reset (line 287) | def reset(self):
method get_default_workspace (line 309) | def get_default_workspace(self):
method get_workspace (line 326) | def get_workspace(self, workspace):
method get_workspaces (line 348) | def get_workspaces(self):
method set_default_workspace (line 365) | def set_default_workspace(self, workspace: str):
method create_workspace (line 391) | def create_workspace(self, workspace: str):
method delete_workspace (line 415) | def delete_workspace(self, workspace: str):
method get_datastore (line 445) | def get_datastore(self, store_name: str, workspace: Optional[str] = No...
method get_datastores (line 475) | def get_datastores(self, workspace: Optional[str] = None):
method get_coveragestore (line 507) | def get_coveragestore(
method get_coveragestores (line 538) | def get_coveragestores(self, workspace: str = None):
method create_coverage (line 562) | def create_coverage(
method create_coveragestore (line 615) | def create_coveragestore(
method publish_time_dimension_to_coveragestore (line 687) | def publish_time_dimension_to_coveragestore(
method get_layer (line 762) | def get_layer(self, layer_name: str, workspace: Optional[str] = None):
method get_layers (line 790) | def get_layers(self, workspace: Optional[str] = None):
method delete_layer (line 814) | def delete_layer(self, layer_name: str, workspace: Optional[str] = None):
method get_layergroups (line 849) | def get_layergroups(self, workspace: Optional[str] = None):
method get_layergroup (line 879) | def get_layergroup(self, layer_name: str, workspace: Optional[str] = N...
method create_layergroup (line 906) | def create_layergroup(
method update_layergroup (line 1106) | def update_layergroup(
method delete_layergroup (line 1228) | def delete_layergroup(
method add_layer_to_layergroup (line 1266) | def add_layer_to_layergroup(
method remove_layer_from_layergroup (line 1343) | def remove_layer_from_layergroup(
method _layergroup_definition_from_layers_and_styles (line 1427) | def _layergroup_definition_from_layers_and_styles(
method get_style (line 1504) | def get_style(self, style_name, workspace: Optional[str] = None):
method get_styles (line 1538) | def get_styles(self, workspace: Optional[str] = None):
method upload_style (line 1569) | def upload_style(
method create_coveragestyle (line 1658) | def create_coveragestyle(
method create_catagorized_featurestyle (line 1764) | def create_catagorized_featurestyle(
method create_outline_featurestyle (line 1848) | def create_outline_featurestyle(
method create_classified_featurestyle (line 1925) | def create_classified_featurestyle(
method publish_style (line 2015) | def publish_style(
method delete_style (line 2068) | def delete_style(self, style_name: str, workspace: Optional[str] = None):
method create_featurestore (line 2109) | def create_featurestore(
method create_datastore (line 2310) | def create_datastore(
method create_shp_datastore (line 2390) | def create_shp_datastore(
method create_gpkg_datastore (line 2456) | def create_gpkg_datastore(
method publish_featurestore (line 2521) | def publish_featurestore(
method edit_featuretype (line 2604) | def edit_featuretype(
method publish_featurestore_sqlview (line 2679) | def publish_featurestore_sqlview(
method get_featuretypes (line 2832) | def get_featuretypes(
method get_feature_attribute (line 2866) | def get_feature_attribute(
method get_featurestore (line 2904) | def get_featurestore(self, store_name: str, workspace: str) -> dict:
method delete_featurestore (line 2935) | def delete_featurestore(
method delete_coveragestore (line 2971) | def delete_coveragestore(
method get_all_users (line 3017) | def get_all_users(self, service=None) -> dict:
method create_user (line 3050) | def create_user(
method modify_user (line 3094) | def modify_user(
method delete_user (line 3147) | def delete_user(self, username: str, service=None) -> str:
method get_all_usergroups (line 3182) | def get_all_usergroups(self, service=None) -> dict:
method create_usergroup (line 3214) | def create_usergroup(self, group: str, service=None) -> str:
method delete_usergroup (line 3247) | def delete_usergroup(self, group: str, service=None) -> str:
method update_service (line 3287) | def update_service(self, service: str, **kwargs):
FILE: geo/Style.py
function coverage_style_colormapentry (line 9) | def coverage_style_colormapentry(
function coverage_style_xml (line 84) | def coverage_style_xml(
function outline_only_xml (line 134) | def outline_only_xml(color, width, geom_type="polygon"):
function catagorize_xml (line 203) | def catagorize_xml(
function classified_xml (line 312) | def classified_xml(
FILE: geo/supports.py
function prepare_zip_file (line 9) | def prepare_zip_file(name: str, data: Dict) -> str:
function is_valid_xml (line 44) | def is_valid_xml(xml_string: str) -> bool:
function is_surrounded_by_quotes (line 66) | def is_surrounded_by_quotes(text, param):
FILE: tests/test_geoserver.py
class TestCustomRequestParameters (line 16) | class TestCustomRequestParameters:
method test_custom_request_parameters (line 18) | def test_custom_request_parameters(self):
class TestGeoserverMethods (line 38) | class TestGeoserverMethods:
method test_get_manifest (line 40) | def test_get_manifest(self):
method test_get_version (line 48) | def test_get_version(self):
method test_get_status (line 58) | def test_get_status(self):
method test_get_system_status (line 67) | def test_get_system_status(self):
method test_reload (line 75) | def test_reload(self):
method test_reset (line 83) | def test_reset(self):
class TestWorkspace (line 92) | class TestWorkspace:
method test_get_default_workspace (line 94) | def test_get_default_workspace(self):
method test_get_workspace (line 100) | def test_get_workspace(self):
method test_get_workspaces (line 105) | def test_get_workspaces(self):
method test_set_default_workspace (line 117) | def test_set_default_workspace(self):
class TestRequest (line 129) | class TestRequest:
method test_information (line 130) | def test_information(self):
method test_datastore_create (line 136) | def test_datastore_create(self):
class TestCoverages (line 171) | class TestCoverages:
method setup_method (line 173) | def setup_method(self):
method teardown_method (line 194) | def teardown_method(self):
method test_coverage (line 201) | def test_coverage(self):
method test_create_coverage (line 223) | def test_create_coverage(self):
class TestFeatures (line 234) | class TestFeatures:
method test_featurestore (line 236) | def test_featurestore(self):
method test_sql_featurestore (line 300) | def test_sql_featurestore(self):
method test_parameterized_sql_featurestore (line 342) | def test_parameterized_sql_featurestore(self):
method test_parameterized_sql_featurestore_regexp_validator (line 395) | def test_parameterized_sql_featurestore_regexp_validator(self):
method test_parameterized_sql_featurestore_fails_when_integer_parameter_has_no_default_value (line 465) | def test_parameterized_sql_featurestore_fails_when_integer_parameter_h...
class TestStyles (line 516) | class TestStyles:
method test_styles (line 517) | def test_styles(self):
class TestCreateGeopackageDatastore (line 530) | class TestCreateGeopackageDatastore:
method test_create_geopackage_datastore_from_file (line 532) | def test_create_geopackage_datastore_from_file(self):
class TestUploadStyles (line 541) | class TestUploadStyles:
method test_upload_style_from_file (line 543) | def test_upload_style_from_file(self):
method test_upload_style_from_malformed_file_fails (line 554) | def test_upload_style_from_malformed_file_fails(self):
method test_upload_style_from_xml (line 569) | def test_upload_style_from_xml(self):
method test_upload_style_from_malformed_xml_fails (line 581) | def test_upload_style_from_malformed_xml_fails(self):
class TestPostGres (line 596) | class TestPostGres:
method test_postgres (line 601) | def test_postgres(self):
class TestDeletion (line 616) | class TestDeletion:
method test_delete (line 619) | def test_delete(self):
class TestOther (line 627) | class TestOther:
method test_classified_xml (line 628) | def test_classified_xml(self):
class TestCoveragestore (line 632) | class TestCoveragestore:
method setup_method (line 633) | def setup_method(self):
method teardown_method (line 645) | def teardown_method(self):
method _verify_coveragestore (line 648) | def _verify_coveragestore(self, response):
method _test_create_coveragestore (line 662) | def _test_create_coveragestore(self, method, path=None):
method test_create_coveragestore_using_file_method (line 682) | def test_create_coveragestore_using_file_method(self):
method test_create_coveragestore_using_external_method (line 689) | def test_create_coveragestore_using_external_method(self):
method test_create_coveragestore_using_url_method (line 696) | def test_create_coveragestore_using_url_method(self):
FILE: tests/test_layergroup.py
class TestLayerGroup (line 13) | class TestLayerGroup(unittest.TestCase):
method setUpClass (line 36) | def setUpClass(cls):
method tearDownClass (line 84) | def tearDownClass(cls):
method setUp (line 90) | def setUp(self):
method tearDown (line 96) | def tearDown(self):
method test_get_layergroup_that_doesnt_exist (line 125) | def test_get_layergroup_that_doesnt_exist(self, layergroup_name, works...
method test_delete_layergroup_that_doesnt_exist (line 139) | def test_delete_layergroup_that_doesnt_exist(self, layergroup_name, wo...
method test_create_and_get_and_delete_layergroup (line 167) | def test_create_and_get_and_delete_layergroup(self, name, mode, title,...
method test_add_layer_to_layergroup (line 233) | def test_add_layer_to_layergroup(self, workspace):
method test_add_layer_to_layergroup_that_doesnt_exist (line 292) | def test_add_layer_to_layergroup_that_doesnt_exist(self):
method test_add_layer_that_doesnt_exist_to_layergroup (line 303) | def test_add_layer_that_doesnt_exist_to_layergroup(self):
method test_remove_layer_from_layergroup (line 325) | def test_remove_layer_from_layergroup(self, workspace):
method test_remove_layer_from_layergroup_that_doesnt_exist (line 355) | def test_remove_layer_from_layergroup_that_doesnt_exist(self):
method test_remove_layer_that_doesnt_exist_from_layergroup (line 366) | def test_remove_layer_that_doesnt_exist_from_layergroup(self):
FILE: tests/test_path_fix.py
class TestPathConversion (line 12) | class TestPathConversion(unittest.TestCase):
method setUp (line 15) | def setUp(self):
method test_absolute_path_conversion (line 28) | def test_absolute_path_conversion(self):
method test_relative_paths_unchanged (line 58) | def test_relative_paths_unchanged(self):
method test_http_urls_unchanged (line 80) | def test_http_urls_unchanged(self):
method test_force_absolute_path_parameter (line 101) | def test_force_absolute_path_parameter(self):
method test_create_datastore_path_conversion (line 127) | def test_create_datastore_path_conversion(self):
method test_http_url_handling (line 160) | def test_http_url_handling(self):
function test_path_conversion_demo (line 174) | def test_path_conversion_demo():
Condensed preview — 48 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (246K chars).
[
{
"path": ".github/workflows/python-publish.yml",
"chars": 1136,
"preview": "# This workflow will upload a Python Package using Twine when a release is created\n# For more information see: https://h"
},
{
"path": ".github/workflows/python-test.yml",
"chars": 5775,
"preview": "name: Run tests\n\non:\n push:\n branches: [master]\n pull_request:\n branches: [master]\n # schedule:\n # - cron: \""
},
{
"path": ".gitignore",
"chars": 2234,
"preview": "style.sld\nFunctionsToImplement.py\nrecord.txt\npackage_test.py\ntest.py\n.idea/\n\n# Created by https://www.toptal.com/develop"
},
{
"path": ".pre-commit-config.yaml",
"chars": 1391,
"preview": "default_language_version:\n python: python3\n\nrepos:\n- repo: https://github.com/asottile/pyupgrade\n rev: v3.3.1\n "
},
{
"path": ".readthedocs.yaml",
"chars": 1040,
"preview": "# Read the Docs configuration file for Sphinx projects\n# See https://docs.readthedocs.io/en/stable/config-file/v2.html f"
},
{
"path": "CODE_OF_CONDUCT.md",
"chars": 3351,
"preview": "# Contributor Covenant Code of Conduct\n\n## Our Pledge\n\nIn the interest of fostering an open and welcoming environment, w"
},
{
"path": "CONTRIBUTING.md",
"chars": 471,
"preview": "# Contributing Guidelines\n\nGeoserver-rest is an open source library written in python and contributors are needed to kee"
},
{
"path": "DEV_DOCS.md",
"chars": 1609,
"preview": "# Documentation\n\n### Publishing the python package\n\nFor publishing the python package, follow following steps,\n\n1. Insta"
},
{
"path": "LICENSE",
"chars": 1142,
"preview": "MIT License\n\nCopyright (c) 2020, Geoinformatics Center, Asian Institute of Technology\nCopyright (c) 2020, Tek Kshetri\n\nP"
},
{
"path": "README.md",
"chars": 4604,
"preview": "# Python wrapper for GeoServer REST API\n\n[](https://pepy.tech/projec"
},
{
"path": "conda-recipe/bld.bat",
"chars": 101,
"preview": "cd %RECIPE_DIR%\\..\n%PYTHON% setup.py install --single-version-externally-managed --record=record.txt\n"
},
{
"path": "conda-recipe/build.sh",
"chars": 25,
"preview": "$PYTHON setup.py install\n"
},
{
"path": "conda-recipe/meta.yaml",
"chars": 730,
"preview": "{% set data = load_setup_py_data() %}\npackage:\n name: \"geoserver-rest\"\n version: {{ data.get('version') }}\n\nbuild:\n #"
},
{
"path": "docs/Makefile",
"chars": 638,
"preview": "# Minimal makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line, and also\n# from the "
},
{
"path": "docs/make.bat",
"chars": 764,
"preview": "@ECHO OFF\n\npushd %~dp0\n\nREM Command file for Sphinx documentation\n\nif \"%SPHINXBUILD%\" == \"\" (\n\tset SPHINXBUILD=sphinx-bu"
},
{
"path": "docs/requirements-docs.txt",
"chars": 150,
"preview": "jinja2\nsphinx\nsphinx_rtd_theme\nreadthedocs-sphinx-ext\nm2r\nmistune\nsphinxcontrib-httpdomain\nsphinxcontrib-openapi\ngeopand"
},
{
"path": "docs/source/about.rst",
"chars": 726,
"preview": "About geoserver-rest!\n=====================\n\nWhat is geoserver-rest?\n^^^^^^^^^^^^^^^^^^^^^^^\n\nThe geoserver-rest package"
},
{
"path": "docs/source/acknowledgements.rst",
"chars": 204,
"preview": "Acknowledgements\n================\n\nCreated and managed by `Tek Bahadur Kshetri <https://github.com/iamtekson>`_ for the "
},
{
"path": "docs/source/advanced_uses.rst",
"chars": 1690,
"preview": "Advanced uses for automation\n============================\n\nThe following code will first convert all the ``.rst`` data f"
},
{
"path": "docs/source/change_log.rst",
"chars": 3820,
"preview": "Change Log\n=============\n\n``Master branch``\n^^^^^^^^^^^^^^^^^\n* New method `create_gpkg_datastore`\n* Bugfixes for `add_l"
},
{
"path": "docs/source/conf.py",
"chars": 1985,
"preview": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common op"
},
{
"path": "docs/source/contribution.rst",
"chars": 191,
"preview": "Contribution\n=============\n\nGeoserver-rest is an open source library written in python and contributors are needed to ke"
},
{
"path": "docs/source/geo.rst",
"chars": 771,
"preview": "Python API Reference\n====================\n\n**Disclamer**: The documentation in the code was initially written by people,"
},
{
"path": "docs/source/how_to_use.rst",
"chars": 15943,
"preview": "How to use\n===========\n\nThis library is built for getting, creating, updating and deleting workspaces, coveragestores, f"
},
{
"path": "docs/source/index.rst",
"chars": 757,
"preview": ".. geoserver-rest documentation master file, created by\n sphinx-quickstart on Wed Mar 10 14:46:38 2021.\n You can ada"
},
{
"path": "docs/source/installation.rst",
"chars": 3442,
"preview": "Installation\n=============\n\n.. warning::\n As of version 2.9.0, the required dependency ``gdal``, ``matplotlib`` and `"
},
{
"path": "docs/source/license.rst",
"chars": 1176,
"preview": "License\n=========\n\nMIT License\n^^^^^^^^^^^^\n\n\nCopyright (c) 2020, Geoinformatics Center, Asian Institute of Technology\n\n"
},
{
"path": "geo/Calculation_gdal.py",
"chars": 1177,
"preview": "import os\n\ntry:\n from osgeo import gdal # noqa\nexcept ImportError:\n import gdal # noqa\nexcept ImportError:\n r"
},
{
"path": "geo/Geoserver.py",
"chars": 108480,
"preview": "# inbuilt libraries\nimport os\nfrom typing import List, Optional, Set, Union, Dict, Iterable, Any\nfrom pathlib import Pat"
},
{
"path": "geo/Style.py",
"chars": 13183,
"preview": "# inbuilt libraries\nfrom typing import Dict, Iterable, List, Union\n\n# third-party libraries\nimport seaborn as sns\nfrom m"
},
{
"path": "geo/__init__.py",
"chars": 1,
"preview": "\n"
},
{
"path": "geo/__version__.py",
"chars": 309,
"preview": "# This information is located in its own file so that it can be loaded\n# without importing the main package when its dep"
},
{
"path": "geo/supports.py",
"chars": 2229,
"preview": "import os\nimport re\nfrom tempfile import mkstemp\nfrom typing import Dict\nfrom zipfile import ZipFile\nimport xml.etree.El"
},
{
"path": "requirements.txt",
"chars": 27,
"preview": "pygments\nrequests\nxmltodict"
},
{
"path": "requirements_dev.txt",
"chars": 307,
"preview": "-r requirements.txt\n\npytest\nblack\nflake8\nsphinx>=1.7\nsphinx-rtd-theme>=2.0\npre-commit\nenvirons\nddt\nsqlalchemy>=2.0.29\nps"
},
{
"path": "requirements_style.txt",
"chars": 23,
"preview": "matplotlib\nseaborn\ngdal"
},
{
"path": "setup.cfg",
"chars": 307,
"preview": "[flake8]\nignore =\n\tC901\n\tE203\n\tE231\n\tE266\n\tE501\n\tF401\n\tF403\n\tW503\n\tW504\nmax-line-length = 88\nmax-complexity = 12\nexclude"
},
{
"path": "setup.py",
"chars": 1547,
"preview": "import os\nfrom typing import Dict\n\nfrom setuptools import setup\n\nHERE = os.path.abspath(os.path.dirname(__file__))\n\nabou"
},
{
"path": "tests/.env_template",
"chars": 136,
"preview": "# Login information for geoserver\nGEOSERVER_URL=\"http://localhost:8080/geoserver\"\nGEOSERVER_USER=\"admin\"\nGEOSERVER_PASSW"
},
{
"path": "tests/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "tests/common.py",
"chars": 985,
"preview": "import os\n\nfrom geo.Geoserver import Geoserver\n\nGEO_URL = os.getenv(\"GEO_URL\", \"http://localhost:8080/geoserver\") # rel"
},
{
"path": "tests/docker-compose.yaml",
"chars": 430,
"preview": "version: '3.8'\n\nservices:\n\n postgis:\n image: postgis/postgis:latest\n environment:\n POSTGRES_DB: geodb\n "
},
{
"path": "tests/test_geoserver.py",
"chars": 24311,
"preview": "import os\nimport pathlib\n\nimport requests\nimport pytest\nimport sqlalchemy as sa\n\nfrom geo.Style import catagorize_xml, c"
},
{
"path": "tests/test_layergroup.py",
"chars": 12688,
"preview": "import os\nimport unittest\nfrom environs import Env\nimport pytest\n\nfrom unittest.mock import MagicMock, patch #allows rep"
},
{
"path": "tests/test_path_fix.py",
"chars": 11105,
"preview": "#!/usr/bin/env python3\n\"\"\"\nTest script to demonstrate the absolute path fix for create_datastore function.\n\"\"\"\n\nimport o"
}
]
// ... and 3 more files (download for full content)
About this extraction
This page contains the full source code of the gicait/geoserver-rest GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 48 files (227.6 KB), approximately 52.7k tokens, and a symbol index with 154 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.