Showing preview only (1,150K chars total). Download the full file or copy to clipboard to get everything.
Repository: vatlab/sos-notebook
Branch: master
Commit: a770bbdf1069
Files: 115
Total size: 1.1 MB
Directory structure:
gitextract_658abc0k/
├── .appveyor.yml
├── .github/
│ ├── linters/
│ │ └── .python-lint
│ └── workflows/
│ ├── pylint.yml
│ ├── pytest.yml
│ └── python-publish.yml
├── .gitignore
├── .pre-commit-config.yaml
├── .travis.yml
├── CLAUDE.md
├── CONTRIBUTING.md
├── LICENSE
├── MANIFEST.in
├── README.md
├── development/
│ ├── README.md
│ ├── docker-compose.yml
│ ├── eg_sshd/
│ │ ├── .ssh/
│ │ │ ├── id_rsa
│ │ │ ├── id_rsa.pub
│ │ │ └── known_hosts
│ │ └── Dockerfile
│ ├── install_sos_notebook.sh
│ └── sos_notebook_test/
│ ├── .ssh/
│ │ ├── id_rsa
│ │ ├── id_rsa.pub
│ │ └── known_hosts
│ └── Dockerfile
├── pyproject.toml
├── setup.py.old
├── src/
│ └── sos_notebook/
│ ├── __init__.py
│ ├── _version.py
│ ├── comm_manager.py
│ ├── completer.py
│ ├── converter.py
│ ├── inspector.py
│ ├── install.py
│ ├── install_sos_notebook.sh
│ ├── kernel.py
│ ├── magics.py
│ ├── step_executor.py
│ ├── subkernel.py
│ ├── templates/
│ │ ├── README.md
│ │ ├── sos-cm/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── cm.tpl
│ │ │ │ └── sos-mode.js
│ │ │ └── sos-cm.html.j2
│ │ ├── sos-cm-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-cm-toc.html.j2
│ │ ├── sos-full/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── preview.tpl
│ │ │ │ └── sos_style.tpl
│ │ │ └── sos-full.html.j2
│ │ ├── sos-full-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-full-toc.html.j2
│ │ ├── sos-lab-cm/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── cm.tpl
│ │ │ │ └── sos-mode.js
│ │ │ └── sos-lab-cm.html.j2
│ │ ├── sos-lab-full/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── preview.tpl
│ │ │ │ └── sos_style.tpl
│ │ │ ├── sos-lab-full.html.j2
│ │ │ └── static/
│ │ │ ├── index.css
│ │ │ ├── theme-dark.css
│ │ │ └── theme-light.css
│ │ ├── sos-lab-report-only/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ └── sos-lab-report-only.html.j2
│ │ ├── sos-markdown/
│ │ │ ├── conf.json
│ │ │ ├── index.md.j2
│ │ │ └── sos-markdown.md.j2
│ │ ├── sos-report/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── control_panel.tpl
│ │ │ └── sos-report.html.j2
│ │ ├── sos-report-only/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ └── sos-report-only.html.j2
│ │ ├── sos-report-only-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-report-only-toc.html.j2
│ │ ├── sos-report-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-report-toc.html.j2
│ │ ├── sos-report-toc-v2/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-report-toc-v2.html.j2
│ │ ├── sos-report-v1/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── control_panel_v1.tpl
│ │ │ └── sos-report-v1.html.j2
│ │ └── sos-report-v2/
│ │ ├── conf.json
│ │ ├── index.html.j2
│ │ ├── parts/
│ │ │ └── control_panel.tpl
│ │ └── sos-report-v2.html.j2
│ ├── test_utils.py
│ └── workflow_executor.py
├── tasks.py
└── test/
├── __init__.py
├── build_test_docker.sh
├── conftest.py
├── sample_notebook.ipynb
├── sample_papermill_notebook.ipynb
├── sample_workflow.ipynb
├── test_convert.py
├── test_magics.py
└── test_workflow.py
================================================
FILE CONTENTS
================================================
================================================
FILE: .appveyor.yml
================================================
version: 1.0.{build}
# docker support
#image: Visual Studio 2017
#init:
# - ps: iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/appveyor/ci/master/scripts/enable-rdp.ps1'))
branches:
only:
- master
skip_tags: true
max_jobs: 100
build: none
clone_folder: c:\projects\sos
clone_depth: 50
shallow_clone: false
environment:
matrix:
- PYTHON: "C:\\Miniconda36-x64"
PYTHON_VERSION: 3.8
install:
- set PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
# Useful for debugging any issues with conda
- conda info -a
- conda install -c conda-forge feather-format r-base r-irkernel
#
#
# add docker
#- pip install docker
# packages required by SoS
- pip install spyder jedi notebook nbconvert nbformat pyyaml psutil tqdm matplotlib
- pip install fasteners pygments ipython ptpython networkx pydot pydotplus nose selenium
# https://github.com/jupyter/jupyter/issues/150
- pip install entrypoints markdown
- pip install jupyter wand numpy pandas papermill sos-papermill
# install github version of sos
- pip install git+https://github.com/vatlab/sos.git
- pip install pytest
- pip install . -U
- python -m sos_notebook.install
- pip install sos-python sos-r
test_script:
- cd test
- pytest -x -v
#on_finish:
# - ps: $blockRdp = $true; iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/appveyor/ci/master/scripts/enable-rdp.ps1'))
notifications:
- provider: Email
to:
- ben.bob@gmail.com
on_build_status_changed: true
================================================
FILE: .github/linters/.python-lint
================================================
[MASTER]
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=
# Specify a score threshold to be exceeded before program exits with error.
fail-under=10.0
# Add files or directories to the blacklist. They should be base names, not
# paths.
ignore=development
# Add files or directories matching the regex patterns to the blacklist. The
# regex matches against base names, not paths.
ignore-patterns=\d{4}_.*?.py
# Python code to execute, usually for sys.path manipulation such as
# pygtk.require().
#init-hook=
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
# number of processors available to use.
jobs=1
# Control the amount of potential inferred values when inferring a single
# object. This can help the performance when dealing with large functions or
# complex, nested conditions.
limit-inference-results=100
# List of plugins (as comma separated values of python module names) to load,
# usually to register additional checkers.
# these two plugins are good to have but super-linter does not support it.
#
# pylint_django, pylint_celery
#
load-plugins=
# Pickle collected data for later comparisons.
persistent=yes
# When enabled, pylint would attempt to guess common misconfiguration and emit
# user-friendly hints instead of false-positive error messages.
# suggestion-mode=yes # removed: unrecognized in newer pylint
# Allow loading of arbitrary C extensions. Extensions are imported into the
# active Python interpreter and may run arbitrary code.
unsafe-load-any-extension=no
[MESSAGES CONTROL]
# Only show warnings with the listed confidence levels. Leave empty to show
# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED.
confidence=
# Disable the message, report, category or checker with the given id(s). You
# can either give multiple identifiers separated by comma (,) or put this
# option multiple times (only on the command line, not in the configuration
# file where it should appear only once). You can also use "--disable=all" to
# disable everything first and then reenable specific checks. For example, if
# you want to run only the similarities checker, you can use "--disable=all
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use "--disable=all --enable=classes
# --disable=W".
disable=abstract-method,
arguments-differ,
arguments-renamed,
attribute-defined-outside-init,
bad-inline-option,
bare-except,
broad-except,
chained-comparison,
consider-iterating-dictionary,
consider-using-dict-items,
consider-using-generator,
consider-using-get,
consider-using-in,
consider-using-f-string,
consider-using-set-comprehension,
consider-using-with,
cyclic-import,
dangerous-default-value,
deprecated-method,
deprecated-pragma,
duplicate-code,
eval-used,
exec-used,
file-ignored,
fixme,
global-statement,
import-error,
import-outside-toplevel,
import-self,
inconsistent-return-statements,
invalid-name,
invalid-unary-operand-type,
line-too-long,
locally-disabled,
logging-format-interpolation,
logging-fstring-interpolation,
logging-not-lazy,
missing-class-docstring,
missing-function-docstring,
missing-module-docstring,
no-member,
no-name-in-module,
# no-self-use, # moved to optional extension in pylint 2.14
no-value-for-parameter,
pointless-statement,
protected-access,
raw-checker-failed,
redefined-argument-from-local,
redefined-builtin,
redefined-outer-name,
reimported,
self-assigning-variable,
self-cls-assignment,
signature-differs,
simplifiable-if-expression,
super-init-not-called,
suppressed-message,
too-few-public-methods,
too-many-ancestors,
too-many-arguments,
too-many-boolean-expressions,
too-many-branches,
too-many-instance-attributes,
too-many-lines,
too-many-locals,
too-many-nested-blocks,
too-many-public-methods,
too-many-return-statements,
too-many-statements,
try-except-raise,
undefined-loop-variable,,
undefined-variable,
unnecessary-comprehension,
unnecessary-lambda,
unneeded-not,
unreachable,
unspecified-encoding,
unsubscriptable-object,
unused-argument,
unused-import,
use-a-generator,
use-symbolic-message-instead,
useless-object-inheritance,
useless-super-delegation,
useless-suppression,
useless-with-lock,
wildcard-import,
assignment-from-none,
broad-exception-raised,
global-variable-not-assigned,
invalid-overridden-method,
no-else-return,
possibly-used-before-assignment,
stop-iteration-return,
too-many-positional-arguments,
ungrouped-imports,
unnecessary-dunder-call,
unused-variable,
used-before-assignment,
useless-return,
trailing-whitespace,
no-else-break,
unnecessary-pass
# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
# multiple time (only on the command line, not in the configuration file where
# it should appear only once). See also the "--disable" option for examples.
enable=c-extension-no-member
[REPORTS]
# Python expression which should return a score less than or equal to 10. You
# have access to the variables 'error', 'warning', 'refactor', and 'convention'
# which contain the number of messages in each category, as well as 'statement'
# which is the total number of statements analyzed. This score is used by the
# global evaluation report (RP0004).
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
# Template used to display messages. This is a python new-style format string
# used to format the message information. See doc for all details.
#msg-template=
# Set the output format. Available formats are text, parseable, colorized, json
# and msvs (visual studio). You can also give a reporter class, e.g.
# mypackage.mymodule.MyReporterClass.
output-format=text
# Tells whether to display a full report or only the messages.
reports=no
# Activate the evaluation score.
score=yes
[REFACTORING]
# Maximum number of nested blocks for function / method body
max-nested-blocks=5
# Complete name of functions that never returns. When checking for
# inconsistent-return-statements if a never returning function is called then
# it will be considered as an explicit return statement and no message will be
# printed.
never-returning-functions=sys.exit
[LOGGING]
# The type of string formatting that logging methods do. `old` means using %
# formatting, `new` is for `{}` formatting.
logging-format-style=old
# Logging modules to check that the string format arguments are in logging
# function parameter format.
logging-modules=logging
[SPELLING]
# Limits count of emitted suggestions for spelling mistakes.
max-spelling-suggestions=4
# Spelling dictionary name. Available dictionaries: none. To make it work,
# install the python-enchant package.
spelling-dict=
# List of comma separated words that should not be checked.
spelling-ignore-words=
# A path to a file that contains the private dictionary; one word per line.
spelling-private-dict-file=
# Tells whether to store unknown words to the private dictionary (see the
# --spelling-private-dict-file option) instead of raising a message.
spelling-store-unknown-words=no
[MISCELLANEOUS]
# List of note tags to take in consideration, separated by a comma.
notes=FIXME,
XXX,
TODO
# Regular expression of note tags to take in consideration.
#notes-rgx=
[TYPECHECK]
# List of decorators that produce context managers, such as
# contextlib.contextmanager. Add to this list to register other decorators that
# produce valid context managers.
contextmanager-decorators=contextlib.contextmanager
# List of members which are set dynamically and missed by pylint inference
# system, and so shouldn't trigger E1101 when accessed. Python regular
# expressions are accepted.
generated-members=REQUEST,acl_users,aq_parent,"[a-zA-Z]+_set{1,2}",save,delete
# Tells whether missing members accessed in mixin class should be ignored. A
# mixin class is detected if its name ends with "mixin" (case insensitive).
ignore-mixin-members=yes
# Tells whether to warn about missing members when the owner of the attribute
# is inferred to be None.
ignore-none=yes
# This flag controls whether pylint should warn about no-member and similar
# checks whenever an opaque object is returned when inferring. The inference
# can return multiple potential results while evaluating a Python object, but
# some branches might not be evaluated, which results in partial inference. In
# that case, it might be useful to still emit no-member and other checks for
# the rest of the inferred objects.
ignore-on-opaque-inference=yes
# List of class names for which member attributes should not be checked (useful
# for classes with dynamically set attributes). This supports the use of
# qualified names.
ignored-classes=optparse.Values,thread._local,_thread._local
# List of module names for which member attributes should not be checked
# (useful for modules/projects where namespaces are manipulated during runtime
# and thus existing member attributes cannot be deduced by static analysis). It
# supports qualified module names, as well as Unix pattern matching.
ignored-modules=
# Show a hint with possible names when a member name was not found. The aspect
# of finding the hint is based on edit distance.
missing-member-hint=yes
# The minimum edit distance a name should have in order to be considered a
# similar match for a missing member name.
missing-member-hint-distance=1
# The total number of similar names that should be taken in consideration when
# showing a hint for a missing member.
missing-member-max-choices=1
# List of decorators that change the signature of a decorated function.
signature-mutators=
[VARIABLES]
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid defining new builtins when possible.
additional-builtins=
# Tells whether unused global variables should be treated as a violation.
allow-global-unused-variables=yes
# List of strings which can identify a callback function by name. A callback
# name must start or end with one of those strings.
callbacks=cb_,
_cb
# A regular expression matching the name of dummy variables (i.e. expected to
# not be used).
dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
# Argument names that match this expression will be ignored. Default to name
# with leading underscore.
ignored-argument-names=_.*|^ignored_|^unused_
# Tells whether we should check for unused import in __init__ files.
init-import=no
# List of qualified module names which can have objects that can redefine
# builtins.
redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io
[FORMAT]
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
expected-line-ending-format=
# Regexp for a line that is allowed to be longer than the limit.
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
# Number of spaces of indent required inside a hanging or continued line.
indent-after-paren=4
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
# tab).
indent-string=' '
# Maximum number of characters on a single line.
max-line-length=120
# Maximum number of lines in a module.
max-module-lines=1000
# Allow the body of a class to be on the same line as the declaration if body
# contains single statement.
single-line-class-stmt=no
# Allow the body of an if to be on the same line as the test if there is no
# else.
single-line-if-stmt=no
[SIMILARITIES]
# Ignore comments when computing similarities.
ignore-comments=yes
# Ignore docstrings when computing similarities.
ignore-docstrings=yes
# Ignore imports when computing similarities.
ignore-imports=no
# Minimum lines number of a similarity.
min-similarity-lines=4
[BASIC]
# Naming style matching correct argument names.
argument-naming-style=snake_case
# Regular expression matching correct argument names. Overrides argument-
# naming-style.
#argument-rgx=
# Naming style matching correct attribute names.
attr-naming-style=snake_case
# Regular expression matching correct attribute names. Overrides attr-naming-
# style.
#attr-rgx=
# Bad variable names which should always be refused, separated by a comma.
bad-names=foo,
bar,
baz,
toto,
tutu,
tata
# Bad variable names regexes, separated by a comma. If names match any regex,
# they will always be refused
bad-names-rgxs=
# Naming style matching correct class attribute names.
class-attribute-naming-style=any
# Regular expression matching correct class attribute names. Overrides class-
# attribute-naming-style.
#class-attribute-rgx=
# Naming style matching correct class names.
class-naming-style=PascalCase
# Regular expression matching correct class names. Overrides class-naming-
# style.
#class-rgx=
# Naming style matching correct constant names.
const-naming-style=UPPER_CASE
# Regular expression matching correct constant names. Overrides const-naming-
# style.
#const-rgx=
# Minimum line length for functions/classes that require docstrings, shorter
# ones are exempt.
docstring-min-length=-1
# Naming style matching correct function names.
function-naming-style=snake_case
# Regular expression matching correct function names. Overrides function-
# naming-style.
#function-rgx=
# Good variable names which should always be accepted, separated by a comma.
good-names=i,
j,
k,
ex,
Run,
_
# Good variable names regexes, separated by a comma. If names match any regex,
# they will always be accepted
good-names-rgxs=
# Include a hint for the correct naming format with invalid-name.
include-naming-hint=no
# Naming style matching correct inline iteration names.
inlinevar-naming-style=any
# Regular expression matching correct inline iteration names. Overrides
# inlinevar-naming-style.
#inlinevar-rgx=
# Naming style matching correct method names.
method-naming-style=snake_case
# Regular expression matching correct method names. Overrides method-naming-
# style.
#method-rgx=
# Naming style matching correct module names.
module-naming-style=snake_case
# Regular expression matching correct module names. Overrides module-naming-
# style.
#module-rgx=
# Colon-delimited sets of names that determine each other's naming style when
# the name regexes allow several styles.
name-group=
# Regular expression which should only match function or class names that do
# not require a docstring.
no-docstring-rgx=^_
# List of decorators that produce properties, such as abc.abstractproperty. Add
# to this list to register other decorators that produce valid properties.
# These decorators are taken in consideration only for invalid-name.
property-classes=abc.abstractproperty
# Naming style matching correct variable names.
variable-naming-style=snake_case
# Regular expression matching correct variable names. Overrides variable-
# naming-style.
#variable-rgx=
[STRING]
# This flag controls whether inconsistent-quotes generates a warning when the
# character used as a quote delimiter is used inconsistently within a module.
check-quote-consistency=no
# This flag controls whether the implicit-str-concat should generate a warning
# on implicit string concatenation in sequences defined over several lines.
check-str-concat-over-line-jumps=no
[IMPORTS]
# List of modules that can be imported at any level, not just the top level
# one.
allow-any-import-level=
# Allow wildcard imports from modules that define __all__.
allow-wildcard-with-all=no
# Analyse import fallback blocks. This can be used to support both Python 2 and
# 3 compatible code, which means that the block might have code that exists
# only in one or another interpreter, leading to false positives when analysed.
analyse-fallback-blocks=no
# Deprecated modules which should not be used, separated by a comma.
deprecated-modules=optparse,tkinter.tix
# Create a graph of external dependencies in the given file (report RP0402 must
# not be disabled).
ext-import-graph=
# Create a graph of every (i.e. internal and external) dependencies in the
# given file (report RP0402 must not be disabled).
import-graph=
# Create a graph of internal dependencies in the given file (report RP0402 must
# not be disabled).
int-import-graph=
# Force import order to recognize a module as part of the standard
# compatibility libraries.
known-standard-library=
# Force import order to recognize a module as part of a third party library.
known-third-party=enchant
# Couples of modules and preferred modules, separated by a comma.
preferred-modules=
[CLASSES]
# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods=__init__,
__new__,
setUp,
__post_init__
# List of member names, which should be excluded from the protected access
# warning.
exclude-protected=_asdict,
_fields,
_replace,
_source,
_make
# List of valid names for the first argument in a class method.
valid-classmethod-first-arg=cls
# List of valid names for the first argument in a metaclass class method.
valid-metaclass-classmethod-first-arg=cls
[DESIGN]
# Maximum number of arguments for function / method.
max-args=5
# Maximum number of attributes for a class (see R0902).
max-attributes=7
# Maximum number of boolean expressions in an if statement (see R0916).
max-bool-expr=5
# Maximum number of branch for function / method body.
max-branches=12
# Maximum number of locals for function / method body.
max-locals=15
# Maximum number of parents for a class (see R0901).
max-parents=13
# Maximum number of public methods for a class (see R0904).
max-public-methods=20
# Maximum number of return / yield for function / method body.
max-returns=6
# Maximum number of statements in function / method body.
max-statements=50
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
[EXCEPTIONS]
# Exceptions that will emit a warning when being caught. Defaults to
# "BaseException, Exception".
# overgeneral-exceptions=BaseException,
# Exception
================================================
FILE: .github/workflows/pylint.yml
================================================
name: Pylint
on: [push, pull_request]
jobs:
pylint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.12
uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install pylint
pip install -e .
- name: Analysing the code with pylint
run: |
python -m pylint --rcfile .github/linters/.python-lint src
================================================
FILE: .github/workflows/pytest.yml
================================================
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.12"]
fail-fast: false
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Set up R
uses: r-lib/actions/setup-r@v2
- name: Install R packages
run: |
Rscript -e 'install.packages(c("IRkernel", "arrow"), repos="https://cloud.r-project.org")'
Rscript -e 'IRkernel::installspec()'
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
pip install -e .[dev]
pip install sos-r sos-python sos-bash
pip install bash_kernel
python -m bash_kernel.install
python -m sos_notebook.install
- name: Run tests
run: |
pytest test/ -v --tb=short -x
================================================
FILE: .github/workflows/python-publish.yml
================================================
# This workflow will upload a Python Package to PyPI when a release is created
name: Upload Python Package
on:
release:
types: [created]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.x'
- name: Install build dependencies
run: |
python -m pip install --upgrade pip
pip install build twine
- name: Build package
run: python -m build
- name: Publish to PyPI
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
run: twine upload dist/*
================================================
FILE: .gitignore
================================================
.sos
*.swp
__pycache__
tmp*
build
dist
*egg-info
.DS_Store
_site/
.ipynb_checkpoints
*.pem
*.pyc
.ipython/
.jupyter/
.viminfo
geckodriver.log
.bash_history
.cache/
.local/
================================================
FILE: .pre-commit-config.yaml
================================================
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- id: flake8
args: ["--ignore=E501,W504"]
- repo: https://github.com/pre-commit/mirrors-yapf
rev: ''
hooks:
- id: yapf
args: [--style, "{based_on_style:chromium,indent_width:4}"]
================================================
FILE: .travis.yml
================================================
dist: trusty
group: edge
os:
- linux
# travis does not support python on osx yet (https://github.com/travis-ci/travis-ci/issues/4729)
language: python
python:
- "3.8"
addons:
chrome: stable
before_install:
# - sudo apt-get update
# - sudo apt-get -y -o Dpkg::Options::="--force-confnew" install docker-ce
# - wget https://repo.continuum.io/miniconda/Miniconda3-4.5.11-Linux-x86_64.sh -O miniconda.sh
# - bash miniconda.sh -b -p $HOME/miniconda
# - export PATH="$HOME/miniconda/bin:$PATH"
# - hash -r
# - conda config --set always_yes yes --set changeps1 no
# - conda update -q conda
# #- conda info -a
# - pip install docker rq pyyaml psutil tqdm nose fasteners pygments networkx pydot pydotplus
# - pip install entrypoints jupyter coverage codacy-coverage pytest pytest-cov python-coveralls
# - conda install -q pandas numpy
# - conda install -c r r-essentials r-feather
# - conda install -c conda-forge feather-format
# # SoS Notebook
# - pip install jedi notebook nbconvert nbformat pyyaml psutil tqdm scipy markdown matplotlib
# - sudo apt-get install libmagickwand-dev libmagickcore5-extra graphviz
# - pip install pygments ipython wand graphviz
# - pip install git+https://github.com/vatlab/sos.git
# - pip install git+https://github.com/vatlab/sos-bash.git
# - pip install git+https://github.com/vatlab/sos-python.git
# - pip install git+https://github.com/vatlab/sos-r.git
# - pip install selenium
# - google-chrome-stable --headless --disable-gpu --remote-debugging-port=9222 http://localhost &
# - wget https://chromedriver.storage.googleapis.com/73.0.3683.20/chromedriver_linux64.zip -P ~/
# - unzip ~/chromedriver_linux64.zip -d ~/
# - rm ~/chromedriver_linux64.zip
# - sudo mv -f ~/chromedriver /usr/local/share/
# - sudo chmod +x /usr/local/share/chromedriver
# - sudo ln -s /usr/local/share/chromedriver /usr/local/bin/chromedriver
# - "export DISPLAY=:99.0"
# - "sh -e /etc/init.d/xvfb start"
# - sleep 3
sudo: required
services:
- docker
install:
- docker network create sosnet
- docker pull mdabioinfo/sos_notebook_test:latest
- docker pull mdabioinfo/eg_sshd:latest
- cd development
- export COMPOSE_PROJECT_NAME=sosnotebook
- docker-compose up -d
- cd ..
- docker cp . sosnotebook_sos-notebook_1:/home/jovyan
- docker exec -u root sosnotebook_sos-notebook_1 sh ./development/install_sos_notebook.sh
before_script:
# - cd test
# - sh build_test_docker.sh
script:
#- docker exec -u root sosnotebook_sos-notebook_1 pytest ./test -x -v --cov sos_notebook --cov-report=term-missing
- docker exec -u root sosnotebook_sos-notebook_1 mkdir -p /home/jovyan/.sos
- docker exec -u root sosnotebook_sos-notebook_1 mkdir -p /home/jovyan/.local
- docker exec -u root sosnotebook_sos-notebook_1 chown -R jovyan:users /home/jovyan/.sos/
- docker exec -u root sosnotebook_sos-notebook_1 chown -R jovyan:users /home/jovyan/.local/
- docker exec -u root sosnotebook_sos-notebook_1 chown -R jovyan:users /home/jovyan/test/
- docker exec sosnotebook_sos-notebook_1 bash -c 'cd test && pytest -v'
after_success:
- coverage combine
- coveralls
notifications:
email:
recipients:
- ben.bob@gmail.com
on_success: never
on_failure: always
================================================
FILE: CLAUDE.md
================================================
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Development Commands
**Invoke Tasks (Recommended):**
- `invoke --list` - Show all available development tasks
- `invoke dev-setup` - Set up complete development environment with uv
- `invoke venv-create` - Create virtual environment with uv
- `invoke uv-sync` - Sync dependencies using uv
- `invoke uv-lock` - Update uv.lock file
- `invoke check` - Run all quality checks (format, lint, test)
- `invoke format` - Format code with ruff
- `invoke format --check` - Check code formatting without changes
- `invoke lint` - Run linting with ruff
- `invoke lint --fix` - Run linting with auto-fix
- `invoke test` - Run tests with pytest
- `invoke test --verbose` - Run tests with verbose output
- `invoke test --coverage` - Run tests with coverage report
- `invoke build` - Build source and wheel distributions
- `invoke build --clean` - Clean build artifacts and rebuild
- `invoke clean` - Clean build artifacts and caches
- `invoke install` - Install package in development mode
- `invoke release-check` - Run comprehensive pre-release checks
- `invoke test-docker` - Run full test suite in Docker (CI environment)
**uv Virtual Environment Management:**
- `uv venv` - Create virtual environment
- `uv sync` - Install dependencies from pyproject.toml
- `uv sync --dev` - Install with development dependencies
- `uv add <package>` - Add runtime dependency
- `uv add --dev <package>` - Add development dependency
- `uv remove <package>` - Remove dependency
- `uv lock` - Update dependency lock file
- `uv pip install -e .` - Install package in development mode
- `source .venv/bin/activate` - Activate virtual environment
**Direct Commands:**
- `pytest -v` - Run all tests (executed in Docker container)
- `docker exec sosnotebook_sos-notebook_1 bash -c 'cd test && pytest -v'` - Full test run in CI environment
**Code Quality:**
- `pre-commit run --all-files` - Run code formatting and linting (ruff)
- `ruff check` - Run linting
- `ruff check --fix` - Run linting with auto-fix
- `ruff format` - Format code
- `ruff format --check` - Check code formatting
**Development Environment:**
- Uses Docker for testing - see `development/docker-compose.yml`
- `docker-compose build --no-cache` - Rebuild test images
- `docker network create sosnet` - Create Docker network for testing
**Build System:**
- Uses modern `pyproject.toml` configuration (PEP 517/518)
- `python -m build` - Build source and wheel distributions
- `python -m build --sdist` - Build source distribution only
- `python -m build --wheel` - Build wheel distribution only
- `pip install -e .` - Install in development mode
- Package entry points defined in pyproject.toml for SoS converters
- Old `setup.py` kept as `setup.py.old` for reference
## Architecture Overview
SoS Notebook is a Jupyter kernel that enables multi-language workflows within a single notebook. The architecture consists of several key components:
**Core Kernel System:**
- `kernel.py` - Main `SoS_Kernel` class extending `IPythonKernel`, handles cell execution and communication
- `subkernel.py` - `Subkernels` class manages multiple language kernels (R, Bash, Python, etc.)
- `comm_manager.py` - `SoSCommManager` handles inter-kernel communication and data exchange
**Language Integration:**
- Language modules (sos-bash, sos-r, etc.) provide language-specific data type understanding
- `magics.py` - SoS-specific Jupyter magic commands for workflow control
- `completer.py` - Tab completion for SoS syntax and cross-language variables
**Workflow Execution:**
- `step_executor.py` - Executes individual workflow steps
- `workflow_executor.py` - Orchestrates complete workflows, includes `NotebookLoggingHandler`
- Supports both interactive execution and batch workflow processing
**Conversion System:**
- `converter.py` - Multiple converters for different formats:
- `ScriptToNotebookConverter` (sos-ipynb)
- `NotebookToScriptConverter` (ipynb-sos)
- `NotebookToHTMLConverter` (ipynb-html)
- `NotebookToPDFConverter` (ipynb-pdf)
- `NotebookToMarkdownConverter` (ipynb-md)
**Testing Strategy:**
- Integration tests in Docker containers simulate real Jupyter environment
- Tests cover frontend interaction, magic commands, conversions, and workflows
- Sample notebooks in `test/` directory provide test scenarios
**Key Dependencies:**
- Requires Python ≥3.7, built on jupyter ecosystem (jupyter_client, ipykernel, nbformat)
- Core SoS package (sos>=0.22.0) provides workflow engine
- pandas/numpy for data handling, psutil for system monitoring
**Data Exchange:**
The system enables seamless data transfer between kernels through SoS variable system, supporting dataframes, matrices, and other structured data types across R, Python, Bash, and other supported languages.
================================================
FILE: CONTRIBUTING.md
================================================
# Contributing to SoS Notebook
Thank you for your interest in contributing to SoS Notebook! This document provides comprehensive guidelines for setting up your development environment, making changes, and submitting contributions.
## Table of Contents
- [Prerequisites](#prerequisites)
- [Development Setup](#development-setup)
- [Development Workflow](#development-workflow)
- [Code Quality](#code-quality)
- [Testing](#testing)
- [Building and Releasing](#building-and-releasing)
- [Submitting Changes](#submitting-changes)
- [Project Structure](#project-structure)
- [Troubleshooting](#troubleshooting)
## Prerequisites
### Required Software
1. **Python 3.8+** - SoS Notebook requires Python 3.8 or later
2. **[uv](https://github.com/astral-sh/uv)** - Fast Python package installer and resolver
```bash
# Install uv (recommended method)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Or via pip
pip install uv
# Or via homebrew (macOS)
brew install uv
```
3. **Git** - For version control
4. **Docker** (optional) - For running the full test suite
### Verify Installation
```bash
python --version # Should be 3.8+
uv --version # Should be 0.4.0+
git --version
```
## Development Setup
### 1. Clone the Repository
```bash
git clone https://github.com/vatlab/sos-notebook.git
cd sos-notebook
```
### 2. Set Up Development Environment
We use `invoke` for task automation. The `dev-setup` command will:
- Create a virtual environment with uv
- Install the package in development mode
- Install all development dependencies
- Set up pre-commit hooks
```bash
# One-command setup
invoke dev-setup
# Activate the virtual environment
source .venv/bin/activate
```
### 3. Manual Setup (Alternative)
If you prefer manual setup:
```bash
# Create virtual environment
uv venv
# Activate virtual environment
source .venv/bin/activate # Linux/macOS
# or
.venv\Scripts\activate # Windows
# Install package in development mode with dependencies
uv pip install -e .
uv sync --dev
# Install pre-commit hooks
pre-commit install
```
### 4. Verify Setup
```bash
# List available development tasks
invoke --list
# Run a quick check
invoke format --check
invoke lint
```
## Development Workflow
### Daily Development Commands
```bash
# Activate virtual environment (if not already active)
source .venv/bin/activate
# Make your changes...
# Format code
invoke format
# Check and fix linting issues
invoke lint --fix
# Run tests
invoke test
# Run all quality checks
invoke check
```
### Task Automation with Invoke
We use [invoke](http://www.pyinvoke.org/) for development task automation. All tasks are defined in `tasks.py`.
#### Core Tasks
```bash
# Environment management
invoke dev-setup # Complete development setup
invoke venv-create # Create virtual environment
invoke uv-sync # Sync dependencies
invoke uv-lock # Update lock file
# Code quality
invoke format # Format code with ruff
invoke format --check # Check formatting without changes
invoke lint # Run linting
invoke lint --fix # Run linting with auto-fix
invoke check # Run all quality checks (format, lint, test)
# Testing
invoke test # Run tests (skips Docker/selenium tests)
invoke test --verbose # Verbose test output
invoke test --coverage # Run tests with coverage report
invoke test-docker # Full test suite in Docker (CI environment)
# Building
invoke build # Build source and wheel distributions
invoke build --clean # Clean and rebuild
invoke clean # Clean build artifacts
invoke install # Install package in development mode
# Release
invoke release-check # Comprehensive pre-release checks
```
#### Advanced Tasks
```bash
# Run specific test paths
invoke test --path="test/test_magics.py"
# Generate coverage report
invoke test --coverage
# Clean everything
invoke clean --all
```
## Code Quality
### Code Formatting and Linting
We use [ruff](https://github.com/astral-sh/ruff) for both code formatting and linting. Ruff is extremely fast and replaces multiple tools (yapf, flake8, isort, etc.).
#### Configuration
Ruff is configured in `pyproject.toml`:
```toml
[tool.ruff]
target-version = "py38"
line-length = 88
[tool.ruff.lint]
select = ["E", "W", "F", "I", "B", "C4", "UP"]
ignore = ["E501"] # line too long (handled by formatter)
[tool.ruff.format]
quote-style = "double"
indent-style = "space"
```
#### Running Code Quality Checks
```bash
# Format code
invoke format
# Check formatting without making changes
invoke format --check
# Run linting
invoke lint
# Auto-fix linting issues
invoke lint --fix
# Run all quality checks
invoke check
```
### Pre-commit Hooks
Pre-commit hooks are automatically installed during `invoke dev-setup`. They run on every commit to ensure code quality:
- Code formatting with ruff
- Linting with ruff
- Basic file checks (trailing whitespace, file size, etc.)
To run pre-commit manually:
```bash
pre-commit run --all-files
```
## Testing
### Test Structure
- `test/` - Main test directory
- `test/test_*.py` - Unit and integration tests
- Docker-based integration tests simulate real Jupyter environments
### Running Tests
```bash
# Quick tests (skip Docker/selenium dependencies)
invoke test
# Verbose output
invoke test --verbose
# Run with coverage
invoke test --coverage
# Run specific test file
invoke test --path="test/test_magics.py"
# Full test suite in Docker (as run in CI)
invoke test-docker
```
### Test Dependencies
Some tests require additional dependencies:
- **Selenium** - For frontend integration tests
- **Docker** - For containerized testing environment
- **ImageMagick** - For image processing tests
These are automatically skipped if not available, with appropriate skip messages.
### Writing Tests
- Follow pytest conventions
- Use the `NotebookTest` base class for kernel tests
- Mock external dependencies when possible
- Add integration tests for new features
## Building and Releasing
### Modern Build System
SoS Notebook uses a modern Python build system based on:
- `pyproject.toml` for project configuration (PEP 517/518)
- `uv` for dependency management
- `build` module for creating distributions
### Building Distributions
```bash
# Build both source and wheel distributions
invoke build
# Clean build artifacts first
invoke build --clean
# Manual build (alternative)
uv run python -m build
```
### Release Preparation
Before releasing:
```bash
# Run comprehensive pre-release checks
invoke release-check
# This runs:
# 1. All quality checks (format, lint, test)
# 2. Clean build
# 3. Distribution verification
```
### Version Management
1. Update version in `pyproject.toml`
2. Update version in `src/sos_notebook/_version.py`
3. Update `CHANGELOG.md` (if exists)
4. Run `invoke release-check`
5. Create git tag and push
## Submitting Changes
### Pull Request Process
1. **Fork the repository** on GitHub
2. **Create a feature branch** from `master`:
```bash
git checkout -b feature/your-feature-name
```
3. **Make your changes** following the development workflow
4. **Run quality checks**:
```bash
invoke check
```
5. **Commit your changes** with descriptive messages:
```bash
git add .
git commit -m "Add feature: description of changes"
```
6. **Push to your fork**:
```bash
git push origin feature/your-feature-name
```
7. **Create a Pull Request** on GitHub
### Commit Message Guidelines
- Use clear, descriptive commit messages
- Start with a verb in present tense ("Add", "Fix", "Update", "Remove")
- Limit first line to 72 characters
- Reference issues when applicable: "Fix #123: description"
### Pull Request Guidelines
- **Title**: Clear, descriptive title
- **Description**: Explain what changes were made and why
- **Testing**: Describe how the changes were tested
- **Documentation**: Update documentation if needed
- **Breaking Changes**: Clearly mark any breaking changes
## Project Structure
```
sos-notebook/
├── src/sos_notebook/ # Main package source
│ ├── __init__.py
│ ├── kernel.py # Main SoS kernel
│ ├── subkernel.py # Multi-language kernel management
│ ├── magics.py # Jupyter magic commands
│ ├── converter.py # Notebook conversion utilities
│ └── ...
├── test/ # Test suite
│ ├── test_kernel.py
│ ├── test_magics.py
│ └── ...
├── development/ # Docker development environment
├── tasks.py # Invoke task definitions
├── pyproject.toml # Project configuration
├── uv.lock # Dependency lock file
├── README.md # Project overview
├── CONTRIBUTING.md # This file
└── CLAUDE.md # Claude Code development guide
```
### Key Files
- **`pyproject.toml`** - Modern Python project configuration
- **`tasks.py`** - Development task automation with invoke
- **`uv.lock`** - Locked dependency versions for reproducible builds
- **`CLAUDE.md`** - Development guidance for Claude Code AI assistant
## Troubleshooting
### Common Issues
#### Virtual Environment Issues
```bash
# Remove and recreate virtual environment
rm -rf .venv
invoke venv-create
source .venv/bin/activate
invoke uv-sync
```
#### Dependency Issues
```bash
# Update dependencies
invoke uv-sync
# Regenerate lock file
invoke uv-lock
```
#### Import Errors
```bash
# Reinstall package in development mode
uv pip install -e .
```
#### Test Failures
```bash
# Run tests with verbose output
invoke test --verbose
# Run specific test file
invoke test --path="test/test_specific.py"
# Skip Docker tests if Docker isn't available
invoke test # Docker tests are skipped automatically
```
### Performance Issues
- **Slow dependency resolution**: uv is much faster than pip, but if you experience issues, try clearing cache: `uv cache clean`
- **Slow tests**: Use `invoke test` instead of `invoke test-docker` for faster iteration
### Getting Help
1. **Check existing issues**: [GitHub Issues](https://github.com/vatlab/sos-notebook/issues)
2. **Search documentation**: [SoS Documentation](https://vatlab.github.io/sos-docs/)
3. **Create a new issue**: Include:
- Python version (`python --version`)
- uv version (`uv --version`)
- Operating system
- Full error message
- Steps to reproduce
## Development Philosophy
- **Modern tooling**: We use the latest and fastest Python development tools
- **Developer experience**: Commands should be simple and fast
- **Code quality**: Automated formatting and linting
- **Testing**: Comprehensive tests including Docker-based integration tests
- **Documentation**: Clear documentation for users and developers
Thank you for contributing to SoS Notebook! 🎉
================================================
FILE: LICENSE
================================================
BSD 3-Clause License
Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
* Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
================================================
FILE: MANIFEST.in
================================================
include LICENSE
include README.md
include pyproject.toml
recursive-include src *.py *.js *.tpl *.png *.css *.j2 *.json
recursive-include test *.vim *.py *.js
prune */.sos
global-exclude *~
global-exclude *.bak
global-exclude *.pyc
global-exclude *.pyo
global-exclude .ipynb_checkpoints
================================================
FILE: README.md
================================================
[](https://anaconda.org/conda-forge/sos-notebook)
[](https://badge.fury.io/py/sos-notebook)
[](https://zenodo.org/badge/latestdoi/105826659)
[](https://travis-ci.org/vatlab/sos-notebook)
[](https://ci.appveyor.com/project/BoPeng/sos-notebook/branch/master)
## ⚠️ Deprecation Notice
**The classic Jupyter notebook interface support for SoS Notebook has been deprecated.**
This package no longer provides JavaScript extensions or frontend functionality for the classic Jupyter notebook interface. All frontend features have been migrated to JupyterLab and are available through [jupyterlab-sos](https://github.com/vatlab/jupyterlab-sos).
**This package is still required** as a backend dependency for jupyterlab-sos and contains the SoS kernel, magics, notebook converters (HTML, PDF, Markdown), and other core functionality. However, **for the best SoS experience, please use JupyterLab with the jupyterlab-sos extension**.
For more information, see the [JupyterLab SoS extension](https://github.com/vatlab/jupyterlab-sos).
# SoS Notebook
SoS Notebook is a [Jupyter](https://jupyter.org/) kernel that allows the use of multiple kernels in one Jupyter notebook. Using language modules that understand datatypes of underlying languages (modules [sos-bash](https://github.com/vatlab/sos-bash), [sos-r](https://github.com/vatlab/sos-r), [sos-matlab](https://github.com/vatlab/sos-matlab), etc), SoS Notebook allows data exchange among live kernels of supported languages.
SoS Notebook also extends the Jupyter frontend and adds a console panel for the execution of scratch commands and display of intermediate results and progress information, and a number of shortcuts and magics to facilitate interactive data analysis. All these features have been ported to JupyterLab, either in the sos extension [jupyterlab-sos](https://github.com/vatlab/jupyterlab-sos) or contributed to JupyterLab as core features.
SoS Notebook also serves as the IDE for the [SoS Workflow](https://github.com/vatlab/sos) that allows the development and execution of workflows from Jupyter notebooks. This not only allows easy translation of scripts developed for interactive data analysis to workflows running in containers and remote systems, but also allows the creation of scientific workflows in a format with narratives, sample input and output.
SoS Notebook is part of the SoS suite of tools. Please refer to the [SoS Homepage](http://vatlab.github.io/SoS/) for details about SoS, and [this page](https://vatlab.github.io/sos-docs/notebook.html#content) for documentations and examples on SoS Notebook. If a language that you are using is not yet supported by SoS, please [submit a ticket](https://github.com/vatlab/sos-notebook/issues), or consider adding a language module by yourself following the guideline [here](https://vatlab.github.io/sos-docs/doc/user_guide/language_module.html).
## Installation
### For Users
Install from PyPI:
```bash
pip install sos-notebook
```
Install from conda-forge:
```bash
conda install -c conda-forge sos-notebook
```
### For Developers
SoS Notebook uses modern Python packaging and development tools. See [CONTRIBUTING.md](CONTRIBUTING.md) for detailed development setup instructions.
Quick development setup:
```bash
# Prerequisites: Python 3.8+ and uv (https://github.com/astral-sh/uv)
git clone https://github.com/vatlab/sos-notebook.git
cd sos-notebook
invoke dev-setup # Sets up virtual environment and dependencies
source .venv/bin/activate
```
## Development
This project uses modern Python development tools:
- **[uv](https://github.com/astral-sh/uv)** for fast dependency management and virtual environments
- **[ruff](https://github.com/astral-sh/ruff)** for linting and code formatting
- **[invoke](http://www.pyinvoke.org/)** for task automation
- **[pytest](https://pytest.org/)** for testing
- **Modern build system** with `pyproject.toml` (PEP 517/518)
### Quick Commands
```bash
# Set up development environment
invoke dev-setup
# Run quality checks
invoke check # Run all checks (format, lint, test)
invoke format # Format code with ruff
invoke lint --fix # Lint and auto-fix issues
invoke test # Run tests
# Build and release
invoke build # Build distributions
invoke release-check # Comprehensive pre-release checks
```
For detailed contribution guidelines, see [CONTRIBUTING.md](CONTRIBUTING.md).
## Testing
The project includes comprehensive tests that run in Docker containers to simulate real Jupyter environments:
```bash
invoke test # Quick tests (skip Docker/selenium)
invoke test-docker # Full test suite in Docker (as in CI)
```
## License
This project is licensed under the 3-clause BSD License.
================================================
FILE: development/README.md
================================================
1. Update docker images
docker-compose build --no-cache
2. push docker images to dockerhub
docker push mdabioinfo/eg_sshd
docker push mdabioinfo/sos_notebook_test
================================================
FILE: development/docker-compose.yml
================================================
version: '3'
services:
sos-notebook:
build:
context: ./sos_notebook_test
image: mdabioinfo/sos_notebook_test:latest
restart: "no"
shm_size:
8gb
networks:
- sosnet
sos:
build:
context: ./eg_sshd
image: mdabioinfo/eg_sshd:latest
hostname: sos
restart: always
networks:
- sosnet
networks:
sosnet:
external: true
================================================
FILE: development/eg_sshd/.ssh/id_rsa
================================================
-----BEGIN RSA PRIVATE KEY-----
MIIEpAIBAAKCAQEAwOw86WfZeC8AAkhgfr0PMCqEtwTXqK4q8hOuvruQa9TN/dcG
kUFukzfJ0XAUz4anw16PMnwfOuvwk9ya5Hz1UwGjcu+l2AxSWeG+KJkpbBtnOls9
ELrE+7pwOurVolP1Qevf6y2vfJNGN78uNuiupp/hJ5sSWqashMoqqqsT7631d+9C
PdrzOWZd3TL1CFl8uNkCWt8i7vtPhRycrS9Wkpmdu1CflJbgab9vfn02jeY4Cx2z
vToQz3AMmt3QnuKf2DEHC+QeLDayF0TgOhuqLvjl1gP+Hm4GWWvHCiPJBQ4idMm0
Z+ujR4uafuIIZ5u3N3O6z45SDi2JorqzpYmsywIDAQABAoIBAQCMSHv2YQxyZwLD
pit8nS85IAHHL589yf/ybTuI98yJjIGJTl05LHIiXNPFFpIbYVgGKXFJDZaL+trC
Ogzrjq25AR0AS6C1nCgZsZvb25uSP87tUUDzNExem3BWd0KHOjPCDqmRUnQjytep
W7xYMxQkl2darFlJT59tI7Coz6O8iOUTYdTJtKfaQEC4zE5oTQ3AxqaWpiPc4i9/
/kEGYDbFitz6id7zWBS6cYGmvIhsCUwVNMany+R0QFGP6AqSv0knp4vvd1IMzDUO
q1mt8OyZ/ZKkaw8sA2UACkdxnp+iejNiE64tmJVGxPf5J4FuNoU25Ecwm9cnvGPv
MlOiYVMBAoGBAP7gvztmnrhYkhTZB7CfcyRlH1pWRfznLlQU3E2U094/rnjck/av
atdPcko/UoMjwCZ4dVgZFBkVYuPhlELPiHfs2OGEcSry+GhWgguZoSCKsE+vaZuN
kA75Eb2GEQCklN1LX+0x2cfux0CbYUqODoT1VNXJpwpqaSWiQXgR5giBAoGBAMHF
qonSbizig/VsVhM8smtl+3dPFkpl/fulo4+Xuu29YxYDtR1quUeaPQH2Zx+lQJmb
+o4Jp1P/9+4eulXYn/nCLkOFRaKNb7i0rWfHaqkgjUIQVSSSwUOR14ivvdW1oBKN
Gvsqy5psxrM2dDHphGkwh8EpGwov2/Fo5bsU1a9LAoGBAJfJVnlUms9kB9MckKTR
wGt7QVm2KUX8ky2FotEdAbPIrunRStjNDM6exIyM+2GXx9XhRNirTrnFb7gQXhAP
sdDhnyNmkVKnkeHpKtcnrbpIfclmyHjXrGQOVk9M6RE98l17huwmFPEpNUY3gpA4
21K5G8WZqr3cMzQzVdPgrOKBAoGAcX4F2b1ffHibk3aFn7TQR6kutP2kb6T3Mpoc
h3D2MmLXk0BOp1En/fEvxGN+mQFgKdg601CCKeflXhmvR7KeWFnMYQ3A8Glow0VH
v14EcdS4B7arN8Wg3qOgGtXcGTzM6bCt2eiB4gvOAY9mVQmR3U5oZNFfngLUDrxC
ueWFFqsCgYBFJe38Kby6SpaXpaWDF9rNE8a4QkYXi3N7TDUUYePJ4LbDhfff1oty
6x8SkbFqR0Bb4NQa1fRCozjhlFqQgZvRHkDNqRaTPUNZ1AtTBElVCgWNQKPFL1s7
bUSgkIBztPFI7wGUvUViAFQsFOPSgArkQq7feGHBHrzY/Xu6aXxb4g==
-----END RSA PRIVATE KEY-----
================================================
FILE: development/eg_sshd/.ssh/id_rsa.pub
================================================
ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDA7DzpZ9l4LwACSGB+vQ8wKoS3BNeoriryE66+u5Br1M391waRQW6TN8nRcBTPhqfDXo8yfB866/CT3JrkfPVTAaNy76XYDFJZ4b4omSlsG2c6Wz0QusT7unA66tWiU/VB69/rLa98k0Y3vy426K6mn+EnmxJapqyEyiqqqxPvrfV370I92vM5Zl3dMvUIWXy42QJa3yLu+0+FHJytL1aSmZ27UJ+UluBpv29+fTaN5jgLHbO9OhDPcAya3dCe4p/YMQcL5B4sNrIXROA6G6ou+OXWA/4ebgZZa8cKI8kFDiJ0ybRn66NHi5p+4ghnm7c3c7rPjlIOLYmiurOliazL root@9740cb7bc090
================================================
FILE: development/eg_sshd/.ssh/known_hosts
================================================
|1|ifI8r7TTuuavRoHPCWi/zRhT7Xg=|cJEU9IuPFryWuUqX7WDDGFBSwxU= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
|1|yCnH65n1ZVw+qPTMEqT56AEZT4M=|2pj5jSP74KW6M1eQZYG0j8V2B8s= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
|1|bTROh2skcuquL75pSr4wX75zvJE=|nLhgDUKf0rdGtDyc0DkYeb4eId8= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
|1|ecfY0BlbNYKAPb3UsJLmuDx20oQ=|3JKsJ85O666utGbMe0BTGw0sG98= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
================================================
FILE: development/eg_sshd/Dockerfile
================================================
FROM python:3.6
RUN apt-get update && apt-get install -y openssh-server rsync task-spooler
RUN mkdir /var/run/sshd
RUN echo 'root:screencast' | chpasswd
RUN sed -i 's/PermitRootLogin prohibit-password/PermitRootLogin yes/' /etc/ssh/sshd_config
# SSH login fix. Otherwise user is kicked off after login
RUN sed 's@session\s*required\s*pam_loginuid.so@session optional pam_loginuid.so@g' -i /etc/pam.d/sshd
ENV NOTVISIBLE "in users profile"
RUN echo "export VISIBLE=now" >> /etc/profile
RUN [ -d /root/.ssh ] || mkdir -p /root/.ssh
# install sos on the remote host
RUN pip install spyder jedi notebook nbconvert nbformat pyyaml psutil tqdm
RUN pip install fasteners pygments ipython ptpython networkx pydotplus
RUN SHA=$(git ls-remote https://github.com/vatlab/sos.git -t master)
RUN SHA=$SHA git clone http://github.com/vatlab/sos sos
RUN cd sos && pip install . -U
RUN echo "export TS_SLOTS=10" >> /root/.bash_profile
COPY ./.ssh/id_rsa.pub /root/.ssh/authorized_keys
EXPOSE 22
CMD ["/usr/sbin/sshd", "-D"]
================================================
FILE: development/install_sos_notebook.sh
================================================
#!/usr/bin/bash
pip install . -U
python -m sos_notebook.install
# jupyter notebook
================================================
FILE: development/sos_notebook_test/.ssh/id_rsa
================================================
-----BEGIN RSA PRIVATE KEY-----
MIIEpAIBAAKCAQEAwOw86WfZeC8AAkhgfr0PMCqEtwTXqK4q8hOuvruQa9TN/dcG
kUFukzfJ0XAUz4anw16PMnwfOuvwk9ya5Hz1UwGjcu+l2AxSWeG+KJkpbBtnOls9
ELrE+7pwOurVolP1Qevf6y2vfJNGN78uNuiupp/hJ5sSWqashMoqqqsT7631d+9C
PdrzOWZd3TL1CFl8uNkCWt8i7vtPhRycrS9Wkpmdu1CflJbgab9vfn02jeY4Cx2z
vToQz3AMmt3QnuKf2DEHC+QeLDayF0TgOhuqLvjl1gP+Hm4GWWvHCiPJBQ4idMm0
Z+ujR4uafuIIZ5u3N3O6z45SDi2JorqzpYmsywIDAQABAoIBAQCMSHv2YQxyZwLD
pit8nS85IAHHL589yf/ybTuI98yJjIGJTl05LHIiXNPFFpIbYVgGKXFJDZaL+trC
Ogzrjq25AR0AS6C1nCgZsZvb25uSP87tUUDzNExem3BWd0KHOjPCDqmRUnQjytep
W7xYMxQkl2darFlJT59tI7Coz6O8iOUTYdTJtKfaQEC4zE5oTQ3AxqaWpiPc4i9/
/kEGYDbFitz6id7zWBS6cYGmvIhsCUwVNMany+R0QFGP6AqSv0knp4vvd1IMzDUO
q1mt8OyZ/ZKkaw8sA2UACkdxnp+iejNiE64tmJVGxPf5J4FuNoU25Ecwm9cnvGPv
MlOiYVMBAoGBAP7gvztmnrhYkhTZB7CfcyRlH1pWRfznLlQU3E2U094/rnjck/av
atdPcko/UoMjwCZ4dVgZFBkVYuPhlELPiHfs2OGEcSry+GhWgguZoSCKsE+vaZuN
kA75Eb2GEQCklN1LX+0x2cfux0CbYUqODoT1VNXJpwpqaSWiQXgR5giBAoGBAMHF
qonSbizig/VsVhM8smtl+3dPFkpl/fulo4+Xuu29YxYDtR1quUeaPQH2Zx+lQJmb
+o4Jp1P/9+4eulXYn/nCLkOFRaKNb7i0rWfHaqkgjUIQVSSSwUOR14ivvdW1oBKN
Gvsqy5psxrM2dDHphGkwh8EpGwov2/Fo5bsU1a9LAoGBAJfJVnlUms9kB9MckKTR
wGt7QVm2KUX8ky2FotEdAbPIrunRStjNDM6exIyM+2GXx9XhRNirTrnFb7gQXhAP
sdDhnyNmkVKnkeHpKtcnrbpIfclmyHjXrGQOVk9M6RE98l17huwmFPEpNUY3gpA4
21K5G8WZqr3cMzQzVdPgrOKBAoGAcX4F2b1ffHibk3aFn7TQR6kutP2kb6T3Mpoc
h3D2MmLXk0BOp1En/fEvxGN+mQFgKdg601CCKeflXhmvR7KeWFnMYQ3A8Glow0VH
v14EcdS4B7arN8Wg3qOgGtXcGTzM6bCt2eiB4gvOAY9mVQmR3U5oZNFfngLUDrxC
ueWFFqsCgYBFJe38Kby6SpaXpaWDF9rNE8a4QkYXi3N7TDUUYePJ4LbDhfff1oty
6x8SkbFqR0Bb4NQa1fRCozjhlFqQgZvRHkDNqRaTPUNZ1AtTBElVCgWNQKPFL1s7
bUSgkIBztPFI7wGUvUViAFQsFOPSgArkQq7feGHBHrzY/Xu6aXxb4g==
-----END RSA PRIVATE KEY-----
================================================
FILE: development/sos_notebook_test/.ssh/id_rsa.pub
================================================
ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDA7DzpZ9l4LwACSGB+vQ8wKoS3BNeoriryE66+u5Br1M391waRQW6TN8nRcBTPhqfDXo8yfB866/CT3JrkfPVTAaNy76XYDFJZ4b4omSlsG2c6Wz0QusT7unA66tWiU/VB69/rLa98k0Y3vy426K6mn+EnmxJapqyEyiqqqxPvrfV370I92vM5Zl3dMvUIWXy42QJa3yLu+0+FHJytL1aSmZ27UJ+UluBpv29+fTaN5jgLHbO9OhDPcAya3dCe4p/YMQcL5B4sNrIXROA6G6ou+OXWA/4ebgZZa8cKI8kFDiJ0ybRn66NHi5p+4ghnm7c3c7rPjlIOLYmiurOliazL root@9740cb7bc090
================================================
FILE: development/sos_notebook_test/.ssh/known_hosts
================================================
|1|ifI8r7TTuuavRoHPCWi/zRhT7Xg=|cJEU9IuPFryWuUqX7WDDGFBSwxU= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
|1|yCnH65n1ZVw+qPTMEqT56AEZT4M=|2pj5jSP74KW6M1eQZYG0j8V2B8s= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
|1|bTROh2skcuquL75pSr4wX75zvJE=|nLhgDUKf0rdGtDyc0DkYeb4eId8= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
|1|ecfY0BlbNYKAPb3UsJLmuDx20oQ=|3JKsJ85O666utGbMe0BTGw0sG98= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV5pS+Um9ke91Rmq1E83GnlS2EdvwpELdU563V8O9Dc9/DMmliPoaWO/oLLW5tz1y2hu9e7ISZytvaeHUnTVg8=
================================================
FILE: development/sos_notebook_test/Dockerfile
================================================
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
# tag created in March 2019
FROM jupyter/r-notebook:83ed2c63671f
MAINTAINER Bo Peng <Bo.Peng@bcm.edu>
USER root
# Tools
RUN apt-get update && apt-get install -y graphviz texlive-xetex texlive-latex-recommended texlive-latex-extra texlive-fonts-recommended libssl1.0.0 libssl-dev libappindicator3-1 libxtst6 libgmp3-dev software-properties-common rsync ssh
USER jovyan
RUN pip install bash_kernel selenium nose
RUN python -m bash_kernel.install --user
RUN pip install markdown wand graphviz imageio pillow nbformat coverage codacy-coverage pytest pytest-cov python-coveralls
RUN conda install -y feather-format -c conda-forge
RUN conda install -c r r-arrow r-dplyr
## trigger rerun for sos updates
RUN DUMMY=$(git ls-remote https://github.com/vatlab/sos.git -t master)
RUN DUMMY=${DUMMY} pip install git+https://github.com/vatlab/sos.git
RUN pip install sos-r sos-python sos-bash --upgrade
RUN pip install ipython -U
USER root
RUN curl https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb -o /chrome.deb
RUN dpkg -i /chrome.deb || apt-get install -yf
RUN rm /chrome.deb
RUN wget -q "https://chromedriver.storage.googleapis.com/76.0.3809.126/chromedriver_linux64.zip" -O /tmp/chromedriver.zip \
&& unzip /tmp/chromedriver.zip -d /usr/bin/ \
&& rm /tmp/chromedriver.zip
ENV DISPLAY=:99
RUN ln -s /usr/bin/chromedriver && chmod 777 /usr/bin/chromedriver
RUN chmod 777 /home/jovyan/.local/share/jupyter/
COPY ./.ssh /root/.ssh
RUN chmod 700 /root/.ssh
RUN chmod 600 /root/.ssh/*
COPY ./.ssh /home/jovyan/.ssh
RUN chmod 700 /home/jovyan/.ssh
RUN chmod 600 /home/jovyan/.ssh/*
USER jovyan
================================================
FILE: pyproject.toml
================================================
[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "sos-notebook"
version = "0.24.7"
description = "Script of Scripts (SoS) kernel and backend for Jupyter - backend dependency for jupyterlab-sos"
readme = "README.md"
requires-python = ">=3.8"
license = {text = "3-clause BSD"}
authors = [
{name = "Bo Peng", email = "Bo.Peng@bcm.edu"},
]
maintainers = [
{name = "Bo Peng", email = "Bo.Peng@bcm.edu"},
]
keywords = ["jupyter", "notebook", "workflow", "reproducible-research", "multi-language"]
classifiers = [
"Development Status :: 4 - Beta",
"Environment :: Console",
"License :: OSI Approved :: BSD License",
"Natural Language :: English",
"Operating System :: POSIX :: Linux",
"Operating System :: MacOS :: MacOS X",
"Operating System :: Microsoft :: Windows",
"Intended Audience :: Information Technology",
"Intended Audience :: Science/Research",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: Implementation :: CPython",
]
dependencies = [
"jupyter_client>=8.0.0",
"jupyter_core",
"sos>=0.22.0",
"nbformat",
"nbconvert>=6.0.1",
"ipython",
"ipykernel>=6.18.0", # 6.18.0 and newer has a new comm manager interface
"jinja2", # https://github.com/spatialaudio/nbsphinx/issues/563
"notebook>=5.0.0",
"tabulate",
"pandas",
"numpy",
"psutil",
]
[project.optional-dependencies]
dev = [
"invoke>=2.0",
"pre-commit",
"pytest>=6.0",
"pytest-cov",
"ruff",
"build",
]
[project.urls]
Homepage = "https://github.com/vatlab/SOS"
Repository = "https://github.com/vatlab/sos-notebook"
Documentation = "https://vatlab.github.io/sos-docs/notebook.html"
"Bug Tracker" = "https://github.com/vatlab/sos-notebook/issues"
[project.entry-points.sos_converters]
"sos-ipynb" = "sos_notebook.converter:ScriptToNotebookConverter"
"ipynb-sos" = "sos_notebook.converter:NotebookToScriptConverter"
"ipynb-html" = "sos_notebook.converter:NotebookToHTMLConverter"
"ipynb-pdf" = "sos_notebook.converter:NotebookToPDFConverter"
"ipynb-md" = "sos_notebook.converter:NotebookToMarkdownConverter"
"ipynb-ipynb" = "sos_notebook.converter:NotebookToNotebookConverter"
[tool.setuptools]
zip-safe = false
[tool.setuptools.packages.find]
where = ["src"]
[tool.setuptools.package-data]
"*" = ["*.js", "*.tpl", "*.png", "*.css", "*.j2", "*.json"]
[tool.uv]
dev-dependencies = [
"invoke>=2.0",
"pre-commit",
"pytest>=6.0",
"pytest-cov",
"ruff",
"build",
"mypy>=1.14.1",
"types-psutil>=6.1.0.20241221",
"pandas-stubs>=2.0.2.230605",
"types-tabulate>=0.9.0.20241207",
"papermill>=2.6.0",
"sos-papermill>=0.2.1",
"testpath>=0.6.0",
]
[tool.ruff]
target-version = "py38"
line-length = 88
exclude = [
".git",
".venv",
"__pycache__",
"build",
"dist",
"*.egg-info",
]
[tool.ruff.lint]
select = [
"E", # pycodestyle errors
"W", # pycodestyle warnings
"F", # pyflakes
"I", # isort
"B", # flake8-bugbear
"C4", # flake8-comprehensions
"UP", # pyupgrade
]
ignore = [
"E501", # line too long (handled by formatter)
]
[tool.ruff.format]
quote-style = "double"
indent-style = "space"
skip-magic-trailing-comma = false
line-ending = "auto"
[tool.mypy]
python_version = "3.9"
warn_return_any = true
warn_unused_configs = true
disallow_untyped_defs = false
ignore_missing_imports = true
no_implicit_optional = true
warn_redundant_casts = true
warn_unused_ignores = true
warn_no_return = true
check_untyped_defs = false
# Ignore specific modules that don't have type hints
[[tool.mypy.overrides]]
module = [
"sos.*",
"selenium.*",
"ipykernel.*",
"papermill.*",
"IPython.*",
]
ignore_missing_imports = true
ignore_errors = true
================================================
FILE: setup.py.old
================================================
#!/usr/bin/env python
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
import os
import sys
from setuptools import find_packages, setup
_py_ver = sys.version_info
if _py_ver.major == 2 or (_py_ver.major == 3 and
(_py_ver.minor, _py_ver.micro) < (6, 0)):
raise SystemError(
'sos-notebook requires Python 3.6 or higher. Please upgrade your Python {}.{}.{}.'
.format(_py_ver.major, _py_ver.minor, _py_ver.micro))
# obtain version of SoS
with open('src/sos_notebook/_version.py') as version:
for line in version:
if line.startswith('__version__'):
__version__ = eval(line.split('=')[1])
break
kernel_json = {
"argv": ["python", "-m", "sos_notebook.kernel", "-f", "{connection_file}"],
"display_name": "SoS",
"language": "sos",
}
CURRENT_DIR = os.path.abspath(os.path.dirname(__file__))
def get_long_description():
with open(os.path.join(CURRENT_DIR, "README.md"), "r") as ld_file:
return ld_file.read()
setup(
name="sos-notebook",
version=__version__,
description='Script of Scripts (SoS): an interactive, cross-platform, and cross-language workflow system for reproducible data analysis',
long_description=get_long_description(),
long_description_content_type="text/markdown",
author='Bo Peng',
url='https://github.com/vatlab/SOS',
author_email='Bo.Peng@bcm.edu',
maintainer='Bo Peng',
maintainer_email='Bo.Peng@bcm.edu',
license='3-clause BSD',
include_package_data=True,
classifiers=[
'Development Status :: 4 - Beta',
'Environment :: Console',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
'Operating System :: POSIX :: Linux',
'Operating System :: MacOS :: MacOS X',
'Operating System :: Microsoft :: Windows',
'Intended Audience :: Information Technology',
'Intended Audience :: Science/Research',
'Programming Language :: Python :: 3 :: Only',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: Implementation :: CPython',
],
zip_safe=False,
packages=find_packages('src'),
package_dir={'': 'src'},
python_requires='>=3.7',
install_requires=[
'jupyter_client>=8.0.0',
'jupyter_core',
'sos>=0.22.0',
'nbformat',
'nbconvert>=6.0.1',
'ipython',
# 6.18.0 and newer has a new comm manager interface
'ipykernel>=6.18.0',
# https://github.com/spatialaudio/nbsphinx/issues/563
'jinja2',
'notebook>=5.0.0',
'tabulate',
# 'markdown',
'pandas',
'numpy',
# 'selenium',
# 'requests',
# 'pytest',
'psutil',
],
entry_points='''
[sos_converters]
sos-ipynb = sos_notebook.converter:ScriptToNotebookConverter
ipynb-sos = sos_notebook.converter:NotebookToScriptConverter
ipynb-html = sos_notebook.converter:NotebookToHTMLConverter
ipynb-pdf = sos_notebook.converter:NotebookToPDFConverter
ipynb-md = sos_notebook.converter:NotebookToMarkdownConverter
ipynb-ipynb = sos_notebook.converter:NotebookToNotebookConverter
''')
================================================
FILE: src/sos_notebook/__init__.py
================================================
#!/usr/bin/env python3
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
from ._version import __version__
# suppress warning
_ = __version__
================================================
FILE: src/sos_notebook/_version.py
================================================
#!/usr/bin/env python3
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
import sys
__all__ = ["__version__", "SOS_FULL_VERSION"]
_py_ver = sys.version_info
if _py_ver.major == 2 or (
_py_ver.major == 3 and (_py_ver.minor, _py_ver.micro) < (7, 0)
):
raise SystemError(
f"SOS requires Python 3.7 or higher. Please upgrade your Python {_py_ver.major}.{_py_ver.minor}.{_py_ver.micro}."
)
# version of the SoS language
__sos_version__ = "1.0"
# version of the sos command
__version__ = "0.24.7"
__py_version__ = f"{_py_ver.major}.{_py_ver.minor}.{_py_ver.micro}"
#
SOS_FULL_VERSION = (
f"{__version__} for Python {_py_ver.major}.{_py_ver.minor}.{_py_ver.micro}"
)
SOS_COPYRIGHT = f"""SoS {__version__} : Copyright (c) 2016 Bo Peng"""
SOS_CONTACT = """Please visit http://github.com/vatlab/SOS for more information."""
================================================
FILE: src/sos_notebook/comm_manager.py
================================================
"""Base class to manage comms"""
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License.
import logging
import time
import traitlets
import traitlets.config
from ipykernel.comm.manager import CommManager
logger = logging.getLogger("soskernel.comm")
class CommProxyHandler:
def __init__(self, KC, sos_kernel):
self._KC = KC
self._sos_kernel = sos_kernel
def handle_msg(self, msg):
self._KC.shell_channel.send(msg)
# wait for subkernel to handle
comm_msg_started = False
comm_msg_ended = False
while not (comm_msg_started and comm_msg_ended):
while self._KC.iopub_channel.msg_ready():
sub_msg = self._KC.iopub_channel.get_msg()
if sub_msg["header"]["msg_type"] == "status":
if sub_msg["content"]["execution_state"] == "busy":
comm_msg_started = True
elif (
comm_msg_started
and sub_msg["content"]["execution_state"] == "idle"
):
comm_msg_ended = True
continue
self._sos_kernel.session.send(self._sos_kernel.iopub_socket, sub_msg)
time.sleep(0.001)
class SoSCommManager(CommManager):
"""This comm manager will replace the system default comm manager.
When a comm is requested, it will return a `CommProxyHandler` instead
of a real comm if the comm is created by the subkerel.
"""
kernel = traitlets.Instance("sos_notebook.kernel.SoS_Kernel")
def __init__(self, **kwargs):
super().__init__(**kwargs)
self._forwarders = {}
def register_subcomm(self, comm_id, KC, sos_kernel):
self._forwarders[comm_id] = CommProxyHandler(KC, sos_kernel)
def get_comm(self, comm_id):
try:
return self.comms[comm_id]
except Exception:
if comm_id in self._forwarders:
# self._sos_kernel.start_forwarding_ioPub()
return self._forwarders[comm_id]
self.log.warning("No such comm: %s", comm_id)
if self.log.isEnabledFor(logging.DEBUG):
# don't create the list of keys if debug messages aren't enabled
self.log.debug("Current comms: %s", list(self.comms.keys()))
================================================
FILE: src/sos_notebook/completer.py
================================================
#!/usr/bin/env python3
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
import glob
import os
import rlcompleter
from sos.utils import env
from .magics import SoS_Magics
def last_valid(line):
text = line
for char in (" ", "\t", '"', "'", "=", "("):
if text.endswith(char):
text = ""
elif char in text:
text = text.rsplit(char, 1)[-1]
return text
class SoS_MagicsCompleter:
def __init__(self, kernel):
self.kernel = kernel
def get_completions(self, line):
text = last_valid(line)
if not text.strip():
if line.startswith("%get"):
return text, [
x
for x in env.sos_dict.keys()
if x not in self.kernel.original_keys and not x.startswith("_")
]
if any(line.startswith(x) for x in ("%use", "%with", "%shutdown")):
return text, ["SoS"] + list(self.kernel.supported_languages.keys())
return None
if text.startswith("%") and line.startswith(text):
return text, [
"%" + x + " " for x in SoS_Magics.names if x.startswith(text[1:])
]
if any(line.startswith(x) for x in ("%use", "%with", "%shutdown")):
return text, [
x for x in self.kernel.supported_languages.keys() if x.startswith(text)
]
if line.startswith("%get "):
return text, [
x
for x in env.sos_dict.keys()
if x.startswith(text)
and x not in self.kernel.original_keys
and not x.startswith("_")
]
return None
class SoS_PathCompleter:
"""PathCompleter.. The problem with ptpython's path completor is that
it only matched 'text_before_cursor', which would not match cases such
as %cd ~/, which we will need."""
def __init__(self):
pass
def get_completions(self, line):
text = last_valid(line)
if not text.strip():
return text, glob.glob("*")
matches = glob.glob(os.path.expanduser(text) + "*")
if (
len(matches) == 1
and matches[0] == os.path.expanduser(text)
and os.path.isdir(os.path.expanduser(text))
):
return text, glob.glob(os.path.expanduser(text) + "/*")
return text, matches
class PythonCompleter:
def __init__(self):
pass
def get_completions(self, line):
text = last_valid(line)
completer = rlcompleter.Completer(env.sos_dict._dict)
return text, completer.global_matches(text)
class SoS_Completer:
def __init__(self, kernel):
self.completers = [
SoS_MagicsCompleter(kernel),
SoS_PathCompleter(),
PythonCompleter(),
]
def complete_text(self, code, cursor_pos=None):
if cursor_pos is None:
cursor_pos = len(code)
# get current line before cursor
doc = code[:cursor_pos].rpartition("\n")[2]
for c in self.completers:
matched = c.get_completions(doc)
if matched is None:
continue
if isinstance(matched, tuple):
if matched[1]:
return matched
else:
raise RuntimeError(f"Unrecognized completer return type {matched}")
# No match
return "", []
================================================
FILE: src/sos_notebook/converter.py
================================================
#!/usr/bin/env python3
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
import argparse
import os
import sys
import tempfile
import time
from importlib import metadata
from io import StringIO
import nbformat
from nbconvert.exporters import Exporter
from nbformat.v4 import new_code_cell, new_markdown_cell, new_notebook
from sos.syntax import SOS_SECTION_HEADER
from sos.utils import env
def execute_sos_notebook(
input_notebook, output_notebook=None, return_content=True, parameters=None
):
# execute input notebook
# if input_notebook is a string, it will be loaded. Otherwise it should be a notebook object
# if output_notebook is a string, it will be used as output filename. Otherwise
# the notebook will be returned.
if parameters is None:
parameters = {}
try:
from papermill.execute import execute_notebook
except ImportError as e:
raise RuntimeError(
"Please install papermill for the use of option --execute."
) from e
if not any(
entrypoint.name == "sos"
for entrypoint in metadata.entry_points(group="papermill.engine")
):
raise RuntimeError(
"Please install sos-papermill for the use of option --execute."
)
if isinstance(input_notebook, str):
input_file = input_notebook
else:
input_file = tempfile.NamedTemporaryFile(
prefix="__tmp_input_nb", dir=os.getcwd(), suffix=".ipynb", delete=False
).name
with open(input_file, "w") as notebook_file:
nbformat.write(input_notebook, notebook_file, 4)
if output_notebook is None:
output_file = tempfile.NamedTemporaryFile(
prefix="__tmp_output_nb", dir=os.getcwd(), suffix=".ipynb", delete=False
).name
else:
output_file = output_notebook
execute_notebook(
input_path=input_file,
output_path=output_file,
engine_name="sos",
kernel_name="sos",
parameters=parameters,
)
if os.path.basename(input_file).startswith("__tmp_input_nb"):
try:
os.remove(input_file)
except Exception as e:
env.logger.warning(
f"Failed to remove temporary input file {input_file}: {e}"
)
if os.path.basename(output_file).startswith("__tmp_output_nb") and return_content:
new_nb = nbformat.read(output_file, nbformat.NO_CONVERT)
try:
os.remove(output_file)
except Exception as e:
env.logger.warning(
f"Failed to remove temporary output file {output_file}: {e}"
)
return new_nb
return output_file
# This class cannot be defined in .kernel because it would cause some
# weird problem with unittesting not able to resolve __main__
class SoS_Exporter(Exporter):
def __init__(self, config=None, **kwargs):
self.output_extension = ".sos"
self.output_mimetype = "text/x-sos"
Exporter.__init__(self, config, **kwargs)
def content_from_notebook_cell(self, cell, fh, idx=0):
# in non-all mode, markdown cells are ignored because they can be mistakenly
# treated as markdown content of an action or script #806
if cell.cell_type != "code":
return
#
# Non-sos code cells are also ignored
fh.write(
f"# cell {idx + 1}, kernel={cell.metadata['kernel']}\n{cell.source}\n\n"
)
return idx
def workflow_from_notebook_cell(self, cell, fh, idx=0):
# in non-all mode, markdown cells are ignored because they can be mistakenly
# treated as markdown content of an action or script #806
if cell.cell_type != "code":
return
#
# Non-sos code cells are also ignored
if "kernel" in cell.metadata and cell.metadata["kernel"] not in (
"sos",
"SoS",
None,
):
return
lines = cell.source.split("\n")
valid_cell = False
for idx, line in enumerate(lines):
if valid_cell or (line.startswith("%include") or line.startswith("%from")):
fh.write(line + "\n")
elif SOS_SECTION_HEADER.match(line):
valid_cell = True
# look retrospectively for comments
c = idx - 1
comment = ""
while c >= 0 and lines[c].startswith("#"):
comment = lines[c] + "\n" + comment
c -= 1
fh.write(comment + line + "\n")
# other content, namely non-%include lines before section header is ignored
if valid_cell:
fh.write("\n")
return idx
def from_notebook_node(self, nb, resources, **kwargs):
#
cells = nb.cells
with StringIO() as fh:
fh.write("#!/usr/bin/env sos-runner\n")
fh.write("#fileformat=SOS1.0\n\n")
for idx, cell in enumerate(cells):
if "all_content" in kwargs and kwargs["all_content"]:
self.content_from_notebook_cell(cell, fh, idx)
else:
self.workflow_from_notebook_cell(cell, fh, idx)
content = fh.getvalue()
resources["output_extension"] = ".sos"
return content, resources
#
# Converter to Notebook
#
class ScriptToNotebookConverter:
def get_parser(self):
parser = argparse.ArgumentParser(
"sos convert FILE.sos FILE._ipynb (or --to ipynb)",
description="""Convert a sos script to Jupyter notebook (.ipynb)
so that it can be opened by Jupyter notebook.""",
)
return parser
def convert(self, script_file, notebook_file, args=None, unknown_args=None):
"""
Convert a sos script to iPython notebook (.ipynb) so that it can be opened
by Jupyter notebook.
"""
if unknown_args:
raise ValueError(f"Unrecognized parameter {unknown_args}")
cells = []
cell_count = 1
cell_type = "code"
metainfo = {}
content = []
def add_cell(cells, content, cell_type, cell_count, metainfo):
# if a section consist of all report, report it as a markdown cell
if not content:
return
if cell_type not in ("code", "markdown"):
env.logger.warning(f"Unrecognized cell type {cell_type}, code assumed.")
if cell_type == "markdown" and any(
x.strip() and not x.startswith("#! ") for x in content
):
env.logger.warning(
"Markdown lines not starting with #!, code cell assumed."
)
cell_type = "code"
#
if cell_type == "markdown":
cells.append(
new_markdown_cell(
source="".join([x[3:] for x in content]).strip(),
metadata=metainfo,
)
)
else:
cells.append(
new_code_cell(
# remove any trailing blank lines...
source="".join(content).strip(),
execution_count=cell_count,
metadata=metainfo,
)
)
with open(script_file) as script:
first_block = True
for line in script:
if line.startswith("#") and first_block:
if line.startswith("#!"):
continue
if line.startswith("#fileformat="):
if not line[12:].startswith("SOS"):
raise RuntimeError(
f"{script_file} is not a SoS script according to #fileformat line."
)
continue
first_block = False
mo = SOS_SECTION_HEADER.match(line)
if mo:
# get rid of empty content
if not any(x.strip() for x in content):
content = []
if content:
# the comment should be absorbed into the next section
i = len(content) - 1
while i >= 0 and content[i].startswith("#"):
i -= 1
# i point to the last non comment line
if i >= 0:
add_cell(
cells, content[: i + 1], cell_type, cell_count, metainfo
)
content = content[i + 1 :]
cell_type = "code"
cell_count += 1
metainfo = {"kernel": "SoS"}
content += [line]
continue
if line.startswith("#!"):
if cell_type == "markdown":
content.append(line)
continue
# get ride of empty content
if not any(x.strip() for x in content):
content = []
if content:
add_cell(cells, content, cell_type, cell_count, metainfo)
cell_type = "markdown"
cell_count += 1
content = [line]
continue
# other cases
content.append(line)
#
if content and any(x.strip() for x in content):
add_cell(cells, content, cell_type, cell_count, metainfo)
#
nb = new_notebook(
cells=cells,
metadata={
"kernelspec": {"display_name": "SoS", "language": "sos", "name": "sos"},
"language_info": {
"codemirror_mode": "sos",
"file_extension": ".sos",
"mimetype": "text/x-sos",
"name": "sos",
"pygments_lexer": "python",
"nbconvert_exporter": "sos_notebook.converter.SoS_Exporter",
},
"sos": {"kernels": [["SoS", "sos", "", ""]]},
},
)
if not notebook_file:
nbformat.write(nb, sys.stdout, 4)
else:
with open(notebook_file, "w") as notebook:
nbformat.write(nb, notebook, 4)
env.logger.info(f"Jupyter notebook saved to {notebook_file}")
# if err:
# raise RuntimeError(repr(err))
#
# notebook to sos script
#
class NotebookToScriptConverter:
def get_parser(self):
parser = argparse.ArgumentParser(
"sos convert FILE.ipynb FILE.sos (or --to sos)",
description="""Export Jupyter notebook with a SoS kernel to a
.sos file. The cells are presented in the .sos file as
cell structure lines, which will be ignored if executed
in batch mode """,
)
parser.add_argument(
"-a",
"--all",
action="store_true",
help="""If set, export all cells to the output file, which
does not have to be a valid sos workflow.""",
)
return parser
def convert(self, notebook_file, sos_file, args=None, unknown_args=None):
"""
Convert a ipython notebook to sos format.
"""
if unknown_args:
raise ValueError(f"Unrecognized parameter {unknown_args}")
exporter = SoS_Exporter()
notebook = nbformat.read(notebook_file, nbformat.NO_CONVERT)
output, _ = exporter.from_notebook_node(
notebook, {}, all_content=args.all if args else False
)
if not sos_file:
sys.stdout.write(output)
elif isinstance(sos_file, str):
with open(sos_file, "w") as sosfile:
sosfile.write(output)
env.logger.info(f"SoS script saved to {sos_file}")
else:
sos_file.write(output)
#
# notebook to HTML
#
def get_template_args():
return [
"--TemplateExporter.extra_template_basedirs",
os.path.join(f"{os.path.split(os.path.abspath(__file__))[0]}", "templates"),
]
def export_notebook(
exporter_class, to_format, notebook_file, output_file, unknown_args=None, view=False
):
import subprocess
if not os.path.isfile(notebook_file):
raise RuntimeError(f"{notebook_file} does not exist")
if not output_file:
tmp = tempfile.NamedTemporaryFile(delete=False, suffix="." + to_format).name
tmp_stderr = tempfile.NamedTemporaryFile(
delete=False, suffix="." + to_format
).name
with open(tmp_stderr, "w") as err:
ret = subprocess.call(
[
"jupyter",
"nbconvert",
notebook_file,
"--to",
to_format,
"--output",
tmp,
]
+ ([] if unknown_args is None else unknown_args),
stderr=err,
)
with open(tmp_stderr) as err:
err_msg = err.read()
if ret != 0:
env.logger.error(err_msg)
raise RuntimeError(
f"Failed to convert {notebook_file} to {to_format} format"
)
# identify output files
dest_file = err_msg.rsplit()[-1]
if not os.path.isfile(dest_file):
env.logger.error(err_msg)
raise RuntimeError("Failed to get converted file.")
if view:
import webbrowser
url = f"file://{os.path.abspath(dest_file)}"
env.logger.info(f"Viewing {url} in a browser")
webbrowser.open(url, new=2)
# allow browser some time to process the file before this process removes it
time.sleep(2)
else:
with open(dest_file, "rb") as tfile:
sys.stdout.buffer.write(tfile.read())
try:
os.remove(tmp)
except Exception:
pass
else:
ret = subprocess.call(
[
"jupyter",
"nbconvert",
os.path.abspath(notebook_file),
"--to",
to_format,
"--output",
os.path.abspath(output_file),
]
+ ([] if unknown_args is None else unknown_args)
)
if ret != 0:
raise RuntimeError(
f"Failed to convert {notebook_file} to {to_format} format"
)
env.logger.info(f"Output saved to {output_file}")
def _is_int(value):
"""Use casting to check if value can convert to an `int`."""
try:
int(value)
except ValueError:
return False
else:
return True
def _is_float(value):
"""Use casting to check if value can convert to a `float`."""
try:
float(value)
except ValueError:
return False
else:
return True
def parse_papermill_parameters(values):
parameters = {}
for value in values:
if "=" not in value:
parameters[value] = True
continue
k, v = value.split("=", 1)
if v == "True":
parameters[k] = True
elif v == "False":
parameters[k] = False
elif value == "None":
parameters[k] = None
elif _is_int(v):
parameters[k] = int(v)
elif _is_float(v):
parameters[k] = float(v)
else:
parameters[k] = v
return parameters
class NotebookToHTMLConverter:
def get_parser(self):
parser = argparse.ArgumentParser(
"sos convert FILE.ipynb FILE.html (or --to html)",
description="""Export Jupyter notebook with a SoS kernel to a
.html file. Additional command line arguments are passed directly to
command "jupyter nbconvert --to html" so please refer to nbconvert manual for
available options.""",
)
parser.add_argument(
"--template",
help="""Template to export Jupyter notebook with sos kernel. SoS provides a number
of templates, with sos-report displays markdown cells and only output of cells with
prominent tag, and a control panel to control the display of the rest of the content
""",
)
parser.add_argument(
"-e",
"--execute",
nargs="*",
help="""Execute the workflow using sos-papermill before exporting to HTML format.
One or more parameters are acceptable and should be specified as name=value,
where the type of value will be automatically guessed. An exception of this
rule is that `name' without `=` will be considered as value True.""",
)
parser.add_argument(
"-a",
"--all",
action="store_true",
help="""If specified, save content of all cells to .sos file.""",
)
parser.add_argument(
"-v",
"--view",
action="store_true",
help="""Open the output file in a broswer. In case no html file is specified,
this option will display the HTML file in a browser, instead of writing its
content to standard output.""",
)
return parser
def convert(self, notebook_file, output_file, sargs=None, unknown_args=None):
from nbconvert.exporters.html import HTMLExporter
if unknown_args is None:
unknown_args = []
if sargs.template:
template_path, template_name = os.path.split(sargs.template)
if template_path == "":
unknown_args = (
get_template_args() + ["--template", template_name] + unknown_args
)
else:
unknown_args = (
get_template_args()
+ [
"--TemplateExporter.extra_template_basedirs",
template_path,
"--template",
template_name,
]
+ unknown_args
)
if sargs.execute is not None:
notebook_file = execute_sos_notebook(
notebook_file,
return_content=False,
parameters=parse_papermill_parameters(sargs.execute),
)
export_notebook(
HTMLExporter,
"html",
notebook_file,
output_file,
unknown_args,
view=sargs.view,
)
if os.path.basename(notebook_file).startswith("__tmp_output_nb"):
try:
os.remove(notebook_file)
except Exception as e:
env.logger.warning(
f"Failed to remove temporary output file {notebook_file}: {e}"
)
#
# Notebook to PDF
#
class NotebookToPDFConverter:
def get_parser(self):
parser = argparse.ArgumentParser(
"sos convert FILE.ipynb FILE.pdf (or --to pdf)",
description="""Export Jupyter notebook with a SoS kernel to a
.pdf file. Additional command line arguments are passed directly to
command "jupyter nbconvert --to pdf" so please refer to nbconvert manual for
available options.""",
)
parser.add_argument(
"-e",
"--execute",
nargs="*",
help="""Execute the workflow using sos-papermill before exporting to PDF format.
One or more parameters are acceptable and should be specified as name=value,
where the type of value will be automatically guessed. An exception of this
rule is that `name' without `=` will be considered as value True.""",
)
parser.add_argument(
"--template",
help="""Template to export Jupyter notebook with sos kernel. SoS provides a number
of templates, with sos-report displays markdown cells and only output of cells with
prominent tag, and a control panel to control the display of the rest of the content
""",
)
return parser
def convert(self, notebook_file, output_file, sargs=None, unknown_args=None):
from nbconvert.exporters.pdf import PDFExporter
if sargs.execute is not None:
notebook_file = execute_sos_notebook(
notebook_file,
return_content=False,
parameters=parse_papermill_parameters(sargs.execute),
)
if unknown_args is None:
unknown_args = []
if sargs.template:
template_path, template_name = os.path.split(sargs.template)
if template_path == "":
unknown_args = (
get_template_args() + ["--template", template_name] + unknown_args
)
else:
unknown_args = (
get_template_args()
+ [
"--TemplateExporter.extra_template_basedirs",
template_path,
"--template",
template_name,
]
+ unknown_args
)
# jupyter convert will add extension to output file...
if output_file is not None and output_file.endswith(".pdf"):
output_file = output_file[:-4]
export_notebook(
PDFExporter,
"pdf",
notebook_file,
output_file,
get_template_args() + unknown_args,
)
if os.path.basename(notebook_file).startswith("__tmp_output_nb"):
try:
os.remove(notebook_file)
except Exception as e:
env.logger.warning(
f"Failed to remove temporary output file {notebook_file}: {e}"
)
#
# Notebook to Markdown
#
class NotebookToMarkdownConverter:
def get_parser(self):
parser = argparse.ArgumentParser(
"sos convert FILE.ipynb FILE.md (or --to md)",
description="""Export Jupyter notebook with a SoS kernel to a
markdown file. Additional command line arguments are passed directly to
command "jupyter nbconvert --to markdown" so please refer to nbconvert manual for
available options.""",
)
parser.add_argument(
"-e",
"--execute",
nargs="*",
help="""Execute the workflow using sos-papermill before exporting to markdown format.
One or more parameters are acceptable and should be specified as name=value,
where the type of value will be automatically guessed. An exception of this
rule is that `name' without `=` will be considered as value True.""",
)
return parser
def convert(self, notebook_file, output_file, sargs=None, unknown_args=None):
from nbconvert.exporters.markdown import MarkdownExporter
if sargs.execute is not None:
notebook_file = execute_sos_notebook(
notebook_file,
return_content=False,
parameters=parse_papermill_parameters(sargs.execute),
)
export_notebook(
MarkdownExporter,
"markdown",
notebook_file,
output_file,
get_template_args() + ["--template", "sos-markdown"] + unknown_args,
)
if os.path.basename(notebook_file).startswith("__tmp_output_nb"):
try:
os.remove(notebook_file)
except Exception as e:
env.logger.warning(
f"Failed to remove temporary output file {notebook_file}: {e}"
)
#
# Notebook to Notebook
#
class NotebookToNotebookConverter:
def get_parser(self):
parser = argparse.ArgumentParser(
"sos convert FILE.ipynb FILE.ipynb (or --to ipynb)",
description="""Export a Jupyter notebook with a non-SoS kernel to a SoS notebook
with SoS kernel, or from a SoS notebook to a regular notebook with specified kernel,
or execute a SoS notebook.""",
)
parser.add_argument(
"-k",
"--kernel",
help="""Kernel for the destination notebook. The default kernel is
SoS which converts a non-SoS notebook to SoS Notebook. If another kernel is specified,
this command will remove cell-specific kernel information and convert a SoS Notebook
to regular notebook with specified kernel.""",
)
parser.add_argument(
"--python3-to-sos",
action="store_true",
help="""Convert python3 cells to SoS.""",
)
parser.add_argument(
"--execute",
nargs="*",
help="""Execute the workflow using sos-papermill. One or more parameters
are acceptable and should be specified as name=value,
where the type of value will be automatically guessed. An exception of this
rule is that `name' without `=` will be considered as value True.""",
)
parser.add_argument(
"--inplace",
action="store_true",
help="""Overwrite input notebook with the output.""",
)
return parser
def nonSoS_to_SoS_notebook(self, notebook, args):
"""Converting a nonSoS notebook to SoS notebook by adding kernel metadata"""
# get the kernel of the notebook
# this is like 'R', there is another 'display_name'
lan_name = notebook["metadata"]["kernelspec"]["language"]
if lan_name == "python":
lan_name = "Python3"
# this is like 'ir'
kernel_name = notebook["metadata"]["kernelspec"]["name"]
# if it is already a SoS notebook, do nothing.
if kernel_name == "sos":
if args.inplace:
return
env.logger.warning(
"Notebook is already using the sos kernel. No conversion is needed."
)
return notebook
# convert to?
if kernel_name == "python3" and args.python3_to_sos:
to_lan = "SoS"
to_kernel = "sos"
else:
to_lan = lan_name
to_kernel = kernel_name
#
cells = []
for cell in notebook.cells:
if cell.cell_type == "code":
cell.metadata["kernel"] = to_lan
cells.append(cell)
#
# new header
kernels = [["SoS", "sos", "", "", ""]]
if to_lan != "SoS":
kernels += [[to_lan, to_kernel, "", "", ""]]
metadata = {
"kernelspec": {"display_name": "SoS", "language": "sos", "name": "sos"},
"language_info": {
"file_extension": ".sos",
"mimetype": "text/x-sos",
"name": "sos",
"pygments_lexer": "python",
"nbconvert_exporter": "sos_notebook.converter.SoS_Exporter",
},
"sos": {"kernels": kernels},
}
return new_notebook(cells=cells, metadata=metadata)
def SoS_to_nonSoS_notebook(self, notebook, args):
kernel_name = notebook["metadata"]["kernelspec"]["name"]
if kernel_name != "sos":
raise ValueError(
f"Cannot convert a notebook with kernel {kernel_name} to a notebook with kernel {args.kernel}"
)
all_subkernels = [
x[1] for x in notebook["metadata"]["sos"]["kernels"] if x[1] != "sos"
]
kinfo = [
x for x in notebook["metadata"]["sos"]["kernels"] if x[1] == args.kernel
]
if not kinfo:
if args.kernel == "python3":
# converting SoS cells to python3, should be more or less ok
kinfo = [
[
"Python3",
"python3",
"Python3",
"",
{"name": "ipython", "version": 3},
]
]
else:
raise ValueError(
f"Specified kernel {args.kernel} is not one of the subkernels ({', '.join(all_subkernels)}) used in the SoS notebook. "
)
if len(all_subkernels) > 1:
env.logger.warning(
f"More than one subkernels ({', '.join(all_subkernels)}) are used in the SoS notebook. They will all be considered as {args.kernel} cells."
)
# from SoS to args.kernel, we will need to first strip the cell-level kernel info
cells = []
for cell in notebook.cells:
if cell.cell_type == "code":
cell.metadata.pop("kernel")
cells.append(cell)
# NOTE: we do not have enough information to restore language_info
# which contains things such as codemirros mode and mimetype. However,
# Jupyter should be able to open the notebook and retrieve such information
# from the kernel, and langauge_info will be written if the notebook is
# saved again.
metadata = {
"kernelspec": {
"display_name": kinfo[0][0],
"language": kinfo[0][2],
"name": kinfo[0][1],
}
}
return new_notebook(cells=cells, metadata=metadata)
def convert(self, notebook_file, output_file, sargs=None, unknown_args=None):
notebook = nbformat.read(notebook_file, nbformat.NO_CONVERT)
kernel_name = notebook["metadata"]["kernelspec"]["name"]
nb = None
# what are we supposed to do?
if kernel_name == "sos" and sargs.kernel and sargs.kernel not in ("sos", "SoS"):
# sos => nonSoS
if sargs.execute is not None:
notebook = execute_sos_notebook(
notebook_file, parameters=parse_papermill_parameters(sargs.execute)
)
nb = self.SoS_to_nonSoS_notebook(notebook, sargs)
elif kernel_name == "sos" and not sargs.kernel:
if sargs.execute is not None:
if output_file and notebook_file != output_file:
execute_sos_notebook(
notebook_file,
output_file,
parameters=parse_papermill_parameters(sargs.execute),
)
env.logger.info(f"Jupyter notebook saved to {output_file}")
return
nb = execute_sos_notebook(
notebook_file, parameters=parse_papermill_parameters(sargs.execute)
)
# sos => sos
elif kernel_name != "sos" and sargs.kernel in ("sos", "SoS", None):
nb = self.nonSoS_to_SoS_notebook(notebook, sargs)
if sargs.execute is not None:
nb = execute_sos_notebook(
nb, parameters=parse_papermill_parameters(sargs.execute)
)
if nb is None:
# nothing to do (e.g. sos -> sos) without --execute
return
if sargs.inplace:
with open(notebook_file, "w") as new_nb:
nbformat.write(nb, new_nb, 4)
env.logger.info(f"Jupyter notebook saved to {notebook_file}")
elif not output_file:
nbformat.write(nb, sys.stdout, 4)
else:
with open(output_file, "w") as new_nb:
nbformat.write(nb, new_nb, 4)
env.logger.info(f"Jupyter notebook saved to {output_file}")
================================================
FILE: src/sos_notebook/inspector.py
================================================
#!/usr/bin/env python3
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
import pydoc
from sos.syntax import SOS_USAGES
from sos.utils import env
from .magics import SoS_Magics
class SoS_VariableInspector:
def __init__(self, kernel):
self.kernel = kernel
self.preview_magic = kernel.magics.get("preview")
def inspect(self, name, line, pos):
try:
obj_desc, preview = self.preview_magic.preview_var(name, style=None)
if preview is None:
return {}
format_dict, _ = preview
if "text/plain" in format_dict:
return format_dict
return {"text/plain": f"{repr(env.sos_dict['name'])} ({obj_desc})"}
except Exception:
return {}
class SoS_SyntaxInspector:
def __init__(self, kernel):
self.kernel = kernel
def inspect(self, name, line, pos):
if line.startswith("%") and name in SoS_Magics.names and pos <= len(name) + 1:
try:
magic = SoS_Magics(self.kernel).get(name)
parser = magic.get_parser()
return {"text/plain": parser.format_help()}
except Exception as e:
return {"text/plain": f"Magic %{name}: {e}"}
if line.startswith(name + ":") and pos <= len(name):
if self.kernel.original_keys is None:
self.kernel._reset_dict()
# input: etc
if name in SOS_USAGES:
return {"text/plain": SOS_USAGES[name]}
if name in env.sos_dict:
# action?
return {
"text/plain": pydoc.render_doc(
env.sos_dict[name], title="%s", renderer=pydoc.plaintext
),
"text/html": pydoc.render_doc(
env.sos_dict[name], title="%s", renderer=pydoc.html
),
}
return {}
class SoS_Inspector:
def __init__(self, kernel):
self.inspectors = [
SoS_SyntaxInspector(kernel),
SoS_VariableInspector(kernel),
]
def inspect(self, name, line, pos):
for c in self.inspectors:
try:
data = c.inspect(name, line, pos)
if data:
return data
except Exception:
continue
# No match
return {}
================================================
FILE: src/sos_notebook/install.py
================================================
#!/usr/bin/env python
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
import argparse
import json
import os
import shutil
import sys
from IPython.utils.tempdir import TemporaryDirectory
from jupyter_client.kernelspec import KernelSpecManager
kernel_json = {
"argv": [sys.executable, "-m", "sos_notebook.kernel", "-f", "{connection_file}"],
"display_name": "SoS",
"language": "sos",
}
def _is_root():
try:
return os.geteuid() == 0
except AttributeError:
return False # assume not an admin on non-Unix platforms
def get_install_sos_kernel_spec_parser():
parser = argparse.ArgumentParser(description="Install KernelSpec for sos Kernel")
prefix_locations = parser.add_mutually_exclusive_group()
prefix_locations.add_argument(
"--user", help="Install KernelSpec in user homedirectory", action="store_true"
)
prefix_locations.add_argument(
"--sys-prefix",
help="Install KernelSpec in sys.prefix. Useful in conda / virtualenv",
action="store_true",
dest="sys_prefix",
)
prefix_locations.add_argument(
"--prefix", help="Install KernelSpec in this prefix", default=None
)
return parser
def install_sos_kernel_spec(user, prefix):
with TemporaryDirectory() as td:
os.chmod(td, 0o755) # Starts off as 700, not user readable
with open(os.path.join(td, "kernel.json"), "w") as f:
json.dump(kernel_json, f, sort_keys=True)
shutil.copy(
os.path.join(os.path.split(__file__)[0], "logo-64x64.png"),
os.path.join(td, "logo-64x64.png"),
)
KS = KernelSpecManager()
KS.install_kernel_spec(td, "sos", user=user, prefix=prefix)
destination = KS._get_destination_dir("sos", user=user, prefix=prefix)
print(f"sos jupyter kernel spec is installed to {destination}")
def main():
parser = get_install_sos_kernel_spec_parser()
args = parser.parse_args()
user = False
prefix = None
if args.sys_prefix:
prefix = sys.prefix
elif args.prefix:
prefix = args.prefix
elif args.user or not _is_root():
user = True
install_sos_kernel_spec(user, prefix)
if __name__ == "__main__":
main()
================================================
FILE: src/sos_notebook/install_sos_notebook.sh
================================================
#!/usr/bin/bash
pip install . -U
python -m sos_notebook.install
================================================
FILE: src/sos_notebook/kernel.py
================================================
#!/usr/bin/env python3
#
# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center
# Distributed under the terms of the 3-clause BSD License.
import asyncio
import atexit
import contextlib
import inspect
import logging
import os
import pprint
import subprocess
import sys
import threading
import time
from collections import defaultdict
from importlib import metadata
from textwrap import dedent
import comm
import pandas as pd
from ipykernel._version import version_info as ipykernel_version_info
from ipykernel.ipkernel import IPythonKernel
from IPython.utils.tokenutil import line_at_cursor, token_at_cursor
from jupyter_client import manager
from sos._version import __sos_version__, __version__
from sos.eval import SoS_eval, interpolate
from sos.executor_utils import prepare_env
from sos.syntax import SOS_DIRECTIVE, SOS_SECTION_HEADER
from sos.utils import env, load_config_files, short_repr
from ._version import __version__ as __notebook_version__
from .comm_manager import SoSCommManager
from .completer import SoS_Completer
from .inspector import SoS_Inspector
from .magics import SoS_Magics
from .subkernel import Subkernels
from .workflow_executor import (
NotebookLoggingHandler,
execute_scratch_cell,
run_sos_workflow,
start_controller,
)
class FlushableStringIO:
def __init__(self, kernel, name, *args, **kwargs):
self.kernel = kernel
self.name = name
def write(self, content):
if content.startswith("HINT: "):
content = content.splitlines()
hint_line = content[0][6:].strip()
content = "\n".join(content[1:])
self.kernel.send_response(
self.kernel.iopub_socket,
"display_data",
{
"metadata": {},
"data": {"text/html": f'<div class="sos_hint">{hint_line}</div>'},
},
)
if content:
if self.kernel._meta["capture_result"] is not None:
self.kernel._meta["capture_result"].append(
("stream", {"name": self.name, "text": content})
)
if self.kernel._meta["render_result"] is False:
self.kernel.send_response(
self.kernel.iopub_socket,
"stream",
{"name": self.name, "text": content},
)
def flush(self):
pass
__all__ = ["SoS_Kernel"]
# translate a message to transient_display_data message
def make_transient_msg(msg_type, content):
if msg_type == "display_data":
return {
"data": content.get("data", {}),
"metadata": content.get("metadata", {}),
}
if msg_type == "stream":
if content["name"] == "stdout":
return {
"data": {
"text/plain": content["text"],
"application/vnd.jupyter.stdout": content["text"],
},
"metadata": {},
}
return {
"data": {
"text/plain": content["text"],
"application/vnd.jupyter.stderr": content["text"],
},
"metadata": {},
}
raise ValueError(
f"failed to translate message {msg_type} to transient_display_data message"
)
class SoS_Kernel(IPythonKernel):
implementation = "SOS"
implementation_version = __version__
language = "sos"
language_version = __sos_version__
language_info = {
"mimetype": "text/x-sos",
"name": "sos",
"file_extension": ".sos",
"pygments_lexer": "sos",
"codemirror_mode": "sos",
"nbconvert_exporter": "sos_notebook.converter.SoS_Exporter",
}
banner = "SoS kernel - script of scripts"
def get_supported_languages(self):
if self._supported_languages is not None:
return self._supported_languages
group = "sos_languages"
self._supported_languages = {}
for entrypoint in metadata.entry_points(group=group):
# Grab the function that is the actual plugin.
name = entrypoint.name
env.log_to_file("KERNEL", f"Found registered language {name}")
try:
plugin = entrypoint.load()
self._supported_languages[name] = plugin
except Exception as e:
env.log_to_file(
"KERNEL", f"Failed to load registered language {name}: {e}"
)
self._failed_languages[name] = e
return self._supported_languages
supported_languages = property(lambda self: self.get_supported_languages())
def get_kernel_list(self):
if not hasattr(self, "_subkernels"):
self._subkernels = Subkernels(self)
# sort kernel list by name to avoid unnecessary change of .ipynb files
return self._subkernels
subkernels = property(lambda self: self.get_kernel_list())
def get_completer(self):
if self._completer is None:
self._completer = SoS_Completer(self)
return self._completer
completer = property(lambda self: self.get_completer())
def get_inspector(self):
if self._inspector is None:
self._inspector = SoS_Inspector(self)
return self._inspector
inspector = property(lambda self: self.get_inspector())
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.options = ""
self.kernel = "SoS"
# a dictionary of started kernels, with the format of
#
# 'R': ['ir', 'sos.R.sos_R', '#FFEEAABB']
#
# Note that:
#
# 'R' is the displayed name of the kernel.
# 'ir' is the kernel name.
# 'sos.R.sos_R' is the language module.
# '#FFEEAABB' is the background color
#
env.log_to_file(
"KERNEL",
f"Starting SoS Kernel version {__notebook_version__} with SoS {__version__}",
)
self.kernels = {}
self._shutting_down = False
atexit.register(self._atexit_shutdown)
# self.shell = InteractiveShell.instance()
self.format_obj = self.shell.display_formatter.format
self._meta = {
"workflow": "",
"workflow_mode": False,
"render_result": False,
"capture_result": None,
"cell_id": "0",
"notebook_name": "",
"notebook_path": "",
"use_panel": True,
"use_iopub": False,
"default_kernel": "SoS",
"cell_kernel": "SoS",
"batch_mode": False,
}
self._debug_mode = False
self._supported_languages = None
self._completer = None
self._inspector = None
self._real_execution_count = 1
self._execution_count = 1
self.frontend_comm = None
self.frontend_comm_cache = []
#
self.comm_manager = comm.get_comm_manager()
# remove the old comm_manager
self.shell.configurables.pop()
self.shell.configurables.append(self.comm_manager)
for msg_type in ["comm_open", "comm_msg", "comm_close"]:
self.shell_handlers[msg_type] = getattr(self.comm_manager, msg_type)
self.comm_manager.register_target("sos_comm", self.sos_comm)
self.my_tasks = {}
self.magics = SoS_Magics(self)
self._failed_languages = {}
# enable matplotlib by default #77
self.shell.enable_gui = lambda gui: None
self.editor_kernel = "sos"
# initialize env
prepare_env("")
self.original_keys = set(env.sos_dict._dict.keys()) | {
"SOS_VERSION",
"CONFIG",
"step_name",
"__builtins__",
"input",
"output",
"depends",
}
env.logger.handlers = [
x for x in env.logger.handlers if not isinstance(x, logging.StreamHandler)
]
env.logger.addHandler(
NotebookLoggingHandler(
{
0: logging.ERROR,
1: logging.WARNING,
2: logging.INFO,
3: logging.DEBUG,
4: logging.DEBUG,
None: logging.INFO,
}[env.verbosity],
kernel=self,
)
)
env.logger.print = lambda cell_id, msg, *args: (
self.send_response(
self.iopub_socket, "stream", {"name": "stdout", "text": msg}
)
if self._meta["batch_mode"]
else self.send_frontend_msg("print", [cell_id, msg])
)
self.controller = None
cell_id = property(lambda self: self._meta["cell_id"])
_workflow_mode = property(lambda self: self._meta["workflow_mode"])
def sos_comm(self, comm, msg):
# record frontend_comm to send messages
self.frontend_comm = comm
@comm.on_msg
def handle_frontend_msg(msg):
content = msg["content"]["data"]
# log_to_file(msg)
for k, v in content.items():
if k == "list-kernel":
if v:
self.subkernels.update(v)
self.subkernels.notify_frontend()
elif k == "set-editor-kernel":
self.editor_kernel = v
elif k == "cancel-workflow":
from .workflow_executor import cancel_workflow
cancel_workflow(v[0], self)
elif k == "execute-workflow":
from .workflow_executor import execute_pending_workflow
execute_pending_workflow(v, self)
elif k == "update-task-status":
if not isinstance(v, list):
env.log_to_file(
"KERNEL",
f"Failed to parse message for update-task-status {v}",
)
continue
# split by host ...
host_status = defaultdict(list)
for name in v:
try:
tqu, tid, _ = name.rsplit("_", 2)
except Exception as e:
env.log_to_file(
"KERNEL", f"Failed to parse task ID {name}: {e}"
)
# incorrect ID...
continue
host_status[tqu].append(tid)
# log_to_file(host_status)
#
from sos.hosts import Host
for tqu, tids in host_status.items():
try:
h = Host(tqu, start_engine=True)
except Exception as e:
env.log_to_file(
"KERNEL", f"Failed to connect to host {tqu}: {e}"
)
for tid in tids:
self.send_frontend_msg(
"task_status",
{
"task_id": tid,
"queue": tqu,
"status": "missing",
"duration": "",
},
)
continue
for tid, tst, tdt in h._task_engine.monitor_tasks(tids):
self.send_frontend_msg(
"task_status",
{
"task_id": tid,
"queue": tqu,
"status": tst,
"duration": tdt,
},
)
self.send_frontend_msg("update-duration", {})
elif k == "paste-table":
try:
from tabulate import tabulate
df = pd.read_clipboard()
tbl = tabulate(df, headers="keys", tablefmt="pipe")
self.send_frontend_msg("paste-table", tbl)
except Exception as e:
self.send_frontend_msg(
"alert", f"Failed to paste clipboard as table: {e}"
)
elif k == "notebook-version":
# send the version of notebook, right now we will not do anything to it, but
# we will send the version of sos-notebook there
self.send_frontend_msg("notebook-version", __notebook_version__)
else:
# this somehow does not work
self.warn(f"Unknown message {k}: {v}")
def notify_error(self, e):
msg = {
"status": "error",
"ename": e.__class__.__name__,
"evalue": str(e),
"traceback": [f"\033[91m{e}\033[0m"],
"execution_count": self._execution_count,
}
if self._meta["suppress_error"]:
self.send_response(
self.iopub_socket,
"stream",
{"name": "stderr", "text": f"{msg['ename']}: {msg['evalue']}"},
)
else:
self.send_response(self.iopub_socket, "error", msg)
return msg
def send_frontend_msg(self, msg_type, msg=None):
# if comm is never created by frontend, the kernel is in test mode without frontend
if msg_type in ("display_data", "stream"):
if self._meta["use_panel"] is False or self._meta["cell_id"] == -1:
self.send_response(
self.iopub_socket, msg_type, {} if msg is None else msg
)
elif self._meta["use_iopub"]:
self.send_response(
self.iopub_socket,
"transient_display_data",
make_transient_msg(msg_type, msg),
)
elif self.frontend_comm:
if self.frontend_comm_cache:
for mt, mg in self.frontend_comm_cache:
self.frontend_comm.send(
make_transient_msg(mt, mg),
{"msg_type": "transient_display_data"},
)
self.frontend_comm_cache = []
self.frontend_comm.send(
make_transient_msg(msg_type, msg),
{"msg_type": "transient_display_data"},
)
elif self._meta["batch_mode"]:
env.log_to_file(
"MESSAGE",
f"frontend message of type {msg_type} is sent in batch mode.",
)
else:
self.frontend_comm_cache.append([msg_type, msg])
env.log_to_file(
"MESSAGE",
f"fronten not ready or broken. Message of type {msg_type} is cached",
)
elif self.frontend_comm:
if self.frontend_comm_cache:
for mt, mg in self.frontend_comm_cache:
self.frontend_comm.send({} if mg is None else mg, {"msg_type": mt})
self.frontend_comm_cache = []
self.frontend_comm.send({} if msg is None else msg, {"msg_type": msg_type})
elif self._meta["batch_mode"]:
env.log_to_file(
"MESSAGE", f"frontend message of type {msg_type} is sent in batch mode."
)
else:
# frontend_comm is not ready
self.frontend_comm_cache.append([msg_type, msg])
env.log_to_file(
"MESSAGE",
f"fronten not ready or broken. Message of type {msg_type} is cached",
)
@contextlib.contextmanager
def redirect_sos_io(self):
save_stdout = sys.stdout
save_stderr = sys.stderr
sys.stdout = FlushableStringIO(self, "stdout")
sys.stderr = FlushableStringIO(self, "stderr")
yield
sys.stdout = save_stdout
sys.stderr = save_stderr
async def get_vars_from(self, items, from_kernel=None, explicit=False, as_var=None):
if as_var is not None:
if not isinstance(as_var, str):
self.warn("Option --as should be a string.")
return
if len(items) > 1:
self.warn("Only one expression is allowed when option --as is used")
return
if from_kernel is None or from_kernel.lower() == "sos":
# Feature removed #253
# autmatically get all variables with names start with 'sos'
# default_items = [
# x for x in env.sos_dict.keys()
# if x.startswith('sos') and x not in self.original_keys
# ]
if not items:
return
for item in items:
if item not in env.sos_dict:
self.warn(f"Variable {item} does not exist")
return
kinfo = self.subkernels.find(self.kernel)
if kinfo.language in self.supported_languages:
lan = self.supported_languages[kinfo.language]
try:
get_vars_func = lan(self, kinfo.kernel).get_vars
args = inspect.getfullargspec(get_vars_func).args
if "as_var" in args:
await get_vars_func(items, as_var=as_var)
else:
if as_var is not None:
self.warn(
f"Subkernel {kinfo.language} does not support option --as"
)
await get_vars_func(items)
except Exception as e:
self.warn(f"Failed to get variable: {e}\n")
return
elif self.kernel == "SoS":
self.warn(
"Magic %get without option --kernel can only be executed by subkernels"
)
return
else:
if explicit:
self.warn(
f"Magic %get failed because the language module for {self.kernel} is not properly installed. Please install it according to language specific instructions on the Running SoS section of the SoS homepage and restart Jupyter server."
)
return
elif self.kernel.lower() == "sos":
# if another kernel is specified and the current kernel is sos
# we get from subkernel
try:
await self.switch_kernel(from_kernel)
await self.put_vars_to(items, as_var=as_var)
except Exception as e:
self.warn(f"Failed to get {', '.join(items)} from {from_kernel}: {e}")
finally:
await self.switch_kernel("SoS")
else:
# if another kernel is specified, we should try to let that kernel pass
# the variables to this one directly
try:
my_kernel = self.kernel
await self.switch_kernel(from_kernel)
# put stuff to sos or my_kernel directly
await self.put_vars_to(
items, to_kernel=my_kernel, explicit=explicit, as_var=as_var
)
except Exception as e:
self.warn(f"Failed to get {', '.join(items)} from {from_kernel}: {e}")
finally:
# then switch back
await self.switch_kernel(my_kernel)
async def put_vars_to(self, items, to_kernel=None, explicit=False, as_var=None):
if not items:
return
if as_var is not None:
if not isinstance(as_var, str):
self.warn("Option --as should be a string.")
return
if len(items) > 1:
self.warn("Only one expression is allowed when option --as is used")
return
if self.kernel.lower() == "sos":
if to_kernel is None:
self.warn(
"Magic %put without option --kernel can only be executed by subkernels"
)
return
# if another kernel is specified and the current kernel is sos
try:
# switch to kernel and bring in items
await self.switch_kernel(to_kernel, in_vars=items, as_var=as_var)
except Exception as e:
self.warn(f"Failed to put {', '.join(items)} to {to_kernel}: {e}")
finally:
# switch back
await self.switch_kernel("SoS")
else:
# put to sos kernel or another kernel
kinfo = self.subkernels.find(self.kernel)
if kinfo.language not in self.supported_languages:
if explicit:
self.warn(f"Subkernel {self.kernel} does not support magic %put.")
return
#
lan = self.supported_languages[kinfo.language]
# pass language name to to_kernel
try:
put_vars_func = lan(self, kinfo.kernel).put_vars
args = inspect.getfullargspec(put_vars_func).args
to_kernel_name = (
self.subkernels.find(to_kernel).language if to_kernel else "SoS"
)
if "as_var" in args:
objects = put_vars_func(
items, to_kernel=to_kernel_name, as_var=as_var
)
else:
objects = put_vars_func(items, to_kernel=to_kernel_name)
except Exception as e:
# if somethign goes wrong in the subkernel does not matter
env.log_to_file(
"MAGIC", f"Failed to call put_var({items}) from {kinfo.kernel}: {e}"
)
objects = {}
if isinstance(objects, dict):
# returns a SOS dictionary
try:
# if the variable is passing through SoS, let us try to restore variables in SoS
if to_kernel is not None:
missing_vars = [
x for x in objects.keys() if x not in env.sos_dict
]
existing_vars = {
x: env.sos_dict[x]
for x in objects.keys()
if x in env.sos_dict
}
env.sos_dict.update(objects)
except Exception as e:
self.warn(f"Failed to put {', '.join(items)} to {to_kernel}: {e}")
return
if to_kernel is None:
return
# if another kernel is specified and the current kernel is not sos
# we need to first put to sos then to another kernel
my_kernel = self.kernel
try:
# switch to the destination kernel and bring in vars
await self.switch_kernel(
to_kernel, in_vars=[as_var] if as_var else items
)
except Exception as e:
self.warn(f"Failed to put {', '.join(items)} to {to_kernel}: {e}")
finally:
# switch back to the original kernel
await self.switch_kernel(my_kernel)
# restore sos_dict to avoid bypassing effect #252
for missing_var in missing_vars:
env.sos_dict.pop(missing_var)
env.sos_dict.update(existing_vars)
elif isinstance(objects, str):
# an statement that will be executed in the destination kernel
if to_kernel is None or to_kernel == "SoS":
# evaluate in SoS, this should not happen or rarely happen
# because the subkernel should return a dictionary for SoS kernel
try:
exec(objects, env.sos_dict._dict)
except Exception as e:
self.warn(f"Failed to put variables {items} to SoS kernel: {e}")
return
try:
my_kernel = self.kernel
# switch to the destination kernel
await self.switch_kernel(to_kernel)
# execute the statement to pass variables directly to destination kernel
await self.run_cell(objects, True, False)
except Exception as e:
self.warn(f"Failed to put {', '.join(items)} to {to_kernel}: {e}")
finally:
# switch back to the original kernel
await self.switch_kernel(my_kernel)
else:
self.warn(
f"Unrecognized return value of type {object.__class__.__name__} for action %put"
)
async def expand_text_in(self, text, sigil=None, kernel="SoS"):
"""
Expand a piece of (markdown) text in specified kernel, used by
magic %expand
"""
if not text:
return ""
if sigil is None:
sigil = "{ }"
if sigil.count(" ") != 1:
raise ValueError(
f'Invalid interpolation delimiter "{sigil}": should be in the format of "L R"'
)
if sigil.split(" ")[0] not in text:
return text
if not kernel or kernel.lower() == "sos":
if sigil != "{ }":
from sos.parser import replace_sigil
text = replace_sigil(text, sigil)
return interpolate(text, local_dict=env.sos_dict._dict)
# check if the language supports expand protocol
kinfo = self.subkernels.find(kernel)
if kinfo.language not in self.supported_languages:
self.warn(f"Subkernel {kernel} does not support magic %expand --in")
return text
lan = self.supported_languages[kinfo.language](self, kinfo.kernel)
if not hasattr(lan, "expand"):
self.warn(f"Subkernel {kernel} does not support magic %expand --in")
return text
orig_kernel = self.kernel
try:
await self.switch_kernel(kernel)
return lan.expand(text, sigil)
except Exception as e:
self.warn(
f"Failed to expand {text} with sigin {sigil} in kernel {kernel}: {e}"
)
return text
finally:
await self.switch_kernel(orig_kernel)
def do_is_complete(self, code):
"""check if new line is in order"""
code = code.strip()
if not code:
return {"status": "complete", "indent": ""}
env.log_to_file("MESSAGE", f'Checking is_complete of "{code}"')
lines = code.split("\n")
# first let us remove "valid" magics
while any(line.startswith("%") or line.startswith("!") for line in lines):
for idx, line in enumerate(lines):
if line.startswith("%") or line.startswith("!"):
# if the last line ending with \, incomplete
if line.endswith("\\"):
if idx == len(lines) - 1:
return {"status": "incomplete", "indent": ""}
lines[idx] = lines[idx][:-1] + lines[idx + 1]
lines[idx + 1] = ""
else:
# valid, complete, ignore
lines[idx] = ""
if self.kernel == "SoS":
for idx, line in enumerate(lines):
# remove header
if SOS_SECTION_HEADER.match(line):
lines[idx] = ""
# remove input stuff?
if SOS_DIRECTIVE.match(line):
if any(
line.startswith(x)
for x in ("input:", "output:", "depends:", "parameter:")
):
# directive, remvoe them
lines[idx] = lines[idx].split(":", 1)[-1]
elif idx == len(lines) - 1:
# sh: with no script, incomplete
return {"status": "incomplete", "indent": " "}
else:
# remove the rest of them because they are embedded script
for i in range(idx, len(lines)):
lines[i] = ""
# check the rest if it is ok
try:
from IPython.core.inputtransformer2 import TransformerManager as ipf
except ImportError:
from IPython.core.inputsplitter import InputSplitter as ipf
code = "\n".join(lines) + "\n\n"
res = ipf().check_complete(code)
env.log_to_file("MESSAGE", f"SoS kernel returns {res} for code {code}")
return {"status": res[0], "indent": res[1]}
# non-SoS kernels
try:
cell_kernel = self.subkernels.find(self.editor_kernel)
if cell_kernel.name not in self.kernels:
orig_kernel = self.kernel
try:
# switch to start the new kernel
asyncio.run(self.switch_kernel(cell_kernel.name))
finally:
asyncio.run(self.switch_kernel(orig_kernel))
KC = self.kernels[cell_kernel.name][1]
# clear the shell channel
while KC.shell_channel.msg_ready():
KC.shell_channel.get_msg()
code = "\n".join(lines)
KC.is_complete(code)
msg = KC.shell_channel.get_msg()
if msg["header"]["msg_type"] == "is_complete_reply":
env.log_to_file(
"MESSAGE",
f"{self.kernel} kernel returns {msg['content']} for code {code}",
)
return msg["content"]
raise RuntimeError(
f"is_complete_reply not obtained: {msg['header']['msg_type']} {msg['content']} returned instead"
)
except Exception as e:
env.logger.debug(f"Completion fail with exception: {e}")
return {"status": "incomplete", "indent": ""}
def do_inspect(self, code, cursor_pos, detail_level=0):
if self.editor_kernel.lower() == "sos":
line, offset = line_at_cursor(code, cursor_pos)
name = token_at_cursor(code, cursor_pos)
data = self.inspector.inspect(name, line, cursor_pos - offset)
return {
"status": "ok",
"metadata": {},
"found": True if data else False,
"data": data,
}
cell_kernel = self.subkernels.find(self.editor_kernel)
try:
_, KC = self.kernels[cell_kernel.name]
except Exception as e:
env.log_to_file(
"KERNEL", f"Failed to get subkernels {cell_kernel.name}: {e}"
)
KC = self.KC
try:
KC.inspect(code, cursor_pos)
while KC.shell_channel.msg_ready():
msg = KC.shell_channel.get_msg()
if msg["header"]["msg_type"] == "inspect_reply":
return msg["content"]
# other messages, do not know what is going on but
# we should not wait forever and cause a deadloop here
env.log_to_file(
"MESSAGE",
f"complete_reply not obtained: {msg['header']['msg_type']} {msg['content']} returned instead",
)
break
except Exception as e:
env.log_to_file("KERNEL", f"Completion fail with exception: {e}")
async def do_complete(self, code, cursor_pos):
if self.editor_kernel.lower() == "sos":
text, matches = self.completer.complete_text(code, cursor_pos)
return {
"matches": matches,
"cursor_end": cursor_pos,
"cursor_start": cursor_pos - len(text),
"metadata": {},
"status": "ok",
}
try:
cell_kernel = self.subkernels.find(self.editor_kernel)
if cell_kernel.name not in self.kernels:
orig_kernel = self.kernel
try:
# switch to start the new kernel
await self.switch_kernel(cell_kernel.name)
finally:
await self.switch_kernel(orig_kernel)
KC = self.kernels[cell_kernel.name][1]
# clear the shell channel
while KC.shell_channel.msg_ready():
KC.shell_channel.get_msg()
KC.complete(code, cursor_pos)
msg = KC.shell_channel.get_msg()
if msg["header"]["msg_type"] == "complete_reply":
return msg["content"]
raise RuntimeError(
f"complete_reply not obtained: {msg['header']['msg_type']} {msg['content']} returned instead"
)
except Exception as e:
env.logger.debug(f"Completion fail with exception: {e}")
return {
"matches": [],
"cursor_end": cursor_pos,
"cursor_start": cursor_pos,
"metadata": {},
"status": "error",
}
def warn(self, message):
message = str(message).rstrip() + "\n"
if message.strip():
self.send_response(
self.iopub_socket, "stream", {"name": "stderr", "text": message}
)
async def run_cell(self, code, silent, store_history, on_error=None):
#
if not self.KM.is_alive():
self.send_response(
self.iopub_socket,
"stream",
{"name": "stdout", "text": f'Restarting kernel "{self.kernel}"\n'},
)
self.KM.restart_kernel(now=False)
self.KC = self.KM.client()
# flush stale replies, which could have been ignored, due to missed heartbeats
while self.KC.shell_channel.msg_ready():
self.KC.shell_channel.get_msg()
# executing code in another kernel.
# https://github.com/ipython/ipykernel/blob/604ee892623cca29eb495933eb5aa26bd166c7ff/ipykernel/inprocess/client.py#L94
content = {
"code": code,
"silent": silent,
"store_history": store_history,
"user_expressions": {},
"allow_stdin": False,
}
msg = self.KC.session.msg("execute_request", content)
# use the msg_id of the sos kernel for the subkernel to make sure that the messages sent
# from the subkernel has the correct msg_id in parent_header so that they can be
# displayed directly in the notebook (without using self._parent_header
if ipykernel_version_info[0] >= 6:
msg["msg_id"] = self.get_parent()["header"]["msg_id"]
else:
msg["msg_id"] = self._parent_header["header"]["msg_id"]
msg["header"]["msg_id"] = msg["msg_id"]
self.KC.shell_channel.send(msg)
# first thing is wait for any side effects (output, stdin, etc.)
iopub_started = False
iopub_ended = False
shell_ended = False
res = None
while not (iopub_started and iopub_ended and shell_ended):
try:
# display intermediate print statements, etc.
while self.KC.stdin_channel.msg_ready():
sub_msg = self.KC.stdin_channel.get_msg()
env.log_to_file(
"MESSAGE",
f"MSG TYPE {sub_msg['header']['msg_type']} CONTENT\n {pprint.pformat(sub_msg)}",
)
if sub_msg["header"]["msg_type"] != "input_request":
self.session.send(self.stdin_socket, sub_msg)
else:
content = sub_msg["content"]
if content["password"]:
res = self.getpass(prompt=content["prompt"])
else:
res = self.raw_input(prompt=content["prompt"])
self.KC.input(res)
while self.KC.iopub_channel.msg_ready():
sub_msg = self.KC.iopub_channel.get_msg()
msg_type = sub_msg["header"]["msg_type"]
env.log_to_file(
"MESSAGE",
f"IOPUB MSG TYPE {sub_msg['header']['msg_type']} CONTENT \n {pprint.pformat(sub_msg)}",
)
if msg_type == "status":
if sub_msg["content"]["execution_state"] == "busy":
iopub_started = True
elif (
iopub_started
and sub_msg["content"]["execution_state"] == "idle"
):
iopub_ended = True
continue
if msg_type in ("execute_input", "execute_result"):
# override execution count with the master count,
# not sure if it is needed
sub_msg["content"]["execution_count"] = self._execution_count
#
if msg_type in [
"display_data",
"stream",
"execute_result",
"update_display_data",
"error",
]:
if self._meta["capture_result"] is not None:
self._meta["capture_result"].append(
(msg_type, sub_msg["content"])
)
if msg_type == "execute_result" or (
not silent and self._meta["render_result"] is False
):
if msg_type == "error" and self._meta["suppress_error"]:
self.send_response(
self.iopub_socket,
"stream",
{
"name": "stderr",
"text": f"{sub_msg['content']['ename']}: {sub_msg['content']['evalue']}",
},
)
else:
self.session.send(self.iopub_socket, sub_msg)
else:
# if the subkernel tried to create a customized comm
if msg_type == "comm_open":
self.comm_manager.register_subcomm(
sub_msg["content"]["comm_id"], self.KC, self
)
self.session.send(self.iopub_socket, sub_msg)
if self.KC.shell_channel.msg_ready():
# now get the real result
reply = self.KC.get_shell_msg()
reply["content"]["execution_count"] = self._execution_count
env.log_to_file("MESSAGE", f"GET SHELL MSG {pprint.pformat(reply)}")
res = reply["content"]
shell_ended = True
time.sleep(0.001)
except KeyboardInterrupt:
self.KM.interrupt_kernel()
return res
def get_info_of_subkernels(self):
from jupyter_client.kernelspec import KernelSpecManager
km = KernelSpecManager()
available_subkernels = """<table>
<tr>
<th>Subkernel</th>
<th>Kernel Name</th>
<th>Language</th>
<th>Language Module</th>
<th style="text-align:left">Interpreter</th>
</tr>"""
for sk in self.subkernels.kernel_list():
spec = km.get_kernel_spec(sk.kernel)
if sk.name in ("SoS", "Markdown"):
lan_module = ""
elif sk.language in self.supported_languages:
lan_module = f"<code>{self.supported_languages[sk.language].__module__.split('.')[0]}</code>"
else:
lan_module = '<font style="color:red">Unavailable</font>'
available_subkernels += f"""\
<tr>
<td>{sk.name}</td>
<td><code>{sk.kernel}</code></td>
<td>{spec.language}</td>
<td>{lan_module}</td>
<td style="text-align:left"><code>{spec.argv[0]}</code></td>
</tr>"""
available_subkernels += "</table>"
return available_subkernels
async def switch_kernel(
self,
kernel,
in_vars=None,
kernel_name=None,
language=None,
color=None,
as_var=None,
):
# switching to a non-sos kernel
if not kernel:
kinfo = self.subkernels.find(self.kernel)
self.send_response(
self.iopub_socket,
"display_data",
{"metadata": {}, "data": {"text/html": self.get_info_of_subkernels()}},
)
return
kinfo = self.subkernels.find(kernel, kernel_name, language, color)
if kinfo.name == self.kernel:
return
if kinfo.name == "SoS":
# non-SoS to SoS
if in_vars:
await self.put_vars_to(in_vars, as_var=as_var)
self.kernel = "SoS"
elif self.kernel != "SoS":
# Non-SoS to Non-SoS
await self.switch_kernel("SoS", in_vars)
await self.switch_kernel(kinfo.name, in_vars)
else:
# SoS to non-SoS
env.log_to_file("KERNEL", f"Switch from {self.kernel} to {kinfo.name}")
# case when self.kernel == 'sos', kernel != 'sos'
# to a subkernel
new_kernel = False
if kinfo.name not in self.kernels:
# start a new kernel
try:
env.log_to_file("KERNEL", f"Starting subkernel {kinfo.name}")
self.kernels[kinfo.name] = manager.start_new_kernel(
startup_timeout=30, kernel_name=kinfo.kernel, cwd=os.getcwd()
)
new_kernel = True
except Exception:
env.log_to_file(
"KERNEL",
f"Failed to start kernel {kinfo.kernel}. Trying again...",
)
# try toget error message
import tempfile
with tempfile.TemporaryFile() as ferr:
try:
# this should fail, but sometimes the second attempt will succeed #282
self.kernels[kinfo.name] = manager.start_new_kernel(
startup_timeout=60,
kernel_name=kinfo.kernel,
cwd=os.getcwd(),
stdout=subprocess.DEVNULL,
stderr=ferr,
)
new_kernel = True
env.log_to_file(
"KERNEL",
f"Kernel {kinfo.kernel} started with the second attempt.",
)
except Exception as e:
ferr.seek(0)
raise RuntimeError(
f'Failed to start kernel "{kernel}". {e}\nError Message:\n{ferr.read().decode()}'
) from e
self.KM, self.KC = self.kernels[kinfo.name]
self.kernel = kinfo.name
if new_kernel and not kinfo.codemirror_mode:
self.KC.kernel_info()
kinfo.codemirror_mode = self.KC.get_shell_msg(timeout=10)["content"][
"language_info"
].get("codemirror_mode", "")
self.subkernels.notify_frontend()
if new_kernel and kinfo.language in self.supported_languages:
lan_module = self.supported_languages[kinfo.language](
self, kinfo.kernel
)
init_stmts = lan_module.init_statements
if hasattr(lan_module, "__version__"):
module_version = f" (version {lan_module.__version__})"
else:
module_version = " (version unavailable)"
env.log_to_file(
"KERNEL",
f"Loading language module for kernel {kinfo.name}{module_version}",
)
if init_stmts:
await self.run_cell(init_stmts, True, False)
# passing
if in_vars:
await self.get_vars_from(in_vars, as_var=as_var)
def shutdown_kernel(self, kernel, restart=False):
kernel = self.subkernels.find(kernel).name
if kernel == "SoS":
# cannot restart myself ...
self.warn("Cannot restart SoS kernel from within SoS.")
elif kernel:
if kernel not in self.kernels:
self.send_response(
self.iopub_socket,
"stream",
{"name": "stdout", "text": f"{kernel} is not running"},
)
elif restart:
orig_kernel = self.kernel
try:
# shutdown
self.shutdown_kernel(kernel)
# switch back to kernel (start a new one)
asyncio.run(self.switch_kernel(kernel))
finally:
# finally switch to starting kernel
asyncio.run(self.switch_kernel(orig_kernel))
else:
# shutdown
if self.kernel == kernel:
asyncio.run(self.switch_kernel("SoS"))
try:
self.kernels[kernel][0].shutdown_kernel(restart=False)
except Exception as e:
self.warn(f"Failed to shutdown kernel {kernel}: {e}\n")
finally:
self.kernels.pop(kernel)
else:
self.send_response(
self.iopub_socket,
"stream",
{
"name": "stdout",
"text": "Specify one of the kernels to shutdown: SoS{}\n".format(
"".join(f", {x}" for x in self.kernels)
),
},
)
# stop_controller(self.controller)
# def get_response(self, statement, msg_types, name=None):
# return asyncio.run(self._async_get_response(statement, msg_types, name))
def get_response(self, statement, msg_types, name=None):
# get response of statement of specific msg types.
while self.KC.shell_channel.msg_ready():
self.KC.shell_channel.get_msg()
while self.KC.iopub_channel.msg_ready():
sub_msg = self.KC.iopub_channel.get_msg()
if sub_msg["header"]["msg_type"] != "status":
env.log_to_file(
"MESSAGE",
f"Overflow message in iopub {sub_msg['header']['msg_type']} {sub_msg['content']}",
)
responses = []
self.KC.execute(statement, silent=False, store_history=False)
# first thing is wait for any side effects (output, stdin, etc.)
iopub_started = False
iopub_ended = False
shell_ended = False
while not (iopub_started and iopub_ended and shell_ended):
# display intermediate print statements, etc.
while self.KC.iopub_channel.msg_ready():
sub_msg = self.KC.iopub_channel.get_msg()
msg_type = sub_msg["header"]["msg_type"]
env.log_to_file("MESSAGE", f"Received {msg_type} {sub_msg['content']}")
if msg_type == "status":
if sub_msg["content"]["execution_state"] == "busy":
iopub_started = True
elif (
iopub_started
and sub_msg["content"]["execution_state"] == "idle"
):
iopub_ended = True
continue
if msg_type in msg_types and (
name is None
or sub_msg["content"].get("name", None) in name
or any(x in name for x in sub_msg["content"].keys())
):
env.log_to_file(
"MESSAGE", f"Capture response: {msg_type}: {sub_msg['content']}"
)
responses.append([msg_type, sub_msg["content"]])
else:
env.log_to_file(
"MESSAGE", f"Non-response: {msg_type}: {sub_msg['content']}"
)
#
# we ignore the messages we are not interested.
#
# self.send_response(
# self.iopub_socket, msg_type, sub_msg['content'])
if self.KC.shell_channel.msg_ready():
# now get the real result
reply = self.KC.get_shell_msg()
env.log_to_file("MESSAGE", f"GET SHELL MSG {reply}")
shell_ended = True
time.sleep(0.001)
if not responses:
env.log_to_file(
"MESSAGE",
f"Failed to get a response from message type {msg_types} for the execution of {statement}",
)
return responses
def run_sos_code(self, code, silent):
code = dedent(code)
with self.redirect_sos_io():
try:
if self._workflow_mode:
res = run_sos_workflow(
code=code,
raw_args=self.options,
kernel=self,
run_in_queue=self._workflow_mode == "nowait"
and not self._meta["batch_mode"],
)
else:
res = execute_scratch_cell(
code=code, raw_args=self.options, kernel=self
)
self.send_result(res, silent)
except Exception as e:
sys.stderr.flush()
sys.stdout.flush()
raise e
except KeyboardInterrupt:
# this only occurs when the executor does not capture the signal.
self.warn("KeyboardInterrupt\n")
return {"status": "abort", "execution_count": self._execution_count}
finally:
sys.stderr.flush()
sys.stdout.flush()
def render_result(self, res):
if not self._meta["render_result"]:
return res
if not isinstance(res, str):
self.warn(
f"Cannot render result {short_repr(res)} in type {res.__class__.__name__} as {self._meta['render_result']}."
)
else:
# import the object from IPython.display
mod = __import__("IPython.display")
if not hasattr(mod.display, self._meta["render_result"]):
self.warn(f"Unrecognized render format {self._meta['render_result']}")
else:
func = getattr(mod.display, self._meta["render_result"])
res = func(res)
return res
def send_result(self, res, silent=False):
# this is Ok, send result back
if not silent and res is not None:
format_dict, md_dict = self.format_obj(self.render_result(res))
if self._meta["capture_result"] is not None:
self._meta["capture_result"].append(("execute_result", format_dict))
env.log_to_file(
"MESSAGE", f"IOPUB execute_result with content {format_dict}"
)
self.send_response(
self.iopub_socket,
"execute_result",
{
"execution_count": self._execution_count,
"data": format_dict,
"metadata": md_dict,
},
)
def init_metadata(self, metadata):
super().init_metadata(metadata)
env.log_to_file("KERNEL", f"GOT METADATA {metadata}")
if "sos" in metadata["metadata"]:
# jupyterlab-sos sends meta data through metadata
meta = metadata["metadata"]["sos"]
elif "sos" in metadata["content"]:
# classic jupyter does not use metadata but allow additional fields
# in content
meta = metadata["content"]["sos"]
else:
# if there is no sos metadata, the execution should be started from a test suite
# just ignore
self._meta = {
"workflow": "",
"workflow_mode": False,
"render_result": False,
"capture_result": None,
"cell_id": "0",
"notebook_name": "",
"notebook_path": "",
"use_panel": False,
"use_iopub": False,
"default_kernel": self.kernel,
"cell_kernel": self.kernel,
"batch_mode": False,
"suppress_error": False,
}
return self._meta
env.log_to_file("KERNEL", f"Meta info: {meta}")
self._meta = {
"workflow": meta["workflow"] if "workflow" in meta else "",
"workflow_mode": False,
"render_result": False,
"capture_result": None,
"cell_id": meta["cell_id"] if "cell_id" in meta else "0",
"notebook_path": meta["path"] if "path" in meta else "Untitled.ipynb",
"use_panel": "use_panel" in meta and meta["use_panel"] is True,
"use_iopub": "use_iopub" in meta and meta["use_iopub"] is True,
"default_kernel": meta["default_kernel"]
if "default_kernel" in meta
else "SoS",
"cell_kernel": meta["cell_kernel"]
if "cell_kernel" in meta
else (meta["default_kernel"] if "default_kernel" in meta else "SoS"),
"batch_mode": meta.get("batch_mode", False),
"suppress_error": False,
}
# remove path and extension
self._meta["notebook_name"] = os.path.basename(
self._meta["notebook_path"]
).rsplit(".", 1)[0]
if "list_kernel" in meta and meta["list_kernel"]:
# https://github.com/jupyter/help/issues/153#issuecomment-289026056
#
# when the frontend is refreshed, cached comm would be lost and
# communication would be discontinued. However, a kernel-list
# request would be sent by the new-connection so we reset the
# frontend_comm to re-connect to the frontend.
self.comm_manager.register_target("sos_comm", self.sos_comm)
return self._meta
async def do_execute(
self, code, silent, store_history=True, user_expressions=None, allow_stdin=True
):
env.log_to_file("KERNEL", f"execute: {code}")
if not self.controller:
self.controller = start_controller(self)
# load basic configuration each time in case user modifies the configuration during
# runs. This is not very efficient but should not matter much during interactive
# data analysis
try:
load_config_files()
except Exception as e:
self.warn(f"Failed to load configuration files: {e}")
self._forward_input(allow_stdin)
# switch to global default kernel
try:
if (
self.subkernels.find(self._meta["default_kernel"]).name
!= self.subkernels.find(self.kernel).name
):
await self.switch_kernel(self._meta["default_kernel"])
# evaluate user expression
except Exception as e:
return self.notify_error(e)
# switch to cell kernel
try:
if (
self.subkernels.find(self._meta["cell_kernel"]).name
!= self.subkernels.find(self.kernel).name
):
await self.switch_kernel(self._meta["cell_kernel"])
except Exception as e:
return self.notify_error(e)
# execute with cell kernel
try:
ret = await self._do_execute(
code=code,
silent=silent,
store_history=store_history,
user_expressions=user_expressions,
allow_stdin=allow_stdin,
)
except Exception as e:
return self.notify_error(e)
if ret is None:
ret = {
"status": "ok",
"payload": [],
"user_expressions": {},
"execution_count": self._execution_count,
}
out = {}
for key, expr in (user_expressions or {}).items():
try:
# value = self.shell._format_user_obj(SoS_eval(expr))
value = SoS_eval(expr)
value = self.shell._format_user_obj(value)
except Exception as e:
self.warn(f"Failed to evaluate user expression {expr}: {e}")
value = self.shell._user_obj_error()
out[key] = value
ret["user_expressions"] = out
#
if not silent and store_history:
self._real_execution_count += 1
self._execution_count = self._real_execution_count
# make sure post_executed is triggered after the completion of all cell content
self.shell.user_ns.update(env.sos_dict._dict)
# trigger post processing of object and display matplotlib figures
self.shell.events.trigger("post_execute")
# tell the frontend the kernel for the "next" cell
return ret
async def _do_execute(
self, code, silent, store_history=True, user_expressions=None, allow_stdin=True
):
# handles windows/unix newline
code = "\n".join(code.splitlines()) + "\n"
if code == "import os\n_pid = os.getpid()":
# this is a special probing command from vim-ipython. Let us handle it specially
# so that vim-python can get the pid.
return
for magic in self.magics.values():
if magic.match(code):
return await magic.apply(
code, silent, store_history, user_expressions, allow_stdin
)
if self.kernel != "SoS":
# handle string interpolation before sending to the underlying kernel
if self._meta["cell_id"] != "0" and not self._meta["batch_mode"]:
self.send_frontend_msg(
"cell-kernel", [self._meta["cell_id"], self.kernel]
)
if code is None or not code.strip():
return
try:
# We remove leading new line in case that users have a SoS
# magic and a cell magic, separated by newline.
# issue #58 and #33
return await self.run_cell(code.lstrip(), silent, store_history)
except KeyboardInterrupt:
self.warn(
"KeyboardInterrupt. This will only be captured if the subkernel failed to process the signal.\n"
)
return {"status": "abort", "execution_count": self._execution_count}
else:
# if the cell starts with comment, and newline, remove it
lines = code.splitlines()
empties = [x.startswith("#") or not x.strip() for x in lines]
if not self._meta["batch_mode"]:
self.send_frontend_msg("cell-kernel", [self._meta["cell_id"], "SoS"])
if all(empties):
return {
"status": "ok",
"payload": [],
"user_expressions": {},
"execution_count": self._execution_count,
}
idx = empties.index(False)
if idx != 0 and (lines[idx].startswith("%") or lines[idx].startswith("!")):
# not start from empty, but might have magic etc
return await self._do_execute(
"\n".join(lines[idx:]) + "\n",
silent,
store_history,
user_expressions,
allow_stdin,
)
# if there is no more empty, magic etc, enter workflow mode
# run sos
try:
self.run_sos_code(code, silent)
return {
"status": "ok",
"payload": [],
"user_expressions": {},
"execution_count": self._execution_count,
}
except Exception as e:
return self.notify_error(e)
finally:
# even if something goes wrong, we clear output so that the "preview"
# will not be viewed by a later step.
env.sos_dict.pop("input", None)
env.sos_dict.pop("output", None)
def do_shutdown(self, restart):
if self._shutting_down:
return
self._shutting_down = True
try:
for name, (km, _) in self.kernels.items():
try:
km.shutdown_kernel(restart=restart)
except Exception as e:
self.warn(f"Failed to shutdown kernel {name}: {e}")
finally:
if not restart:
self.kernels.clear()
self._shutting_down = False
def _atexit_shutdown(self):
"""Ensure all subkernels are shut down when the process exits.
This is more reliable than __del__ which is not guaranteed to be
called during interpreter shutdown, especially with reference cycles.
"""
self.do_shutdown(False)
def __del__(self):
self.do_shutdown(False)
# there can only be one comm manager in a ipykernel process
_comm_lock = threading.Lock()
_comm_manager = None
def _get_comm_manager(*args, **kwargs):
"""Create a new CommManager."""
global _comm_manager # noqa
if _comm_manager is None:
with _comm_lock:
if _comm_manager is None:
_comm_manager = SoSCommManager(*args, **kwargs)
return _comm_manager
comm.get_comm_manager = _get_comm_manager
if __name__ == "__main__":
from ipykernel.kernelapp import IPKernelApp
IPKernelApp.launch_instance(kernel_class=SoS_Kernel)
================================================
FILE: src/sos_notebook/magics.py
================================================
import argparse
import builtins
import copy
import fnmatch
import os
import pydoc
import re
import shlex
import subprocess
import sys
from collections import OrderedDict
from collections.abc import Sequence, Sized
from io import StringIO
from types import ModuleType
import pandas as pd
from IPython.core.error import UsageError
from IPython.lib.clipboard import (
ClipboardEmpty,
osx_clipboard_get,
tkinter_clipboard_get,
)
from jupyter_client import find_connection_file
from sos._version import __version__
from sos.eval import interpolate
from sos.syntax import SOS_SECTION_HEADER
from sos.targets import path
from sos.utils import env, load_config_files, pexpect_run, pretty_size, short_repr
class SoS_Magic:
name = "BaseMagic"
def __init__(self, kernel):
self.sos_kernel = kernel
self.pattern = re.compile(rf"%{self.name}(\s|$)")
def _interpolate_text(self, text, quiet=False):
# interpolate command
try:
new_text = interpolate(text, local_dict=env.sos_dict._dict)
if new_text != text and not quiet:
self.sos_kernel.send_response(
self.sos_kernel.iopub_socket,
"display_data",
{
"metadata": {},
"data": {
"text/html": f'<div class="sos_hint">> {new_text.strip() + "<br>"}</div>'
},
},
)
return new_text
except Exception as e:
self.sos_kernel.warn(f"Failed to interpolate {short_repr(text)}: {e}\n")
return None
def get_magic_and_code(self, code, warn_remaining=False):
if code.startswith("%") or code.startswith("!"):
lines = re.split(r"(?<!\\)\n", code, maxsplit=1)
# remove lines joint by \
lines[0] = lines[0].replace("\\\n", "")
else:
lines = code.split("\n", 1)
pieces = self._interpolate_text(lines[0], quiet=False).strip().split(None, 1)
if len(pieces) == 2:
command_line = pieces[1]
else:
command_line = ""
remaining_code = lines[1] if len(lines) > 1 else ""
if warn_remaining and remaining_code.strip():
self.sos_kernel.warn(f"Statement {short_repr(remaining_code)} ignored")
return command_line, remaining_code
def match(self, code):
return self.pattern.match(code)
def run_shell_command(self, cmd):
# interpolate command
if not cmd:
return
try:
with self.sos_kernel.redirect_sos_io():
pexpect_run(
cmd,
shell=True,
win_width=40 if self.sos_kernel._meta["cell_id"] == "" else 80,
)
except Exception as e:
self.sos_kernel.warn(e)
async def apply(self, code, silent, store_history, user_expressions, allow_stdin):
raise RuntimeError(f"Unimplemented magic {self.name}")
def _parse_error(self, msg):
self.sos_kernel.warn(msg)
class Command_Magic(SoS_Magic):
name = "!"
def match(self, code):
return code.startswith("!")
async def apply(self, code, silent, store_history, user_expressions, allow_stdin):
options, remaining_code = self.get_magic_and_code(code, False)
self.run_shell_command(code.split(" ")[0][1:] + " " + options)
return await self.sos_kernel._do_execute(
remaining_code, silent, store_history, user_expressions, allow_stdin
)
class Capture_Magic(SoS_Magic):
name = "capture"
def __init__(self, kernel):
super().__init__(kernel)
def get_parser(self):
parser = argparse.ArgumentParser(
prog="%capture",
description="""Capture output from a subkernel as variable in SoS""",
)
parser.add_argument(
"msg_type",
nargs="?",
default="raw",
choices=["stdout", "stderr", "text", "markdown", "html", "raw", "error"],
help="""Message type to capture. In terms of Jupyter message types,
"stdout" refers to "stream" message with "stdout" type, "stderr"
refers to "stream" message with "stderr" type, "text", "markdown"
and "html" refers to "display_data" or "execute_result" messages
with "text/plain", "text/markdown" and "text/html" type respectively,
and 'error' refers to "evalue" of "error" messages.
If no value or "raw" is specified, all returned messages will be
returned in alist format, and will be displayed in the console panel.
This will help you determine the right type to capture.""",
)
parser.add_argument(
"--as",
dest="as_type",
default="text",
nargs="?",
choices=("text", "json", "csv", "tsv"),
help="""How to interpret the captured text. This only applicable to stdout, stderr and
text message type where the text from cell output will be collected. If this
option is given, SoS will try to parse the text as json, csv (comma separated text),
tsv (tab separated text), and store text (from text), Pandas DataFrame
(from csv or tsv), dict or other types (from json) to the variable.""",
)
grp = parser.add_mutually_exclusive_group(required=False)
grp.add_argument(
"-t",
"--to",
dest="__to__",
metavar="VAR",
help="""Name of variable to which the captured content will be saved. If no varialbe is
specified, the return value will be saved to variable "__captured" and be displayed
at the side panel. """,
)
grp.add_argument(
"-a",
"--append",
dest="__append__",
metavar="VAR",
help="""Name of variable to which the captured content will be appended.
This option is equivalent to --to if VAR does not exist. If VAR exists
and is of the same type of new content (str or dict or DataFrame), the
new content will be appended to VAR if VAR is of str (str concatenation),
dict (dict update), or DataFrame (DataFrame.append) types. If VAR is of
list type, the new content will be appended to the end of the list.""",
)
parser.error = self._parse_error
return parser
async def apply(self, code, silent, store_history, user_expressions, allow_stdin):
options, remaining_code = self.get_magic_and_code(code, False)
parser = self.get_parser()
try:
args = parser.parse_args(shlex.split(options))
except SystemExit:
return
try:
self.sos_kernel._meta["capture_result"] = []
return await self.sos_kernel._do_execute(
remaining_code, silent, store_history, user_expressions, allow_stdin
)
finally:
# parse capture_result
content = ""
if args.msg_type == "stdout":
for msg in self.sos_kernel._meta["capture_result"]:
if msg[0] == "stream" and msg[1]["name"] == "stdout":
content += msg[1]["text"]
elif args.msg_type == "stderr":
for msg in self.sos_kernel._meta["capture_result"]:
if msg[0] == "stream" and msg[1]["name"] == "stderr":
content += msg[1]["text"]
elif args.msg_type == "text":
for msg in self.sos_kernel._meta["capture_result"]:
if (
msg[0] == "display_data"
and "data" in msg[1]
and "text/plain" in msg[1]["data"]
):
content += msg[1]["data"]["text/plain"]
elif args.msg_type == "markdown":
for msg in self.sos_kernel._meta["capture_result"]:
if (
msg[0] == "display_data"
and "data" in msg[1]
and "text/markdown" in msg[1]["data"]
):
content += msg[1]["data"]["text/markdown"]
elif args.msg_type == "html":
for msg in self.sos_kernel._meta["capture_result"]:
if (
msg[0] == "display_data"
and "data" in msg[1]
and "text/html" in msg[1]["data"]
):
content += msg[1]["data"]["text/html"]
elif args.msg_type == "error":
for msg in self.sos_kernel._meta["capture_result"]:
if msg[0] == "error" and "evalue" in msg[1]:
content += msg[1]["evalue"]
else:
args.as_type = "raw"
content = self.sos_kernel._meta["capture_result"]
env.log_to_file(
"MAGIC", f"Captured {self.sos_kernel._meta['capture_result'][:40]}"
)
if not args.as_type or args.as_type == "text":
if not isinstance(content, str):
self.sos_kernel.warn(
"Option --as is only available for message types stdout, stderr, and text."
)
elif args.as_type == "json":
import json
try:
if isinstance(content, str):
content = json.loads(content)
else:
self.sos_kernel.warn(
"Option --as is only available for message types stdout, stderr, and text."
)
except Exception as e:
self.sos_kernel.warn(
f"Failed to capture output in JSON format, text returned: {e}"
)
elif args.as_type == "csv":
try:
if isinstance(content, str):
with StringIO(content) as ifile:
content = pd.read_csv(ifile)
else:
self.sos_kernel.warn(
"Option --as is only available for message types stdout, stderr, and text."
)
except Exception as e:
self.sos_kernel.warn(
f"Failed to capture output in {args.as_type} format, text returned: {e}"
)
elif args.as_type == "tsv":
try:
if isinstance(content, str):
with StringIO(content) as ifile:
content = pd.read_csv(ifile, sep="\t")
else:
self.sos_kernel.warn(
"Option --as is only available for message types stdout, stderr, and text."
)
except Exception as e:
self.sos_kernel.warn(
f"Failed to capture output in {args.as_type} format, text returned: {e}"
)
#
if args.__to__ and not args.__to__.isidentifier():
self.sos_kernel.warn(f"Invalid variable name {args.__to__}")
self.sos_kernel._meta["capture_result"] = None
elif args.__append__ and not args.__append__.isidentifier():
self.sos_kernel.warn(f"Invalid variable name {args.__append__}")
self.sos_kernel._meta["capture_result"] = None
elif args.__to__:
env.sos_dict.set(args.__to__, content)
elif args.__append__:
if args.__append__ not in env.sos_dict:
env.sos_dict.set(args.__append__, content)
elif isinstance(env.sos_dict[args.__append__], str):
if isinstance(content, str):
env.sos_dict[args.__append__] += content
else:
self.sos_kernel.warn(
f"Cannot append new content of type {type(content).__name__} to {args.__append__} of type {type(env.sos_dict[args.__append__]).__name__}"
)
elif isinstance(env.sos_dict[args.__append__], dict):
if isinstance(content, dict):
env.sos_dict[args.__append__].update(content)
else:
self.sos_kernel.warn(
f"Cannot append new content of type {type(content).__name__} to {args.__append__} of type {type(env.sos_dict[args.__append__]).__name__}"
)
elif isinstance(env.sos_dict[args.__append__], pd.DataFrame):
if isinstance(content, pd.DataFrame):
env.sos_dict.set(
args.__append__,
env.sos_dict[args.__append__].append(content),
)
else:
self.sos_kernel.warn(
f"Cannot append new content of type {type(content).__name__} to {args.__append__} of type {type(env.sos_dict[args.__append__]).__name__}"
)
elif isinstance(env.sos_dict[args.__append__], list):
env.sos_dict[args.__append__].append(content)
else:
self.sos_kernel.warn(
f"Cannot append new content of type {type(content).__name__} to {args.__append__} of type {type(env.sos_dict[args.__append__]).__name__}"
)
else:
env.sos_dict.set("__captured", content)
import pprint
self.sos_kernel.send_frontend_msg(
"display_data",
{
"metadata": {},
"data": {
"text/html": '<div class="sos_hint">Cell output captured to variable __captured with content</div>'
},
},
)
self.sos_kernel.send_frontend_msg(
"display_data",
{"metadata": {}, "data": {"text/plain": pprint.pformat(content)}},
)
self.sos_kernel._meta["capture_result"] = None
class Cd_Magic(SoS_Magic):
name = "cd"
def __init__(self, kernel):
super().__init__(kernel)
def get_parser(self):
parser = argparse.ArgumentParser(
prog="%cd", description="""change directory of SoS and all subkernels."""
)
parser.add_argument("dir", help="""destination directory""")
parser.error = self._parse_error
return parser
async def handle_magic_cd(self, option):
if not option:
return
to_dir = option.strip()
try:
os.chdir(path(to_dir))
self.sos_kernel.send_response(
self.sos_kernel.iopub_socket,
"stream",
{"name": "stdout", "text": os.getcwd()},
)
except Exception as e:
self.sos_kernel.warn(
f"Failed to change dir to {os.path.expanduser(to_dir)}: {e}"
)
return
#
cur_kernel = self.sos_kernel.kernel
try:
for kernel in self.sos_kernel.kernels.keys():
if kernel not in self.sos_kernel.supported_languages:
self.sos_kernel.warn(
f"Current directory of kernel {kernel} is not changed: unsupported language"
)
continue
lan = self.sos_kernel.supported_languages[kernel]
if hasattr(lan, "cd_command"):
try:
await self.sos_kernel.switch_kernel(kernel)
cmd = interpolate(lan.cd_command, {"dir": str(path(to_dir))})
await self.sos_kernel.run_cell(
cmd,
True,
False,
on_error=f"Failed to execute {cmd} in {kernel}",
)
except Exception as e:
self.sos_kernel.warn(
f"Current directory of kernel {kernel} is not changed: {e}"
)
else:
self.sos_kernel.warn(
f"Current directory of kernel {kernel} is not changed: cd_command not defined"
)
finally:
await self.sos_kernel.switch_kernel(cur_kernel)
async def apply(self, code, silent, store_history, user_expressions, allow_stdin):
options, remaining_code = self.get_magic_and_code(code, False)
parser = self.get_parser()
try:
args = parser.parse_args(shlex.split(options))
except SystemExit:
return
await self.handle_magic_cd(args.dir)
return await self.sos_kernel._do_execute(
remaining_code, silent, store_history, user_expressions, allow_stdin
)
class Clear_Magic(SoS_Magic):
name = "clear"
def __init__(self, kernel):
super().__init__(kernel)
async def apply(self, code, silent, store_history, user_expressions, allow_stdin):
self.sos_kernel.warn("Magic %clear is deprecated.")
_, remaining_code = self.get_magic_and_code(code, False)
return await self.sos_kernel._do_execute(
remaining_code, silent, store_history, user_expressions, allow_stdin
)
class ConnectInfo_Magic(SoS_Magic):
name = "connectinfo"
def __init__(self, kernel):
super().__init__(kernel)
async def apply(self, code, silent, store_history, user_expressions, allow_stdin):
_, remaining_code = self.get_magic_and_code(code, False)
cfile = find_connection_file()
with open(cfile) as conn:
conn_info = conn.read()
self.sos_kernel.send_response(
self.sos_kernel.iopub_socket,
"stream",
{
"name": "stdout",
"text": f"Connection file: {cfile}\n{conn_info}",
},
)
return await self.sos_kernel._do_execute(
remaining_code, silent, store_history, user_expressions, allow_stdin
)
class Convert_Magic(SoS_Magic):
name = "convert"
def __init__(self, kernel):
super().__init__(kernel)
def get_parser(self):
parser = argparse.ArgumentParser(
prog="%convert",
description="""Convert the current notebook to another format.""",
)
parser.add_argument(
"filename",
nargs="?",
help="""Filename of saved report or script. Default to notebookname with file
extension determined by option --to.""",
)
parser.add_argument(
"-t",
"--to",
dest="__to__",
choices=["sos", "html"],
help="""Destination format, default to html.""",
)
parser.add_argument(
"-a",
"--all",
action="store_true",
help="""Convert all cells, not only sos workflow cells, to .sos file. The result
might not be a valid .sos file.""",
)
parser.add_argument(
"-f",
"--force",
action="store_true",
help="""If destination file already exists, overwrite it.""",
)
parser.add_argument(
"--template",
default="default-sos-template",
help="""Template to generate HTML output. The default template is a
template defined by configuration key default-sos-template, or
sos-report-toc if such a key does not exist.""
gitextract_658abc0k/
├── .appveyor.yml
├── .github/
│ ├── linters/
│ │ └── .python-lint
│ └── workflows/
│ ├── pylint.yml
│ ├── pytest.yml
│ └── python-publish.yml
├── .gitignore
├── .pre-commit-config.yaml
├── .travis.yml
├── CLAUDE.md
├── CONTRIBUTING.md
├── LICENSE
├── MANIFEST.in
├── README.md
├── development/
│ ├── README.md
│ ├── docker-compose.yml
│ ├── eg_sshd/
│ │ ├── .ssh/
│ │ │ ├── id_rsa
│ │ │ ├── id_rsa.pub
│ │ │ └── known_hosts
│ │ └── Dockerfile
│ ├── install_sos_notebook.sh
│ └── sos_notebook_test/
│ ├── .ssh/
│ │ ├── id_rsa
│ │ ├── id_rsa.pub
│ │ └── known_hosts
│ └── Dockerfile
├── pyproject.toml
├── setup.py.old
├── src/
│ └── sos_notebook/
│ ├── __init__.py
│ ├── _version.py
│ ├── comm_manager.py
│ ├── completer.py
│ ├── converter.py
│ ├── inspector.py
│ ├── install.py
│ ├── install_sos_notebook.sh
│ ├── kernel.py
│ ├── magics.py
│ ├── step_executor.py
│ ├── subkernel.py
│ ├── templates/
│ │ ├── README.md
│ │ ├── sos-cm/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── cm.tpl
│ │ │ │ └── sos-mode.js
│ │ │ └── sos-cm.html.j2
│ │ ├── sos-cm-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-cm-toc.html.j2
│ │ ├── sos-full/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── preview.tpl
│ │ │ │ └── sos_style.tpl
│ │ │ └── sos-full.html.j2
│ │ ├── sos-full-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-full-toc.html.j2
│ │ ├── sos-lab-cm/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── cm.tpl
│ │ │ │ └── sos-mode.js
│ │ │ └── sos-lab-cm.html.j2
│ │ ├── sos-lab-full/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ ├── preview.tpl
│ │ │ │ └── sos_style.tpl
│ │ │ ├── sos-lab-full.html.j2
│ │ │ └── static/
│ │ │ ├── index.css
│ │ │ ├── theme-dark.css
│ │ │ └── theme-light.css
│ │ ├── sos-lab-report-only/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ └── sos-lab-report-only.html.j2
│ │ ├── sos-markdown/
│ │ │ ├── conf.json
│ │ │ ├── index.md.j2
│ │ │ └── sos-markdown.md.j2
│ │ ├── sos-report/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── control_panel.tpl
│ │ │ └── sos-report.html.j2
│ │ ├── sos-report-only/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ └── sos-report-only.html.j2
│ │ ├── sos-report-only-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-report-only-toc.html.j2
│ │ ├── sos-report-toc/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-report-toc.html.j2
│ │ ├── sos-report-toc-v2/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── toc.tpl
│ │ │ └── sos-report-toc-v2.html.j2
│ │ ├── sos-report-v1/
│ │ │ ├── conf.json
│ │ │ ├── index.html.j2
│ │ │ ├── parts/
│ │ │ │ └── control_panel_v1.tpl
│ │ │ └── sos-report-v1.html.j2
│ │ └── sos-report-v2/
│ │ ├── conf.json
│ │ ├── index.html.j2
│ │ ├── parts/
│ │ │ └── control_panel.tpl
│ │ └── sos-report-v2.html.j2
│ ├── test_utils.py
│ └── workflow_executor.py
├── tasks.py
└── test/
├── __init__.py
├── build_test_docker.sh
├── conftest.py
├── sample_notebook.ipynb
├── sample_papermill_notebook.ipynb
├── sample_workflow.ipynb
├── test_convert.py
├── test_magics.py
└── test_workflow.py
SYMBOL INDEX (393 symbols across 18 files)
FILE: src/sos_notebook/comm_manager.py
class CommProxyHandler (line 16) | class CommProxyHandler:
method __init__ (line 17) | def __init__(self, KC, sos_kernel):
method handle_msg (line 21) | def handle_msg(self, msg):
class SoSCommManager (line 42) | class SoSCommManager(CommManager):
method __init__ (line 50) | def __init__(self, **kwargs):
method register_subcomm (line 54) | def register_subcomm(self, comm_id, KC, sos_kernel):
method get_comm (line 57) | def get_comm(self, comm_id):
FILE: src/sos_notebook/completer.py
function last_valid (line 15) | def last_valid(line):
class SoS_MagicsCompleter (line 25) | class SoS_MagicsCompleter:
method __init__ (line 26) | def __init__(self, kernel):
method get_completions (line 29) | def get_completions(self, line):
class SoS_PathCompleter (line 61) | class SoS_PathCompleter:
method __init__ (line 66) | def __init__(self):
method get_completions (line 69) | def get_completions(self, line):
class PythonCompleter (line 84) | class PythonCompleter:
method __init__ (line 85) | def __init__(self):
method get_completions (line 88) | def get_completions(self, line):
class SoS_Completer (line 95) | class SoS_Completer:
method __init__ (line 96) | def __init__(self, kernel):
method complete_text (line 103) | def complete_text(self, code, cursor_pos=None):
FILE: src/sos_notebook/converter.py
function execute_sos_notebook (line 21) | def execute_sos_notebook(
class SoS_Exporter (line 91) | class SoS_Exporter(Exporter):
method __init__ (line 92) | def __init__(self, config=None, **kwargs):
method content_from_notebook_cell (line 97) | def content_from_notebook_cell(self, cell, fh, idx=0):
method workflow_from_notebook_cell (line 109) | def workflow_from_notebook_cell(self, cell, fh, idx=0):
method from_notebook_node (line 141) | def from_notebook_node(self, nb, resources, **kwargs):
class ScriptToNotebookConverter (line 162) | class ScriptToNotebookConverter:
method get_parser (line 163) | def get_parser(self):
method convert (line 171) | def convert(self, script_file, notebook_file, args=None, unknown_args=...
class NotebookToScriptConverter (line 306) | class NotebookToScriptConverter:
method get_parser (line 307) | def get_parser(self):
method convert (line 324) | def convert(self, notebook_file, sos_file, args=None, unknown_args=None):
function get_template_args (line 350) | def get_template_args():
function export_notebook (line 357) | def export_notebook(
function _is_int (line 431) | def _is_int(value):
function _is_float (line 441) | def _is_float(value):
function parse_papermill_parameters (line 451) | def parse_papermill_parameters(values):
class NotebookToHTMLConverter (line 473) | class NotebookToHTMLConverter:
method get_parser (line 474) | def get_parser(self):
method convert (line 514) | def convert(self, notebook_file, output_file, sargs=None, unknown_args...
class NotebookToPDFConverter (line 567) | class NotebookToPDFConverter:
method get_parser (line 568) | def get_parser(self):
method convert (line 594) | def convert(self, notebook_file, output_file, sargs=None, unknown_args...
class NotebookToMarkdownConverter (line 649) | class NotebookToMarkdownConverter:
method get_parser (line 650) | def get_parser(self):
method convert (line 669) | def convert(self, notebook_file, output_file, sargs=None, unknown_args...
class NotebookToNotebookConverter (line 701) | class NotebookToNotebookConverter:
method get_parser (line 702) | def get_parser(self):
method nonSoS_to_SoS_notebook (line 737) | def nonSoS_to_SoS_notebook(self, notebook, args):
method SoS_to_nonSoS_notebook (line 787) | def SoS_to_nonSoS_notebook(self, notebook, args):
method convert (line 843) | def convert(self, notebook_file, output_file, sargs=None, unknown_args...
FILE: src/sos_notebook/inspector.py
class SoS_VariableInspector (line 14) | class SoS_VariableInspector:
method __init__ (line 15) | def __init__(self, kernel):
method inspect (line 19) | def inspect(self, name, line, pos):
class SoS_SyntaxInspector (line 32) | class SoS_SyntaxInspector:
method __init__ (line 33) | def __init__(self, kernel):
method inspect (line 36) | def inspect(self, name, line, pos):
class SoS_Inspector (line 63) | class SoS_Inspector:
method __init__ (line 64) | def __init__(self, kernel):
method inspect (line 70) | def inspect(self, name, line, pos):
FILE: src/sos_notebook/install.py
function _is_root (line 22) | def _is_root():
function get_install_sos_kernel_spec_parser (line 29) | def get_install_sos_kernel_spec_parser():
function install_sos_kernel_spec (line 47) | def install_sos_kernel_spec(user, prefix):
function main (line 63) | def main():
FILE: src/sos_notebook/kernel.py
class FlushableStringIO (line 46) | class FlushableStringIO:
method __init__ (line 47) | def __init__(self, kernel, name, *args, **kwargs):
method write (line 51) | def write(self, content):
method flush (line 76) | def flush(self):
function make_transient_msg (line 85) | def make_transient_msg(msg_type, content):
class SoS_Kernel (line 112) | class SoS_Kernel(IPythonKernel):
method get_supported_languages (line 127) | def get_supported_languages(self):
method get_kernel_list (line 149) | def get_kernel_list(self):
method get_completer (line 158) | def get_completer(self):
method get_inspector (line 165) | def get_inspector(self):
method __init__ (line 172) | def __init__(self, **kwargs):
method sos_comm (line 275) | def sos_comm(self, comm, msg):
method notify_error (line 370) | def notify_error(self, e):
method send_frontend_msg (line 388) | def send_frontend_msg(self, msg_type, msg=None):
method redirect_sos_io (line 443) | def redirect_sos_io(self):
method get_vars_from (line 452) | async def get_vars_from(self, items, from_kernel=None, explicit=False,...
method put_vars_to (line 527) | async def put_vars_to(self, items, to_kernel=None, explicit=False, as_...
method expand_text_in (line 644) | async def expand_text_in(self, text, sigil=None, kernel="SoS"):
method do_is_complete (line 687) | def do_is_complete(self, code):
method do_inspect (line 770) | def do_inspect(self, code, cursor_pos, detail_level=0):
method do_complete (line 805) | async def do_complete(self, code, cursor_pos):
method warn (line 847) | def warn(self, message):
method run_cell (line 854) | async def run_cell(self, code, silent, store_history, on_error=None):
method get_info_of_subkernels (line 975) | def get_info_of_subkernels(self):
method switch_kernel (line 1007) | async def switch_kernel(
method shutdown_kernel (line 1108) | def shutdown_kernel(self, kernel, restart=False):
method get_response (line 1156) | def get_response(self, statement, msg_types, name=None):
method run_sos_code (line 1220) | def run_sos_code(self, code, silent):
method render_result (line 1249) | def render_result(self, res):
method send_result (line 1266) | def send_result(self, res, silent=False):
method init_metadata (line 1285) | def init_metadata(self, metadata):
method do_execute (line 1348) | async def do_execute(
method _do_execute (line 1423) | async def _do_execute(
method do_shutdown (line 1497) | def do_shutdown(self, restart):
method _atexit_shutdown (line 1512) | def _atexit_shutdown(self):
method __del__ (line 1520) | def __del__(self):
function _get_comm_manager (line 1529) | def _get_comm_manager(*args, **kwargs):
FILE: src/sos_notebook/magics.py
class SoS_Magic (line 31) | class SoS_Magic:
method __init__ (line 34) | def __init__(self, kernel):
method _interpolate_text (line 38) | def _interpolate_text(self, text, quiet=False):
method get_magic_and_code (line 58) | def get_magic_and_code(self, code, warn_remaining=False):
method match (line 76) | def match(self, code):
method run_shell_command (line 79) | def run_shell_command(self, cmd):
method apply (line 93) | async def apply(self, code, silent, store_history, user_expressions, a...
method _parse_error (line 96) | def _parse_error(self, msg):
class Command_Magic (line 100) | class Command_Magic(SoS_Magic):
method match (line 103) | def match(self, code):
method apply (line 106) | async def apply(self, code, silent, store_history, user_expressions, a...
class Capture_Magic (line 114) | class Capture_Magic(SoS_Magic):
method __init__ (line 117) | def __init__(self, kernel):
method get_parser (line 120) | def get_parser(self):
method apply (line 177) | async def apply(self, code, silent, store_history, user_expressions, a...
class Cd_Magic (line 342) | class Cd_Magic(SoS_Magic):
method __init__ (line 345) | def __init__(self, kernel):
method get_parser (line 348) | def get_parser(self):
method handle_magic_cd (line 356) | async def handle_magic_cd(self, option):
method apply (line 403) | async def apply(self, code, silent, store_history, user_expressions, a...
class Clear_Magic (line 416) | class Clear_Magic(SoS_Magic):
method __init__ (line 419) | def __init__(self, kernel):
method apply (line 422) | async def apply(self, code, silent, store_history, user_expressions, a...
class ConnectInfo_Magic (line 430) | class ConnectInfo_Magic(SoS_Magic):
method __init__ (line 433) | def __init__(self, kernel):
method apply (line 436) | async def apply(self, code, silent, store_history, user_expressions, a...
class Convert_Magic (line 454) | class Convert_Magic(SoS_Magic):
method __init__ (line 457) | def __init__(self, kernel):
method get_parser (line 460) | def get_parser(self):
method apply (line 501) | async def apply(self, code, silent, store_history, user_expressions, a...
class Debug_Magic (line 584) | class Debug_Magic(SoS_Magic):
method __init__ (line 587) | def __init__(self, kernel):
method apply (line 590) | async def apply(self, code, silent, store_history, user_expressions, a...
class Dict_Magic (line 601) | class Dict_Magic(SoS_Magic):
method __init__ (line 604) | def __init__(self, kernel):
method get_parser (line 607) | def get_parser(self):
method handle_magic_dict (line 638) | async def handle_magic_dict(self, line):
method apply (line 693) | async def apply(self, code, silent, store_history, user_expressions, a...
class Env_Magic (line 702) | class Env_Magic(SoS_Magic):
method __init__ (line 705) | def __init__(self, kernel):
method get_parser (line 708) | def get_parser(self):
method apply (line 758) | async def apply(self, code, silent, store_history, user_expressions, a...
class Expand_Magic (line 842) | class Expand_Magic(SoS_Magic):
method __init__ (line 845) | def __init__(self, kernel):
method get_parser (line 848) | def get_parser(self):
method apply (line 876) | async def apply(self, code, silent, store_history, user_expressions, a...
class Get_Magic (line 916) | class Get_Magic(SoS_Magic):
method __init__ (line 919) | def __init__(self, kernel):
method get_parser (line 922) | def get_parser(self):
method apply (line 949) | async def apply(self, code, silent, store_history, user_expressions, a...
class Matplotlib_Magic (line 967) | class Matplotlib_Magic(SoS_Magic):
method __init__ (line 970) | def __init__(self, kernel):
method get_parser (line 973) | def get_parser(self):
method apply (line 1010) | async def apply(self, code, silent, store_history, user_expressions, a...
class Paste_Magic (line 1062) | class Paste_Magic(SoS_Magic):
method __init__ (line 1065) | def __init__(self, kernel):
method apply (line 1068) | async def apply(self, code, silent, store_history, user_expressions, a...
class Preview_Magic (line 1103) | class Preview_Magic(SoS_Magic):
method __init__ (line 1106) | def __init__(self, kernel):
method preview_var (line 1110) | def preview_var(self, item, style=None):
method show_preview_result (line 1154) | def show_preview_result(self, result):
method preview_file (line 1192) | def preview_file(self, filename, style=None):
method get_parser (line 1252) | def get_parser(self):
method handle_magic_preview (line 1319) | async def handle_magic_preview(self, items, kernel=None, style=None):
method apply (line 1476) | async def apply(self, code, silent, store_history, user_expressions, a...
class Pull_Magic (line 1564) | class Pull_Magic(SoS_Magic):
method __init__ (line 1567) | def __init__(self, kernel):
method get_parser (line 1570) | def get_parser(self):
method handle_magic_pull (line 1609) | async def handle_magic_pull(self, args):
method apply (line 1635) | async def apply(self, code, silent, store_history, user_expressions, a...
class Push_Magic (line 1651) | class Push_Magic(SoS_Magic):
method __init__ (line 1654) | def __init__(self, kernel):
method get_parser (line 1657) | def get_parser(self):
method handle_magic_push (line 1694) | async def handle_magic_push(self, args):
method apply (line 1720) | async def apply(self, code, silent, store_history, user_expressions, a...
class Put_Magic (line 1736) | class Put_Magic(SoS_Magic):
method __init__ (line 1739) | def __init__(self, kernel):
method get_parser (line 1742) | def get_parser(self):
method apply (line 1769) | async def apply(self, code, silent, store_history, user_expressions, a...
class Render_Magic (line 1789) | class Render_Magic(SoS_Magic):
method __init__ (line 1792) | def __init__(self, kernel):
method get_parser (line 1795) | def get_parser(self):
method apply (line 1821) | async def apply(self, code, silent, store_history, user_expressions, a...
class Runfile_Magic (line 1863) | class Runfile_Magic(SoS_Magic):
method __init__ (line 1866) | def __init__(self, kernel):
method get_parser (line 1869) | def get_parser(self):
method apply (line 1880) | async def apply(self, code, silent, store_history, user_expressions, a...
class Revisions_Magic (line 1935) | class Revisions_Magic(SoS_Magic):
method __init__ (line 1938) | def __init__(self, kernel):
method get_parser (line 1941) | def get_parser(self):
method handle_magic_revisions (line 1971) | async def handle_magic_revisions(self, args, unknown_args):
method apply (line 2057) | async def apply(self, code, silent, store_history, user_expressions, a...
class Run_Magic (line 2073) | class Run_Magic(SoS_Magic):
method __init__ (line 2076) | def __init__(self, kernel):
method get_parser (line 2079) | def get_parser(self):
method apply (line 2089) | async def apply(self, code, silent, store_history, user_expressions, a...
class Sandbox_Magic (line 2144) | class Sandbox_Magic(SoS_Magic):
method __init__ (line 2147) | def __init__(self, kernel):
method get_parser (line 2150) | def get_parser(self):
method apply (line 2179) | async def apply(self, code, silent, store_history, user_expressions, a...
class Save_Magic (line 2227) | class Save_Magic(SoS_Magic):
method __init__ (line 2230) | def __init__(self, kernel):
method get_parser (line 2233) | def get_parser(self):
method apply (line 2269) | async def apply(self, code, silent, store_history, user_expressions, a...
class SessionInfo_Magic (line 2322) | class SessionInfo_Magic(SoS_Magic):
method __init__ (line 2325) | def __init__(self, kernel):
method get_parser (line 2328) | def get_parser(self):
method handle_sessioninfo (line 2347) | async def handle_sessioninfo(self, args):
method prepare_string (line 2428) | def prepare_string(self, item):
method apply (line 2437) | async def apply(self, code, silent, store_history, user_expressions, a...
class Set_Magic (line 2450) | class Set_Magic(SoS_Magic):
method __init__ (line 2453) | def __init__(self, kernel):
method apply (line 2456) | async def apply(self, code, silent, store_history, user_expressions, a...
class Shutdown_Magic (line 2465) | class Shutdown_Magic(SoS_Magic):
method __init__ (line 2468) | def __init__(self, kernel):
method get_parser (line 2471) | def get_parser(self):
method apply (line 2487) | async def apply(self, code, silent, store_history, user_expressions, a...
class SoSRun_Magic (line 2502) | class SoSRun_Magic(SoS_Magic):
method __init__ (line 2505) | def __init__(self, kernel):
method get_parser (line 2508) | def get_parser(self):
method apply (line 2520) | async def apply(self, code, silent, store_history, user_expressions, a...
class SoSSave_Magic (line 2561) | class SoSSave_Magic(SoS_Magic):
method __init__ (line 2564) | def __init__(self, kernel):
method get_parser (line 2567) | def get_parser(self):
method apply (line 2628) | async def apply(self, code, silent, store_history, user_expressions, a...
class Task_Magic (line 2717) | class Task_Magic(SoS_Magic):
method __init__ (line 2720) | def __init__(self, kernel):
method get_parser (line 2723) | def get_parser(self):
method status (line 2945) | def status(self, args):
method execute (line 3013) | def execute(self, args):
method kill (line 3033) | def kill(self, args):
method purge (line 3073) | def purge(self, args):
method apply (line 3123) | async def apply(self, code, silent, store_history, user_expressions, a...
class Tasks_Magic (line 3138) | class Tasks_Magic(SoS_Magic):
method __init__ (line 3141) | def __init__(self, kernel):
method get_parser (line 3144) | def get_parser(self):
method handle_tasks (line 3175) | def handle_tasks(self, tasks, queue="localhost", status=None, age=None):
method apply (line 3197) | async def apply(self, code, silent, store_history, user_expressions, a...
class Toc_Magic (line 3213) | class Toc_Magic(SoS_Magic):
method __init__ (line 3216) | def __init__(self, kernel):
method apply (line 3219) | async def apply(self, code, silent, store_history, user_expressions, a...
class Use_Magic (line 3227) | class Use_Magic(SoS_Magic):
method __init__ (line 3230) | def __init__(self, kernel):
method get_parser (line 3233) | def get_parser(self):
method apply (line 3283) | async def apply(self, code, silent, store_history, user_expressions, a...
class With_Magic (line 3314) | class With_Magic(SoS_Magic):
method __init__ (line 3317) | def __init__(self, kernel):
method get_parser (line 3320) | def get_parser(self):
method apply (line 3347) | async def apply(self, code, silent, store_history, user_expressions, a...
class SoS_Magics (line 3374) | class SoS_Magics:
method __init__ (line 3411) | def __init__(self, kernel=None):
method get (line 3414) | def get(self, name):
method values (line 3417) | def values(self):
FILE: src/sos_notebook/step_executor.py
class Interactive_Step_Executor (line 13) | class Interactive_Step_Executor(Base_Step_Executor):
method __init__ (line 14) | def __init__(self, step, mode="interactive"):
method init_input_output_vars (line 19) | def init_input_output_vars(self):
method submit_tasks (line 41) | def submit_tasks(self, tasks):
method wait_for_tasks (line 55) | def wait_for_tasks(self, tasks, all_submitted):
method run (line 96) | def run(self):
method log (line 105) | def log(self, stage=None, msg=None):
method wait_for_subworkflows (line 125) | def wait_for_subworkflows(self, workflow_results):
method handle_unknown_target (line 129) | def handle_unknown_target(self, e):
method verify_dynamic_targets (line 134) | def verify_dynamic_targets(self, targets):
FILE: src/sos_notebook/subkernel.py
class subkernel (line 7) | class subkernel:
method __init__ (line 9) | def __init__(
method __repr__ (line 27) | def __repr__(self):
class Subkernels (line 31) | class Subkernels:
method __init__ (line 33) | def __init__(self, kernel):
method kernel_list (line 125) | def kernel_list(self):
method add_or_replace (line 152) | def add_or_replace(self, kdef):
method get_background_color (line 160) | def get_background_color(self, plugin, lan):
method find (line 171) | def find(
method update (line 475) | def update(self, notebook_kernel_list):
method notify_frontend (line 493) | def notify_frontend(self):
FILE: src/sos_notebook/templates/sos-cm/parts/sos-mode.js
function findMode (line 122) | function findMode(mode) {
function findModeFromFilename (line 129) | function findModeFromFilename(filename) {
function markExpr (line 140) | function markExpr(python_mode) {
FILE: src/sos_notebook/templates/sos-lab-cm/parts/sos-mode.js
function findMode (line 122) | function findMode(mode) {
function findModeFromFilename (line 129) | function findModeFromFilename(filename) {
function markExpr (line 140) | function markExpr(python_mode) {
FILE: src/sos_notebook/test_utils.py
function start_new_kernel (line 26) | def start_new_kernel(kernel_name="python3"):
function sos_kernel (line 42) | def sos_kernel():
function flush_channels (line 52) | def flush_channels(kc=None):
function start_sos_kernel (line 64) | def start_sos_kernel():
function stop_sos_kernel (line 75) | def stop_sos_kernel():
function get_result (line 86) | def get_result(iopub):
function _async_get_result (line 91) | async def _async_get_result(iopub):
function get_display_data (line 117) | def get_display_data(iopub, data_type="text/plain"):
function _async_get_display_data (line 122) | async def _async_get_display_data(iopub, data_type):
function clear_channels (line 143) | def clear_channels(iopub):
function _async_clear_channels (line 148) | async def _async_clear_channels(iopub):
function get_std_output (line 157) | def get_std_output(iopub):
class NotebookTest (line 187) | class NotebookTest:
class Notebook (line 191) | class Notebook:
method __init__ (line 198) | def __init__(self, kernel_client=None):
method _execute_and_collect (line 202) | def _execute_and_collect(self, code):
method _switch_kernel (line 242) | def _switch_kernel(self, kernel):
method check_output (line 248) | def check_output(self, code, kernel="SoS"):
method call (line 257) | def call(self, code, kernel="SoS"):
method save (line 263) | def save(self):
method get_input_backgroundColor (line 266) | def get_input_backgroundColor(self, idx=0):
method get_cell_output (line 270) | def get_cell_output(self, idx=0):
FILE: src/sos_notebook/workflow_executor.py
class NotebookLoggingHandler (line 29) | class NotebookLoggingHandler(logging.Handler):
method __init__ (line 30) | def __init__(self, level, kernel=None, title="Log Messages"):
method setTitle (line 35) | def setTitle(self, title):
method emit (line 38) | def emit(self, record):
function start_controller (line 55) | def start_controller(kernel):
function stop_controller (line 69) | def stop_controller(controller):
function execute_scratch_cell (line 81) | def execute_scratch_cell(code, raw_args, kernel):
class Tapped_Executor (line 216) | class Tapped_Executor(mp.Process):
method __init__ (line 217) | def __init__(self, code, args, config):
method run (line 224) | def run(self):
function run_next_workflow_in_queue (line 304) | def run_next_workflow_in_queue():
function execute_pending_workflow (line 323) | def execute_pending_workflow(cell_ids, kernel):
function run_sos_workflow (line 335) | def run_sos_workflow(
function cancel_workflow (line 376) | def cancel_workflow(cell_id, kernel):
FILE: tasks.py
function check_tool (line 12) | def check_tool(ctx, tool_name):
function format (line 25) | def format(ctx, check=False):
function lint (line 51) | def lint(ctx, fix=False):
function precommit (line 78) | def precommit(ctx, install=False):
function test (line 90) | def test(ctx, path="", verbose=False, coverage=False):
function test_docker (line 130) | def test_docker(ctx):
function build (line 173) | def build(ctx, clean=False):
function clean (line 185) | def clean(ctx, all=False):
function install (line 217) | def install(ctx, dev=True, force=False):
function check (line 234) | def check(ctx):
function release_check (line 258) | def release_check(ctx):
function uv_sync (line 277) | def uv_sync(ctx):
function uv_lock (line 285) | def uv_lock(ctx):
function venv_create (line 293) | def venv_create(ctx):
function dev_setup (line 302) | def dev_setup(ctx):
function help (line 333) | def help(ctx):
FILE: test/conftest.py
function notebook (line 15) | def notebook():
function sample_scripts (line 21) | def sample_scripts():
function sample_notebook (line 51) | def sample_notebook():
function sample_papermill_notebook (line 195) | def sample_papermill_notebook():
FILE: test/test_convert.py
function test_script_to_and_from_notebook (line 13) | def test_script_to_and_from_notebook(sample_scripts):
function test_convert_html (line 27) | def test_convert_html(sample_notebook):
function test_convert_pdf (line 72) | def test_convert_pdf(sample_notebook, sample_papermill_notebook):
function test_convert_md (line 91) | def test_convert_md(sample_notebook, sample_papermill_notebook):
function test_convert_notebook (line 113) | def test_convert_notebook(sample_notebook):
function test_execute_notebook (line 137) | def test_execute_notebook(sample_papermill_notebook):
function test_execute_and_convert (line 159) | def test_execute_and_convert(sample_papermill_notebook):
function test_comments (line 181) | def test_comments(sample_notebook):
FILE: test/test_magics.py
class TestMagics (line 15) | class TestMagics(NotebookTest):
method test_magic_in_subkernel (line 16) | def test_magic_in_subkernel(self, notebook):
method test_help_messages (line 20) | def test_help_messages(self, notebook):
method test_magic_capture (line 46) | def test_magic_capture(self, notebook):
method test_magic_cd (line 182) | def test_magic_cd(self, notebook):
method test_magic_connectinfo (line 201) | def test_magic_connectinfo(self, notebook):
method test_magic_debug (line 205) | def test_magic_debug(self, notebook):
method test_magic_dict (line 215) | def test_magic_dict(self, notebook):
method test_magic_expand (line 242) | def test_magic_expand(self, notebook):
method test_magic_get (line 282) | def test_magic_get(self, notebook):
method test_magic_get_between_subkernels (line 357) | def test_magic_get_between_subkernels(self, notebook):
method test_magic_matplotlib (line 392) | def test_magic_matplotlib(self, notebook):
method test_magic_render (line 410) | def test_magic_render(self, notebook):
method test_magic_run (line 463) | def test_magic_run(self, notebook):
method test_magic_runfile (line 511) | def test_magic_runfile(self, notebook):
method test_magic_preview_dot (line 529) | def test_magic_preview_dot(self, notebook):
method test_magic_preview_in_R (line 546) | def test_magic_preview_in_R(self, notebook):
method test_magic_preview_png (line 555) | def test_magic_preview_png(self, notebook):
method test_magic_preview_jpg (line 569) | def test_magic_preview_jpg(self, notebook):
method test_magic_preview_pdf (line 585) | def test_magic_preview_pdf(self, notebook):
method test_magic_preview_pdf_as_png (line 605) | def test_magic_preview_pdf_as_png(self, notebook):
method test_magic_preview_var (line 626) | def test_magic_preview_var(self, notebook):
method test_magic_preview_var_limit (line 635) | def test_magic_preview_var_limit(self, notebook):
method test_magic_preview_csv (line 666) | def test_magic_preview_csv(self, notebook):
method test_magic_preview_txt (line 681) | def test_magic_preview_txt(self, notebook):
method test_magic_preview_zip (line 695) | def test_magic_preview_zip(self, notebook):
method test_magic_preview_tar (line 712) | def test_magic_preview_tar(self, notebook):
method test_magic_preview_tar_gz (line 726) | def test_magic_preview_tar_gz(self, notebook):
method test_magic_preview_gz (line 740) | def test_magic_preview_gz(self, notebook):
method test_magic_preview_md (line 756) | def test_magic_preview_md(self, notebook):
method test_magic_preview_html (line 772) | def test_magic_preview_html(self, notebook):
method test_magic_put (line 798) | def test_magic_put(self, notebook):
method test_magic_sandbox (line 854) | def test_magic_sandbox(self, notebook):
method test_magic_save (line 865) | def test_magic_save(self, notebook):
method test_magic_sessioninfo (line 880) | def test_magic_sessioninfo(self, notebook):
method test_magic_shell (line 918) | def test_magic_shell(self, notebook):
method test_magic_convert (line 922) | def test_magic_convert(self, notebook):
method test_magic_convert_sos (line 941) | def test_magic_convert_sos(self, notebook):
method test_magic_convert_sos_all (line 960) | def test_magic_convert_sos_all(self, notebook):
method test_magic_use (line 977) | def test_magic_use(self, notebook):
method test_magic_with (line 998) | def test_magic_with(self, notebook):
FILE: test/test_workflow.py
class TestWorkflow (line 11) | class TestWorkflow(NotebookTest):
method test_no_output (line 12) | def test_no_output(self, notebook):
method test_no_signature (line 22) | def test_no_signature(self, notebook):
method test_task (line 51) | def test_task(self, notebook):
method test_identical_task (line 78) | def test_identical_task(self, notebook):
method test_background_mode (line 101) | def test_background_mode(self, notebook):
method test_warning_from_sos (line 121) | def test_warning_from_sos(self, notebook):
Condensed preview — 115 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (1,179K chars).
[
{
"path": ".appveyor.yml",
"chars": 1708,
"preview": "version: 1.0.{build}\n\n# docker support\n#image: Visual Studio 2017\n\n#init:\n# - ps: iex ((new-object net.webclient).Downl"
},
{
"path": ".github/linters/.python-lint",
"chars": 18985,
"preview": "[MASTER]\n# A comma-separated list of package or module names from where C extensions may\n# be loaded. Extensions are loa"
},
{
"path": ".github/workflows/pylint.yml",
"chars": 503,
"preview": "name: Pylint\n\non: [push, pull_request]\n\njobs:\n pylint:\n\n runs-on: ubuntu-latest\n\n steps:\n - uses: actions/chec"
},
{
"path": ".github/workflows/pytest.yml",
"chars": 978,
"preview": "name: Tests\n\non: [push, pull_request]\n\njobs:\n test:\n runs-on: ubuntu-latest\n strategy:\n matrix:\n pyth"
},
{
"path": ".github/workflows/python-publish.yml",
"chars": 676,
"preview": "# This workflow will upload a Python Package to PyPI when a release is created\n\nname: Upload Python Package\n\non:\n relea"
},
{
"path": ".gitignore",
"chars": 174,
"preview": ".sos\n*.swp\n__pycache__\ntmp*\nbuild\ndist\n*egg-info\n.DS_Store\n_site/\n.ipynb_checkpoints\n*.pem\n*.pyc\n.ipython/\n.jupyter/\n.vi"
},
{
"path": ".pre-commit-config.yaml",
"chars": 534,
"preview": "# See https://pre-commit.com for more information\n# See https://pre-commit.com/hooks.html for more hooks\nrepos:\n- repo"
},
{
"path": ".travis.yml",
"chars": 3416,
"preview": "dist: trusty\ngroup: edge\nos:\n - linux\n # travis does not support python on osx yet (https://github.com/travis-ci/t"
},
{
"path": "CLAUDE.md",
"chars": 4844,
"preview": "# CLAUDE.md\n\nThis file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.\n\n## "
},
{
"path": "CONTRIBUTING.md",
"chars": 10861,
"preview": "# Contributing to SoS Notebook\n\nThank you for your interest in contributing to SoS Notebook! This document provides comp"
},
{
"path": "LICENSE",
"chars": 1555,
"preview": "BSD 3-Clause License\n\nCopyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\nAll rights reserved.\n\n"
},
{
"path": "MANIFEST.in",
"chars": 286,
"preview": "include LICENSE\ninclude README.md\ninclude pyproject.toml\nrecursive-include src *.py *.js *.tpl *.png *.css *.j2 *.json\nr"
},
{
"path": "README.md",
"chars": 5099,
"preview": "[](https://anaconda.org/conda-"
},
{
"path": "development/README.md",
"chars": 169,
"preview": "1. Update docker images\n\tdocker-compose build --no-cache\n2. push docker images to dockerhub\n\tdocker push mdabioinfo/eg_"
},
{
"path": "development/docker-compose.yml",
"chars": 463,
"preview": "version: '3'\n\nservices:\n sos-notebook:\n build:\n context: ./sos_notebook_test\n image: mdabioi"
},
{
"path": "development/eg_sshd/.ssh/id_rsa",
"chars": 1679,
"preview": "-----BEGIN RSA PRIVATE KEY-----\nMIIEpAIBAAKCAQEAwOw86WfZeC8AAkhgfr0PMCqEtwTXqK4q8hOuvruQa9TN/dcG\nkUFukzfJ0XAUz4anw16PMnw"
},
{
"path": "development/eg_sshd/.ssh/id_rsa.pub",
"chars": 399,
"preview": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDA7DzpZ9l4LwACSGB+vQ8wKoS3BNeoriryE66+u5Br1M391waRQW6TN8nRcBTPhqfDXo8yfB866/CT3Jrk"
},
{
"path": "development/eg_sshd/.ssh/known_hosts",
"chars": 888,
"preview": "|1|ifI8r7TTuuavRoHPCWi/zRhT7Xg=|cJEU9IuPFryWuUqX7WDDGFBSwxU= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbml"
},
{
"path": "development/eg_sshd/Dockerfile",
"chars": 1022,
"preview": "FROM python:3.6\n\nRUN apt-get update && apt-get install -y openssh-server rsync task-spooler\nRUN mkdir /var/run/sshd\nRUN "
},
{
"path": "development/install_sos_notebook.sh",
"chars": 84,
"preview": "#!/usr/bin/bash\npip install . -U\npython -m sos_notebook.install\n\n# jupyter notebook\n"
},
{
"path": "development/sos_notebook_test/.ssh/id_rsa",
"chars": 1679,
"preview": "-----BEGIN RSA PRIVATE KEY-----\nMIIEpAIBAAKCAQEAwOw86WfZeC8AAkhgfr0PMCqEtwTXqK4q8hOuvruQa9TN/dcG\nkUFukzfJ0XAUz4anw16PMnw"
},
{
"path": "development/sos_notebook_test/.ssh/id_rsa.pub",
"chars": 399,
"preview": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDA7DzpZ9l4LwACSGB+vQ8wKoS3BNeoriryE66+u5Br1M391waRQW6TN8nRcBTPhqfDXo8yfB866/CT3Jrk"
},
{
"path": "development/sos_notebook_test/.ssh/known_hosts",
"chars": 888,
"preview": "|1|ifI8r7TTuuavRoHPCWi/zRhT7Xg=|cJEU9IuPFryWuUqX7WDDGFBSwxU= ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbml"
},
{
"path": "development/sos_notebook_test/Dockerfile",
"chars": 1822,
"preview": "# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed under the terms of the 3-cla"
},
{
"path": "pyproject.toml",
"chars": 3939,
"preview": "[build-system]\nrequires = [\"setuptools>=61.0\", \"wheel\"]\nbuild-backend = \"setuptools.build_meta\"\n\n[project]\nname = \"sos-n"
},
{
"path": "setup.py.old",
"chars": 3392,
"preview": "#!/usr/bin/env python\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed unde"
},
{
"path": "src/sos_notebook/__init__.py",
"chars": 242,
"preview": "#!/usr/bin/env python3\r\n#\r\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\r\n# Distributed "
},
{
"path": "src/sos_notebook/_version.py",
"chars": 932,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "src/sos_notebook/comm_manager.py",
"chars": 2387,
"preview": "\"\"\"Base class to manage comms\"\"\"\n\n# Copyright (c) IPython Development Team.\n# Distributed under the terms of the Modifie"
},
{
"path": "src/sos_notebook/completer.py",
"chars": 3559,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "src/sos_notebook/converter.py",
"chars": 32279,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "src/sos_notebook/inspector.py",
"chars": 2516,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "src/sos_notebook/install.py",
"chars": 2340,
"preview": "#!/usr/bin/env python\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed unde"
},
{
"path": "src/sos_notebook/install_sos_notebook.sh",
"chars": 64,
"preview": "#!/usr/bin/bash\npip install . -U\npython -m sos_notebook.install\n"
},
{
"path": "src/sos_notebook/kernel.py",
"chars": 64050,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "src/sos_notebook/magics.py",
"chars": 131846,
"preview": "import argparse\nimport builtins\nimport copy\nimport fnmatch\nimport os\nimport pydoc\nimport re\nimport shlex\nimport subproce"
},
{
"path": "src/sos_notebook/step_executor.py",
"chars": 5502,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "src/sos_notebook/subkernel.py",
"chars": 20443,
"preview": "import fnmatch\nfrom importlib import metadata\n\nfrom sos.utils import env\n\n\nclass subkernel:\n # a class to information"
},
{
"path": "src/sos_notebook/templates/README.md",
"chars": 419,
"preview": "# nbconvert templates for SoS Notebooks\n\nNbconvert templates for exporting SoS Notebooks in various formats.\n\n## Basic U"
},
{
"path": "src/sos_notebook/templates/sos-cm/conf.json",
"chars": 78,
"preview": "{\n \"base_template\": \"sos-full\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-cm/index.html.j2",
"chars": 31,
"preview": "{% extends 'sos-cm.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-cm/parts/cm.tpl",
"chars": 3053,
"preview": "{% macro css() %}\n\n<link type=\"text/css\" rel=\"stylesheet\" href=\"https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.38.0"
},
{
"path": "src/sos_notebook/templates/sos-cm/parts/sos-mode.js",
"chars": 21954,
"preview": "/**\n * Copyright (c) Bo Peng and UT MD Anderson Cancer Center\n * Distributed under the terms of the Modified BSD License"
},
{
"path": "src/sos_notebook/templates/sos-cm/sos-cm.html.j2",
"chars": 797,
"preview": "\n{% extends 'sos-full.html.j2' %}\n\n{% import 'parts/cm.tpl' as cm %}\n\n{%- block html_head -%}\n\n{{ super() | replace('<sc"
},
{
"path": "src/sos_notebook/templates/sos-cm-toc/conf.json",
"chars": 76,
"preview": "{\n \"base_template\": \"sos-cm\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-cm-toc/index.html.j2",
"chars": 35,
"preview": "{% extends 'sos-cm-toc.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-cm-toc/parts/toc.tpl",
"chars": 4622,
"preview": "{% macro css() %}\n\n<style>\n\n/* The Table of Contents container element */\n.toc-container {\n flex-flow: column;\n di"
},
{
"path": "src/sos_notebook/templates/sos-cm-toc/sos-cm-toc.html.j2",
"chars": 1283,
"preview": "\n{% extends 'sos-cm.html.j2' %}\n\n{% import 'parts/toc.tpl' as toc %}\n\n{% block html_head %}\n{{ super() | replace('<scrip"
},
{
"path": "src/sos_notebook/templates/sos-full/conf.json",
"chars": 77,
"preview": "{\n \"base_template\": \"classic\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-full/index.html.j2",
"chars": 33,
"preview": "{% extends 'sos-full.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-full/parts/preview.tpl",
"chars": 3475,
"preview": "\n{% macro css() %}\n\n<link rel=\"stylesheet\" href=\"https://cdnjs.cloudflare.com/ajax/libs/font-awesome/5.10.2/css/all.min."
},
{
"path": "src/sos_notebook/templates/sos-full/parts/sos_style.tpl",
"chars": 7239,
"preview": "\n{% macro css() %}\n\n\n<link href=\"https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.42.2/codemirror.css\" rel=\"styleshee"
},
{
"path": "src/sos_notebook/templates/sos-full/sos-full.html.j2",
"chars": 1150,
"preview": "{% extends 'classic/index.html.j2' %}\n\n{% import 'parts/sos_style.tpl' as sos_style %}\n{% import 'parts/preview.tpl' as "
},
{
"path": "src/sos_notebook/templates/sos-full-toc/conf.json",
"chars": 78,
"preview": "{\n \"base_template\": \"sos-full\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-full-toc/index.html.j2",
"chars": 37,
"preview": "{% extends 'sos-full-toc.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-full-toc/parts/toc.tpl",
"chars": 4622,
"preview": "{% macro css() %}\n\n<style>\n\n/* The Table of Contents container element */\n.toc-container {\n flex-flow: column;\n di"
},
{
"path": "src/sos_notebook/templates/sos-full-toc/sos-full-toc.html.j2",
"chars": 1202,
"preview": "{% extends 'sos-full.html.j2' %}\n\n{% import 'parts/toc.tpl' as toc %}\n\n{% block html_head %}\n{{ super() | replace('<scri"
},
{
"path": "src/sos_notebook/templates/sos-lab-cm/conf.json",
"chars": 82,
"preview": "{\n \"base_template\": \"sos-lab-full\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-lab-cm/index.html.j2",
"chars": 35,
"preview": "{% extends 'sos-lab-cm.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-lab-cm/parts/cm.tpl",
"chars": 3053,
"preview": "{% macro css() %}\n\n<link type=\"text/css\" rel=\"stylesheet\" href=\"https://cdnjs.cloudflare.com/ajax/libs/codemirror/5.38.0"
},
{
"path": "src/sos_notebook/templates/sos-lab-cm/parts/sos-mode.js",
"chars": 21954,
"preview": "/**\n * Copyright (c) Bo Peng and UT MD Anderson Cancer Center\n * Distributed under the terms of the Modified BSD License"
},
{
"path": "src/sos_notebook/templates/sos-lab-cm/sos-lab-cm.html.j2",
"chars": 804,
"preview": "\n{% extends 'sos-lab-full.html.j2' %}\n\n{% import 'parts/cm.tpl' as cm %}\n\n{%- block html_head -%}\n\n{{ super() | replace("
},
{
"path": "src/sos_notebook/templates/sos-lab-full/conf.json",
"chars": 73,
"preview": "{\n \"base_template\": \"lab\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-lab-full/index.html.j2",
"chars": 37,
"preview": "{% extends 'sos-lab-full.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-lab-full/parts/preview.tpl",
"chars": 3475,
"preview": "\n{% macro css() %}\n\n<link rel=\"stylesheet\" href=\"https://cdnjs.cloudflare.com/ajax/libs/font-awesome/5.10.2/css/all.min."
},
{
"path": "src/sos_notebook/templates/sos-lab-full/parts/sos_style.tpl",
"chars": 6151,
"preview": "\n{% macro css() %}\n\n\n<style type=\"text/css\">\n\n/* the cell_kernel_selector will be absolute to the parent\n* of absolute "
},
{
"path": "src/sos_notebook/templates/sos-lab-full/sos-lab-full.html.j2",
"chars": 1146,
"preview": "{% extends 'lab/index.html.j2' %}\n\n{% import 'parts/sos_style.tpl' as sos_style %}\n{% import 'parts/preview.tpl' as prev"
},
{
"path": "src/sos_notebook/templates/sos-lab-full/static/index.css",
"chars": 551398,
"preview": "/*-----------------------------------------------------------------------------\n| Copyright (c) Jupyter Development Team"
},
{
"path": "src/sos_notebook/templates/sos-lab-full/static/theme-dark.css",
"chars": 15702,
"preview": "/*-----------------------------------------------------------------------------\n| Copyright (c) Jupyter Development Team"
},
{
"path": "src/sos_notebook/templates/sos-lab-full/static/theme-light.css",
"chars": 14606,
"preview": "/*-----------------------------------------------------------------------------\n| Copyright (c) Jupyter Development Team"
},
{
"path": "src/sos_notebook/templates/sos-lab-report-only/conf.json",
"chars": 80,
"preview": "{\n \"base_template\": \"sos-lab-cm\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-lab-report-only/index.html.j2",
"chars": 44,
"preview": "{% extends 'sos-lab-report-only.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-lab-report-only/sos-lab-report-only.html.j2",
"chars": 1804,
"preview": "{% extends 'sos-lab-cm.html.j2' %}\n\n{%- block codecell -%}\n {%- if 'scratch' in cell.metadata.get('tags', []) -%}\n {%-"
},
{
"path": "src/sos_notebook/templates/sos-markdown/conf.json",
"chars": 82,
"preview": "{\n \"base_template\": \"markdown\",\n \"mimetypes\": {\n \"text/markdown\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-markdown/index.md.j2",
"chars": 35,
"preview": "{% extends 'sos-markdown.md.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-markdown/sos-markdown.md.j2",
"chars": 399,
"preview": "\n{% extends 'markdown/index.md.j2' %}\n\n\n{% block input %}\n```\n{%- if 'kernel' in cell.metadata -%}\n {{ cell.metadata."
},
{
"path": "src/sos_notebook/templates/sos-report/conf.json",
"chars": 78,
"preview": "{\n \"base_template\": \"sos-full\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-report/index.html.j2",
"chars": 35,
"preview": "{% extends 'sos-report.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-report/parts/control_panel.tpl",
"chars": 4322,
"preview": "{% macro css() %}\n<style type=\"text/css\">\n\n.output_stderr {\n display: none;\n}\n\n.hidden_content {\n display: none;\n}"
},
{
"path": "src/sos_notebook/templates/sos-report/sos-report.html.j2",
"chars": 1267,
"preview": "{% extends 'sos-full.html.j2' %}\n\n{% import 'parts/control_panel.tpl' as control_panel %}\n\n{% block header %}\n{{ super()"
},
{
"path": "src/sos_notebook/templates/sos-report-only/conf.json",
"chars": 76,
"preview": "{\n \"base_template\": \"sos-cm\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-only/index.html.j2",
"chars": 40,
"preview": "{% extends 'sos-report-only.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-only/sos-report-only.html.j2",
"chars": 1800,
"preview": "{% extends 'sos-cm.html.j2' %}\n\n{%- block codecell -%}\n {%- if 'scratch' in cell.metadata.get('tags', []) -%}\n {%- eli"
},
{
"path": "src/sos_notebook/templates/sos-report-only-toc/conf.json",
"chars": 85,
"preview": "{\n \"base_template\": \"sos-report-only\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-only-toc/index.html.j2",
"chars": 44,
"preview": "{% extends 'sos-report-only-toc.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-only-toc/parts/toc.tpl",
"chars": 4622,
"preview": "{% macro css() %}\n\n<style>\n\n/* The Table of Contents container element */\n.toc-container {\n flex-flow: column;\n di"
},
{
"path": "src/sos_notebook/templates/sos-report-only-toc/sos-report-only-toc.html.j2",
"chars": 1210,
"preview": "{% extends 'sos-report-only.html.j2' %}\n\n\n{% import 'parts/toc.tpl' as toc %}\n\n{% block html_head %}\n{{ super() | replac"
},
{
"path": "src/sos_notebook/templates/sos-report-toc/conf.json",
"chars": 80,
"preview": "{\n \"base_template\": \"sos-report\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-toc/index.html.j2",
"chars": 39,
"preview": "{% extends 'sos-report-toc.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-toc/parts/toc.tpl",
"chars": 4622,
"preview": "{% macro css() %}\n\n<style>\n\n/* The Table of Contents container element */\n.toc-container {\n flex-flow: column;\n di"
},
{
"path": "src/sos_notebook/templates/sos-report-toc/sos-report-toc.html.j2",
"chars": 1204,
"preview": "{% extends 'sos-report.html.j2' %}\n\n{% import 'parts/toc.tpl' as toc %}\n\n{% block html_head %}\n{{ super() | replace('<sc"
},
{
"path": "src/sos_notebook/templates/sos-report-toc-v2/conf.json",
"chars": 83,
"preview": "{\n \"base_template\": \"sos-report-v2\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-toc-v2/index.html.j2",
"chars": 42,
"preview": "{% extends 'sos-report-toc-v2.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-toc-v2/parts/toc.tpl",
"chars": 4622,
"preview": "{% macro css() %}\n\n<style>\n\n/* The Table of Contents container element */\n.toc-container {\n flex-flow: column;\n di"
},
{
"path": "src/sos_notebook/templates/sos-report-toc-v2/sos-report-toc-v2.html.j2",
"chars": 1207,
"preview": "{% extends 'sos-report-v2.html.j2' %}\n\n{% import 'parts/toc.tpl' as toc %}\n\n{% block html_head %}\n{{ super() | replace('"
},
{
"path": "src/sos_notebook/templates/sos-report-v1/conf.json",
"chars": 78,
"preview": "{\n \"base_template\": \"sos-full\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-v1/index.html.j2",
"chars": 38,
"preview": "{% extends 'sos-report-v1.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-v1/parts/control_panel_v1.tpl",
"chars": 2448,
"preview": "{% macro css() %}\n<style type=\"text/css\">\n\n.output_stderr {\n display: none;\n}\n\n.hidden_content {\n display: none;\n}"
},
{
"path": "src/sos_notebook/templates/sos-report-v1/sos-report-v1.html.j2",
"chars": 1270,
"preview": "{% extends 'sos-full.html.j2' %}\n\n{% import 'parts/control_panel_v1.tpl' as control_panel %}\n\n{% block header %}\n{{ supe"
},
{
"path": "src/sos_notebook/templates/sos-report-v2/conf.json",
"chars": 76,
"preview": "{\n \"base_template\": \"sos-cm\",\n \"mimetypes\": {\n \"text/html\": true\n }\n}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-v2/index.html.j2",
"chars": 38,
"preview": "{% extends 'sos-report-v2.html.j2' %}\n"
},
{
"path": "src/sos_notebook/templates/sos-report-v2/parts/control_panel.tpl",
"chars": 4322,
"preview": "{% macro css() %}\n<style type=\"text/css\">\n\n.output_stderr {\n display: none;\n}\n\n.hidden_content {\n display: none;\n}"
},
{
"path": "src/sos_notebook/templates/sos-report-v2/sos-report-v2.html.j2",
"chars": 1584,
"preview": "{% extends 'sos-cm' %}\n\n{% import 'parts/control_panel.tpl' as control_panel %}\n\n{% block header %}\n{{ super() }}\n{{ con"
},
{
"path": "src/sos_notebook/test_utils.py",
"chars": 8519,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "src/sos_notebook/workflow_executor.py",
"chars": 13569,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "tasks.py",
"chars": 8685,
"preview": "\"\"\"\nDevelopment tasks for sos-notebook using invoke.\n\"\"\"\n\nimport os\nimport sys\nfrom pathlib import Path\n\nfrom invoke imp"
},
{
"path": "test/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "test/build_test_docker.sh",
"chars": 3530,
"preview": "#!/usr/bin/bash\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed under the "
},
{
"path": "test/conftest.py",
"chars": 5673,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "test/sample_notebook.ipynb",
"chars": 2450,
"preview": "{\n \"cells\": [\n {\n \"cell_type\": \"code\",\n \"execution_count\": 1,\n \"metadata\": {\n \"kernel\": \"SoS\"\n },\n \"output"
},
{
"path": "test/sample_papermill_notebook.ipynb",
"chars": 1887,
"preview": "{\n \"cells\": [\n {\n \"cell_type\": \"markdown\",\n \"metadata\": {\n \"kernel\": \"SoS\"\n },\n \"source\": [\n \"## Notebook"
},
{
"path": "test/sample_workflow.ipynb",
"chars": 2450,
"preview": "{\n \"cells\": [\n {\n \"cell_type\": \"code\",\n \"execution_count\": 1,\n \"metadata\": {\n \"kernel\": \"SoS\"\n },\n \"output"
},
{
"path": "test/test_convert.py",
"chars": 6458,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "test/test_magics.py",
"chars": 29354,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
},
{
"path": "test/test_workflow.py",
"chars": 3472,
"preview": "#!/usr/bin/env python3\n#\n# Copyright (c) Bo Peng and the University of Texas MD Anderson Cancer Center\n# Distributed und"
}
]
About this extraction
This page contains the full source code of the vatlab/sos-notebook GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 115 files (1.1 MB), approximately 336.4k tokens, and a symbol index with 393 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.