pull/447/head
Tania Allard 2018-11-11 10:38:54 +00:00
commit f6a163ce9f
41 zmienionych plików z 1166 dodań i 304 usunięć

2
.gitignore vendored
Wyświetl plik

@ -28,3 +28,5 @@ test_file_text.txt
# Untracked artifacts from the conda script
repo2docker/buildpacks/conda/environment.py-3.5.yml
repo2docker/buildpacks/conda/environment.py-3.6.yml
\.vscode/

Wyświetl plik

@ -16,8 +16,8 @@ script:
# cd into tests so CWD being repo2docker does not hide
# possible issues with MANIFEST.in
- pushd tests;
if [ ${REPO_TYPE} == "r" ]; then
travis_wait pytest --cov repo2docker -v ${REPO_TYPE} || exit 1;
if [ ${REPO_TYPE} == "r" ] || [ ${REPO_TYPE} == "stencila-r" ] || [ ${REPO_TYPE} == "stencila-py" ]; then
travis_wait 30 pytest --cov repo2docker -v ${REPO_TYPE} || exit 1;
else
travis_retry pytest --cov repo2docker -v ${REPO_TYPE} || exit 1;
fi;
@ -46,9 +46,11 @@ env:
- REPO_TYPE=base
- REPO_TYPE=conda
- REPO_TYPE=venv
- REPO_TYPE=stencila
- REPO_TYPE=stencila-r
- REPO_TYPE=stencila-py
- REPO_TYPE=julia
- REPO_TYPE=r
- REPO_TYPE=nix
- REPO_TYPE=dockerfile
- REPO_TYPE=external/*
- REPO_TYPE=**/*.py

Wyświetl plik

@ -6,14 +6,22 @@ Release date: unknown
New features
------------
- Editable mode: allows editing a local repository from a live container [#421]
- Change log added [#426]
- Build from sub-directory: build the image based on a sub-directory of a
repository `#413`_ by `@dsludwig`_.
- Editable mode: allows editing a local repository from a live container
`#421`_ by `@evertrol`_.
- Change log added `#426`_ by `@evertrol`_.
- Documentation: improved the documentation for contributors `#453`_ by
`@choldgraf`_.
- Buildpack: added support for the nix package manager `#407`_ by
`@costrouc`_.
API changes
-----------
- Add content provider abstraction `#421`_ by `@betatim`_.
Bug fixes
---------
@ -54,3 +62,16 @@ Version 0.1
===========
Released 2017-04-14
.. _#453: https://github.com/jupyter/repo2docker/pull/453
.. _#413: https://github.com/jupyter/repo2docker/pull/413
.. _#421: https://github.com/jupyter/repo2docker/pull/421
.. _#426: https://github.com/jupyter/repo2docker/pull/426
.. _#242: https://github.com/jupyter/repo2docker/pull/242
.. _#407: https://github.com/jupyter/repo2docker/pull/407
.. _@betatim: https://github.com/betatim
.. _@choldgraf: https://github.com/choldgraf
.. _@costrouc: https://github.com/costrouc
.. _@dsludwig: https://github.com/dsludwig
.. _@evertrol: https://github.com/evertrol

Wyświetl plik

@ -1,260 +1,7 @@
# Contributing to repo2docker development
This document covers:
The repo2docker developer documentation can be found on these pages:
- Process for making a code contribution
- Setting up for Local Development
- Running Tests
- Updating and Freezing BuildPack Dependencies
- Updating the change log
- Creating a Release
## Process for making a code contribution
This outlines the process for getting changes to the code of
repo2docker merged. This serves as information on when a PR is "done".
Contributions should follow these guidelines:
* all changes by pull request (PR);
* please prefix the title of your pull request with `[MRG]` if the contribution
is complete and should be subjected to a detailed review;
* create a PR as early as possible, marking it with `[WIP]` while you work on
it (good to avoid duplicated work, get broad review of functionality or API,
or seek collaborators);
* a PR solves one problem (do not mix problems together in one PR) with the
minimal set of changes;
* describe why you are proposing the changes you are proposing;
* try to not rush changes (the definition of rush depends on how big your
changes are);
* someone else has to merge your PR;
* new code needs to come with a test;
* apply [PEP8](https://www.python.org/dev/peps/pep-0008/) as much
as possible, but not too much;
* no merging if travis is red;
* do use merge commits instead of merge-by-squashing/-rebasing. This makes it
easier to find all changes since the last deployment `git log --merges --pretty=format:"%h %<(10,trunc)%an %<(15)%ar %s" <deployed-revision>..`
* [when you merge do deploy to mybinder.org](http://mybinder-sre.readthedocs.io/en/latest/deployment/how.html)
These are not hard rules to be enforced by :police_car: but instead guidelines.
## Setting up for Local Development
To develop & test repo2docker locally, you need:
1. Familiarity with using a command line terminal
2. A computer running macOS / Linux
3. Some knowledge of git
4. At least python 3.6
5. Your favorite text editor
6. A recent version of [Docker Community Edition](https://www.docker.com/community-edition)
### Clone the repository
First, you need to get a copy of the repo2docker git repository on your local
disk.
```bash
git clone https://github.com/jupyter/repo2docker
```
This will clone repo2docker into a directory called `repo2docker`. You can
make that your current directory with `cd repo2docker`.
### Set up a local virtual environment
After cloning the repository (or your fork of the repository), you should set up an
isolated environment to install libraries required for running / developing
repo2docker. There are many ways to do this, and a `virtual environment` is
one of them.
```bash
python3 -m venv .
source bin/activate
pip3 install -e .
pip3 install -r dev-requirements.txt
pip3 install -r docs/doc-requirements.txt
```
This should install all the libraries required for testing & running repo2docker!
### Verify that docker is installed and running
If you do not already have [Docker](https://www.docker.com/), you should be able
to download and install it for your operating system using the links from the
[official website](https://www.docker.com/community-edition). After you have
installed it, you can verify that it is working by running the following commands:
```bash
docker version
```
It should output something like:
```
Client:
Version: 17.09.0-ce
API version: 1.32
Go version: go1.8.3
Git commit: afdb6d4
Built: Tue Sep 26 22:42:45 2017
OS/Arch: linux/amd64
Server:
Version: 17.09.0-ce
API version: 1.32 (minimum version 1.12)
Go version: go1.8.3
Git commit: afdb6d4
Built: Tue Sep 26 22:41:24 2017
OS/Arch: linux/amd64
Experimental: false
```
Then you are good to go!
## Running tests
We have a lot of tests for various cases supported by repo2docker in the `tests/`
subdirectory. If you fix a bug or add new functionality consider adding a new
test to prevent the bug from coming back. These use
[py.test](https://docs.pytest.org/).
You can run all the tests with:
```bash
py.test -s tests/*
```
If you want to run a specific test, you can do so with:
```bash
py.test -s tests/<path-to-test>
```
## Update and Freeze BuildPack Dependencies
### Updating libraries installed for all repositories
For both the `conda` and `virtualenv` (`pip`) base environments in the **Conda BuildPack** and **Python BuildPack**,
we install specific pinned versions of all dependencies. We explicitly list the dependencies
we want, then *freeze* them at commit time to explicitly list all the
transitive dependencies at current versions. This way, we know that
all dependencies will have the exact same version installed at all times.
To update one of the dependencies shared across all `repo2docker` builds, you
must follow these steps (with more detailed information in the sections below):
* Make sure you have [Docker](https://www.docker.com/) running on your computer
* Bump the version numbers of the dependencies you want to update in the `conda` environment ([link](https://github.com/jupyter/repo2docker/blob/master/CONTRIBUTING.md#conda-dependencies))
* Make a pull request with your changes ([link](https://github.com/jupyter/repo2docker/blob/master/CONTRIBUTING.md#make-a-pull-request))
See the subsections below for more detailed instructions.
### Conda dependencies
1. There are two files related to conda dependencies. Edit as needed.
- `repo2docker/buildpacks/conda/environment.yml`
Contains list of packages to install in Python3 conda environments,
which are the default. **This is where all Notebook versions &
notebook extensions (such as JupyterLab / nteract) go**.
- `repo2docker/buildpacks/conda/environment.py-2.7.yml`
Contains list of packages to install in Python2 conda environments, which
can be specifically requested by users. **This only needs `IPyKernel`
and kernel related libraries**. Notebook / Notebook Extension need
not be installed here.
2. Once you edit either of these files to add a new package / bump version on
an existing package, you should then run:
```bash
cd ./repo2docker/buildpacks/conda/
python freeze.py
```
This script will resolve dependencies and write them to the respective `.frozen.yml`
files. You will need `docker` installed to run this script.
3. After the freeze script finishes, a number of files will have been created.
Commit the following subset of files to git:
```
repo2docker/buildpacks/conda/environment.yml
repo2docker/buildpacks/conda/environment.frozen.yml
repo2docker/buildpacks/conda/environment.py-2.7.yml
repo2docker/buildpacks/conda/environment.py-2.7.frozen.yml
repo2docker/buildpacks/conda/environment.py-3.5.frozen.yml
repo2docker/buildpacks/conda/environment.py-3.6.frozen.yml
```
5. Make a pull request; see details below.
6. Once the pull request is approved (but not yet merged), Update the
change log (details below) and commit the change log, then update
the pull request.
### Change log
To add your change to the change log, find the relevant Feature/Bug
fix/API change section for the next release near the top of the file;
then add one or two sentences as a new bullet point about your
changes. Include the pull request or issue number between square
brackets at the end.
Some details:
- versioning follows the x.y.z, major.minor.bugfix numbering
- bug fixes go into the next bugfix release. If there isn't any, you
can create a new section (see point below). Don't worry if you're
not sure about that, and think it should go into a next major or
minor release: an admin will let you know, or move the change later
to the appropriate section
- API changes should preferably go into the next major release, unless
they are backward compatible (for example, a deprecated function
keyword): then they can go into the next minor release. For release
with major release 0, non-backward compatible breaking changes are
also fine for the next minor release.
- new features should go into the next minor release.
- if there is no section for the appropriate release, you can add one:
follow the versioning scheme, by simply increasing the relevant
number for one of the major /minor/bugfix numbers, appropriate for
your change (see the above bullet points); add the release
section. Then add three subsections: new features, api changes, and
bug fixes. Leave out the sections that are not appropriate for the
newlye added release section.
Release candidate versions in the change log are only temporary, and
should be superseded by either a next release candidate, or the final
release for that version (bugfix version 0).
### Make a Pull Request
Once you've made the commit, please make a Pull Request to the `jupyterhub/repo2docker`
repository, with a description of what versions were bumped / what new packages were
added and why. If you fix a bug or add new functionality consider adding a new
test to prevent the bug from coming back/the feature breaking in the future.
## Creating a Release
We try to make a release of repo2docker every few months if possible.
We follow semantic versioning.
Check hat the Change log is ready and then tag a new release on GitHub.
When the travis run completes check that the new release is available on PyPI.
* [Contributing to repo2docker](https://repo2docker.readthedocs.io/en/latest/contributing/contributing.html)
* [Our roadmap](https://repo2docker.readthedocs.io/en/latest/contributing/roadmap.html)
* [Common developer tasks and how-tos](https://repo2docker.readthedocs.io/en/latest/contributing/tasks.html)

Wyświetl plik

@ -7,11 +7,14 @@
the configuration files found in the repository.
See the [repo2docker documentation](http://repo2docker.readthedocs.io)
for more information.
for more information on using repo2docker.
See the [contributing guide](CONTRIBUTING.md) for information on contributing to
repo2docker.
See [our roadmap](https://repo2docker.readthedocs.io/en/latest/contributing/roadmap.html)
to learn about where the project is heading.
## Using repo2docker
### Prerequisites

Wyświetl plik

@ -1,4 +1,4 @@
# Architecture
# Architecture of repo2docker
This is a living document talking about the architecture of repo2docker
from various perspectives.
@ -38,8 +38,12 @@ It takes the following steps to determine this:
something that it should be used for. For example, a `BuildPack` that uses `conda` to install
libraries can check for presence of an `environment.yml` file and say 'yes, I can handle this
repository' by returning `True`. Usually buildpacks look for presence of specific files
(`requirements.txt`, `environment.yml`, `install.R`, etc) to determine if they can handle a
repository or not.
(`requirements.txt`, `environment.yml`, `install.R`, `manifest.xml` etc) to determine if they can handle a
repository or not. Buildpacks may also look into specific files to determine specifics of the
required environment, such as the Stencila integration which extracts the required language-specific
executions contexts from an XML file (see base `BuildPack`). More than one buildpack may use such
information, as properties can be inherited (e.g. the R buildpack uses the list of required Stencila
contexts to see if R must be installed).
3. If no `BuildPack` returns true, then repo2docker will use the default `BuildPack` (defined in
`Repo2Docker.default_buildpack` traitlet).

Wyświetl plik

@ -43,8 +43,15 @@ source_parsers = {
'.md': 'recommonmark.parser.CommonMarkParser',
}
from recommonmark.transform import AutoStructify
def setup(app):
app.add_stylesheet('custom.css') # may also be a URL
app.add_config_value('recommonmark_config', {
'auto_toc_tree_section': 'Contents',
}, True)
app.add_transform(AutoStructify)
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
@ -110,7 +117,7 @@ html_theme_path = [alabaster_jupyterhub.get_html_theme_path()]
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
html_sidebars = { '**': ['globaltoc.html', 'relations.html', 'sourcelink.html', 'searchbox.html'] }
# -- Options for HTMLHelp output ------------------------------------------

Wyświetl plik

@ -107,6 +107,22 @@ You also need to have a ``runtime.txt`` file that is formatted as
``r-<YYYY>-<MM>-<DD>``, where YYYY-MM-DD is a snapshot of MRAN that will be
used for your R installation.
``manifest.xml`` - Install Stencila
===================================
`Stencila <https://stenci.la/>`_ is an open source office suite for reproducible research.
It is powered by the open file format `Dar <https://github.com/substance/dar>`_.
If your repository contains a Stencila document, repo2docker detects it based on the file ``manifest.xml``.
The required `execution contexts <https://stenci.la/learn/intro.html>` are extracted from a Dar article (i.e.
files named ``*.jats.xml``).
You may also have a ``runtime.txt`` and/or an ``install.R`` to manually configure your R installation.
To see example repositories, visit our
`Stencila with R <https://github.com/binder-examples/stencila-r/>`_ and
`Stencila with Python <https://github.com/binder-examples/stencila-py>`_ examples.
.. _postBuild:
``postBuild`` - Run code after installing the environment
@ -164,7 +180,6 @@ used for installing libraries.
To see an example R repository, visit our `R
example in binder-examples <https://github.com/binder-examples/r/blob/master/runtime.txt>`_.
``Dockerfile`` - Advanced environments
======================================
@ -179,3 +194,26 @@ With Dockerfiles, a regular Docker build will be performed.
See the `Advanced Binder Documentation <https://mybinder.readthedocs.io/en/latest/tutorials/dockerfile.html>`_ for
best-practices with Dockerfiles.
.. _default.nix:
``default.nix``
~~~~~~~~~~~~~~~
This allows you to use the `nix package manager <https://github.com/NixOS/nixpkgs>`_. It is hard to explain what nix
is to new users and why it is usefull. If you are inclined please read
more at the `nix homepage <https://nixos.org/nix/>`_. It is currently
the largest package repository, offers reproducible builds, multiple
versions of same package coexisting, source and binary based, and
packages many languages such as python, R, go, javascript, haskell,
ruby, etc. .
A ``default.nix`` file allows you to use `nix-shell <https://nixos.org/nix/manual/#sec-nix-shell>`_
to evaluate a ``nix`` expression to define a reproducible nix environment.
The only requirement is that you expose a ``jupyter`` command within the shell
(since jupyterlab is currently what ``repo2docker`` is designed
around). While the ``nix`` environment does have ``NIX_PATH`` set with
``nixpkgs=...`` you should not rely on it and make sure to
`pin your nixpkgs <https://discourse.nixos.org/t/nixops-pinning-nixpkgs/734>`_.
By doing this you are truley producing a reproducible environment. To see an
example repository visit a `nix binder example <https://gitlab.com/costrouc/nix-binder-example>`_.

Wyświetl plik

@ -0,0 +1,119 @@
# Contributing to repo2docker development
## Process for making a code contribution
This outlines the process for getting changes to the code of
repo2docker merged.
* If your change is relatively significant, **open an issue to discuss**
before spending a lot of time writing code. Getting consensus with the
community is a great way to save time later.
* Make edits in your fork of the repo2docker repository
* Submit a pull request (this is how all changes are made)
* Edit [the changelog](https://github.com/jupyter/repo2docker/blob/master/CHANGES.rst)
by appending your feature / bug fix to the development version.
* Wait for a community member to merge your changes
* (optional) Deploy a new version of repo2docker to mybinder.org by [following these steps](http://mybinder-sre.readthedocs.io/en/latest/deployment/how.html)
## Guidelines to getting a Pull Request merged
These are not hard rules to be enforced by :police_car: but instead guidelines
to help you make the most effictive / efficient contribution.
* prefix the title of your pull request with `[MRG]` if the contribution
is complete and should be subjected to a detailed review;
* create a PR as early as possible, marking it with `[WIP]` while you work on
it (good to avoid duplicated work, get broad review of functionality or API,
or seek collaborators);
* a PR solves one problem (do not mix problems together in one PR) with the
minimal set of changes;
* describe why you are proposing the changes you are proposing;
* try to not rush changes (the definition of rush depends on how big your
changes are);
* someone else has to merge your PR;
* new code needs to come with a test;
* apply [PEP8](https://www.python.org/dev/peps/pep-0008/) as much
as possible, but not too much;
* no merging if travis is red;
* do use merge commits instead of merge-by-squashing/-rebasing. This makes it
easier to find all changes since the last deployment `git log --merges --pretty=format:"%h %<(10,trunc)%an %<(15)%ar %s" <deployed-revision>..`
## Setting up for Local Development
To develop & test repo2docker locally, you need:
1. Familiarity with using a command line terminal
2. A computer running macOS / Linux
3. Some knowledge of git
4. At least python 3.6
5. Your favorite text editor
6. A recent version of [Docker Community Edition](https://www.docker.com/community-edition)
### Clone the repository
First, you need to get a copy of the repo2docker git repository on your local
disk. Fork the repository on GitHub, then clone it to your computer:
```bash
git clone https://github.com/<your-username>/repo2docker
```
This will clone repo2docker into a directory called `repo2docker`. You can
make that your current directory with `cd repo2docker`.
### Set up a local virtual environment
After cloning the repository (or your fork of the repository), you should set up an
isolated environment to install libraries required for running / developing
repo2docker. There are many ways to do this, and a `virtual environment` is
one of them.
```bash
python3 -m venv .
source bin/activate
pip3 install -e .
pip3 install -r dev-requirements.txt
pip3 install -r docs/doc-requirements.txt
```
This should install all the libraries required for testing & running repo2docker!
### Verify that docker is installed and running
If you do not already have [Docker](https://www.docker.com/), you should be able
to download and install it for your operating system using the links from the
[official website](https://www.docker.com/community-edition). After you have
installed it, you can verify that it is working by running the following commands:
```bash
docker version
```
It should output something like:
```
Client:
Version: 17.09.0-ce
API version: 1.32
Go version: go1.8.3
Git commit: afdb6d4
Built: Tue Sep 26 22:42:45 2017
OS/Arch: linux/amd64
Server:
Version: 17.09.0-ce
API version: 1.32 (minimum version 1.12)
Go version: go1.8.3
Git commit: afdb6d4
Built: Tue Sep 26 22:41:24 2017
OS/Arch: linux/amd64
Experimental: false
```
Then you are good to go!

Wyświetl plik

@ -0,0 +1,87 @@
# The repo2docker roadmap
This roadmap collects "next steps" for repo2docker. It is about creating a
shared understanding of the project's vision and direction amongst
the community of users, contributors, and maintainers.
The goal is to communicate priorities and upcoming release plans.
It is not a aimed at limiting contributions to what is listed here.
## Using the roadmap
### Sharing Feedback on the Roadmap
All of the community is encouraged to provide feedback as well as share new
ideas with the community. Please do so by submitting an issue. If you want to
have an informal conversation first use one of the other communication channels.
After submitting the issue, others from the community will probably
respond with questions or comments they have to clarify the issue. The
maintainers will help identify what a good next step is for the issue.
### What do we mean by "next step"?
When submitting an issue, think about what "next step" category best describes
your issue:
* **now**, concrete/actionable step that is ready for someone to start work on.
These might be items that have a link to an issue or more abstract like
"decrease typos and dead links in the documentation"
* **soon**, less concrete/actionable step that is going to happen soon,
discussions around the topic are coming close to an end at which point it can
move into the "now" category
* **later**, abstract ideas or tasks, need a lot of discussion or
experimentation to shape the idea so that it can be executed. Can also
contain concrete/actionable steps that have been postponed on purpose
(these are steps that could be in "now" but the decision was taken to work on
them later)
### Reviewing and Updating the Roadmap
The roadmap will get updated as time passes (next review by 1st December) based
on discussions and ideas captured as issues.
This means this list should not be exhaustive, it should only represent
the "top of the stack" of ideas. It should
not function as a wish list, collection of feature requests or todo list.
For those please create a
[new issue](https://github.com/jupyter/repo2docker/issues/new).
The roadmap should give the reader an idea of what is happening next, what needs
input and discussion before it can happen and what has been postponed.
## The roadmap proper
### Project vision
Repo2docker is a dependable tool used by humans that reduces the complexity of
creating the environment in which a piece of software can be executed.
### Now
These "Now" items are considered active areas of focus for the project:
* reduce documentation typos and syntax errors
* add Julia Manifest support (https://docs.julialang.org/en/v1/stdlib/Pkg/index.html)
* increase test coverage (see https://codecov.io/gh/jupyter/repo2docker/tree/master/repo2docker for low coverage files)
* reduce execution time of tests
* make a new release once Pipfile and nix support have been merged
### Soon
These "Soon" items are under discussion. Once an item reaches the point of an
actionable plan, the item will be moved to the "Now" section. Typically,
these will be moved at a future review of the roadmap.
* create the contributor highway, define the route from newcomer to project lead
* add support for using ZIP files as the repo (`repo2docker https://example.com/an-archive.zip`) this will give us access to several archives (like Zenodo) that expose things as ZIP files.
* add support for Zenodo (`repo2docker 10.5281/zenodo.1476680`) so Zenodo software archives can be used as the source in addition to a git repository
* tooling to make it easier to produce good `requirements.txt`, `environment.yml`, etc files. They should help users create versions of these files that have a high chance of still working in a few months
* support for running with GPU acceleration
### Later
The "Later" items are things that are at the back of the project's mind. At this
time there is no active plan for an item. The project would like to find the
resources and time to discuss these ideas.
* repo2docker in repo2docker, to reproduce an environment users need to specify the repository and the version of repo2docker to use. Add support for repo2docker inspecting a repo and then starting a different version of itself to build the image
* support execution on a remote host (with more resources than available locally) via the command-line

Wyświetl plik

@ -0,0 +1,151 @@
# Common tasks
These are some common tasks to be done as a part of developing
and maintaining repo2docker. If you'd like more guidance for how
to do these things, reach out in the [JupyterHub Gitter channel](https://gitter.im/jupyterhub/jupyterhub).
## Running tests
We have a lot of tests for various cases supported by repo2docker in the `tests/`
subdirectory. If you fix a bug or add new functionality consider adding a new
test to prevent the bug from coming back. These use
[py.test](https://docs.pytest.org/).
You can run all the tests with:
```bash
py.test -s tests/*
```
If you want to run a specific test, you can do so with:
```bash
py.test -s tests/<path-to-test>
```
## Update and Freeze BuildPack Dependencies
This section covers the process by which repo2docker defines and updates the
dependencies that are installed by default for several buildpacks.
For both the `conda` and `virtualenv` (`pip`) base environments in the **Conda BuildPack** and **Python BuildPack**,
we install specific pinned versions of all dependencies. We explicitly list the dependencies
we want, then *freeze* them at commit time to explicitly list all the
transitive dependencies at current versions. This way, we know that
all dependencies will have the exact same version installed at all times.
To update one of the dependencies shared across all `repo2docker` builds, you
must follow these steps (with more detailed information in the sections below):
1. Make sure you have [Docker](https://www.docker.com/) running on your computer
2. Bump the version numbers of the dependencies you want to update in the `conda` environment ([link](https://github.com/jupyter/repo2docker/blob/master/CONTRIBUTING.md#conda-dependencies))
3. Make a pull request with your changes ([link](https://github.com/jupyter/repo2docker/blob/master/CONTRIBUTING.md#make-a-pull-request))
See the subsections below for more detailed instructions.
### Conda dependencies
1. There are two files related to conda dependencies. Edit as needed.
- `repo2docker/buildpacks/conda/environment.yml`
Contains list of packages to install in Python3 conda environments,
which are the default. **This is where all Notebook versions &
notebook extensions (such as JupyterLab / nteract) go**.
- `repo2docker/buildpacks/conda/environment.py-2.7.yml`
Contains list of packages to install in Python2 conda environments, which
can be specifically requested by users. **This only needs `IPyKernel`
and kernel related libraries**. Notebook / Notebook Extension need
not be installed here.
2. Once you edit either of these files to add a new package / bump version on
an existing package, you should then run:
```bash
cd ./repo2docker/buildpacks/conda/
python freeze.py
```
This script will resolve dependencies and write them to the respective `.frozen.yml`
files. You will need `docker` installed to run this script.
3. After the freeze script finishes, a number of files will have been created.
Commit the following subset of files to git:
```
repo2docker/buildpacks/conda/environment.yml
repo2docker/buildpacks/conda/environment.frozen.yml
repo2docker/buildpacks/conda/environment.py-2.7.yml
repo2docker/buildpacks/conda/environment.py-2.7.frozen.yml
repo2docker/buildpacks/conda/environment.py-3.5.frozen.yml
repo2docker/buildpacks/conda/environment.py-3.6.frozen.yml
```
5. Make a pull request; see details below.
6. Once the pull request is approved (but not yet merged), Update the
change log (details below) and commit the change log, then update
the pull request.
### Make a Pull Request
Once you've made the commit, please make a Pull Request to the `jupyterhub/repo2docker`
repository, with a description of what versions were bumped / what new packages were
added and why. If you fix a bug or add new functionality consider adding a new
test to prevent the bug from coming back/the feature breaking in the future.
## Creating a Release
We try to make a release of repo2docker every few months if possible.
We follow semantic versioning.
Check that the Change log is ready and then tag a new release on GitHub.
When the travis run completes check that the new release is available on PyPI.
### Update the change log
To add your change to the change log, find the relevant Feature/Bug
fix/API change section for the next release near the top of the file;
then add one or two sentences as a new bullet point about your
changes. Include the pull request or issue number between square
brackets at the end.
Some details:
- versioning follows the x.y.z, major.minor.bugfix numbering
- bug fixes go into the next bugfix release. If there isn't any, you
can create a new section (see point below). Don't worry if you're
not sure about that, and think it should go into a next major or
minor release: an admin will let you know, or move the change later
to the appropriate section
- API changes should preferably go into the next major release, unless
they are backward compatible (for example, a deprecated function
keyword): then they can go into the next minor release. For release
with major release 0, non-backward compatible breaking changes are
also fine for the next minor release.
- new features should go into the next minor release.
- if there is no section for the appropriate release, you can add one:
follow the versioning scheme, by simply increasing the relevant
number for one of the major /minor/bugfix numbers, appropriate for
your change (see the above bullet points); add the release
section. Then add three subsections: new features, api changes, and
bug fixes. Leave out the sections that are not appropriate for the
newlye added release section.
Release candidate versions in the change log are only temporary, and
should be superseded by either a next release candidate, or the final
release for that version (bugfix version 0).

Wyświetl plik

@ -1,4 +1,4 @@
# Design
# Design of repo2docker
The repo2docker buildpacks are inspired by
[Heroku's Build Packs](https://devcenter.heroku.com/articles/buildpacks).

Wyświetl plik

@ -31,10 +31,29 @@ To learn more about URLs in JupyterLab and Jupyter Notebook, visit
RStudio
-------
The RStudio user interface is automatically enabled a configuration file for
R is detected (an R version specified in ``runtime.txt``). If this is detected,
The RStudio user interface is automatically enabled if a configuration file for
R is detected (i.e. an R version specified in ``runtime.txt``). If this is detected,
RStudio will be accessible by appending ``/rstudio`` to the URL, like so:
.. code-block:: none
http(s)://<server:port>/rstudio
Stencila
--------
The Stencila user interface is automatically enabled if a Stencila document (i.e.
a file `manifest.xml`) is detected. Stencila will be accessible by appending
``/stencila`` to the URL, like so:
.. code-block:: none
http(s)://<server:port>/stencila
The editor will open the Stencila document corresponding to the last `manifest.xml`
found in the file tree. If you want to open a different document, you can configure
the path in the URL parameter `archive`:
.. code-block:: none
http(s)://<server:port>/stencila/?archive=other-dir

Wyświetl plik

@ -21,6 +21,7 @@ Please report `Bugs <https://github.com/jupyter/repo2docker/issues>`_,
install
usage
faq
.. toctree::
:maxdepth: 1
@ -29,6 +30,7 @@ Please report `Bugs <https://github.com/jupyter/repo2docker/issues>`_,
howto/user_interface
howto/languages
howto/jupyterhub_images
howto/deploy
.. toctree::
:maxdepth: 2
@ -37,11 +39,12 @@ Please report `Bugs <https://github.com/jupyter/repo2docker/issues>`_,
config_files
.. toctree::
:maxdepth: 1
:caption: Advanced and developer information
:maxdepth: 2
:caption: Contribute to repo2docker
faq
deploy
design
contributing/contributing
contributing/roadmap
architecture
dev_newbuildpack
design
contributing/tasks
contributing/buildpack

Wyświetl plik

@ -30,7 +30,8 @@ from traitlets.config import Application
from . import __version__
from .buildpacks import (
PythonBuildPack, DockerBuildPack, LegacyBinderDockerBuildPack,
CondaBuildPack, JuliaBuildPack, RBuildPack
CondaBuildPack, JuliaBuildPack, BaseImage,
RBuildPack, NixBuildPack
)
from . import contentproviders
from .utils import (
@ -77,6 +78,7 @@ class Repo2Docker(Application):
LegacyBinderDockerBuildPack,
DockerBuildPack,
JuliaBuildPack,
NixBuildPack,
RBuildPack,
CondaBuildPack,
PythonBuildPack,

Wyświetl plik

@ -5,3 +5,4 @@ from .julia import JuliaBuildPack
from .docker import DockerBuildPack
from .legacy import LegacyBinderDockerBuildPack
from .r import RBuildPack
from .nix import NixBuildPack

Wyświetl plik

@ -7,6 +7,7 @@ import re
import logging
import docker
import sys
import xml.etree.ElementTree as ET
TEMPLATE = r"""
FROM buildpack-deps:bionic
@ -257,7 +258,6 @@ class BuildPack:
"""
return {}
@property
def stencila_manifest_dir(self):
"""Find the stencila manifest dir if it exists"""
@ -271,16 +271,57 @@ class BuildPack:
# ${STENCILA_ARCHIVE_DIR}/${STENCILA_ARCHIVE}/manifest.xml
self._stencila_manifest_dir = None
for root, dirs, files in os.walk("."):
if "manifest.xml" in files:
self.log.debug("Found a manifest.xml at %s", root)
self._stencila_manifest_dir = root.split(os.path.sep, 1)[1]
self.log.info(
"Found stencila manifest.xml in %s",
"Using stencila manifest.xml in %s",
self._stencila_manifest_dir,
)
break
return self._stencila_manifest_dir
@property
def stencila_contexts(self):
"""Find the stencila manifest contexts from file path in manifest"""
if hasattr(self, '_stencila_contexts'):
return self._stencila_contexts
# look at the content of the documents in the manifest
# to extract the required execution contexts
self._stencila_contexts = set()
# get paths to the article files from manifest
files = []
if self.stencila_manifest_dir:
manifest = ET.parse(os.path.join(self.stencila_manifest_dir,
'manifest.xml'))
documents = manifest.findall('./documents/document')
files = [os.path.join(self.stencila_manifest_dir, x.get('path'))
for x in documents]
else:
return self._stencila_contexts
for filename in files:
self.log.debug("Extracting contexts from %s", filename)
# extract code languages from file
document = ET.parse(filename)
code_chunks = document.findall('.//code[@specific-use="source"]')
languages = [x.get('language') for x in code_chunks]
self._stencila_contexts.update(languages)
self.log.info(
"Added executions contexts, now have %s",
self._stencila_contexts,
)
break
return self._stencila_contexts
def get_build_scripts(self):
"""
Ordered list of shell script snippets to build the base image.
@ -332,7 +373,7 @@ class BuildPack:
An ordered list of executable scripts to execute after build.
Is run as a non-root user, and must be executable. Used for performing
build time steps that can not be perfomed with standard tools.
build time steps that can not be performed with standard tools.
The scripts should be as deterministic as possible - running it twice
should not produce different results!
@ -341,13 +382,13 @@ class BuildPack:
def get_start_script(self):
"""
The path to a script to be executated at container start up.
The path to a script to be executed at container start up.
This script is added as the `ENTRYPOINT` to the container.
It is run as a non-root user, and must be executable. Used for performing
run time steps that can not be perfomed with standard tools. For example
setting environment variables for your repository.
It is run as a non-root user, and must be executable. Used for
performing run time steps that can not be performed with standard
tools. For example setting environment variables for your repository.
The script should be as deterministic as possible - running it twice
should not produce different results.
@ -472,9 +513,9 @@ class BaseImage(BuildPack):
env = []
if self.stencila_manifest_dir:
# manifest_dir is the path containing the manifest.xml
# archive_dir is the directory containing archive directories (one level up)
# default archive is the name of the directory in the archive_dir
# such that
# archive_dir is the directory containing archive directories
# (one level up) default archive is the name of the directory
# in the archive_dir such that
# ${STENCILA_ARCHIVE_DIR}/${STENCILA_ARCHIVE}/manifest.xml
# exists.
@ -518,14 +559,24 @@ class BaseImage(BuildPack):
))
except FileNotFoundError:
pass
if 'py' in self.stencila_contexts:
assemble_scripts.extend(
[
(
"${NB_USER}",
r"""
${KERNEL_PYTHON_PREFIX}/bin/pip install --no-cache https://github.com/stencila/py/archive/f1260796.tar.gz && \
${KERNEL_PYTHON_PREFIX}/bin/python -m stencila register
""",
)
]
)
if self.stencila_manifest_dir:
assemble_scripts.extend(
[
(
"${NB_USER}",
r"""
${KERNEL_PYTHON_PREFIX}/bin/pip install --no-cache https://github.com/stencila/py/archive/f6a245fd.tar.gz && \
${KERNEL_PYTHON_PREFIX}/bin/python -m stencila register && \
${NB_PYTHON_PREFIX}/bin/pip install --no-cache nbstencilaproxy==0.1.1 && \
jupyter serverextension enable --sys-prefix --py nbstencilaproxy && \
jupyter nbextension install --sys-prefix --py nbstencilaproxy && \

Wyświetl plik

@ -0,0 +1,70 @@
"""BuildPack for nixpkgs environments"""
import os
from ..base import BuildPack
class NixBuildPack(BuildPack):
"""A nix Package Manager BuildPack"""
def get_path(self):
"""Return paths to be added to PATH environemnt variable
"""
return super().get_path() + [
'/home/${NB_USER}/.nix-profile/bin'
]
def get_env(self):
"""Ordered list of environment variables to be set for this image"""
return super().get_env() + [
('NIX_PATH', "nixpkgs=/home/${NB_USER}/.nix-defexpr/channels/nixpkgs"),
('NIX_SSL_CERT_FILE', '/etc/ssl/certs/ca-certificates.crt'),
('GIT_SSL_CAINFO', '/etc/ssl/certs/ca-certificates.crt')
]
def get_build_scripts(self):
"""
Return series of build-steps common to all nix repositories.
Notice how only root privileges are needed for creating nix
directory.
- create nix directory for user nix installation
- install nix package manager for user
"""
return super().get_build_scripts() + [
("root", """
mkdir -m 0755 /nix && \
chown -R ${NB_USER}:${NB_USER} /nix /usr/local/bin/nix-shell-wrapper /home/${NB_USER}
"""),
("${NB_USER}", """
bash /home/${NB_USER}/.local/bin/install-nix.bash && \
rm /home/${NB_USER}/.local/bin/install-nix.bash
""")
]
def get_build_script_files(self):
"""Dict of files to be copied to the container image for use in building
"""
return {
"nix/install-nix.bash": "/home/${NB_USER}/.local/bin/install-nix.bash",
"nix/nix-shell-wrapper": "/usr/local/bin/nix-shell-wrapper"
}
def get_assemble_scripts(self):
"""Return series of build-steps specific to this source repository.
"""
return super().get_assemble_scripts() + [
('${NB_USER}', """
nix-channel --add https://nixos.org/channels/nixpkgs-unstable nixpkgs && \
nix-channel --update && \
nix-shell default.nix --command "command -v jupyter"
""")
]
def get_start_script(self):
"""The path to a script to be executed as ENTRYPOINT"""
return "/usr/local/bin/nix-shell-wrapper"
def detect(self):
"""Check if current repo should be built with the nix BuildPack"""
return os.path.exists(self.binder_path('default.nix'))

Wyświetl plik

@ -0,0 +1,12 @@
#!/bin/bash
# This downloads and installs a pinned version of nix
set -ex
NIX_VERSION="2.1.1"
NIX_SHA256="ad10b4da69035a585fe89d7330037c4a5d867a372bb0e52a1542ab95aec67999"
wget https://nixos.org/releases/nix/nix-$NIX_VERSION/nix-$NIX_VERSION-x86_64-linux.tar.bz2
echo "$NIX_SHA256 nix-2.1.1-x86_64-linux.tar.bz2" | sha256sum -c
tar xjf nix-*-x86_64-linux.tar.bz2
sh nix-*-x86_64-linux/install
rm -r nix-*-x86_64-linux*

Wyświetl plik

@ -0,0 +1,15 @@
#!/bin/bash
_term() {
echo "Caught SIGTERM signal!"
# kill -TERM "$PID" 2>/dev/null
exit 0
}
trap _term SIGTERM
echo "$*"
nix-shell default.nix --command "$*" &
PID=$!
wait "$PID"

Wyświetl plik

@ -19,8 +19,22 @@ class RBuildPack(PythonBuildPack):
date snapshot of https://mran.microsoft.com/timemachine
from which libraries are to be installed.
2. An optional `install.R` file that will be executed at build time,
and can be used for installing packages from both MRAN and GitHub.
2. A `DESCRIPTION` file signaling an R package
3. A Stencila document (*.jats.xml) with R code chunks (i.e. language="r")
If there is no `runtime.txt`, then the MRAN snapshot is set to latest
date that is guaranteed to exist across timezones.
Additional R packages are installed if specified either
- in a file `install.R`, that will be executed at build time,
and can be used for installing packages from both MRAN and GitHub
- as dependencies in a `DESCRIPTION` file
- are needed by a specific tool, for example the package `stencila` is
installed and configured if a Stencila document is given.
The `r-base` package from Ubuntu apt repositories is used to install
R itself, rather than any of the methods from https://cran.r-project.org/.
@ -60,22 +74,22 @@ class RBuildPack(PythonBuildPack):
"""
Check if current repo should be built with the R Build pack
super().detect() is not called in this function - it would return false
unless a `requirements.txt` is present and we do not want to require the
presence of a `requirements.txt` to use R.
Instead we just check if runtime.txt contains a string of the form
`r-<YYYY>-<MM>-<DD>`
super().detect() is not called in this function - it would return
false unless a `requirements.txt` is present and we do not want
to require the presence of a `requirements.txt` to use R.
"""
# If no date is found, then self.checkpoint_date will be False
# Otherwise, it'll be a date object, which will evaluate to True
if self.checkpoint_date:
return True
description_R = 'DESCRIPTION'
if not os.path.exists('binder') and os.path.exists(description_R):
if ((not os.path.exists('binder') and os.path.exists(description_R))
or 'r' in self.stencila_contexts):
if not self.checkpoint_date:
# no R snapshot date set through runtime.txt
# set the R runtime to the latest date that is guaranteed to be on MRAN across timezones
# set the R runtime to the latest date that is guaranteed to
# be on MRAN across timezones
self._checkpoint_date = datetime.date.today() - datetime.timedelta(days=2)
self._runtime = "r-{}".format(str(self._checkpoint_date))
return True
@ -128,11 +142,14 @@ class RBuildPack(PythonBuildPack):
This sets up:
- A directory owned by non-root in ${R_LIBS_USER} for installing R packages into
- A directory owned by non-root in ${R_LIBS_USER}
for installing R packages into
- RStudio
- R's devtools package, at a particular frozen version (determined by MRAN)
- R's devtools package, at a particular frozen version
(determined by MRAN)
- IRKernel
- nbrsessionproxy (to access RStudio via Jupyter Notebook)
- stencila R package (if Stencila document with R code chunks detected)
"""
rstudio_url = 'https://download2.rstudio.org/rstudio-server-1.1.419-amd64.deb'
# This is MD5, because that is what RStudio download page provides!
@ -148,7 +165,7 @@ class RBuildPack(PythonBuildPack):
# IRKernel version - specified as a tag in the IRKernel repository
irkernel_version = '0.8.11'
return super().get_build_scripts() + [
scripts = [
(
"root",
r"""
@ -226,6 +243,21 @@ class RBuildPack(PythonBuildPack):
),
]
if "r" in self.stencila_contexts:
scripts += [
(
"${NB_USER}",
# Install and register stencila library
r"""
R --quiet -e "source('https://bioconductor.org/biocLite.R'); biocLite('graph')" && \
R --quiet -e "devtools::install_github('stencila/r', ref = '361bbf560f3f0561a8612349bca66cd8978f4f24')" && \
R --quiet -e "stencila::register()"
"""
),
]
return super().get_build_scripts() + scripts
def get_assemble_scripts(self):
"""
Return series of build-steps specific to this repository

Wyświetl plik

@ -0,0 +1,6 @@
Nix environment - default.nix
-----------------------------
You can install a nix shell environment using the traditional default.nix.
Documentation on the syntax and typical setup of a ``nix-shell`` environment can be found `here <https://nixos.org/nix/manual/#sec-nix-shell>`_.

Wyświetl plik

@ -0,0 +1,21 @@
let
# Pinning nixpkgs to specific release
# To get sha256 use "nix-prefetch-git <url> --rev <commit>"
commitRev="5574b6a152b1b3ae5f93ba37c4ffd1981f62bf5a";
nixpkgs = builtins.fetchTarball {
url = "https://github.com/NixOS/nixpkgs/archive/${commitRev}.tar.gz";
sha256 = "1pqdddp4aiz726c7qs1dwyfzixi14shp0mbzi1jhapl9hrajfsjg";
};
pkgs = import nixpkgs { config = { allowUnfree = true; }; };
in
pkgs.mkShell {
buildInputs = with pkgs; [
python36Packages.numpy
python36Packages.scipy
python36Packages.jupyterlab
];
shellHook = ''
export NIX_PATH="nixpkgs=${nixpkgs}:."
'';
}

Wyświetl plik

@ -0,0 +1,3 @@
#!/usr/bin/env python
import numpy
import scipy

Wyświetl plik

@ -0,0 +1,6 @@
<dar>
<documents>
<document id="py.ipynb" name="py.ipynb" type="article" path="py.ipynb.jats.xml" src="py.ipynb" />
</documents>
<assets/>
</dar>

Wyświetl plik

@ -0,0 +1,212 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.1d3 20150301//EN" "JATS-archivearticle1.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<article-meta>
<title-group>
<article-title>Jupyter and Stencila</article-title>
</title-group>
<contrib-group content-type="author">
<contrib contrib-type="person">
<name>
<surname>Pawlik</surname>
<given-names>Aleksandra</given-names>
</name>
</contrib>
</contrib-group>
<abstract>
<p>An example of a Jupyter notebook converted into a JATS document for editing in Stencila.</p>
</abstract>
</article-meta>
</front>
<body>
<sec id="introduction-1">
<title>Introduction</title>
<p>Jupyter notebooks (<xref ref-type="bibr" rid="ref-1">1</xref>&#x2013;<xref ref-type="bibr" rid="ref-3">3</xref>) are one of the most popular platforms for doing reproducible research. Stencila supports importing of Jupyter Notebook <monospace>.ipynb</monospace> files. This allows you to work with collegues to refine a document for final publication while still retaining the code cells, and thus reprodubility of your the work. In the future we also plan to support exporting to <monospace>.ipynb</monospace> files.</p>
</sec>
<sec id="markdown-cells-1">
<title>Markdown cells</title>
<p>Most standard Markdown should be supported by the importer including inline <monospace>code</monospace>, headings etc (although the Stencila user interface do not currently support rendering of some elements e.g.&#xA0;math and lists).</p>
</sec>
<sec id="code-cells-1">
<title>Code cells</title>
<p>Code cells in notebooks are imported without loss. Stencila&#x2019;s user interface currently differs from Jupyter in that code cells are executed on update while you are typing. This produces a very reactive user experience but is inappropriate for more compute intensive, longer running code cells. We are currently working on improving this to allowing users to decide to execute cells explicitly (e.g.&#xA0;using <monospace>Ctrl+Enter</monospace>).</p>
<code specific-use="cell"><named-content><alternatives>
<code specific-use="source" language="py" executable="yes">import sys
import time
&apos;Hello this is Python %s.%s and it is %s&apos; % (sys.version_info[0], sys.version_info[1], time.strftime(&apos;%c&apos;))</code>
<code specific-use="output" language="json">{}</code>
</alternatives>
</named-content>
</code>
<p>Stencila also support Jupyter code cells that produce plots. The cell below produces a simple plot based on the example from <ext-link ext-link-type="uri" xlink:href="https://matplotlib.org/examples/shapes_and_collections/scatter_demo.html">the Matplotlib website</ext-link>. Try changing the code below (for example, the variable <monospace>N</monospace>).</p>
<code specific-use="cell"><named-content><alternatives>
<code specific-use="source" language="py" executable="yes">import numpy as np
import matplotlib.pyplot as plt
N = 50
N = min(N, 1000) # Prevent generation of too many numbers :)
x = np.random.rand(N)
y = np.random.rand(N)
colors = np.random.rand(N)
area = np.pi * (15 * np.random.rand(N))**2 # 0 to 15 point radii
plt.scatter(x, y, s=area, c=colors, alpha=0.5)
plt.show()</code>
<code specific-use="output" language="json">{}</code>
</alternatives>
</named-content>
</code>
<p>We are currently working on supporting <ext-link ext-link-type="uri" xlink:href="http://ipython.readthedocs.io/en/stable/interactive/magics.html">Jupyter&#x2019;s magic commands</ext-link> in Stencila via a bridge to Jupyter kernels.</p>
</sec>
<sec id="metadata-1">
<title>Metadata</title>
<p>To add some metadata about the document (such as authors, title, abstract and so on), In Jupyter, select <monospace>Edit -&gt; Edit Notebook metadata</monospace> from the top menu. Add the title and abstract as JSON strings and authors and organisations metadata as <ext-link ext-link-type="uri" xlink:href="https://www.w3schools.com/js/js_json_arrays.asp">JSON arrays</ext-link>. Author <monospace>affiliation</monospace> identifiers (like <monospace>university-of-earth</monospace> below) must be unique and preferably use only lowercase characters and no spaces.</p>
<p>For example,</p>
<preformat> &quot;authors&quot;: [
{
&quot;given-names&quot;: &quot;Your first name goes here&quot;,
&quot;surname&quot;: &quot;Your last name goes here&quot;,
&quot;email&quot;: &quot;your.email@your-organisation&quot;,
&quot;corresponding&quot;: &quot;yes / no&quot;,
&quot;affiliation&quot;: &quot;university-of-earth&quot;
}
],
&quot;organisations&quot;: [
{
&quot;university-of-earth&quot;: {
&quot;institution&quot;: &quot;Your organisation name&quot;,
&quot;city&quot;: &quot;Your city&quot;,
&quot;country&quot;: &quot;Your country&quot;
}
],
&quot;title&quot;: &quot;Your title goes here&quot;,
&quot;abstract&quot;: &quot;This is a paper about lots of different interesting things&quot;,
</preformat>
</sec>
<sec id="citations-and-references-1">
<title>Citations and references</title>
<p>Stencila supports Pandoc style citations and reference lists within Jupyter notebook Markdown cells. Add a <monospace>bibliography</monospace> entry to the notebook&#x2019;s metadata which points to a file containing your list of references e.g.</p>
<code language="json">&quot;bibliography&quot;: &quot;my-bibliography.bibtex&quot;</code>
<p>Then, within Markdown cells, you can insert citations inside square brackets and separated by semicolons. Each citation is represented using the <monospace>@</monospace> symbol followed by the citation identifier from the bibliography database e.g.</p>
<code language="json">[@perez2015project; @kluyver2016jupyter]</code>
<p>The <ext-link ext-link-type="uri" xlink:href="https://github.com/takluyver/cite2c">cite2c</ext-link> Jupyter extension allows for easier, &#x201C;cite-while-you-write&#x201D; insertion of citations from a Zotero library. We&#x2019;re hoping to support conversion of cite2cstyle citations/references in the <ext-link ext-link-type="uri" xlink:href="https://github.com/stencila/convert/issues/14">future</ext-link>.</p>
</sec>
</body>
<back>
<ref-list>
<ref id="ref-1">
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Perez</surname>
<given-names>Fernando</given-names>
</name>
<name>
<surname>Granger</surname>
<given-names>Brian E</given-names>
</name>
</person-group>
<article-title>Project jupyter: Computational narratives as the engine of collaborative data science</article-title>
<source>Retrieved September</source>
<year>2015</year>
<volume>11</volume>
<fpage>207</fpage>
</element-citation>
</ref>
<ref id="ref-2">
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kluyver</surname>
<given-names>Thomas</given-names>
</name>
<name>
<surname>Ragan-Kelley</surname>
<given-names>Benjamin</given-names>
</name>
<name>
<surname>P&#xE9;rez</surname>
<given-names>Fernando</given-names>
</name>
<name>
<surname>Granger</surname>
<given-names>Brian E</given-names>
</name>
<name>
<surname>Bussonnier</surname>
<given-names>Matthias</given-names>
</name>
<name>
<surname>Frederic</surname>
<given-names>Jonathan</given-names>
</name>
<name>
<surname>Kelley</surname>
<given-names>Kyle</given-names>
</name>
<name>
<surname>Hamrick</surname>
<given-names>Jessica B</given-names>
</name>
<name>
<surname>Grout</surname>
<given-names>Jason</given-names>
</name>
<name>
<surname>Corlay</surname>
<given-names>Sylvain</given-names>
</name>
<name>
<surname>Others</surname>
</name>
</person-group>
<article-title>Jupyter notebooks-a publishing format for reproducible computational workflows.</article-title>
<source>ELPUB</source>
<year>2016</year>
<fpage>87</fpage>
</element-citation>
</ref>
<ref id="ref-3">
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ragan-Kelley</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Perez</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Granger</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Kluyver</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Ivanov</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Frederic</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Bussonnier</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>The jupyter/ipython architecture: A unified view of computational research, from interactive exploration to communication and publication.</article-title>
<source>AGU Fall Meeting Abstracts</source>
<year>2014</year>
</element-citation>
</ref>
</ref-list>
</back>
</article>

Wyświetl plik

@ -2,3 +2,4 @@
jupyter serverextension list 2>&1 | grep nbstencilaproxy
jupyter nbextension list 2>&1 | grep nbstencilaproxy
python3 -c "import stencila"

Wyświetl plik

@ -0,0 +1,21 @@
@article{kluyver2016jupyter,
title={Jupyter Notebooks-a publishing format for reproducible computational workflows.},
author={Kluyver, Thomas and Ragan-Kelley, Benjamin and P{\'e}rez, Fernando and Granger, Brian E and Bussonnier, Matthias and Frederic, Jonathan and Kelley, Kyle and Hamrick, Jessica B and Grout, Jason and Corlay, Sylvain and others},
journal={ELPUB},
pages={87--90},
year={2016}
}
@article{ragan2014jupyter,
title={The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.},
author={Ragan-Kelley, M and Perez, F and Granger, B and Kluyver, T and Ivanov, P and Frederic, J and Bussonnier, M},
journal={AGU Fall Meeting Abstracts},
year={2014}
}
@article{perez2015project,
title={Project Jupyter: Computational narratives as the engine of collaborative data science},
author={Perez, Fernando and Granger, Brian E},
journal={Retrieved September},
volume={11},
pages={207},
year={2015}
}

File diff suppressed because one or more lines are too long

Wyświetl plik

@ -0,0 +1,5 @@
#!/bin/sh
jupyter serverextension list 2>&1 | grep nbstencilaproxy
jupyter nbextension list 2>&1 | grep nbstencilaproxy
python3 -c "import stencila" 2>&1 | grep ModuleNotFoundError

Wyświetl plik

@ -0,0 +1,6 @@
#!/bin/sh
jupyter serverextension list 2>&1 | grep nbstencilaproxy
jupyter nbextension list 2>&1 | grep nbstencilaproxy
python3 -c "import stencila" 2>&1 | grep ModuleNotFoundError
R -e "library('stencila');"