kopia lustrzana https://github.com/micropython/micropython-lib
Porównaj commity
No commits in common. "master" and "v1.8.2" have entirely different histories.
|
@ -1,29 +0,0 @@
|
||||||
name: Build all packages
|
|
||||||
|
|
||||||
on: [push, pull_request]
|
|
||||||
|
|
||||||
env:
|
|
||||||
PACKAGE_INDEX_PATH: /tmp/micropython-lib-deploy
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
build:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v3
|
|
||||||
- uses: actions/setup-python@v4
|
|
||||||
- name: Setup environment
|
|
||||||
run: source tools/ci.sh && ci_build_packages_setup
|
|
||||||
- name: Check manifest files
|
|
||||||
run: source tools/ci.sh && ci_build_packages_check_manifest
|
|
||||||
- name: Compile package index
|
|
||||||
run: source tools/ci.sh && ci_build_packages_compile_index
|
|
||||||
- name: Compile package examples
|
|
||||||
run: source tools/ci.sh && ci_build_packages_examples
|
|
||||||
- name: Publish packages for branch
|
|
||||||
if: vars.MICROPY_PUBLISH_MIP_INDEX && github.event_name == 'push' && ! github.event.deleted
|
|
||||||
run: source tools/ci.sh && ci_push_package_index
|
|
||||||
- name: Upload packages as artifact
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: packages-${{ github.sha }}
|
|
||||||
path: ${{ env.PACKAGE_INDEX_PATH }}
|
|
|
@ -1,12 +0,0 @@
|
||||||
name: Cleanup published packages
|
|
||||||
|
|
||||||
on: delete
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
cleanup:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: vars.MICROPY_PUBLISH_MIP_INDEX
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v3
|
|
||||||
- name: Clean up published files
|
|
||||||
run: source tools/ci.sh && ci_cleanup_package_index ${{ github.event.ref }}
|
|
|
@ -1,18 +0,0 @@
|
||||||
name: Check commit message formatting
|
|
||||||
|
|
||||||
on: [push, pull_request]
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
|
||||||
cancel-in-progress: true
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
build:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
with:
|
|
||||||
fetch-depth: '100'
|
|
||||||
- uses: actions/setup-python@v4
|
|
||||||
- name: Check commit message formatting
|
|
||||||
run: source tools/ci.sh && ci_commit_formatting_run
|
|
|
@ -1,16 +0,0 @@
|
||||||
name: Package tests
|
|
||||||
|
|
||||||
on: [push, pull_request]
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
build:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v3
|
|
||||||
- uses: actions/setup-python@v4
|
|
||||||
- name: Setup environment
|
|
||||||
run: source tools/ci.sh && ci_package_tests_setup_micropython
|
|
||||||
- name: Setup libraries
|
|
||||||
run: source tools/ci.sh && ci_package_tests_setup_lib
|
|
||||||
- name: Run tests
|
|
||||||
run: source tools/ci.sh && ci_package_tests_run
|
|
|
@ -1,12 +0,0 @@
|
||||||
# https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python
|
|
||||||
name: Python code lint and formatting with ruff
|
|
||||||
on: [push, pull_request]
|
|
||||||
jobs:
|
|
||||||
ruff:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
# Version should be kept in sync with .pre-commit_config.yaml & also micropython
|
|
||||||
- run: pip install --user ruff==0.11.6
|
|
||||||
- run: ruff check --output-format=github .
|
|
||||||
- run: ruff format --diff .
|
|
|
@ -1,15 +0,0 @@
|
||||||
repos:
|
|
||||||
- repo: local
|
|
||||||
hooks:
|
|
||||||
- id: verifygitlog
|
|
||||||
name: MicroPython git commit message format checker
|
|
||||||
entry: tools/verifygitlog.py --check-file --ignore-rebase
|
|
||||||
language: python
|
|
||||||
verbose: true
|
|
||||||
stages: [commit-msg]
|
|
||||||
- repo: https://github.com/charliermarsh/ruff-pre-commit
|
|
||||||
# Version should be kept in sync with .github/workflows/ruff.yml & also micropython
|
|
||||||
rev: v0.11.6
|
|
||||||
hooks:
|
|
||||||
- id: ruff
|
|
||||||
id: ruff-format
|
|
|
@ -1 +0,0 @@
|
||||||
Please see the [MicroPython Code of Conduct](https://github.com/micropython/micropython/blob/master/CODEOFCONDUCT.md).
|
|
133
CONTRIBUTING.md
133
CONTRIBUTING.md
|
@ -1,132 +1,3 @@
|
||||||
## Contributor's Guidelines & Code Conventions
|
If you submit a pull request, please adhere to Contributor Guidelines:
|
||||||
|
|
||||||
micropython-lib follows the same general conventions as the [main MicroPython
|
https://github.com/micropython/micropython-lib/wiki/ContributorGuidelines
|
||||||
repository](https://github.com/micropython/micropython). Please see
|
|
||||||
[micropython/CONTRIBUTING.md](https://github.com/micropython/micropython/blob/master/CONTRIBUTING.md)
|
|
||||||
and [micropython/CODECONVENTIONS.md](https://github.com/micropython/micropython/blob/master/CODECONVENTIONS.md).
|
|
||||||
|
|
||||||
### Raising issues
|
|
||||||
|
|
||||||
Please include enough information for someone to reproduce the issue you are
|
|
||||||
describing. This will typically include:
|
|
||||||
|
|
||||||
* The version of MicroPython you are using (e.g. the firmware filename, git
|
|
||||||
hash, or version info printed by the startup message).
|
|
||||||
* What board/device you are running MicroPython on.
|
|
||||||
* Which package you have installed, how you installed it, and what version.
|
|
||||||
When installed via `mip`, all packages will have a `__version__`
|
|
||||||
attribute.
|
|
||||||
* A simple code snippet that demonstrates the issue.
|
|
||||||
|
|
||||||
If you have a how-to question or are looking for help with using MicroPython
|
|
||||||
or packages from micropython-lib, please post at the
|
|
||||||
[discussion forum](https://github.com/orgs/micropython/discussions) instead.
|
|
||||||
|
|
||||||
### Pull requests
|
|
||||||
|
|
||||||
The same rules for commit messages, signing-off commits, and commit structure
|
|
||||||
apply [as for the main MicroPython repository](https://github.com/micropython/micropython/blob/master/CODECONVENTIONS.md).
|
|
||||||
|
|
||||||
All Python code is formatted using the [black](https://github.com/psf/black)
|
|
||||||
tool. You can run [`tools/codeformat.py`](tools/codeformat.py) to apply
|
|
||||||
`black` automatically before submitting a PR. The GitHub CI will also run the
|
|
||||||
[ruff](https://github.com/astral-sh/ruff) tool to apply further "linting"
|
|
||||||
checks.
|
|
||||||
|
|
||||||
Similar to the main repository, a configuration is provided for the
|
|
||||||
[pre-commit](https://pre-commit.com/) tool to apply `black` code formatting
|
|
||||||
rules and run `ruff` automatically. See the documentation for using pre-commit
|
|
||||||
in [the code conventions document](https://github.com/micropython/micropython/blob/master/CODECONVENTIONS.md#automatic-pre-commit-hooks)
|
|
||||||
|
|
||||||
In addition to the conventions from the main repository, there are some
|
|
||||||
specific conventions and guidelines for micropython-lib:
|
|
||||||
|
|
||||||
* The first line of the commit message should start with the name of the
|
|
||||||
package, followed by a short description of the commit. Package names are
|
|
||||||
globally unique in the micropython-lib directory structure.
|
|
||||||
|
|
||||||
For example: `shutil: Add disk_usage function.`
|
|
||||||
|
|
||||||
* Although we encourage keeping the code short and minimal, please still use
|
|
||||||
comments in your code. Typically, packages will be installed via
|
|
||||||
`mip` and so they will be compiled to bytecode where comments will
|
|
||||||
_not_ contribute to the installed size.
|
|
||||||
|
|
||||||
* All packages must include a `manifest.py`, including a `metadata()` line
|
|
||||||
with at least a description and a version.
|
|
||||||
|
|
||||||
* Prefer to break larger packages up into smaller chunks, so that just the
|
|
||||||
required functionality can be installed. The way to do this is to have a
|
|
||||||
base package, e.g. `mypackage` containing `mypackage/__init__.py`, and then
|
|
||||||
an "extension" package, e.g. `mypackage-ext` containing additional files
|
|
||||||
e.g. `mypackage/ext.py`. See
|
|
||||||
[`collections-defaultdict`](python-stdlib/collections-defaultdict) as an
|
|
||||||
example.
|
|
||||||
|
|
||||||
* If you think a package might be extended in this way in the future, prefer
|
|
||||||
to create a package directory with `package/__init__.py`, rather than a
|
|
||||||
single `module.py`.
|
|
||||||
|
|
||||||
* Packages in the python-stdlib directory should be CPython compatible and
|
|
||||||
implement a subset of the CPython equivalent. Avoid adding
|
|
||||||
MicroPython-specific extensions. Please include a link to the corresponding
|
|
||||||
CPython docs in the PR.
|
|
||||||
|
|
||||||
* Include tests (ideally using the `unittest` package) as `test_*.py`.
|
|
||||||
Otherwise, provide examples as `example_*.py`. When porting CPython
|
|
||||||
packages, prefer to use the existing tests rather than writing new ones
|
|
||||||
from scratch.
|
|
||||||
|
|
||||||
* When porting an existing third-party package, please ensure that the source
|
|
||||||
license is compatible.
|
|
||||||
|
|
||||||
* To make it easier for others to install packages directly from your PR before
|
|
||||||
it is merged, consider opting-in to automatic package publishing (see
|
|
||||||
[Publishing packages from forks](#publishing-packages-from-forks)). If you do
|
|
||||||
this, consider quoting the [commands to install
|
|
||||||
packages](README.md#installing-packages-from-forks) in your Pull Request
|
|
||||||
description.
|
|
||||||
|
|
||||||
### Publishing packages from forks
|
|
||||||
|
|
||||||
You can easily publish the packages from your micropython-lib
|
|
||||||
[fork](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/about-forks)
|
|
||||||
by opting in to a system based on [GitHub
|
|
||||||
Actions](https://docs.github.com/en/actions) and [GitHub
|
|
||||||
Pages](https://docs.github.com/en/pages):
|
|
||||||
|
|
||||||
1. Open your fork's repository in the GitHub web interface.
|
|
||||||
2. Navigate to "Settings" -> "Secrets and variables" -> "Actions" -> "Variables".
|
|
||||||
3. Click "New repository variable"
|
|
||||||
4. Create a variable named `MICROPY_PUBLISH_MIP_INDEX` with value `true` (or any
|
|
||||||
"truthy" value).
|
|
||||||
5. The settings for GitHub Actions and GitHub Pages features should not need to
|
|
||||||
be changed from the repository defaults, unless you've explicitly disabled
|
|
||||||
Actions or Pages in your fork.
|
|
||||||
|
|
||||||
The next time you push commits to a branch in your fork, GitHub Actions will run
|
|
||||||
an additional step in the "Build All Packages" workflow named "Publish Packages
|
|
||||||
for branch". This step runs in *your fork*, but if you open a pull request then
|
|
||||||
this workflow is not shown in the Pull Request's "Checks". These run in the
|
|
||||||
upstream repository. Navigate to your fork's Actions tab in order to see
|
|
||||||
the additional "Publish Packages for branch" step.
|
|
||||||
|
|
||||||
Anyone can then install these packages as described under [Installing packages
|
|
||||||
from forks](README.md#installing-packages-from-forks).
|
|
||||||
|
|
||||||
The exact command is also quoted in the GitHub Actions log in your fork's
|
|
||||||
Actions for the "Publish Packages for branch" step of "Build All Packages".
|
|
||||||
|
|
||||||
#### Opting Back Out
|
|
||||||
|
|
||||||
To opt-out again, delete the `MICROPY_PUBLISH_MIP_INDEX` variable and
|
|
||||||
(optionally) delete the `gh-pages` branch from your fork.
|
|
||||||
|
|
||||||
*Note*: While enabled, all micropython-lib packages will be published each time
|
|
||||||
a change is pushed to any branch in your fork. A commit is added to the
|
|
||||||
`gh-pages` branch each time. In a busy repository, the `gh-pages` branch may
|
|
||||||
become quite large. The actual `.git` directory size on disk should still be
|
|
||||||
quite small, as most of the content will be duplicated. If you're worried that
|
|
||||||
the `gh-pages` branch has become too large then you can always delete this
|
|
||||||
branch from GitHub. GitHub Actions will create a new `gh-pages` branch the next
|
|
||||||
time you push a change.
|
|
||||||
|
|
8
LICENSE
8
LICENSE
|
@ -1,8 +1,8 @@
|
||||||
micropython-lib consists of multiple modules from different sources and
|
micropython-lib consists of multiple modules from different sources and
|
||||||
authors. Each module comes under its own licensing terms. The short name of
|
authors. Each module comes under its own licensing terms. Short name of
|
||||||
a license can be found in a file within the module directory (usually
|
a license can be found in a file within a module directory (usually
|
||||||
metadata.txt or setup.py). The complete text of each license used is provided
|
metadata.txt or setup.py). Complete text of each license used is provided
|
||||||
below. Files not belonging to a particular module are provided under the MIT
|
below. Files not belonging to a particular module a provided under MIT
|
||||||
license, unless explicitly stated otherwise.
|
license, unless explicitly stated otherwise.
|
||||||
|
|
||||||
=============== MIT License ===============
|
=============== MIT License ===============
|
||||||
|
|
|
@ -0,0 +1,16 @@
|
||||||
|
PREFIX = ~/.micropython/lib
|
||||||
|
|
||||||
|
all:
|
||||||
|
|
||||||
|
# Installs all modules to a lib location, for development testing
|
||||||
|
CMD="find . -maxdepth 1 -mindepth 1 \( -name '*.py' -not -name 'test_*' -not -name 'setup.py' \) -or \( -type d -not -name 'dist' -not -name '*.egg-info' -not -name '__pycache__' \)| xargs --no-run-if-empty cp -r -t $(PREFIX)"
|
||||||
|
install:
|
||||||
|
@mkdir -p $(PREFIX)
|
||||||
|
@if [ -n "$(MOD)" ]; then \
|
||||||
|
(cd $(MOD); sh -c $(CMD)); \
|
||||||
|
else \
|
||||||
|
for d in $$(find -maxdepth 1 -type d ! -name ".*"); do \
|
||||||
|
echo $$d; \
|
||||||
|
(cd $$d; sh -c $(CMD)); \
|
||||||
|
done \
|
||||||
|
fi
|
206
README.md
206
README.md
|
@ -1,172 +1,66 @@
|
||||||
# micropython-lib
|
~~~~
|
||||||
|
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
|
||||||
|
micropython-lib is a highly experimental community project.
|
||||||
|
|
||||||
This is a repository of packages designed to be useful for writing MicroPython
|
Please help to drive it to just "experimental" state by testing
|
||||||
applications.
|
provided packages with MicroPython.
|
||||||
|
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
|
||||||
|
~~~~
|
||||||
|
|
||||||
The packages here fall into categories corresponding to the four top-level
|
micropython-lib
|
||||||
directories:
|
===============
|
||||||
|
micropython-lib is a project to develop a non-monolothic standard library
|
||||||
|
for MicroPython. Each module or package is available as a separate
|
||||||
|
distribution package from PyPI. Each module is either written from scratch or
|
||||||
|
ported from CPython.
|
||||||
|
|
||||||
* **python-stdlib**: Compatible versions of modules from [The Python Standard
|
Note that the main target of micropython-lib is a "Unix" port of MicroPython
|
||||||
Library](https://docs.python.org/3/library/). These should be drop-in
|
(additional ports to support are to be determined). Actual system requirements
|
||||||
replacements for the corresponding Python modules, although many have
|
vary per module. Though if a module is not related to I/O, the module should
|
||||||
reduced functionality or missing methods or classes (which may not be an
|
work without problem on bare-metal ports too (e.g. pyboard).
|
||||||
issue for most cases).
|
|
||||||
|
|
||||||
* **python-ecosys**: Compatible, but reduced-functionality versions of
|
|
||||||
packages from the wider Python ecosystem. For example, a package that
|
|
||||||
might be found in the [Python Package Index](https://pypi.org/).
|
|
||||||
|
|
||||||
* **micropython**: MicroPython-specific packages that do not have equivalents
|
Usage
|
||||||
in other Python environments. This includes drivers for hardware
|
-----
|
||||||
(e.g. sensors, peripherals, or displays), libraries to work with
|
micropython-lib packages are published on PyPI (Python Package Index),
|
||||||
embedded functionality (e.g. bluetooth), or MicroPython-specific
|
the standard Python community package repository: http://pypi.python.org/ .
|
||||||
packages that do not have equivalents in CPython.
|
On PyPi, you can search for MicroPython related packages and read
|
||||||
|
additional package information.
|
||||||
|
|
||||||
* **unix-ffi**: These packages are specifically for the MicroPython Unix port
|
To install packages from PyPI for usage on your local system, use the
|
||||||
and provide access to operating-system and third-party libraries via FFI,
|
`pip-micropython` tool, which is a simple wrapper around the standard
|
||||||
or functionality that is not useful for non-Unix ports.
|
`pip` tool, which is used to install packages for CPython.
|
||||||
|
The `pip-micropython` tool can be found in `tools` subdirectory
|
||||||
|
of the main MicroPython repository (https://github.com/micropython/micropython).
|
||||||
|
Just install the `pip-micropython` script somewhere on your `PATH`.
|
||||||
|
|
||||||
## Usage
|
Afterwards, just use `pip-micropython` in a way similar to `pip`:
|
||||||
|
|
||||||
To install a micropython-lib package, there are four main options. For more
|
~~~~
|
||||||
information see the [Package management documentation](https://docs.micropython.org/en/latest/reference/packages.html)
|
$ pip-micropython install micropython-copy
|
||||||
documentation.
|
$ micropython
|
||||||
|
>>> import copy
|
||||||
|
>>> copy.copy([1, 2, 3])
|
||||||
|
[1, 2, 3]
|
||||||
|
~~~~
|
||||||
|
|
||||||
### On a network-enabled device
|
Review the `pip-micropython` source code for more info.
|
||||||
|
|
||||||
As of MicroPython v1.20 (and nightly builds since October 2022), boards
|
|
||||||
with WiFi and Ethernet support include the `mip` package manager.
|
|
||||||
|
|
||||||
```py
|
Development
|
||||||
>>> import mip
|
-----------
|
||||||
>>> mip.install("package-name")
|
To install modules during development, use `make install`. By default, all
|
||||||
```
|
available packages will be installed. To install a specific module, add the
|
||||||
|
`MOD=<module>` parameter to the end of the `make install` command.
|
||||||
|
|
||||||
### Using `mpremote` from your PC
|
|
||||||
|
|
||||||
`mpremote` is the officially-supported tool for interacting with a MicroPython
|
Links
|
||||||
device and, since v0.4.0, support for installing micropython-lib packages is
|
-----
|
||||||
provided by using the `mip` command.
|
More information is on GitHub and in the MicroPython forums:
|
||||||
|
|
||||||
```bash
|
* https://github.com/micropython/micropython/issues/405
|
||||||
$ mpremote connect /dev/ttyUSB0 mip install package-name
|
* http://forum.micropython.org/viewtopic.php?f=5&t=70
|
||||||
```
|
|
||||||
|
|
||||||
See the [mpremote documentation](https://docs.micropython.org/en/latest/reference/mpremote.html).
|
Guidelines for packaging MicroPython modules for PyPI:
|
||||||
|
|
||||||
### Freeze into your firmware
|
|
||||||
|
|
||||||
If you are building your own firmware, all packages in this repository include
|
|
||||||
a `manifest.py` that can be included into your board manifest via the
|
|
||||||
`require()` command. See [Manifest files](https://docs.micropython.org/en/latest/reference/manifest.html#require) for
|
|
||||||
more information.
|
|
||||||
|
|
||||||
### Copy the files manually
|
|
||||||
|
|
||||||
Many micropython-lib packages are just single-file modules, and you can
|
|
||||||
quickly get started by copying the relevant Python file to your device. For
|
|
||||||
example, to add the `base64` library, you can directly copy
|
|
||||||
`python-stdlib/base64/base64.py` to the `lib` directory on your device.
|
|
||||||
|
|
||||||
This can be done using `mpremote`, for example:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
$ mpremote connect /dev/ttyUSB0 cp python-stdlib/base64/base64.py :/lib
|
|
||||||
```
|
|
||||||
|
|
||||||
For packages that are implemented as a package directory, you'll need to copy
|
|
||||||
the directory instead. For example, to add `collections.defaultdict`, copy
|
|
||||||
`collections/collections/__init__.py` and
|
|
||||||
`collections-defaultdict/collections/defaultdict.py` to a directory named
|
|
||||||
`lib/collections` on your device.
|
|
||||||
|
|
||||||
Note that unlike the other three approaches based on `mip` or `manifest.py`,
|
|
||||||
you will need to manually resolve dependencies. You can inspect the relevant
|
|
||||||
`manifest.py` file to view the list of dependencies for a given package.
|
|
||||||
|
|
||||||
## Installing packages from forks
|
|
||||||
|
|
||||||
It is possible to use the `mpremote mip install` or `mip.install()` methods to
|
|
||||||
install packages built from a
|
|
||||||
[fork](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/about-forks)
|
|
||||||
of micropython-lib, if the fork's owner has opted in.
|
|
||||||
|
|
||||||
This can be useful to install packages from a pending Pull Request, for example.
|
|
||||||
|
|
||||||
First, the owner of the fork must opt-in as described under
|
|
||||||
[Publishing packages from forks](CONTRIBUTING.md#publishing-packages-from-forks).
|
|
||||||
|
|
||||||
After this has happened, each time someone pushes to a branch in that fork then
|
|
||||||
GitHub Actions will automatically publish the packages to a GitHub Pages site.
|
|
||||||
|
|
||||||
To install these packages, use commands such as:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
$ mpremote connect /dev/ttyUSB0 mip install --index https://USERNAME.github.io/micropython-lib/mip/BRANCH_NAME PACKAGE_NAME
|
|
||||||
```
|
|
||||||
|
|
||||||
Or from a networked device:
|
|
||||||
|
|
||||||
```py
|
|
||||||
import mip
|
|
||||||
mip.install(PACKAGE_NAME, index="https://USERNAME.github.io/micropython-lib/mip/BRANCH_NAME")
|
|
||||||
```
|
|
||||||
|
|
||||||
(Where `USERNAME`, `BRANCH_NAME` and `PACKAGE_NAME` are replaced with the owner
|
|
||||||
of the fork, the branch the packages were built from, and the package name.)
|
|
||||||
|
|
||||||
## Contributing
|
|
||||||
|
|
||||||
We use [GitHub Discussions](https://github.com/micropython/micropython/discussions)
|
|
||||||
as our forum, and [Discord](https://micropython.org/discord) for chat. These
|
|
||||||
are great places to ask questions and advice from the community or to discuss your
|
|
||||||
MicroPython-based projects.
|
|
||||||
|
|
||||||
The [MicroPython Wiki](https://github.com/micropython/micropython/wiki) is
|
|
||||||
also used for micropython-lib.
|
|
||||||
|
|
||||||
For bugs and feature requests, please [raise an issue](https://github.com/micropython/micropython-lib/issues/new).
|
|
||||||
|
|
||||||
We welcome pull requests to add new packages, fix bugs, or add features.
|
|
||||||
Please be sure to follow the
|
|
||||||
[Contributor's Guidelines & Code Conventions](CONTRIBUTING.md). Note that
|
|
||||||
MicroPython is licensed under the [MIT license](LICENSE) and all contributions
|
|
||||||
should follow this license.
|
|
||||||
|
|
||||||
### Future plans (and new contributor ideas)
|
|
||||||
|
|
||||||
* Develop a set of example programs using these packages.
|
|
||||||
* Develop more MicroPython packages for common tasks.
|
|
||||||
* Expand unit testing coverage.
|
|
||||||
* Add support for referencing remote/third-party repositories.
|
|
||||||
|
|
||||||
## Notes on terminology
|
|
||||||
|
|
||||||
The terms *library*, *package*, and *module* are overloaded and lead to some
|
|
||||||
confusion. The interpretation used in by the MicroPython project is that:
|
|
||||||
|
|
||||||
A *library* is a collection of installable packages, e.g. [The Python Standard
|
|
||||||
Library](https://docs.python.org/3/library/), or micropython-lib.
|
|
||||||
|
|
||||||
A *package* can refer to two things. The first meaning, "library package", is
|
|
||||||
something that can be installed from a library, e.g. via `mip` (or `pip` in
|
|
||||||
CPython/PyPI). Packages provide *modules* that can be imported. The ambiguity
|
|
||||||
here is that the module provided by the package does not necessarily have to
|
|
||||||
have the same name, e.g. the `pyjwt` package provides the `jwt` module. In
|
|
||||||
CPython, the `pyserial` package providing the `serial` module is another
|
|
||||||
common example.
|
|
||||||
|
|
||||||
A *module* is something that can be imported. For example, "the *os* module".
|
|
||||||
|
|
||||||
A module can be implemented either as a single file, typically also called
|
|
||||||
a *module* or "single-file module", or as a *package* (the second meaning),
|
|
||||||
which in this context means a directory containing multiple `.py` files
|
|
||||||
(usually at least an `__init__.py`).
|
|
||||||
|
|
||||||
In micropython-lib, we also have the concept of an *extension package* which
|
|
||||||
is a library package that extends the functionality of another package, by
|
|
||||||
adding additional files to the same package directory. These packages have
|
|
||||||
hyphenated names. For example, the `collections-defaultdict` package extends
|
|
||||||
the `collections` package to add the `defaultdict` class to the `collections`
|
|
||||||
module.
|
|
||||||
|
|
||||||
|
* https://github.com/micropython/micropython/issues/413
|
||||||
|
|
|
@ -5,4 +5,3 @@ absolute_import = True
|
||||||
with_statement = True
|
with_statement = True
|
||||||
print_function = True
|
print_function = True
|
||||||
unicode_literals = True
|
unicode_literals = True
|
||||||
annotations = True
|
|
|
@ -0,0 +1,4 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.2
|
||||||
|
dist_name=future
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-future',
|
||||||
|
version='0.0.2',
|
||||||
|
description='Dummy __future__ module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['__future__'])
|
|
@ -4,8 +4,7 @@ import sys
|
||||||
|
|
||||||
_h = None
|
_h = None
|
||||||
|
|
||||||
names = ("libc.so", "libc.so.0", "libc.so.6", "libc.dylib")
|
names = ('libc.so', 'libc.so.0', 'libc.so.6', 'libc.dylib')
|
||||||
|
|
||||||
|
|
||||||
def get():
|
def get():
|
||||||
global _h
|
global _h
|
||||||
|
@ -25,7 +24,6 @@ def set_names(n):
|
||||||
global names
|
global names
|
||||||
names = n
|
names = n
|
||||||
|
|
||||||
|
|
||||||
# Find out bitness of the platform, even if long ints are not supported
|
# Find out bitness of the platform, even if long ints are not supported
|
||||||
# TODO: All bitness differences should be removed from micropython-lib, and
|
# TODO: All bitness differences should be removed from micropython-lib, and
|
||||||
# this snippet too.
|
# this snippet too.
|
|
@ -0,0 +1,7 @@
|
||||||
|
dist_name = libc
|
||||||
|
srctype = micropython-lib
|
||||||
|
type = module
|
||||||
|
version = 0.3
|
||||||
|
author = Paul Sokolovsky
|
||||||
|
desc = MicroPython FFI helper module (deprecated)
|
||||||
|
long_desc = MicroPython FFI helper module (deprecated, replaced by micropython-ffilib).
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-libc',
|
||||||
|
version='0.3',
|
||||||
|
description='MicroPython FFI helper module (deprecated)',
|
||||||
|
long_description='MicroPython FFI helper module (deprecated, replaced by micropython-ffilib).',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='Paul Sokolovsky',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['_libc'])
|
|
@ -7,15 +7,15 @@ documented public API and should not be used directly.
|
||||||
|
|
||||||
import re
|
import re
|
||||||
|
|
||||||
_declname_match = re.compile(r"[a-zA-Z][-_.a-zA-Z0-9]*\s*").match
|
_declname_match = re.compile(r'[a-zA-Z][-_.a-zA-Z0-9]*\s*').match
|
||||||
_declstringlit_match = re.compile(r'(\'[^\']*\'|"[^"]*")\s*').match
|
_declstringlit_match = re.compile(r'(\'[^\']*\'|"[^"]*")\s*').match
|
||||||
_commentclose = re.compile(r"--\s*>")
|
_commentclose = re.compile(r'--\s*>')
|
||||||
_markedsectionclose = re.compile(r"]\s*]\s*>")
|
_markedsectionclose = re.compile(r']\s*]\s*>')
|
||||||
|
|
||||||
# An analysis of the MS-Word extensions is available at
|
# An analysis of the MS-Word extensions is available at
|
||||||
# http://www.planetpublish.com/xmlarena/xap/Thursday/WordtoXML.pdf
|
# http://www.planetpublish.com/xmlarena/xap/Thursday/WordtoXML.pdf
|
||||||
|
|
||||||
_msmarkedsectionclose = re.compile(r"]\s*>")
|
_msmarkedsectionclose = re.compile(r']\s*>')
|
||||||
|
|
||||||
del re
|
del re
|
||||||
|
|
||||||
|
@ -26,10 +26,12 @@ class ParserBase:
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
if self.__class__ is ParserBase:
|
if self.__class__ is ParserBase:
|
||||||
raise RuntimeError("_markupbase.ParserBase must be subclassed")
|
raise RuntimeError(
|
||||||
|
"_markupbase.ParserBase must be subclassed")
|
||||||
|
|
||||||
def error(self, message):
|
def error(self, message):
|
||||||
raise NotImplementedError("subclasses of ParserBase must override error()")
|
raise NotImplementedError(
|
||||||
|
"subclasses of ParserBase must override error()")
|
||||||
|
|
||||||
def reset(self):
|
def reset(self):
|
||||||
self.lineno = 1
|
self.lineno = 1
|
||||||
|
@ -51,12 +53,12 @@ class ParserBase:
|
||||||
if nlines:
|
if nlines:
|
||||||
self.lineno = self.lineno + nlines
|
self.lineno = self.lineno + nlines
|
||||||
pos = rawdata.rindex("\n", i, j) # Should not fail
|
pos = rawdata.rindex("\n", i, j) # Should not fail
|
||||||
self.offset = j - (pos + 1)
|
self.offset = j-(pos+1)
|
||||||
else:
|
else:
|
||||||
self.offset = self.offset + j - i
|
self.offset = self.offset + j-i
|
||||||
return j
|
return j
|
||||||
|
|
||||||
_decl_otherchars = ""
|
_decl_otherchars = ''
|
||||||
|
|
||||||
# Internal -- parse declaration (for use by subclasses).
|
# Internal -- parse declaration (for use by subclasses).
|
||||||
def parse_declaration(self, i):
|
def parse_declaration(self, i):
|
||||||
|
@ -73,35 +75,35 @@ class ParserBase:
|
||||||
rawdata = self.rawdata
|
rawdata = self.rawdata
|
||||||
j = i + 2
|
j = i + 2
|
||||||
assert rawdata[i:j] == "<!", "unexpected call to parse_declaration"
|
assert rawdata[i:j] == "<!", "unexpected call to parse_declaration"
|
||||||
if rawdata[j : j + 1] == ">":
|
if rawdata[j:j+1] == ">":
|
||||||
# the empty comment <!>
|
# the empty comment <!>
|
||||||
return j + 1
|
return j + 1
|
||||||
if rawdata[j : j + 1] in ("-", ""):
|
if rawdata[j:j+1] in ("-", ""):
|
||||||
# Start of comment followed by buffer boundary,
|
# Start of comment followed by buffer boundary,
|
||||||
# or just a buffer boundary.
|
# or just a buffer boundary.
|
||||||
return -1
|
return -1
|
||||||
# A simple, practical version could look like: ((name|stringlit) S*) + '>'
|
# A simple, practical version could look like: ((name|stringlit) S*) + '>'
|
||||||
n = len(rawdata)
|
n = len(rawdata)
|
||||||
if rawdata[j : j + 2] == "--": # comment
|
if rawdata[j:j+2] == '--': #comment
|
||||||
# Locate --.*-- as the body of the comment
|
# Locate --.*-- as the body of the comment
|
||||||
return self.parse_comment(i)
|
return self.parse_comment(i)
|
||||||
elif rawdata[j] == "[": # marked section
|
elif rawdata[j] == '[': #marked section
|
||||||
# Locate [statusWord [...arbitrary SGML...]] as the body of the marked section
|
# Locate [statusWord [...arbitrary SGML...]] as the body of the marked section
|
||||||
# Where statusWord is one of TEMP, CDATA, IGNORE, INCLUDE, RCDATA
|
# Where statusWord is one of TEMP, CDATA, IGNORE, INCLUDE, RCDATA
|
||||||
# Note that this is extended by Microsoft Office "Save as Web" function
|
# Note that this is extended by Microsoft Office "Save as Web" function
|
||||||
# to include [if...] and [endif].
|
# to include [if...] and [endif].
|
||||||
return self.parse_marked_section(i)
|
return self.parse_marked_section(i)
|
||||||
else: # all other declaration elements
|
else: #all other declaration elements
|
||||||
decltype, j = self._scan_name(j, i)
|
decltype, j = self._scan_name(j, i)
|
||||||
if j < 0:
|
if j < 0:
|
||||||
return j
|
return j
|
||||||
if decltype == "doctype":
|
if decltype == "doctype":
|
||||||
self._decl_otherchars = ""
|
self._decl_otherchars = ''
|
||||||
while j < n:
|
while j < n:
|
||||||
c = rawdata[j]
|
c = rawdata[j]
|
||||||
if c == ">":
|
if c == ">":
|
||||||
# end of declaration syntax
|
# end of declaration syntax
|
||||||
data = rawdata[i + 2 : j]
|
data = rawdata[i+2:j]
|
||||||
if decltype == "doctype":
|
if decltype == "doctype":
|
||||||
self.handle_decl(data)
|
self.handle_decl(data)
|
||||||
else:
|
else:
|
||||||
|
@ -133,7 +135,8 @@ class ParserBase:
|
||||||
else:
|
else:
|
||||||
self.error("unexpected '[' char in declaration")
|
self.error("unexpected '[' char in declaration")
|
||||||
else:
|
else:
|
||||||
self.error("unexpected %r char in declaration" % rawdata[j])
|
self.error(
|
||||||
|
"unexpected %r char in declaration" % rawdata[j])
|
||||||
if j < 0:
|
if j < 0:
|
||||||
return j
|
return j
|
||||||
return -1 # incomplete
|
return -1 # incomplete
|
||||||
|
@ -141,37 +144,37 @@ class ParserBase:
|
||||||
# Internal -- parse a marked section
|
# Internal -- parse a marked section
|
||||||
# Override this to handle MS-word extension syntax <![if word]>content<![endif]>
|
# Override this to handle MS-word extension syntax <![if word]>content<![endif]>
|
||||||
def parse_marked_section(self, i, report=1):
|
def parse_marked_section(self, i, report=1):
|
||||||
rawdata = self.rawdata
|
rawdata= self.rawdata
|
||||||
assert rawdata[i : i + 3] == "<![", "unexpected call to parse_marked_section()"
|
assert rawdata[i:i+3] == '<![', "unexpected call to parse_marked_section()"
|
||||||
sectName, j = self._scan_name(i + 3, i)
|
sectName, j = self._scan_name( i+3, i )
|
||||||
if j < 0:
|
if j < 0:
|
||||||
return j
|
return j
|
||||||
if sectName in {"temp", "cdata", "ignore", "include", "rcdata"}:
|
if sectName in {"temp", "cdata", "ignore", "include", "rcdata"}:
|
||||||
# look for standard ]]> ending
|
# look for standard ]]> ending
|
||||||
match = _markedsectionclose.search(rawdata, i + 3)
|
match= _markedsectionclose.search(rawdata, i+3)
|
||||||
elif sectName in {"if", "else", "endif"}:
|
elif sectName in {"if", "else", "endif"}:
|
||||||
# look for MS Office ]> ending
|
# look for MS Office ]> ending
|
||||||
match = _msmarkedsectionclose.search(rawdata, i + 3)
|
match= _msmarkedsectionclose.search(rawdata, i+3)
|
||||||
else:
|
else:
|
||||||
self.error("unknown status keyword %r in marked section" % rawdata[i + 3 : j])
|
self.error('unknown status keyword %r in marked section' % rawdata[i+3:j])
|
||||||
if not match:
|
if not match:
|
||||||
return -1
|
return -1
|
||||||
if report:
|
if report:
|
||||||
j = match.start(0)
|
j = match.start(0)
|
||||||
self.unknown_decl(rawdata[i + 3 : j])
|
self.unknown_decl(rawdata[i+3: j])
|
||||||
return match.end(0)
|
return match.end(0)
|
||||||
|
|
||||||
# Internal -- parse comment, return length or -1 if not terminated
|
# Internal -- parse comment, return length or -1 if not terminated
|
||||||
def parse_comment(self, i, report=1):
|
def parse_comment(self, i, report=1):
|
||||||
rawdata = self.rawdata
|
rawdata = self.rawdata
|
||||||
if rawdata[i : i + 4] != "<!--":
|
if rawdata[i:i+4] != '<!--':
|
||||||
self.error("unexpected call to parse_comment()")
|
self.error('unexpected call to parse_comment()')
|
||||||
match = _commentclose.search(rawdata, i + 4)
|
match = _commentclose.search(rawdata, i+4)
|
||||||
if not match:
|
if not match:
|
||||||
return -1
|
return -1
|
||||||
if report:
|
if report:
|
||||||
j = match.start(0)
|
j = match.start(0)
|
||||||
self.handle_comment(rawdata[i + 4 : j])
|
self.handle_comment(rawdata[i+4: j])
|
||||||
return match.end(0)
|
return match.end(0)
|
||||||
|
|
||||||
# Internal -- scan past the internal subset in a <!DOCTYPE declaration,
|
# Internal -- scan past the internal subset in a <!DOCTYPE declaration,
|
||||||
|
@ -183,7 +186,7 @@ class ParserBase:
|
||||||
while j < n:
|
while j < n:
|
||||||
c = rawdata[j]
|
c = rawdata[j]
|
||||||
if c == "<":
|
if c == "<":
|
||||||
s = rawdata[j : j + 2]
|
s = rawdata[j:j+2]
|
||||||
if s == "<":
|
if s == "<":
|
||||||
# end of buffer; incomplete
|
# end of buffer; incomplete
|
||||||
return -1
|
return -1
|
||||||
|
@ -196,7 +199,7 @@ class ParserBase:
|
||||||
if (j + 4) > n:
|
if (j + 4) > n:
|
||||||
# end of buffer; incomplete
|
# end of buffer; incomplete
|
||||||
return -1
|
return -1
|
||||||
if rawdata[j : j + 4] == "<!--":
|
if rawdata[j:j+4] == "<!--":
|
||||||
j = self.parse_comment(j, report=0)
|
j = self.parse_comment(j, report=0)
|
||||||
if j < 0:
|
if j < 0:
|
||||||
return j
|
return j
|
||||||
|
@ -206,7 +209,8 @@ class ParserBase:
|
||||||
return -1
|
return -1
|
||||||
if name not in {"attlist", "element", "entity", "notation"}:
|
if name not in {"attlist", "element", "entity", "notation"}:
|
||||||
self.updatepos(declstartpos, j + 2)
|
self.updatepos(declstartpos, j + 2)
|
||||||
self.error("unknown declaration %r in internal subset" % name)
|
self.error(
|
||||||
|
"unknown declaration %r in internal subset" % name)
|
||||||
# handle the individual names
|
# handle the individual names
|
||||||
meth = getattr(self, "_parse_doctype_" + name)
|
meth = getattr(self, "_parse_doctype_" + name)
|
||||||
j = meth(j, declstartpos)
|
j = meth(j, declstartpos)
|
||||||
|
@ -248,7 +252,7 @@ class ParserBase:
|
||||||
return -1
|
return -1
|
||||||
# style content model; just skip until '>'
|
# style content model; just skip until '>'
|
||||||
rawdata = self.rawdata
|
rawdata = self.rawdata
|
||||||
if ">" in rawdata[j:]:
|
if '>' in rawdata[j:]:
|
||||||
return rawdata.find(">", j) + 1
|
return rawdata.find(">", j) + 1
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
|
@ -256,7 +260,7 @@ class ParserBase:
|
||||||
def _parse_doctype_attlist(self, i, declstartpos):
|
def _parse_doctype_attlist(self, i, declstartpos):
|
||||||
rawdata = self.rawdata
|
rawdata = self.rawdata
|
||||||
name, j = self._scan_name(i, declstartpos)
|
name, j = self._scan_name(i, declstartpos)
|
||||||
c = rawdata[j : j + 1]
|
c = rawdata[j:j+1]
|
||||||
if c == "":
|
if c == "":
|
||||||
return -1
|
return -1
|
||||||
if c == ">":
|
if c == ">":
|
||||||
|
@ -267,7 +271,7 @@ class ParserBase:
|
||||||
name, j = self._scan_name(j, declstartpos)
|
name, j = self._scan_name(j, declstartpos)
|
||||||
if j < 0:
|
if j < 0:
|
||||||
return j
|
return j
|
||||||
c = rawdata[j : j + 1]
|
c = rawdata[j:j+1]
|
||||||
if c == "":
|
if c == "":
|
||||||
return -1
|
return -1
|
||||||
if c == "(":
|
if c == "(":
|
||||||
|
@ -276,14 +280,14 @@ class ParserBase:
|
||||||
j = rawdata.find(")", j) + 1
|
j = rawdata.find(")", j) + 1
|
||||||
else:
|
else:
|
||||||
return -1
|
return -1
|
||||||
while rawdata[j : j + 1].isspace():
|
while rawdata[j:j+1].isspace():
|
||||||
j = j + 1
|
j = j + 1
|
||||||
if not rawdata[j:]:
|
if not rawdata[j:]:
|
||||||
# end of buffer, incomplete
|
# end of buffer, incomplete
|
||||||
return -1
|
return -1
|
||||||
else:
|
else:
|
||||||
name, j = self._scan_name(j, declstartpos)
|
name, j = self._scan_name(j, declstartpos)
|
||||||
c = rawdata[j : j + 1]
|
c = rawdata[j:j+1]
|
||||||
if not c:
|
if not c:
|
||||||
return -1
|
return -1
|
||||||
if c in "'\"":
|
if c in "'\"":
|
||||||
|
@ -292,7 +296,7 @@ class ParserBase:
|
||||||
j = m.end()
|
j = m.end()
|
||||||
else:
|
else:
|
||||||
return -1
|
return -1
|
||||||
c = rawdata[j : j + 1]
|
c = rawdata[j:j+1]
|
||||||
if not c:
|
if not c:
|
||||||
return -1
|
return -1
|
||||||
if c == "#":
|
if c == "#":
|
||||||
|
@ -302,10 +306,10 @@ class ParserBase:
|
||||||
name, j = self._scan_name(j + 1, declstartpos)
|
name, j = self._scan_name(j + 1, declstartpos)
|
||||||
if j < 0:
|
if j < 0:
|
||||||
return j
|
return j
|
||||||
c = rawdata[j : j + 1]
|
c = rawdata[j:j+1]
|
||||||
if not c:
|
if not c:
|
||||||
return -1
|
return -1
|
||||||
if c == ">":
|
if c == '>':
|
||||||
# all done
|
# all done
|
||||||
return j + 1
|
return j + 1
|
||||||
|
|
||||||
|
@ -316,11 +320,11 @@ class ParserBase:
|
||||||
return j
|
return j
|
||||||
rawdata = self.rawdata
|
rawdata = self.rawdata
|
||||||
while 1:
|
while 1:
|
||||||
c = rawdata[j : j + 1]
|
c = rawdata[j:j+1]
|
||||||
if not c:
|
if not c:
|
||||||
# end of buffer; incomplete
|
# end of buffer; incomplete
|
||||||
return -1
|
return -1
|
||||||
if c == ">":
|
if c == '>':
|
||||||
return j + 1
|
return j + 1
|
||||||
if c in "'\"":
|
if c in "'\"":
|
||||||
m = _declstringlit_match(rawdata, j)
|
m = _declstringlit_match(rawdata, j)
|
||||||
|
@ -335,10 +339,10 @@ class ParserBase:
|
||||||
# Internal -- scan past <!ENTITY declarations
|
# Internal -- scan past <!ENTITY declarations
|
||||||
def _parse_doctype_entity(self, i, declstartpos):
|
def _parse_doctype_entity(self, i, declstartpos):
|
||||||
rawdata = self.rawdata
|
rawdata = self.rawdata
|
||||||
if rawdata[i : i + 1] == "%":
|
if rawdata[i:i+1] == "%":
|
||||||
j = i + 1
|
j = i + 1
|
||||||
while 1:
|
while 1:
|
||||||
c = rawdata[j : j + 1]
|
c = rawdata[j:j+1]
|
||||||
if not c:
|
if not c:
|
||||||
return -1
|
return -1
|
||||||
if c.isspace():
|
if c.isspace():
|
||||||
|
@ -351,7 +355,7 @@ class ParserBase:
|
||||||
if j < 0:
|
if j < 0:
|
||||||
return j
|
return j
|
||||||
while 1:
|
while 1:
|
||||||
c = self.rawdata[j : j + 1]
|
c = self.rawdata[j:j+1]
|
||||||
if not c:
|
if not c:
|
||||||
return -1
|
return -1
|
||||||
if c in "'\"":
|
if c in "'\"":
|
||||||
|
@ -383,7 +387,8 @@ class ParserBase:
|
||||||
return name.lower(), m.end()
|
return name.lower(), m.end()
|
||||||
else:
|
else:
|
||||||
self.updatepos(declstartpos, i)
|
self.updatepos(declstartpos, i)
|
||||||
self.error("expected name token at %r" % rawdata[declstartpos : declstartpos + 20])
|
self.error("expected name token at %r"
|
||||||
|
% rawdata[declstartpos:declstartpos+20])
|
||||||
|
|
||||||
# To be overridden -- handlers for unknown objects
|
# To be overridden -- handlers for unknown objects
|
||||||
def unknown_decl(self, data):
|
def unknown_decl(self, data):
|
|
@ -0,0 +1,4 @@
|
||||||
|
srctype = cpython
|
||||||
|
type = module
|
||||||
|
version = 3.3.3
|
||||||
|
depends = re-pcre
|
|
@ -0,0 +1,19 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-_markupbase',
|
||||||
|
version='3.3.3',
|
||||||
|
description='CPython _markupbase module ported to MicroPython',
|
||||||
|
long_description='This is a module ported from CPython standard library to be compatible with\nMicroPython interpreter. Usually, this means applying small patches for\nfeatures not supported (yet, or at all) in MicroPython. Sometimes, heavier\nchanges are required. Note that CPython modules are written with availability\nof vast resources in mind, and may not work for MicroPython ports with\nlimited heap. If you are affected by such a case, please help reimplement\nthe module from scratch.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
author_email='python-dev@python.org',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
py_modules=['_markupbase'],
|
||||||
|
install_requires=['micropython-re-pcre'])
|
|
@ -1,6 +1,2 @@
|
||||||
class ABC:
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
def abstractmethod(f):
|
def abstractmethod(f):
|
||||||
return f
|
return f
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.0
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-abc',
|
||||||
|
version='0.0.0',
|
||||||
|
description='Dummy abc module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['abc'])
|
|
@ -3,7 +3,7 @@ Minimal and functional version of CPython's argparse module.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
from collections import namedtuple
|
from ucollections import namedtuple
|
||||||
|
|
||||||
|
|
||||||
class _ArgError(BaseException):
|
class _ArgError(BaseException):
|
||||||
|
@ -104,16 +104,8 @@ class ArgumentParser:
|
||||||
if not args:
|
if not args:
|
||||||
args = [dest]
|
args = [dest]
|
||||||
list.append(
|
list.append(
|
||||||
_Arg(
|
_Arg(args, dest, action, kwargs.get("nargs", None),
|
||||||
args,
|
const, default, kwargs.get("help", "")))
|
||||||
dest,
|
|
||||||
action,
|
|
||||||
kwargs.get("nargs", None),
|
|
||||||
const,
|
|
||||||
default,
|
|
||||||
kwargs.get("help", ""),
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
def usage(self, full):
|
def usage(self, full):
|
||||||
# print short usage
|
# print short usage
|
||||||
|
@ -129,9 +121,8 @@ class ArgumentParser:
|
||||||
return " %s%s" % (arg.dest, arg.nargs)
|
return " %s%s" % (arg.dest, arg.nargs)
|
||||||
else:
|
else:
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
for opt in self.opt:
|
for opt in self.opt:
|
||||||
print(" [%s%s]" % (", ".join(opt.names), render_arg(opt)), end="")
|
print(" [%s%s]" % (', '.join(opt.names), render_arg(opt)), end="")
|
||||||
for pos in self.pos:
|
for pos in self.pos:
|
||||||
print(render_arg(pos), end="")
|
print(render_arg(pos), end="")
|
||||||
print()
|
print()
|
||||||
|
@ -150,27 +141,21 @@ class ArgumentParser:
|
||||||
print("\noptional args:")
|
print("\noptional args:")
|
||||||
print(" -h, --help show this message and exit")
|
print(" -h, --help show this message and exit")
|
||||||
for opt in self.opt:
|
for opt in self.opt:
|
||||||
print(" %-16s%s" % (", ".join(opt.names) + render_arg(opt), opt.help))
|
print(" %-16s%s" % (', '.join(opt.names) + render_arg(opt), opt.help))
|
||||||
|
|
||||||
def parse_args(self, args=None):
|
def parse_args(self, args=None):
|
||||||
return self._parse_args_impl(args, False)
|
|
||||||
|
|
||||||
def parse_known_args(self, args=None):
|
|
||||||
return self._parse_args_impl(args, True)
|
|
||||||
|
|
||||||
def _parse_args_impl(self, args, return_unknown):
|
|
||||||
if args is None:
|
if args is None:
|
||||||
args = sys.argv[1:]
|
args = sys.argv[1:]
|
||||||
else:
|
else:
|
||||||
args = args[:]
|
args = args[:]
|
||||||
try:
|
try:
|
||||||
return self._parse_args(args, return_unknown)
|
return self._parse_args(args)
|
||||||
except _ArgError as e:
|
except _ArgError as e:
|
||||||
self.usage(False)
|
self.usage(False)
|
||||||
print("error:", e)
|
print("error:", e)
|
||||||
sys.exit(2)
|
sys.exit(2)
|
||||||
|
|
||||||
def _parse_args(self, args, return_unknown):
|
def _parse_args(self, args):
|
||||||
# add optional args with defaults
|
# add optional args with defaults
|
||||||
arg_dest = []
|
arg_dest = []
|
||||||
arg_vals = []
|
arg_vals = []
|
||||||
|
@ -178,13 +163,6 @@ class ArgumentParser:
|
||||||
arg_dest.append(opt.dest)
|
arg_dest.append(opt.dest)
|
||||||
arg_vals.append(opt.default)
|
arg_vals.append(opt.default)
|
||||||
|
|
||||||
# deal with unknown arguments, if needed
|
|
||||||
unknown = []
|
|
||||||
|
|
||||||
def consume_unknown():
|
|
||||||
while args and not args[0].startswith("-"):
|
|
||||||
unknown.append(args.pop(0))
|
|
||||||
|
|
||||||
# parse all args
|
# parse all args
|
||||||
parsed_pos = False
|
parsed_pos = False
|
||||||
while args or not parsed_pos:
|
while args or not parsed_pos:
|
||||||
|
@ -201,26 +179,15 @@ class ArgumentParser:
|
||||||
found = True
|
found = True
|
||||||
break
|
break
|
||||||
if not found:
|
if not found:
|
||||||
if return_unknown:
|
|
||||||
unknown.append(a)
|
|
||||||
consume_unknown()
|
|
||||||
else:
|
|
||||||
raise _ArgError("unknown option %s" % a)
|
raise _ArgError("unknown option %s" % a)
|
||||||
else:
|
else:
|
||||||
# positional arg
|
# positional arg
|
||||||
if parsed_pos:
|
if parsed_pos:
|
||||||
if return_unknown:
|
|
||||||
unknown = unknown + args
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
raise _ArgError("extra args: %s" % " ".join(args))
|
raise _ArgError("extra args: %s" % " ".join(args))
|
||||||
for pos in self.pos:
|
for pos in self.pos:
|
||||||
arg_dest.append(pos.dest)
|
arg_dest.append(pos.dest)
|
||||||
arg_vals.append(pos.parse(pos.names[0], args))
|
arg_vals.append(pos.parse(pos.names[0], args))
|
||||||
parsed_pos = True
|
parsed_pos = True
|
||||||
if return_unknown:
|
|
||||||
consume_unknown()
|
|
||||||
|
|
||||||
# build and return named tuple with arg values
|
# build and return named tuple with arg values
|
||||||
values = namedtuple("args", arg_dest)(*arg_vals)
|
return namedtuple("args", arg_dest)(*arg_vals)
|
||||||
return (values, unknown) if return_unknown else values
|
|
|
@ -0,0 +1,4 @@
|
||||||
|
srctype = micropython-lib
|
||||||
|
type = module
|
||||||
|
version = 0.3.2
|
||||||
|
author = Damien George
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-argparse',
|
||||||
|
version='0.3.2',
|
||||||
|
description='argparse module for MicroPython',
|
||||||
|
long_description="This is a module reimplemented specifically for MicroPython standard library,\nwith efficient and lean design in mind. Note that this module is likely work\nin progress and likely supports just a subset of CPython's corresponding\nmodule. Please help with the development if you are interested in this\nmodule.",
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='Damien George',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['argparse'])
|
|
@ -44,25 +44,3 @@ args = parser.parse_args(["a", "b"])
|
||||||
assert args.files1 == ["a", "b"] and args.files2 == []
|
assert args.files1 == ["a", "b"] and args.files2 == []
|
||||||
args = parser.parse_args(["a", "b", "c"])
|
args = parser.parse_args(["a", "b", "c"])
|
||||||
assert args.files1 == ["a", "b"] and args.files2 == ["c"]
|
assert args.files1 == ["a", "b"] and args.files2 == ["c"]
|
||||||
|
|
||||||
parser = argparse.ArgumentParser()
|
|
||||||
parser.add_argument("a", nargs=2)
|
|
||||||
parser.add_argument("-b")
|
|
||||||
args, rest = parser.parse_known_args(["a", "b", "-b", "2"])
|
|
||||||
assert args.a == ["a", "b"] and args.b == "2"
|
|
||||||
assert rest == []
|
|
||||||
args, rest = parser.parse_known_args(["-b", "2", "a", "b", "c"])
|
|
||||||
assert args.a == ["a", "b"] and args.b == "2"
|
|
||||||
assert rest == ["c"]
|
|
||||||
args, rest = parser.parse_known_args(["a", "b", "-b", "2", "c"])
|
|
||||||
assert args.a == ["a", "b"] and args.b == "2"
|
|
||||||
assert rest == ["c"]
|
|
||||||
args, rest = parser.parse_known_args(["-b", "2", "a", "b", "-", "c"])
|
|
||||||
assert args.a == ["a", "b"] and args.b == "2"
|
|
||||||
assert rest == ["-", "c"]
|
|
||||||
args, rest = parser.parse_known_args(["a", "b", "-b", "2", "-", "x", "y"])
|
|
||||||
assert args.a == ["a", "b"] and args.b == "2"
|
|
||||||
assert rest == ["-", "x", "y"]
|
|
||||||
args, rest = parser.parse_known_args(["a", "b", "c", "-b", "2", "--x", "5", "1"])
|
|
||||||
assert args.a == ["a", "b"] and args.b == "2"
|
|
||||||
assert rest == ["c", "--x", "5", "1"]
|
|
|
@ -0,0 +1,151 @@
|
||||||
|
import time
|
||||||
|
import logging
|
||||||
|
|
||||||
|
|
||||||
|
log = logging.getLogger("asyncio")
|
||||||
|
|
||||||
|
|
||||||
|
# Workaround for not being able to subclass builtin types
|
||||||
|
class LoopStop(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
class InvalidStateError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Object not matching any other object
|
||||||
|
_sentinel = []
|
||||||
|
|
||||||
|
|
||||||
|
class EventLoop:
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.q = []
|
||||||
|
|
||||||
|
def call_soon(self, c, *args):
|
||||||
|
self.q.append((c, args))
|
||||||
|
|
||||||
|
def call_later(self, delay, c, *args):
|
||||||
|
def _delayed(c, args, delay):
|
||||||
|
yield from sleep(delay)
|
||||||
|
self.call_soon(c, *args)
|
||||||
|
Task(_delayed(c, args, delay))
|
||||||
|
|
||||||
|
def run_forever(self):
|
||||||
|
while self.q:
|
||||||
|
c = self.q.pop(0)
|
||||||
|
try:
|
||||||
|
c[0](*c[1])
|
||||||
|
except LoopStop:
|
||||||
|
return
|
||||||
|
# I mean, forever
|
||||||
|
while True:
|
||||||
|
time.sleep(1)
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
def _cb():
|
||||||
|
raise LoopStop
|
||||||
|
self.call_soon(_cb)
|
||||||
|
|
||||||
|
def run_until_complete(self, coro):
|
||||||
|
t = async(coro)
|
||||||
|
t.add_done_callback(lambda a: self.stop())
|
||||||
|
self.run_forever()
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
_def_event_loop = EventLoop()
|
||||||
|
|
||||||
|
|
||||||
|
class Future:
|
||||||
|
|
||||||
|
def __init__(self, loop=_def_event_loop):
|
||||||
|
self.loop = loop
|
||||||
|
self.res = _sentinel
|
||||||
|
self.cbs = []
|
||||||
|
|
||||||
|
def result(self):
|
||||||
|
if self.res is _sentinel:
|
||||||
|
raise InvalidStateError
|
||||||
|
return self.res
|
||||||
|
|
||||||
|
def add_done_callback(self, fn):
|
||||||
|
if self.res is _sentinel:
|
||||||
|
self.cbs.append(fn)
|
||||||
|
else:
|
||||||
|
self.loop.call_soon(fn, self)
|
||||||
|
|
||||||
|
def set_result(self, val):
|
||||||
|
self.res = val
|
||||||
|
for f in self.cbs:
|
||||||
|
f(self)
|
||||||
|
|
||||||
|
|
||||||
|
class Task(Future):
|
||||||
|
|
||||||
|
def __init__(self, coro, loop=_def_event_loop):
|
||||||
|
super().__init__()
|
||||||
|
self.loop = loop
|
||||||
|
self.c = coro
|
||||||
|
# upstream asyncio forces task to be scheduled on instantiation
|
||||||
|
self.loop.call_soon(self)
|
||||||
|
|
||||||
|
def __call__(self):
|
||||||
|
try:
|
||||||
|
next(self.c)
|
||||||
|
self.loop.call_soon(self)
|
||||||
|
except StopIteration as e:
|
||||||
|
log.debug("Coro finished: %s", self.c)
|
||||||
|
self.set_result(None)
|
||||||
|
|
||||||
|
|
||||||
|
def get_event_loop():
|
||||||
|
return _def_event_loop
|
||||||
|
|
||||||
|
|
||||||
|
# Decorator
|
||||||
|
def coroutine(f):
|
||||||
|
return f
|
||||||
|
|
||||||
|
|
||||||
|
def async(coro):
|
||||||
|
if isinstance(coro, Future):
|
||||||
|
return coro
|
||||||
|
return Task(coro)
|
||||||
|
|
||||||
|
|
||||||
|
class _Wait(Future):
|
||||||
|
|
||||||
|
def __init__(self, n):
|
||||||
|
Future.__init__(self)
|
||||||
|
self.n = n
|
||||||
|
|
||||||
|
def _done(self):
|
||||||
|
self.n -= 1
|
||||||
|
log.debug("Wait: remaining tasks: %d", self.n)
|
||||||
|
if not self.n:
|
||||||
|
self.set_result(None)
|
||||||
|
|
||||||
|
def __call__(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def wait(coro_list, loop=_def_event_loop):
|
||||||
|
|
||||||
|
w = _Wait(len(coro_list))
|
||||||
|
|
||||||
|
for c in coro_list:
|
||||||
|
t = async(c)
|
||||||
|
t.add_done_callback(lambda val: w._done())
|
||||||
|
|
||||||
|
return w
|
||||||
|
|
||||||
|
|
||||||
|
def sleep(secs):
|
||||||
|
t = time.time()
|
||||||
|
log.debug("Started sleep at: %s, targetting: %s", t, t + secs)
|
||||||
|
while time.time() < t + secs:
|
||||||
|
time.sleep(0.01)
|
||||||
|
yield
|
||||||
|
log.debug("Finished sleeping %ss", secs)
|
|
@ -0,0 +1,18 @@
|
||||||
|
#https://docs.python.org/3.4/library/asyncio-task.html#example-chain-coroutines
|
||||||
|
#import asyncio
|
||||||
|
import asyncio_slow as asyncio
|
||||||
|
|
||||||
|
@asyncio.coroutine
|
||||||
|
def compute(x, y):
|
||||||
|
print("Compute %s + %s ..." % (x, y))
|
||||||
|
yield from asyncio.sleep(1.0)
|
||||||
|
return x + y
|
||||||
|
|
||||||
|
@asyncio.coroutine
|
||||||
|
def print_sum(x, y):
|
||||||
|
result = yield from compute(x, y)
|
||||||
|
print("%s + %s = %s" % (x, y, result))
|
||||||
|
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
loop.run_until_complete(print_sum(1, 2))
|
||||||
|
loop.close()
|
|
@ -0,0 +1,15 @@
|
||||||
|
#https://docs.python.org/3.4/library/asyncio-task.html#example-chain-coroutines
|
||||||
|
#import asyncio
|
||||||
|
import asyncio_slow as asyncio
|
||||||
|
|
||||||
|
@asyncio.coroutine
|
||||||
|
def slow_operation(future):
|
||||||
|
yield from asyncio.sleep(1)
|
||||||
|
future.set_result('Future is done!')
|
||||||
|
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
future = asyncio.Future()
|
||||||
|
asyncio.Task(slow_operation(future))
|
||||||
|
loop.run_until_complete(future)
|
||||||
|
print(future.result())
|
||||||
|
loop.close()
|
|
@ -0,0 +1,21 @@
|
||||||
|
#https://docs.python.org/3.4/library/asyncio-task.html#example-future-with-run-forever
|
||||||
|
#import asyncio
|
||||||
|
import asyncio_slow as asyncio
|
||||||
|
|
||||||
|
@asyncio.coroutine
|
||||||
|
def slow_operation(future):
|
||||||
|
yield from asyncio.sleep(1)
|
||||||
|
future.set_result('Future is done!')
|
||||||
|
|
||||||
|
def got_result(future):
|
||||||
|
print(future.result())
|
||||||
|
loop.stop()
|
||||||
|
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
future = asyncio.Future()
|
||||||
|
asyncio.Task(slow_operation(future))
|
||||||
|
future.add_done_callback(got_result)
|
||||||
|
try:
|
||||||
|
loop.run_forever()
|
||||||
|
finally:
|
||||||
|
loop.close()
|
|
@ -0,0 +1,12 @@
|
||||||
|
#https://docs.python.org/3.4/library/asyncio-task.html#example-hello-world-coroutine
|
||||||
|
#import asyncio
|
||||||
|
import asyncio_slow as asyncio
|
||||||
|
|
||||||
|
@asyncio.coroutine
|
||||||
|
def greet_every_two_seconds():
|
||||||
|
while True:
|
||||||
|
print('Hello World')
|
||||||
|
yield from asyncio.sleep(2)
|
||||||
|
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
loop.run_until_complete(greet_every_two_seconds())
|
|
@ -0,0 +1,12 @@
|
||||||
|
#import asyncio
|
||||||
|
import asyncio_slow as asyncio
|
||||||
|
|
||||||
|
@asyncio.coroutine
|
||||||
|
def greet_every_two_seconds():
|
||||||
|
while True:
|
||||||
|
print('Hello World')
|
||||||
|
yield from asyncio.sleep(2)
|
||||||
|
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
asyncio.Task(greet_every_two_seconds())
|
||||||
|
loop.run_forever()
|
|
@ -0,0 +1,11 @@
|
||||||
|
# https://docs.python.org/3.4/library/asyncio-eventloop.html#example-hello-world-callback
|
||||||
|
#import asyncio
|
||||||
|
import asyncio_slow as asyncio
|
||||||
|
|
||||||
|
def print_and_repeat(loop):
|
||||||
|
print('Hello World')
|
||||||
|
loop.call_later(2, print_and_repeat, loop)
|
||||||
|
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
loop.call_soon(print_and_repeat, loop)
|
||||||
|
loop.run_forever()
|
|
@ -0,0 +1,21 @@
|
||||||
|
#https://docs.python.org/3.4/library/asyncio-task.html#example-parallel-execution-of-tasks
|
||||||
|
#import asyncio
|
||||||
|
import asyncio_slow as asyncio
|
||||||
|
|
||||||
|
@asyncio.coroutine
|
||||||
|
def factorial(name, number):
|
||||||
|
f = 1
|
||||||
|
for i in range(2, number+1):
|
||||||
|
print("Task %s: Compute factorial(%s)..." % (name, i))
|
||||||
|
yield from asyncio.sleep(1)
|
||||||
|
f *= i
|
||||||
|
print("Task %s: factorial(%s) = %s" % (name, number, f))
|
||||||
|
|
||||||
|
tasks = [
|
||||||
|
asyncio.Task(factorial("A", 2)),
|
||||||
|
asyncio.Task(factorial("B", 3)),
|
||||||
|
asyncio.Task(factorial("C", 4))]
|
||||||
|
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
loop.run_until_complete(asyncio.wait(tasks))
|
||||||
|
loop.close()
|
|
@ -13,67 +13,38 @@ import binascii
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
# Legacy interface exports traditional RFC 1521 Base64 encodings
|
# Legacy interface exports traditional RFC 1521 Base64 encodings
|
||||||
"encode",
|
'encode', 'decode', 'encodebytes', 'decodebytes',
|
||||||
"decode",
|
|
||||||
"encodebytes",
|
|
||||||
"decodebytes",
|
|
||||||
# Generalized interface for other encodings
|
# Generalized interface for other encodings
|
||||||
"b64encode",
|
'b64encode', 'b64decode', 'b32encode', 'b32decode',
|
||||||
"b64decode",
|
'b16encode', 'b16decode',
|
||||||
"b32encode",
|
|
||||||
"b32decode",
|
|
||||||
"b16encode",
|
|
||||||
"b16decode",
|
|
||||||
# Standard Base64 encoding
|
# Standard Base64 encoding
|
||||||
"standard_b64encode",
|
'standard_b64encode', 'standard_b64decode',
|
||||||
"standard_b64decode",
|
|
||||||
# Some common Base64 alternatives. As referenced by RFC 3458, see thread
|
# Some common Base64 alternatives. As referenced by RFC 3458, see thread
|
||||||
# starting at:
|
# starting at:
|
||||||
#
|
#
|
||||||
# http://zgp.org/pipermail/p2p-hackers/2001-September/000316.html
|
# http://zgp.org/pipermail/p2p-hackers/2001-September/000316.html
|
||||||
"urlsafe_b64encode",
|
'urlsafe_b64encode', 'urlsafe_b64decode',
|
||||||
"urlsafe_b64decode",
|
]
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
bytes_types = (bytes, bytearray) # Types acceptable as binary data
|
bytes_types = (bytes, bytearray) # Types acceptable as binary data
|
||||||
|
|
||||||
|
|
||||||
def _bytes_from_decode_data(s):
|
def _bytes_from_decode_data(s):
|
||||||
if isinstance(s, str):
|
if isinstance(s, str):
|
||||||
try:
|
try:
|
||||||
return s.encode("ascii")
|
return s.encode('ascii')
|
||||||
# except UnicodeEncodeError:
|
# except UnicodeEncodeError:
|
||||||
except:
|
except:
|
||||||
raise ValueError("string argument should contain only ASCII characters")
|
raise ValueError('string argument should contain only ASCII characters')
|
||||||
elif isinstance(s, bytes_types):
|
elif isinstance(s, bytes_types):
|
||||||
return s
|
return s
|
||||||
else:
|
else:
|
||||||
raise TypeError("argument should be bytes or ASCII string, not %s" % s.__class__.__name__)
|
raise TypeError("argument should be bytes or ASCII string, not %s" % s.__class__.__name__)
|
||||||
|
|
||||||
|
|
||||||
def _maketrans(f, t):
|
|
||||||
"""Re-implement bytes.maketrans() as there is no such function in micropython"""
|
|
||||||
if len(f) != len(t):
|
|
||||||
raise ValueError("maketrans arguments must have same length")
|
|
||||||
translation_table = dict(zip(f, t))
|
|
||||||
return translation_table
|
|
||||||
|
|
||||||
|
|
||||||
def _translate(input_bytes, trans_table):
|
|
||||||
"""Re-implement bytes.translate() as there is no such function in micropython"""
|
|
||||||
result = bytearray()
|
|
||||||
|
|
||||||
for byte in input_bytes:
|
|
||||||
translated_byte = trans_table.get(byte, byte)
|
|
||||||
result.append(translated_byte)
|
|
||||||
|
|
||||||
return bytes(result)
|
|
||||||
|
|
||||||
|
|
||||||
# Base64 encoding/decoding uses binascii
|
# Base64 encoding/decoding uses binascii
|
||||||
|
|
||||||
|
|
||||||
def b64encode(s, altchars=None):
|
def b64encode(s, altchars=None):
|
||||||
"""Encode a byte string using Base64.
|
"""Encode a byte string using Base64.
|
||||||
|
|
||||||
|
@ -90,9 +61,10 @@ def b64encode(s, altchars=None):
|
||||||
encoded = binascii.b2a_base64(s)[:-1]
|
encoded = binascii.b2a_base64(s)[:-1]
|
||||||
if altchars is not None:
|
if altchars is not None:
|
||||||
if not isinstance(altchars, bytes_types):
|
if not isinstance(altchars, bytes_types):
|
||||||
raise TypeError("expected bytes, not %s" % altchars.__class__.__name__)
|
raise TypeError("expected bytes, not %s"
|
||||||
|
% altchars.__class__.__name__)
|
||||||
assert len(altchars) == 2, repr(altchars)
|
assert len(altchars) == 2, repr(altchars)
|
||||||
encoded = _translate(encoded, _maketrans(b"+/", altchars))
|
return encoded.translate(bytes.maketrans(b'+/', altchars))
|
||||||
return encoded
|
return encoded
|
||||||
|
|
||||||
|
|
||||||
|
@ -114,9 +86,9 @@ def b64decode(s, altchars=None, validate=False):
|
||||||
if altchars is not None:
|
if altchars is not None:
|
||||||
altchars = _bytes_from_decode_data(altchars)
|
altchars = _bytes_from_decode_data(altchars)
|
||||||
assert len(altchars) == 2, repr(altchars)
|
assert len(altchars) == 2, repr(altchars)
|
||||||
s = _translate(s, _maketrans(altchars, b"+/"))
|
s = s.translate(bytes.maketrans(altchars, b'+/'))
|
||||||
if validate and not re.match(b"^[A-Za-z0-9+/]*=*$", s):
|
if validate and not re.match(b'^[A-Za-z0-9+/]*={0,2}$', s):
|
||||||
raise binascii.Error("Non-base64 digit found")
|
raise binascii.Error('Non-base64 digit found')
|
||||||
return binascii.a2b_base64(s)
|
return binascii.a2b_base64(s)
|
||||||
|
|
||||||
|
|
||||||
|
@ -127,7 +99,6 @@ def standard_b64encode(s):
|
||||||
"""
|
"""
|
||||||
return b64encode(s)
|
return b64encode(s)
|
||||||
|
|
||||||
|
|
||||||
def standard_b64decode(s):
|
def standard_b64decode(s):
|
||||||
"""Decode a byte string encoded with the standard Base64 alphabet.
|
"""Decode a byte string encoded with the standard Base64 alphabet.
|
||||||
|
|
||||||
|
@ -139,9 +110,8 @@ def standard_b64decode(s):
|
||||||
return b64decode(s)
|
return b64decode(s)
|
||||||
|
|
||||||
|
|
||||||
# _urlsafe_encode_translation = _maketrans(b'+/', b'-_')
|
#_urlsafe_encode_translation = bytes.maketrans(b'+/', b'-_')
|
||||||
# _urlsafe_decode_translation = _maketrans(b'-_', b'+/')
|
#_urlsafe_decode_translation = bytes.maketrans(b'-_', b'+/')
|
||||||
|
|
||||||
|
|
||||||
def urlsafe_b64encode(s):
|
def urlsafe_b64encode(s):
|
||||||
"""Encode a byte string using a url-safe Base64 alphabet.
|
"""Encode a byte string using a url-safe Base64 alphabet.
|
||||||
|
@ -150,9 +120,8 @@ def urlsafe_b64encode(s):
|
||||||
returned. The alphabet uses '-' instead of '+' and '_' instead of
|
returned. The alphabet uses '-' instead of '+' and '_' instead of
|
||||||
'/'.
|
'/'.
|
||||||
"""
|
"""
|
||||||
# return b64encode(s).translate(_urlsafe_encode_translation)
|
# return b64encode(s).translate(_urlsafe_encode_translation)
|
||||||
return b64encode(s, b"-_").rstrip(b"\n")
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
|
||||||
def urlsafe_b64decode(s):
|
def urlsafe_b64decode(s):
|
||||||
"""Decode a byte string encoded with the standard Base64 alphabet.
|
"""Decode a byte string encoded with the standard Base64 alphabet.
|
||||||
|
@ -164,47 +133,25 @@ def urlsafe_b64decode(s):
|
||||||
|
|
||||||
The alphabet uses '-' instead of '+' and '_' instead of '/'.
|
The alphabet uses '-' instead of '+' and '_' instead of '/'.
|
||||||
"""
|
"""
|
||||||
# s = _bytes_from_decode_data(s)
|
# s = _bytes_from_decode_data(s)
|
||||||
# s = s.translate(_urlsafe_decode_translation)
|
# s = s.translate(_urlsafe_decode_translation)
|
||||||
# return b64decode(s)
|
# return b64decode(s)
|
||||||
raise NotImplementedError()
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Base32 encoding/decoding must be done in Python
|
# Base32 encoding/decoding must be done in Python
|
||||||
_b32alphabet = {
|
_b32alphabet = {
|
||||||
0: b"A",
|
0: b'A', 9: b'J', 18: b'S', 27: b'3',
|
||||||
9: b"J",
|
1: b'B', 10: b'K', 19: b'T', 28: b'4',
|
||||||
18: b"S",
|
2: b'C', 11: b'L', 20: b'U', 29: b'5',
|
||||||
27: b"3",
|
3: b'D', 12: b'M', 21: b'V', 30: b'6',
|
||||||
1: b"B",
|
4: b'E', 13: b'N', 22: b'W', 31: b'7',
|
||||||
10: b"K",
|
5: b'F', 14: b'O', 23: b'X',
|
||||||
19: b"T",
|
6: b'G', 15: b'P', 24: b'Y',
|
||||||
28: b"4",
|
7: b'H', 16: b'Q', 25: b'Z',
|
||||||
2: b"C",
|
8: b'I', 17: b'R', 26: b'2',
|
||||||
11: b"L",
|
}
|
||||||
20: b"U",
|
|
||||||
29: b"5",
|
|
||||||
3: b"D",
|
|
||||||
12: b"M",
|
|
||||||
21: b"V",
|
|
||||||
30: b"6",
|
|
||||||
4: b"E",
|
|
||||||
13: b"N",
|
|
||||||
22: b"W",
|
|
||||||
31: b"7",
|
|
||||||
5: b"F",
|
|
||||||
14: b"O",
|
|
||||||
23: b"X",
|
|
||||||
6: b"G",
|
|
||||||
15: b"P",
|
|
||||||
24: b"Y",
|
|
||||||
7: b"H",
|
|
||||||
16: b"Q",
|
|
||||||
25: b"Z",
|
|
||||||
8: b"I",
|
|
||||||
17: b"R",
|
|
||||||
26: b"2",
|
|
||||||
}
|
|
||||||
|
|
||||||
_b32tab = [v[0] for k, v in sorted(_b32alphabet.items())]
|
_b32tab = [v[0] for k, v in sorted(_b32alphabet.items())]
|
||||||
_b32rev = dict([(v[0], k) for k, v in _b32alphabet.items()])
|
_b32rev = dict([(v[0], k) for k, v in _b32alphabet.items()])
|
||||||
|
@ -229,30 +176,27 @@ def b32encode(s):
|
||||||
# leftover bit of c1 and tack it onto c2. Then we take the 2 leftover
|
# leftover bit of c1 and tack it onto c2. Then we take the 2 leftover
|
||||||
# bits of c2 and tack them onto c3. The shifts and masks are intended
|
# bits of c2 and tack them onto c3. The shifts and masks are intended
|
||||||
# to give us values of exactly 5 bits in width.
|
# to give us values of exactly 5 bits in width.
|
||||||
c1, c2, c3 = struct.unpack("!HHB", s[i * 5 : (i + 1) * 5])
|
c1, c2, c3 = struct.unpack('!HHB', s[i*5:(i+1)*5])
|
||||||
c2 += (c1 & 1) << 16 # 17 bits wide
|
c2 += (c1 & 1) << 16 # 17 bits wide
|
||||||
c3 += (c2 & 3) << 8 # 10 bits wide
|
c3 += (c2 & 3) << 8 # 10 bits wide
|
||||||
encoded += bytes(
|
encoded += bytes([_b32tab[c1 >> 11], # bits 1 - 5
|
||||||
[
|
_b32tab[(c1 >> 6) & 0x1f], # bits 6 - 10
|
||||||
_b32tab[c1 >> 11], # bits 1 - 5
|
_b32tab[(c1 >> 1) & 0x1f], # bits 11 - 15
|
||||||
_b32tab[(c1 >> 6) & 0x1F], # bits 6 - 10
|
|
||||||
_b32tab[(c1 >> 1) & 0x1F], # bits 11 - 15
|
|
||||||
_b32tab[c2 >> 12], # bits 16 - 20 (1 - 5)
|
_b32tab[c2 >> 12], # bits 16 - 20 (1 - 5)
|
||||||
_b32tab[(c2 >> 7) & 0x1F], # bits 21 - 25 (6 - 10)
|
_b32tab[(c2 >> 7) & 0x1f], # bits 21 - 25 (6 - 10)
|
||||||
_b32tab[(c2 >> 2) & 0x1F], # bits 26 - 30 (11 - 15)
|
_b32tab[(c2 >> 2) & 0x1f], # bits 26 - 30 (11 - 15)
|
||||||
_b32tab[c3 >> 5], # bits 31 - 35 (1 - 5)
|
_b32tab[c3 >> 5], # bits 31 - 35 (1 - 5)
|
||||||
_b32tab[c3 & 0x1F], # bits 36 - 40 (1 - 5)
|
_b32tab[c3 & 0x1f], # bits 36 - 40 (1 - 5)
|
||||||
]
|
])
|
||||||
)
|
|
||||||
# Adjust for any leftover partial quanta
|
# Adjust for any leftover partial quanta
|
||||||
if leftover == 1:
|
if leftover == 1:
|
||||||
encoded = encoded[:-6] + b"======"
|
encoded = encoded[:-6] + b'======'
|
||||||
elif leftover == 2:
|
elif leftover == 2:
|
||||||
encoded = encoded[:-4] + b"===="
|
encoded = encoded[:-4] + b'===='
|
||||||
elif leftover == 3:
|
elif leftover == 3:
|
||||||
encoded = encoded[:-3] + b"==="
|
encoded = encoded[:-3] + b'==='
|
||||||
elif leftover == 4:
|
elif leftover == 4:
|
||||||
encoded = encoded[:-1] + b"="
|
encoded = encoded[:-1] + b'='
|
||||||
return bytes(encoded)
|
return bytes(encoded)
|
||||||
|
|
||||||
|
|
||||||
|
@ -278,20 +222,20 @@ def b32decode(s, casefold=False, map01=None):
|
||||||
s = _bytes_from_decode_data(s)
|
s = _bytes_from_decode_data(s)
|
||||||
quanta, leftover = divmod(len(s), 8)
|
quanta, leftover = divmod(len(s), 8)
|
||||||
if leftover:
|
if leftover:
|
||||||
raise binascii.Error("Incorrect padding")
|
raise binascii.Error('Incorrect padding')
|
||||||
# Handle section 2.4 zero and one mapping. The flag map01 will be either
|
# Handle section 2.4 zero and one mapping. The flag map01 will be either
|
||||||
# False, or the character to map the digit 1 (one) to. It should be
|
# False, or the character to map the digit 1 (one) to. It should be
|
||||||
# either L (el) or I (eye).
|
# either L (el) or I (eye).
|
||||||
if map01 is not None:
|
if map01 is not None:
|
||||||
map01 = _bytes_from_decode_data(map01)
|
map01 = _bytes_from_decode_data(map01)
|
||||||
assert len(map01) == 1, repr(map01)
|
assert len(map01) == 1, repr(map01)
|
||||||
s = _translate(s, _maketrans(b"01", b"O" + map01))
|
s = s.translate(bytes.maketrans(b'01', b'O' + map01))
|
||||||
if casefold:
|
if casefold:
|
||||||
s = s.upper()
|
s = s.upper()
|
||||||
# Strip off pad characters from the right. We need to count the pad
|
# Strip off pad characters from the right. We need to count the pad
|
||||||
# characters because this will tell us how many null bytes to remove from
|
# characters because this will tell us how many null bytes to remove from
|
||||||
# the end of the decoded string.
|
# the end of the decoded string.
|
||||||
padchars = s.find(b"=")
|
padchars = s.find(b'=')
|
||||||
if padchars > 0:
|
if padchars > 0:
|
||||||
padchars = len(s) - padchars
|
padchars = len(s) - padchars
|
||||||
s = s[:-padchars]
|
s = s[:-padchars]
|
||||||
|
@ -305,17 +249,17 @@ def b32decode(s, casefold=False, map01=None):
|
||||||
for c in s:
|
for c in s:
|
||||||
val = _b32rev.get(c)
|
val = _b32rev.get(c)
|
||||||
if val is None:
|
if val is None:
|
||||||
raise binascii.Error("Non-base32 digit found")
|
raise binascii.Error('Non-base32 digit found')
|
||||||
acc += _b32rev[c] << shift
|
acc += _b32rev[c] << shift
|
||||||
shift -= 5
|
shift -= 5
|
||||||
if shift < 0:
|
if shift < 0:
|
||||||
parts.append(binascii.unhexlify(bytes("%010x" % acc, "ascii")))
|
parts.append(binascii.unhexlify(bytes('%010x' % acc, "ascii")))
|
||||||
acc = 0
|
acc = 0
|
||||||
shift = 35
|
shift = 35
|
||||||
# Process the last, partial quanta
|
# Process the last, partial quanta
|
||||||
last = binascii.unhexlify(bytes("%010x" % acc, "ascii"))
|
last = binascii.unhexlify(bytes('%010x' % acc, "ascii"))
|
||||||
if padchars == 0:
|
if padchars == 0:
|
||||||
last = b"" # No characters
|
last = b'' # No characters
|
||||||
elif padchars == 1:
|
elif padchars == 1:
|
||||||
last = last[:-1]
|
last = last[:-1]
|
||||||
elif padchars == 3:
|
elif padchars == 3:
|
||||||
|
@ -325,9 +269,10 @@ def b32decode(s, casefold=False, map01=None):
|
||||||
elif padchars == 6:
|
elif padchars == 6:
|
||||||
last = last[:-4]
|
last = last[:-4]
|
||||||
else:
|
else:
|
||||||
raise binascii.Error("Incorrect padding")
|
raise binascii.Error('Incorrect padding')
|
||||||
parts.append(last)
|
parts.append(last)
|
||||||
return b"".join(parts)
|
return b''.join(parts)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# RFC 3548, Base 16 Alphabet specifies uppercase, but hexlify() returns
|
# RFC 3548, Base 16 Alphabet specifies uppercase, but hexlify() returns
|
||||||
|
@ -357,18 +302,18 @@ def b16decode(s, casefold=False):
|
||||||
s = _bytes_from_decode_data(s)
|
s = _bytes_from_decode_data(s)
|
||||||
if casefold:
|
if casefold:
|
||||||
s = s.upper()
|
s = s.upper()
|
||||||
if re.search(b"[^0-9A-F]", s):
|
if re.search(b'[^0-9A-F]', s):
|
||||||
raise binascii.Error("Non-base16 digit found")
|
raise binascii.Error('Non-base16 digit found')
|
||||||
return binascii.unhexlify(s)
|
return binascii.unhexlify(s)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Legacy interface. This code could be cleaned up since I don't believe
|
# Legacy interface. This code could be cleaned up since I don't believe
|
||||||
# binascii has any line length limitations. It just doesn't seem worth it
|
# binascii has any line length limitations. It just doesn't seem worth it
|
||||||
# though. The files should be opened in binary mode.
|
# though. The files should be opened in binary mode.
|
||||||
|
|
||||||
MAXLINESIZE = 76 # Excluding the CRLF
|
MAXLINESIZE = 76 # Excluding the CRLF
|
||||||
MAXBINSIZE = (MAXLINESIZE // 4) * 3
|
MAXBINSIZE = (MAXLINESIZE//4)*3
|
||||||
|
|
||||||
|
|
||||||
def encode(input, output):
|
def encode(input, output):
|
||||||
"""Encode a file; input and output are binary files."""
|
"""Encode a file; input and output are binary files."""
|
||||||
|
@ -377,7 +322,7 @@ def encode(input, output):
|
||||||
if not s:
|
if not s:
|
||||||
break
|
break
|
||||||
while len(s) < MAXBINSIZE:
|
while len(s) < MAXBINSIZE:
|
||||||
ns = input.read(MAXBINSIZE - len(s))
|
ns = input.read(MAXBINSIZE-len(s))
|
||||||
if not ns:
|
if not ns:
|
||||||
break
|
break
|
||||||
s += ns
|
s += ns
|
||||||
|
@ -406,12 +351,11 @@ def encodebytes(s):
|
||||||
pieces.append(binascii.b2a_base64(chunk))
|
pieces.append(binascii.b2a_base64(chunk))
|
||||||
return b"".join(pieces)
|
return b"".join(pieces)
|
||||||
|
|
||||||
|
|
||||||
def encodestring(s):
|
def encodestring(s):
|
||||||
"""Legacy alias of encodebytes()."""
|
"""Legacy alias of encodebytes()."""
|
||||||
import warnings
|
import warnings
|
||||||
|
warnings.warn("encodestring() is a deprecated alias, use encodebytes()",
|
||||||
warnings.warn("encodestring() is a deprecated alias, use encodebytes()", DeprecationWarning, 2)
|
DeprecationWarning, 2)
|
||||||
return encodebytes(s)
|
return encodebytes(s)
|
||||||
|
|
||||||
|
|
||||||
|
@ -421,12 +365,11 @@ def decodebytes(s):
|
||||||
raise TypeError("expected bytes, not %s" % s.__class__.__name__)
|
raise TypeError("expected bytes, not %s" % s.__class__.__name__)
|
||||||
return binascii.a2b_base64(s)
|
return binascii.a2b_base64(s)
|
||||||
|
|
||||||
|
|
||||||
def decodestring(s):
|
def decodestring(s):
|
||||||
"""Legacy alias of decodebytes()."""
|
"""Legacy alias of decodebytes()."""
|
||||||
import warnings
|
import warnings
|
||||||
|
warnings.warn("decodestring() is a deprecated alias, use decodebytes()",
|
||||||
warnings.warn("decodestring() is a deprecated alias, use decodebytes()", DeprecationWarning, 2)
|
DeprecationWarning, 2)
|
||||||
return decodebytes(s)
|
return decodebytes(s)
|
||||||
|
|
||||||
|
|
||||||
|
@ -434,33 +377,24 @@ def decodestring(s):
|
||||||
def main():
|
def main():
|
||||||
"""Small main program"""
|
"""Small main program"""
|
||||||
import sys, getopt
|
import sys, getopt
|
||||||
|
|
||||||
try:
|
try:
|
||||||
opts, args = getopt.getopt(sys.argv[1:], "deut")
|
opts, args = getopt.getopt(sys.argv[1:], 'deut')
|
||||||
except getopt.error as msg:
|
except getopt.error as msg:
|
||||||
sys.stdout = sys.stderr
|
sys.stdout = sys.stderr
|
||||||
print(msg)
|
print(msg)
|
||||||
print(
|
print("""usage: %s [-d|-e|-u|-t] [file|-]
|
||||||
"""usage: %s [-d|-e|-u|-t] [file|-]
|
|
||||||
-d, -u: decode
|
-d, -u: decode
|
||||||
-e: encode (default)
|
-e: encode (default)
|
||||||
-t: encode and decode string 'Aladdin:open sesame'"""
|
-t: encode and decode string 'Aladdin:open sesame'"""%sys.argv[0])
|
||||||
% sys.argv[0]
|
|
||||||
)
|
|
||||||
sys.exit(2)
|
sys.exit(2)
|
||||||
func = encode
|
func = encode
|
||||||
for o, a in opts:
|
for o, a in opts:
|
||||||
if o == "-e":
|
if o == '-e': func = encode
|
||||||
func = encode
|
if o == '-d': func = decode
|
||||||
if o == "-d":
|
if o == '-u': func = decode
|
||||||
func = decode
|
if o == '-t': test(); return
|
||||||
if o == "-u":
|
if args and args[0] != '-':
|
||||||
func = decode
|
with open(args[0], 'rb') as f:
|
||||||
if o == "-t":
|
|
||||||
test()
|
|
||||||
return
|
|
||||||
if args and args[0] != "-":
|
|
||||||
with open(args[0], "rb") as f:
|
|
||||||
func(f, sys.stdout.buffer)
|
func(f, sys.stdout.buffer)
|
||||||
else:
|
else:
|
||||||
func(sys.stdin.buffer, sys.stdout.buffer)
|
func(sys.stdin.buffer, sys.stdout.buffer)
|
||||||
|
@ -476,5 +410,5 @@ def test():
|
||||||
assert s0 == s2
|
assert s0 == s2
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == '__main__':
|
||||||
main()
|
main()
|
|
@ -0,0 +1,4 @@
|
||||||
|
srctype=cpython
|
||||||
|
type=module
|
||||||
|
version=3.3.3-2
|
||||||
|
depends = struct
|
|
@ -0,0 +1,19 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-base64',
|
||||||
|
version='3.3.3-2',
|
||||||
|
description='CPython base64 module ported to MicroPython',
|
||||||
|
long_description='This is a module ported from CPython standard library to be compatible with\nMicroPython interpreter. Usually, this means applying small patches for\nfeatures not supported (yet, or at all) in MicroPython. Sometimes, heavier\nchanges are required. Note that CPython modules are written with availability\nof vast resources in mind, and may not work for MicroPython ports with\nlimited heap. If you are affected by such a case, please help reimplement\nthe module from scratch.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
author_email='python-dev@python.org',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
py_modules=['base64'],
|
||||||
|
install_requires=['micropython-struct'])
|
|
@ -1,22 +1,22 @@
|
||||||
import base64
|
import base64
|
||||||
|
|
||||||
b = base64.b64encode(b"zlutoucky kun upel dabelske ody")
|
b = base64.b64encode(b'zlutoucky kun upel dabelske ody')
|
||||||
print(b)
|
print(b)
|
||||||
|
|
||||||
if b != b"emx1dG91Y2t5IGt1biB1cGVsIGRhYmVsc2tlIG9keQ==":
|
if b != b'emx1dG91Y2t5IGt1biB1cGVsIGRhYmVsc2tlIG9keQ==':
|
||||||
raise Exception("Error")
|
raise Exception("Error")
|
||||||
|
|
||||||
d = base64.b64decode(b)
|
d = base64.b64decode(b)
|
||||||
print(d)
|
print(d)
|
||||||
|
|
||||||
if d != b"zlutoucky kun upel dabelske ody":
|
if d != b'zlutoucky kun upel dabelske ody':
|
||||||
raise Exception("Error")
|
raise Exception("Error")
|
||||||
|
|
||||||
base64.test()
|
base64.test()
|
||||||
|
|
||||||
binary = b"\x99\x10\xaa"
|
binary = b'\x99\x10\xaa'
|
||||||
b = base64.b64encode(binary)
|
b = base64.b64encode(binary)
|
||||||
if b != b"mRCq":
|
if b != b'mRCq':
|
||||||
raise Exception("Error")
|
raise Exception("Error")
|
||||||
|
|
||||||
d = base64.b64decode(b)
|
d = base64.b64decode(b)
|
||||||
|
@ -24,13 +24,13 @@ print(d)
|
||||||
if d != binary:
|
if d != binary:
|
||||||
raise Exception("Error")
|
raise Exception("Error")
|
||||||
|
|
||||||
d = base64.b32encode(b"zlutoucky kun upel dabelske ody")
|
d = base64.b32encode(b'zlutoucky kun upel dabelske ody')
|
||||||
if d != b"PJWHK5DPOVRWW6JANN2W4IDVOBSWYIDEMFRGK3DTNNSSA33EPE======":
|
if d != b'PJWHK5DPOVRWW6JANN2W4IDVOBSWYIDEMFRGK3DTNNSSA33EPE======':
|
||||||
raise Exception("Error")
|
raise Exception("Error")
|
||||||
|
|
||||||
print(d)
|
print(d)
|
||||||
b = base64.b32decode(d)
|
b = base64.b32decode(d)
|
||||||
if b != b"zlutoucky kun upel dabelske ody":
|
if b != b'zlutoucky kun upel dabelske ody':
|
||||||
raise Exception("Error")
|
raise Exception("Error")
|
||||||
|
|
||||||
print("OK")
|
print("OK")
|
|
@ -0,0 +1,113 @@
|
||||||
|
from ubinascii import *
|
||||||
|
|
||||||
|
if not "unhexlify" in globals():
|
||||||
|
def unhexlify(data):
|
||||||
|
if len(data) % 2 != 0:
|
||||||
|
raise ValueError("Odd-length string")
|
||||||
|
|
||||||
|
return bytes([ int(data[i:i+2], 16) for i in range(0, len(data), 2) ])
|
||||||
|
|
||||||
|
b2a_hex = hexlify
|
||||||
|
a2b_hex = unhexlify
|
||||||
|
|
||||||
|
# ____________________________________________________________
|
||||||
|
|
||||||
|
PAD = '='
|
||||||
|
|
||||||
|
table_a2b_base64 = [
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,62, -1,-1,-1,63,
|
||||||
|
52,53,54,55, 56,57,58,59, 60,61,-1,-1, -1,-1,-1,-1, # Note PAD->-1 here
|
||||||
|
-1, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,10, 11,12,13,14,
|
||||||
|
15,16,17,18, 19,20,21,22, 23,24,25,-1, -1,-1,-1,-1,
|
||||||
|
-1,26,27,28, 29,30,31,32, 33,34,35,36, 37,38,39,40,
|
||||||
|
41,42,43,44, 45,46,47,48, 49,50,51,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
-1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1, -1,-1,-1,-1,
|
||||||
|
]
|
||||||
|
def _transform(n):
|
||||||
|
if n == -1:
|
||||||
|
return '\xff'
|
||||||
|
else:
|
||||||
|
return chr(n)
|
||||||
|
table_a2b_base64 = ''.join(map(_transform, table_a2b_base64))
|
||||||
|
assert len(table_a2b_base64) == 256
|
||||||
|
|
||||||
|
def a2b_base64(ascii):
|
||||||
|
"Decode a line of base64 data."
|
||||||
|
|
||||||
|
res = []
|
||||||
|
quad_pos = 0
|
||||||
|
leftchar = 0
|
||||||
|
leftbits = 0
|
||||||
|
last_char_was_a_pad = False
|
||||||
|
|
||||||
|
for c in ascii:
|
||||||
|
c = chr(c)
|
||||||
|
if c == PAD:
|
||||||
|
if quad_pos > 2 or (quad_pos == 2 and last_char_was_a_pad):
|
||||||
|
break # stop on 'xxx=' or on 'xx=='
|
||||||
|
last_char_was_a_pad = True
|
||||||
|
else:
|
||||||
|
n = ord(table_a2b_base64[ord(c)])
|
||||||
|
if n == 0xff:
|
||||||
|
continue # ignore strange characters
|
||||||
|
#
|
||||||
|
# Shift it in on the low end, and see if there's
|
||||||
|
# a byte ready for output.
|
||||||
|
quad_pos = (quad_pos + 1) & 3
|
||||||
|
leftchar = (leftchar << 6) | n
|
||||||
|
leftbits += 6
|
||||||
|
#
|
||||||
|
if leftbits >= 8:
|
||||||
|
leftbits -= 8
|
||||||
|
res.append((leftchar >> leftbits).to_bytes(1))
|
||||||
|
leftchar &= ((1 << leftbits) - 1)
|
||||||
|
#
|
||||||
|
last_char_was_a_pad = False
|
||||||
|
else:
|
||||||
|
if leftbits != 0:
|
||||||
|
raise Exception("Incorrect padding")
|
||||||
|
|
||||||
|
return b''.join(res)
|
||||||
|
|
||||||
|
# ____________________________________________________________
|
||||||
|
|
||||||
|
table_b2a_base64 = (
|
||||||
|
"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/")
|
||||||
|
|
||||||
|
def b2a_base64(bin):
|
||||||
|
"Base64-code line of data."
|
||||||
|
|
||||||
|
newlength = (len(bin) + 2) // 3
|
||||||
|
newlength = newlength * 4 + 1
|
||||||
|
res = []
|
||||||
|
|
||||||
|
leftchar = 0
|
||||||
|
leftbits = 0
|
||||||
|
for c in bin:
|
||||||
|
# Shift into our buffer, and output any 6bits ready
|
||||||
|
leftchar = (leftchar << 8) | c
|
||||||
|
leftbits += 8
|
||||||
|
res.append(table_b2a_base64[(leftchar >> (leftbits-6)) & 0x3f])
|
||||||
|
leftbits -= 6
|
||||||
|
if leftbits >= 6:
|
||||||
|
res.append(table_b2a_base64[(leftchar >> (leftbits-6)) & 0x3f])
|
||||||
|
leftbits -= 6
|
||||||
|
#
|
||||||
|
if leftbits == 2:
|
||||||
|
res.append(table_b2a_base64[(leftchar & 3) << 4])
|
||||||
|
res.append(PAD)
|
||||||
|
res.append(PAD)
|
||||||
|
elif leftbits == 4:
|
||||||
|
res.append(table_b2a_base64[(leftchar & 0xf) << 2])
|
||||||
|
res.append(PAD)
|
||||||
|
res.append('\n')
|
||||||
|
return ''.join(res).encode('ascii')
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=pypy
|
||||||
|
type=module
|
||||||
|
version=2.4.0-3
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-binascii',
|
||||||
|
version='2.4.0-3',
|
||||||
|
description='PyPy binascii module ported to MicroPython',
|
||||||
|
long_description='This is a module ported from PyPy standard library to be compatible with\nMicroPython interpreter. Usually, this means applying small patches for\nfeatures not supported (yet, or at all) in MicroPython. Sometimes, heavier\nchanges are required. Note that CPython modules are written with availability\nof vast resources in mind, and may not work for MicroPython ports with\nlimited heap. If you are affected by such a case, please help reimplement\nthe module from scratch.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='PyPy Developers',
|
||||||
|
author_email='pypy-dev@python.org',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['binascii'])
|
|
@ -0,0 +1,21 @@
|
||||||
|
from binascii import hexlify, unhexlify
|
||||||
|
import utime
|
||||||
|
|
||||||
|
data = b'zlutoucky kun upel dabelske ody'
|
||||||
|
h = hexlify(data)
|
||||||
|
|
||||||
|
if h != b'7a6c75746f75636b79206b756e207570656c20646162656c736b65206f6479':
|
||||||
|
raise Exception("Error")
|
||||||
|
|
||||||
|
data2 = unhexlify(h)
|
||||||
|
|
||||||
|
if data2 != data:
|
||||||
|
raise Exception("Error")
|
||||||
|
|
||||||
|
start = utime.time()
|
||||||
|
for x in range(100000):
|
||||||
|
d = unhexlify(h)
|
||||||
|
|
||||||
|
print("100000 iterations in: " + str(utime.time() - start))
|
||||||
|
|
||||||
|
print("OK")
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-binhex',
|
||||||
|
version='0.0.1',
|
||||||
|
description='Dummy binhex module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['binhex'])
|
|
@ -1,6 +1,5 @@
|
||||||
"""Bisection algorithms."""
|
"""Bisection algorithms."""
|
||||||
|
|
||||||
|
|
||||||
def insort_right(a, x, lo=0, hi=None):
|
def insort_right(a, x, lo=0, hi=None):
|
||||||
"""Insert item x in list a, and keep it sorted assuming a is sorted.
|
"""Insert item x in list a, and keep it sorted assuming a is sorted.
|
||||||
|
|
||||||
|
@ -11,21 +10,17 @@ def insort_right(a, x, lo=0, hi=None):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if lo < 0:
|
if lo < 0:
|
||||||
raise ValueError("lo must be non-negative")
|
raise ValueError('lo must be non-negative')
|
||||||
if hi is None:
|
if hi is None:
|
||||||
hi = len(a)
|
hi = len(a)
|
||||||
while lo < hi:
|
while lo < hi:
|
||||||
mid = (lo + hi) // 2
|
mid = (lo+hi)//2
|
||||||
if x < a[mid]:
|
if x < a[mid]: hi = mid
|
||||||
hi = mid
|
else: lo = mid+1
|
||||||
else:
|
|
||||||
lo = mid + 1
|
|
||||||
a.insert(lo, x)
|
a.insert(lo, x)
|
||||||
|
|
||||||
|
|
||||||
insort = insort_right # backward compatibility
|
insort = insort_right # backward compatibility
|
||||||
|
|
||||||
|
|
||||||
def bisect_right(a, x, lo=0, hi=None):
|
def bisect_right(a, x, lo=0, hi=None):
|
||||||
"""Return the index where to insert item x in list a, assuming a is sorted.
|
"""Return the index where to insert item x in list a, assuming a is sorted.
|
||||||
|
|
||||||
|
@ -38,21 +33,17 @@ def bisect_right(a, x, lo=0, hi=None):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if lo < 0:
|
if lo < 0:
|
||||||
raise ValueError("lo must be non-negative")
|
raise ValueError('lo must be non-negative')
|
||||||
if hi is None:
|
if hi is None:
|
||||||
hi = len(a)
|
hi = len(a)
|
||||||
while lo < hi:
|
while lo < hi:
|
||||||
mid = (lo + hi) // 2
|
mid = (lo+hi)//2
|
||||||
if x < a[mid]:
|
if x < a[mid]: hi = mid
|
||||||
hi = mid
|
else: lo = mid+1
|
||||||
else:
|
|
||||||
lo = mid + 1
|
|
||||||
return lo
|
return lo
|
||||||
|
|
||||||
|
|
||||||
bisect = bisect_right # backward compatibility
|
bisect = bisect_right # backward compatibility
|
||||||
|
|
||||||
|
|
||||||
def insort_left(a, x, lo=0, hi=None):
|
def insort_left(a, x, lo=0, hi=None):
|
||||||
"""Insert item x in list a, and keep it sorted assuming a is sorted.
|
"""Insert item x in list a, and keep it sorted assuming a is sorted.
|
||||||
|
|
||||||
|
@ -63,15 +54,13 @@ def insort_left(a, x, lo=0, hi=None):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if lo < 0:
|
if lo < 0:
|
||||||
raise ValueError("lo must be non-negative")
|
raise ValueError('lo must be non-negative')
|
||||||
if hi is None:
|
if hi is None:
|
||||||
hi = len(a)
|
hi = len(a)
|
||||||
while lo < hi:
|
while lo < hi:
|
||||||
mid = (lo + hi) // 2
|
mid = (lo+hi)//2
|
||||||
if a[mid] < x:
|
if a[mid] < x: lo = mid+1
|
||||||
lo = mid + 1
|
else: hi = mid
|
||||||
else:
|
|
||||||
hi = mid
|
|
||||||
a.insert(lo, x)
|
a.insert(lo, x)
|
||||||
|
|
||||||
|
|
||||||
|
@ -87,18 +76,15 @@ def bisect_left(a, x, lo=0, hi=None):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if lo < 0:
|
if lo < 0:
|
||||||
raise ValueError("lo must be non-negative")
|
raise ValueError('lo must be non-negative')
|
||||||
if hi is None:
|
if hi is None:
|
||||||
hi = len(a)
|
hi = len(a)
|
||||||
while lo < hi:
|
while lo < hi:
|
||||||
mid = (lo + hi) // 2
|
mid = (lo+hi)//2
|
||||||
if a[mid] < x:
|
if a[mid] < x: lo = mid+1
|
||||||
lo = mid + 1
|
else: hi = mid
|
||||||
else:
|
|
||||||
hi = mid
|
|
||||||
return lo
|
return lo
|
||||||
|
|
||||||
|
|
||||||
# Overwrite above definitions with a fast C implementation
|
# Overwrite above definitions with a fast C implementation
|
||||||
try:
|
try:
|
||||||
from _bisect import *
|
from _bisect import *
|
|
@ -0,0 +1,23 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise distutils will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
def desc_dummy(name):
|
||||||
|
return 'Dummy %s module to MicroPython' % name
|
||||||
|
def desc_cpython(name):
|
||||||
|
return 'CPython %s module ported to MicroPython' % name
|
||||||
|
|
||||||
|
NAME = 'bisect'
|
||||||
|
|
||||||
|
setup(name='micropython-' + NAME,
|
||||||
|
version='0.5',
|
||||||
|
description=desc_cpython(NAME),
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
py_modules=[NAME])
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.0
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-calendar',
|
||||||
|
version='0.0.0',
|
||||||
|
description='Dummy calendar module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['calendar'])
|
|
@ -25,6 +25,9 @@ written in Python.
|
||||||
# responsible for its maintenance.
|
# responsible for its maintenance.
|
||||||
#
|
#
|
||||||
|
|
||||||
|
__version__ = "2.6"
|
||||||
|
|
||||||
|
|
||||||
# Imports
|
# Imports
|
||||||
# =======
|
# =======
|
||||||
|
|
||||||
|
@ -38,22 +41,11 @@ import html
|
||||||
import locale
|
import locale
|
||||||
import tempfile
|
import tempfile
|
||||||
|
|
||||||
__all__ = [
|
__all__ = ["MiniFieldStorage", "FieldStorage",
|
||||||
"MiniFieldStorage",
|
"parse", "parse_qs", "parse_qsl", "parse_multipart",
|
||||||
"FieldStorage",
|
"parse_header", "print_exception", "print_environ",
|
||||||
"parse",
|
"print_form", "print_directory", "print_arguments",
|
||||||
"parse_qs",
|
"print_environ_usage", "escape"]
|
||||||
"parse_qsl",
|
|
||||||
"parse_multipart",
|
|
||||||
"parse_header",
|
|
||||||
"print_exception",
|
|
||||||
"print_environ",
|
|
||||||
"print_form",
|
|
||||||
"print_directory",
|
|
||||||
"print_arguments",
|
|
||||||
"print_environ_usage",
|
|
||||||
"escape",
|
|
||||||
]
|
|
||||||
|
|
||||||
# Logging support
|
# Logging support
|
||||||
# ===============
|
# ===============
|
||||||
|
@ -61,7 +53,6 @@ __all__ = [
|
||||||
logfile = "" # Filename to log to, if not empty
|
logfile = "" # Filename to log to, if not empty
|
||||||
logfp = None # File object to log to, if not None
|
logfp = None # File object to log to, if not None
|
||||||
|
|
||||||
|
|
||||||
def initlog(*allargs):
|
def initlog(*allargs):
|
||||||
"""Write a log message, if there is a log file.
|
"""Write a log message, if there is a log file.
|
||||||
|
|
||||||
|
@ -97,27 +88,23 @@ def initlog(*allargs):
|
||||||
log = dolog
|
log = dolog
|
||||||
log(*allargs)
|
log(*allargs)
|
||||||
|
|
||||||
|
|
||||||
def dolog(fmt, *args):
|
def dolog(fmt, *args):
|
||||||
"""Write a log message to the log file. See initlog() for docs."""
|
"""Write a log message to the log file. See initlog() for docs."""
|
||||||
logfp.write(fmt % args + "\n")
|
logfp.write(fmt%args + "\n")
|
||||||
|
|
||||||
|
|
||||||
def nolog(*allargs):
|
def nolog(*allargs):
|
||||||
"""Dummy function, assigned to log when logging is disabled."""
|
"""Dummy function, assigned to log when logging is disabled."""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
def closelog():
|
def closelog():
|
||||||
"""Close the log file."""
|
"""Close the log file."""
|
||||||
global log, logfile, logfp
|
global log, logfile, logfp
|
||||||
logfile = ""
|
logfile = ''
|
||||||
if logfp:
|
if logfp:
|
||||||
logfp.close()
|
logfp.close()
|
||||||
logfp = None
|
logfp = None
|
||||||
log = initlog
|
log = initlog
|
||||||
|
|
||||||
|
|
||||||
log = initlog # The current logging function
|
log = initlog # The current logging function
|
||||||
|
|
||||||
|
|
||||||
|
@ -128,7 +115,6 @@ log = initlog # The current logging function
|
||||||
# 0 ==> unlimited input
|
# 0 ==> unlimited input
|
||||||
maxlen = 0
|
maxlen = 0
|
||||||
|
|
||||||
|
|
||||||
def parse(fp=None, environ=os.environ, keep_blank_values=0, strict_parsing=0):
|
def parse(fp=None, environ=os.environ, keep_blank_values=0, strict_parsing=0):
|
||||||
"""Parse a query in the environment or from a file (default stdin)
|
"""Parse a query in the environment or from a file (default stdin)
|
||||||
|
|
||||||
|
@ -154,64 +140,62 @@ def parse(fp=None, environ=os.environ, keep_blank_values=0, strict_parsing=0):
|
||||||
|
|
||||||
# field keys and values (except for files) are returned as strings
|
# field keys and values (except for files) are returned as strings
|
||||||
# an encoding is required to decode the bytes read from self.fp
|
# an encoding is required to decode the bytes read from self.fp
|
||||||
if hasattr(fp, "encoding"):
|
if hasattr(fp,'encoding'):
|
||||||
encoding = fp.encoding
|
encoding = fp.encoding
|
||||||
else:
|
else:
|
||||||
encoding = "latin-1"
|
encoding = 'latin-1'
|
||||||
|
|
||||||
# fp.read() must return bytes
|
# fp.read() must return bytes
|
||||||
if isinstance(fp, TextIOWrapper):
|
if isinstance(fp, TextIOWrapper):
|
||||||
fp = fp.buffer
|
fp = fp.buffer
|
||||||
|
|
||||||
if not "REQUEST_METHOD" in environ:
|
if not 'REQUEST_METHOD' in environ:
|
||||||
environ["REQUEST_METHOD"] = "GET" # For testing stand-alone
|
environ['REQUEST_METHOD'] = 'GET' # For testing stand-alone
|
||||||
if environ["REQUEST_METHOD"] == "POST":
|
if environ['REQUEST_METHOD'] == 'POST':
|
||||||
ctype, pdict = parse_header(environ["CONTENT_TYPE"])
|
ctype, pdict = parse_header(environ['CONTENT_TYPE'])
|
||||||
if ctype == "multipart/form-data":
|
if ctype == 'multipart/form-data':
|
||||||
return parse_multipart(fp, pdict)
|
return parse_multipart(fp, pdict)
|
||||||
elif ctype == "application/x-www-form-urlencoded":
|
elif ctype == 'application/x-www-form-urlencoded':
|
||||||
clength = int(environ["CONTENT_LENGTH"])
|
clength = int(environ['CONTENT_LENGTH'])
|
||||||
if maxlen and clength > maxlen:
|
if maxlen and clength > maxlen:
|
||||||
raise ValueError("Maximum content length exceeded")
|
raise ValueError('Maximum content length exceeded')
|
||||||
qs = fp.read(clength).decode(encoding)
|
qs = fp.read(clength).decode(encoding)
|
||||||
else:
|
else:
|
||||||
qs = "" # Unknown content-type
|
qs = '' # Unknown content-type
|
||||||
if "QUERY_STRING" in environ:
|
if 'QUERY_STRING' in environ:
|
||||||
if qs:
|
if qs: qs = qs + '&'
|
||||||
qs = qs + "&"
|
qs = qs + environ['QUERY_STRING']
|
||||||
qs = qs + environ["QUERY_STRING"]
|
|
||||||
elif sys.argv[1:]:
|
elif sys.argv[1:]:
|
||||||
if qs:
|
if qs: qs = qs + '&'
|
||||||
qs = qs + "&"
|
|
||||||
qs = qs + sys.argv[1]
|
qs = qs + sys.argv[1]
|
||||||
environ["QUERY_STRING"] = qs # XXX Shouldn't, really
|
environ['QUERY_STRING'] = qs # XXX Shouldn't, really
|
||||||
elif "QUERY_STRING" in environ:
|
elif 'QUERY_STRING' in environ:
|
||||||
qs = environ["QUERY_STRING"]
|
qs = environ['QUERY_STRING']
|
||||||
else:
|
else:
|
||||||
if sys.argv[1:]:
|
if sys.argv[1:]:
|
||||||
qs = sys.argv[1]
|
qs = sys.argv[1]
|
||||||
else:
|
else:
|
||||||
qs = ""
|
qs = ""
|
||||||
environ["QUERY_STRING"] = qs # XXX Shouldn't, really
|
environ['QUERY_STRING'] = qs # XXX Shouldn't, really
|
||||||
return urllib.parse.parse_qs(qs, keep_blank_values, strict_parsing, encoding=encoding)
|
return urllib.parse.parse_qs(qs, keep_blank_values, strict_parsing,
|
||||||
|
encoding=encoding)
|
||||||
|
|
||||||
|
|
||||||
# parse query string function called from urlparse,
|
# parse query string function called from urlparse,
|
||||||
# this is done in order to maintain backward compatiblity.
|
# this is done in order to maintain backward compatiblity.
|
||||||
|
|
||||||
|
|
||||||
def parse_qs(qs, keep_blank_values=0, strict_parsing=0):
|
def parse_qs(qs, keep_blank_values=0, strict_parsing=0):
|
||||||
"""Parse a query given as a string argument."""
|
"""Parse a query given as a string argument."""
|
||||||
warn("cgi.parse_qs is deprecated, use urllib.parse.parse_qs instead", DeprecationWarning, 2)
|
warn("cgi.parse_qs is deprecated, use urllib.parse.parse_qs instead",
|
||||||
|
DeprecationWarning, 2)
|
||||||
return urllib.parse.parse_qs(qs, keep_blank_values, strict_parsing)
|
return urllib.parse.parse_qs(qs, keep_blank_values, strict_parsing)
|
||||||
|
|
||||||
|
|
||||||
def parse_qsl(qs, keep_blank_values=0, strict_parsing=0):
|
def parse_qsl(qs, keep_blank_values=0, strict_parsing=0):
|
||||||
"""Parse a query given as a string argument."""
|
"""Parse a query given as a string argument."""
|
||||||
warn("cgi.parse_qsl is deprecated, use urllib.parse.parse_qsl instead", DeprecationWarning, 2)
|
warn("cgi.parse_qsl is deprecated, use urllib.parse.parse_qsl instead",
|
||||||
|
DeprecationWarning, 2)
|
||||||
return urllib.parse.parse_qsl(qs, keep_blank_values, strict_parsing)
|
return urllib.parse.parse_qsl(qs, keep_blank_values, strict_parsing)
|
||||||
|
|
||||||
|
|
||||||
def parse_multipart(fp, pdict):
|
def parse_multipart(fp, pdict):
|
||||||
"""Parse multipart input.
|
"""Parse multipart input.
|
||||||
|
|
||||||
|
@ -240,10 +224,11 @@ def parse_multipart(fp, pdict):
|
||||||
import http.client
|
import http.client
|
||||||
|
|
||||||
boundary = b""
|
boundary = b""
|
||||||
if "boundary" in pdict:
|
if 'boundary' in pdict:
|
||||||
boundary = pdict["boundary"]
|
boundary = pdict['boundary']
|
||||||
if not valid_boundary(boundary):
|
if not valid_boundary(boundary):
|
||||||
raise ValueError("Invalid boundary in multipart form: %r" % (boundary,))
|
raise ValueError('Invalid boundary in multipart form: %r'
|
||||||
|
% (boundary,))
|
||||||
|
|
||||||
nextpart = b"--" + boundary
|
nextpart = b"--" + boundary
|
||||||
lastpart = b"--" + boundary + b"--"
|
lastpart = b"--" + boundary + b"--"
|
||||||
|
@ -256,7 +241,7 @@ def parse_multipart(fp, pdict):
|
||||||
if terminator:
|
if terminator:
|
||||||
# At start of next part. Read headers first.
|
# At start of next part. Read headers first.
|
||||||
headers = http.client.parse_headers(fp)
|
headers = http.client.parse_headers(fp)
|
||||||
clength = headers.get("content-length")
|
clength = headers.get('content-length')
|
||||||
if clength:
|
if clength:
|
||||||
try:
|
try:
|
||||||
bytes = int(clength)
|
bytes = int(clength)
|
||||||
|
@ -264,7 +249,7 @@ def parse_multipart(fp, pdict):
|
||||||
pass
|
pass
|
||||||
if bytes > 0:
|
if bytes > 0:
|
||||||
if maxlen and bytes > maxlen:
|
if maxlen and bytes > maxlen:
|
||||||
raise ValueError("Maximum content length exceeded")
|
raise ValueError('Maximum content length exceeded')
|
||||||
data = fp.read(bytes)
|
data = fp.read(bytes)
|
||||||
else:
|
else:
|
||||||
data = b""
|
data = b""
|
||||||
|
@ -293,14 +278,14 @@ def parse_multipart(fp, pdict):
|
||||||
line = line[:-1]
|
line = line[:-1]
|
||||||
lines[-1] = line
|
lines[-1] = line
|
||||||
data = b"".join(lines)
|
data = b"".join(lines)
|
||||||
line = headers["content-disposition"]
|
line = headers['content-disposition']
|
||||||
if not line:
|
if not line:
|
||||||
continue
|
continue
|
||||||
key, params = parse_header(line)
|
key, params = parse_header(line)
|
||||||
if key != "form-data":
|
if key != 'form-data':
|
||||||
continue
|
continue
|
||||||
if "name" in params:
|
if 'name' in params:
|
||||||
name = params["name"]
|
name = params['name']
|
||||||
else:
|
else:
|
||||||
continue
|
continue
|
||||||
if name in partdict:
|
if name in partdict:
|
||||||
|
@ -312,35 +297,34 @@ def parse_multipart(fp, pdict):
|
||||||
|
|
||||||
|
|
||||||
def _parseparam(s):
|
def _parseparam(s):
|
||||||
while s[:1] == ";":
|
while s[:1] == ';':
|
||||||
s = s[1:]
|
s = s[1:]
|
||||||
end = s.find(";")
|
end = s.find(';')
|
||||||
while end > 0 and (s.count('"', 0, end) - s.count('\\"', 0, end)) % 2:
|
while end > 0 and (s.count('"', 0, end) - s.count('\\"', 0, end)) % 2:
|
||||||
end = s.find(";", end + 1)
|
end = s.find(';', end + 1)
|
||||||
if end < 0:
|
if end < 0:
|
||||||
end = len(s)
|
end = len(s)
|
||||||
f = s[:end]
|
f = s[:end]
|
||||||
yield f.strip()
|
yield f.strip()
|
||||||
s = s[end:]
|
s = s[end:]
|
||||||
|
|
||||||
|
|
||||||
def parse_header(line):
|
def parse_header(line):
|
||||||
"""Parse a Content-type like header.
|
"""Parse a Content-type like header.
|
||||||
|
|
||||||
Return the main content-type and a dictionary of options.
|
Return the main content-type and a dictionary of options.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
parts = _parseparam(";" + line)
|
parts = _parseparam(';' + line)
|
||||||
key = parts.__next__()
|
key = parts.__next__()
|
||||||
pdict = {}
|
pdict = {}
|
||||||
for p in parts:
|
for p in parts:
|
||||||
i = p.find("=")
|
i = p.find('=')
|
||||||
if i >= 0:
|
if i >= 0:
|
||||||
name = p[:i].strip().lower()
|
name = p[:i].strip().lower()
|
||||||
value = p[i + 1 :].strip()
|
value = p[i+1:].strip()
|
||||||
if len(value) >= 2 and value[0] == value[-1] == '"':
|
if len(value) >= 2 and value[0] == value[-1] == '"':
|
||||||
value = value[1:-1]
|
value = value[1:-1]
|
||||||
value = value.replace("\\\\", "\\").replace('\\"', '"')
|
value = value.replace('\\\\', '\\').replace('\\"', '"')
|
||||||
pdict[name] = value
|
pdict[name] = value
|
||||||
return key, pdict
|
return key, pdict
|
||||||
|
|
||||||
|
@ -348,7 +332,6 @@ def parse_header(line):
|
||||||
# Classes for field storage
|
# Classes for field storage
|
||||||
# =========================
|
# =========================
|
||||||
|
|
||||||
|
|
||||||
class MiniFieldStorage:
|
class MiniFieldStorage:
|
||||||
|
|
||||||
"""Like FieldStorage, for use when no file uploads are possible."""
|
"""Like FieldStorage, for use when no file uploads are possible."""
|
||||||
|
@ -417,19 +400,9 @@ class FieldStorage:
|
||||||
directory and unlinking them as soon as they have been opened.
|
directory and unlinking them as soon as they have been opened.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
def __init__(self, fp=None, headers=None, outerboundary=b'',
|
||||||
def __init__(
|
environ=os.environ, keep_blank_values=0, strict_parsing=0,
|
||||||
self,
|
limit=None, encoding='utf-8', errors='replace'):
|
||||||
fp=None,
|
|
||||||
headers=None,
|
|
||||||
outerboundary=b"",
|
|
||||||
environ=os.environ,
|
|
||||||
keep_blank_values=0,
|
|
||||||
strict_parsing=0,
|
|
||||||
limit=None,
|
|
||||||
encoding="utf-8",
|
|
||||||
errors="replace",
|
|
||||||
):
|
|
||||||
"""Constructor. Read multipart/* until last part.
|
"""Constructor. Read multipart/* until last part.
|
||||||
|
|
||||||
Arguments, all optional:
|
Arguments, all optional:
|
||||||
|
@ -470,34 +443,35 @@ class FieldStorage:
|
||||||
header)
|
header)
|
||||||
|
|
||||||
"""
|
"""
|
||||||
method = "GET"
|
method = 'GET'
|
||||||
self.keep_blank_values = keep_blank_values
|
self.keep_blank_values = keep_blank_values
|
||||||
self.strict_parsing = strict_parsing
|
self.strict_parsing = strict_parsing
|
||||||
if "REQUEST_METHOD" in environ:
|
if 'REQUEST_METHOD' in environ:
|
||||||
method = environ["REQUEST_METHOD"].upper()
|
method = environ['REQUEST_METHOD'].upper()
|
||||||
self.qs_on_post = None
|
self.qs_on_post = None
|
||||||
if method == "GET" or method == "HEAD":
|
if method == 'GET' or method == 'HEAD':
|
||||||
if "QUERY_STRING" in environ:
|
if 'QUERY_STRING' in environ:
|
||||||
qs = environ["QUERY_STRING"]
|
qs = environ['QUERY_STRING']
|
||||||
elif sys.argv[1:]:
|
elif sys.argv[1:]:
|
||||||
qs = sys.argv[1]
|
qs = sys.argv[1]
|
||||||
else:
|
else:
|
||||||
qs = ""
|
qs = ""
|
||||||
qs = qs.encode(locale.getpreferredencoding(), "surrogateescape")
|
qs = qs.encode(locale.getpreferredencoding(), 'surrogateescape')
|
||||||
fp = BytesIO(qs)
|
fp = BytesIO(qs)
|
||||||
if headers is None:
|
if headers is None:
|
||||||
headers = {"content-type": "application/x-www-form-urlencoded"}
|
headers = {'content-type':
|
||||||
|
"application/x-www-form-urlencoded"}
|
||||||
if headers is None:
|
if headers is None:
|
||||||
headers = {}
|
headers = {}
|
||||||
if method == "POST":
|
if method == 'POST':
|
||||||
# Set default content-type for POST to what's traditional
|
# Set default content-type for POST to what's traditional
|
||||||
headers["content-type"] = "application/x-www-form-urlencoded"
|
headers['content-type'] = "application/x-www-form-urlencoded"
|
||||||
if "CONTENT_TYPE" in environ:
|
if 'CONTENT_TYPE' in environ:
|
||||||
headers["content-type"] = environ["CONTENT_TYPE"]
|
headers['content-type'] = environ['CONTENT_TYPE']
|
||||||
if "QUERY_STRING" in environ:
|
if 'QUERY_STRING' in environ:
|
||||||
self.qs_on_post = environ["QUERY_STRING"]
|
self.qs_on_post = environ['QUERY_STRING']
|
||||||
if "CONTENT_LENGTH" in environ:
|
if 'CONTENT_LENGTH' in environ:
|
||||||
headers["content-length"] = environ["CONTENT_LENGTH"]
|
headers['content-length'] = environ['CONTENT_LENGTH']
|
||||||
if fp is None:
|
if fp is None:
|
||||||
self.fp = sys.stdin.buffer
|
self.fp = sys.stdin.buffer
|
||||||
# self.fp.read() must return bytes
|
# self.fp.read() must return bytes
|
||||||
|
@ -511,7 +485,8 @@ class FieldStorage:
|
||||||
|
|
||||||
self.headers = headers
|
self.headers = headers
|
||||||
if not isinstance(outerboundary, bytes):
|
if not isinstance(outerboundary, bytes):
|
||||||
raise TypeError("outerboundary must be bytes, not %s" % type(outerboundary).__name__)
|
raise TypeError('outerboundary must be bytes, not %s'
|
||||||
|
% type(outerboundary).__name__)
|
||||||
self.outerboundary = outerboundary
|
self.outerboundary = outerboundary
|
||||||
|
|
||||||
self.bytes_read = 0
|
self.bytes_read = 0
|
||||||
|
@ -519,16 +494,16 @@ class FieldStorage:
|
||||||
|
|
||||||
# Process content-disposition header
|
# Process content-disposition header
|
||||||
cdisp, pdict = "", {}
|
cdisp, pdict = "", {}
|
||||||
if "content-disposition" in self.headers:
|
if 'content-disposition' in self.headers:
|
||||||
cdisp, pdict = parse_header(self.headers["content-disposition"])
|
cdisp, pdict = parse_header(self.headers['content-disposition'])
|
||||||
self.disposition = cdisp
|
self.disposition = cdisp
|
||||||
self.disposition_options = pdict
|
self.disposition_options = pdict
|
||||||
self.name = None
|
self.name = None
|
||||||
if "name" in pdict:
|
if 'name' in pdict:
|
||||||
self.name = pdict["name"]
|
self.name = pdict['name']
|
||||||
self.filename = None
|
self.filename = None
|
||||||
if "filename" in pdict:
|
if 'filename' in pdict:
|
||||||
self.filename = pdict["filename"]
|
self.filename = pdict['filename']
|
||||||
self._binary_file = self.filename is not None
|
self._binary_file = self.filename is not None
|
||||||
|
|
||||||
# Process content-type header
|
# Process content-type header
|
||||||
|
@ -543,49 +518,50 @@ class FieldStorage:
|
||||||
#
|
#
|
||||||
# See below for what we do if there does exist a content-type header,
|
# See below for what we do if there does exist a content-type header,
|
||||||
# but it happens to be something we don't understand.
|
# but it happens to be something we don't understand.
|
||||||
if "content-type" in self.headers:
|
if 'content-type' in self.headers:
|
||||||
ctype, pdict = parse_header(self.headers["content-type"])
|
ctype, pdict = parse_header(self.headers['content-type'])
|
||||||
elif self.outerboundary or method != "POST":
|
elif self.outerboundary or method != 'POST':
|
||||||
ctype, pdict = "text/plain", {}
|
ctype, pdict = "text/plain", {}
|
||||||
else:
|
else:
|
||||||
ctype, pdict = "application/x-www-form-urlencoded", {}
|
ctype, pdict = 'application/x-www-form-urlencoded', {}
|
||||||
self.type = ctype
|
self.type = ctype
|
||||||
self.type_options = pdict
|
self.type_options = pdict
|
||||||
if "boundary" in pdict:
|
if 'boundary' in pdict:
|
||||||
self.innerboundary = pdict["boundary"].encode(self.encoding)
|
self.innerboundary = pdict['boundary'].encode(self.encoding)
|
||||||
else:
|
else:
|
||||||
self.innerboundary = b""
|
self.innerboundary = b""
|
||||||
|
|
||||||
clen = -1
|
clen = -1
|
||||||
if "content-length" in self.headers:
|
if 'content-length' in self.headers:
|
||||||
try:
|
try:
|
||||||
clen = int(self.headers["content-length"])
|
clen = int(self.headers['content-length'])
|
||||||
except ValueError:
|
except ValueError:
|
||||||
pass
|
pass
|
||||||
if maxlen and clen > maxlen:
|
if maxlen and clen > maxlen:
|
||||||
raise ValueError("Maximum content length exceeded")
|
raise ValueError('Maximum content length exceeded')
|
||||||
self.length = clen
|
self.length = clen
|
||||||
if self.limit is None and clen:
|
if self.limit is None and clen:
|
||||||
self.limit = clen
|
self.limit = clen
|
||||||
|
|
||||||
self.list = self.file = None
|
self.list = self.file = None
|
||||||
self.done = 0
|
self.done = 0
|
||||||
if ctype == "application/x-www-form-urlencoded":
|
if ctype == 'application/x-www-form-urlencoded':
|
||||||
self.read_urlencoded()
|
self.read_urlencoded()
|
||||||
elif ctype[:10] == "multipart/":
|
elif ctype[:10] == 'multipart/':
|
||||||
self.read_multi(environ, keep_blank_values, strict_parsing)
|
self.read_multi(environ, keep_blank_values, strict_parsing)
|
||||||
else:
|
else:
|
||||||
self.read_single()
|
self.read_single()
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
"""Return a printable representation."""
|
"""Return a printable representation."""
|
||||||
return "FieldStorage(%r, %r, %r)" % (self.name, self.filename, self.value)
|
return "FieldStorage(%r, %r, %r)" % (
|
||||||
|
self.name, self.filename, self.value)
|
||||||
|
|
||||||
def __iter__(self):
|
def __iter__(self):
|
||||||
return iter(self.keys())
|
return iter(self.keys())
|
||||||
|
|
||||||
def __getattr__(self, name):
|
def __getattr__(self, name):
|
||||||
if name != "value":
|
if name != 'value':
|
||||||
raise AttributeError(name)
|
raise AttributeError(name)
|
||||||
if self.file:
|
if self.file:
|
||||||
self.file.seek(0)
|
self.file.seek(0)
|
||||||
|
@ -603,8 +579,7 @@ class FieldStorage:
|
||||||
raise TypeError("not indexable")
|
raise TypeError("not indexable")
|
||||||
found = []
|
found = []
|
||||||
for item in self.list:
|
for item in self.list:
|
||||||
if item.name == key:
|
if item.name == key: found.append(item)
|
||||||
found.append(item)
|
|
||||||
if not found:
|
if not found:
|
||||||
raise KeyError(key)
|
raise KeyError(key)
|
||||||
if len(found) == 1:
|
if len(found) == 1:
|
||||||
|
@ -624,7 +599,7 @@ class FieldStorage:
|
||||||
return default
|
return default
|
||||||
|
|
||||||
def getfirst(self, key, default=None):
|
def getfirst(self, key, default=None):
|
||||||
"""Return the first value received."""
|
""" Return the first value received."""
|
||||||
if key in self:
|
if key in self:
|
||||||
value = self[key]
|
value = self[key]
|
||||||
if isinstance(value, list):
|
if isinstance(value, list):
|
||||||
|
@ -635,7 +610,7 @@ class FieldStorage:
|
||||||
return default
|
return default
|
||||||
|
|
||||||
def getlist(self, key):
|
def getlist(self, key):
|
||||||
"""Return list of received values."""
|
""" Return list of received values."""
|
||||||
if key in self:
|
if key in self:
|
||||||
value = self[key]
|
value = self[key]
|
||||||
if isinstance(value, list):
|
if isinstance(value, list):
|
||||||
|
@ -668,18 +643,15 @@ class FieldStorage:
|
||||||
"""Internal: read data in query string format."""
|
"""Internal: read data in query string format."""
|
||||||
qs = self.fp.read(self.length)
|
qs = self.fp.read(self.length)
|
||||||
if not isinstance(qs, bytes):
|
if not isinstance(qs, bytes):
|
||||||
raise ValueError("%s should return bytes, got %s" % (self.fp, type(qs).__name__))
|
raise ValueError("%s should return bytes, got %s" \
|
||||||
|
% (self.fp, type(qs).__name__))
|
||||||
qs = qs.decode(self.encoding, self.errors)
|
qs = qs.decode(self.encoding, self.errors)
|
||||||
if self.qs_on_post:
|
if self.qs_on_post:
|
||||||
qs += "&" + self.qs_on_post
|
qs += '&' + self.qs_on_post
|
||||||
self.list = []
|
self.list = []
|
||||||
query = urllib.parse.parse_qsl(
|
query = urllib.parse.parse_qsl(
|
||||||
qs,
|
qs, self.keep_blank_values, self.strict_parsing,
|
||||||
self.keep_blank_values,
|
encoding=self.encoding, errors=self.errors)
|
||||||
self.strict_parsing,
|
|
||||||
encoding=self.encoding,
|
|
||||||
errors=self.errors,
|
|
||||||
)
|
|
||||||
for key, value in query:
|
for key, value in query:
|
||||||
self.list.append(MiniFieldStorage(key, value))
|
self.list.append(MiniFieldStorage(key, value))
|
||||||
self.skip_lines()
|
self.skip_lines()
|
||||||
|
@ -690,16 +662,12 @@ class FieldStorage:
|
||||||
"""Internal: read a part that is itself multipart."""
|
"""Internal: read a part that is itself multipart."""
|
||||||
ib = self.innerboundary
|
ib = self.innerboundary
|
||||||
if not valid_boundary(ib):
|
if not valid_boundary(ib):
|
||||||
raise ValueError("Invalid boundary in multipart form: %r" % (ib,))
|
raise ValueError('Invalid boundary in multipart form: %r' % (ib,))
|
||||||
self.list = []
|
self.list = []
|
||||||
if self.qs_on_post:
|
if self.qs_on_post:
|
||||||
query = urllib.parse.parse_qsl(
|
query = urllib.parse.parse_qsl(
|
||||||
self.qs_on_post,
|
self.qs_on_post, self.keep_blank_values, self.strict_parsing,
|
||||||
self.keep_blank_values,
|
encoding=self.encoding, errors=self.errors)
|
||||||
self.strict_parsing,
|
|
||||||
encoding=self.encoding,
|
|
||||||
errors=self.errors,
|
|
||||||
)
|
|
||||||
for key, value in query:
|
for key, value in query:
|
||||||
self.list.append(MiniFieldStorage(key, value))
|
self.list.append(MiniFieldStorage(key, value))
|
||||||
FieldStorageClass = None
|
FieldStorageClass = None
|
||||||
|
@ -707,9 +675,8 @@ class FieldStorage:
|
||||||
klass = self.FieldStorageClass or self.__class__
|
klass = self.FieldStorageClass or self.__class__
|
||||||
first_line = self.fp.readline() # bytes
|
first_line = self.fp.readline() # bytes
|
||||||
if not isinstance(first_line, bytes):
|
if not isinstance(first_line, bytes):
|
||||||
raise ValueError(
|
raise ValueError("%s should return bytes, got %s" \
|
||||||
"%s should return bytes, got %s" % (self.fp, type(first_line).__name__)
|
% (self.fp, type(first_line).__name__))
|
||||||
)
|
|
||||||
self.bytes_read += len(first_line)
|
self.bytes_read += len(first_line)
|
||||||
# first line holds boundary ; ignore it, or check that
|
# first line holds boundary ; ignore it, or check that
|
||||||
# b"--" + ib == first_line.strip() ?
|
# b"--" + ib == first_line.strip() ?
|
||||||
|
@ -727,17 +694,9 @@ class FieldStorage:
|
||||||
self.bytes_read += len(hdr_text)
|
self.bytes_read += len(hdr_text)
|
||||||
parser.feed(hdr_text.decode(self.encoding, self.errors))
|
parser.feed(hdr_text.decode(self.encoding, self.errors))
|
||||||
headers = parser.close()
|
headers = parser.close()
|
||||||
part = klass(
|
part = klass(self.fp, headers, ib, environ, keep_blank_values,
|
||||||
self.fp,
|
strict_parsing,self.limit-self.bytes_read,
|
||||||
headers,
|
self.encoding, self.errors)
|
||||||
ib,
|
|
||||||
environ,
|
|
||||||
keep_blank_values,
|
|
||||||
strict_parsing,
|
|
||||||
self.limit - self.bytes_read,
|
|
||||||
self.encoding,
|
|
||||||
self.errors,
|
|
||||||
)
|
|
||||||
self.bytes_read += part.bytes_read
|
self.bytes_read += part.bytes_read
|
||||||
self.list.append(part)
|
self.list.append(part)
|
||||||
if part.done or self.bytes_read >= self.length > 0:
|
if part.done or self.bytes_read >= self.length > 0:
|
||||||
|
@ -753,7 +712,7 @@ class FieldStorage:
|
||||||
self.read_lines()
|
self.read_lines()
|
||||||
self.file.seek(0)
|
self.file.seek(0)
|
||||||
|
|
||||||
bufsize = 8 * 1024 # I/O buffering size for copy to file
|
bufsize = 8*1024 # I/O buffering size for copy to file
|
||||||
|
|
||||||
def read_binary(self):
|
def read_binary(self):
|
||||||
"""Internal: read binary data."""
|
"""Internal: read binary data."""
|
||||||
|
@ -763,9 +722,8 @@ class FieldStorage:
|
||||||
while todo > 0:
|
while todo > 0:
|
||||||
data = self.fp.read(min(todo, self.bufsize)) # bytes
|
data = self.fp.read(min(todo, self.bufsize)) # bytes
|
||||||
if not isinstance(data, bytes):
|
if not isinstance(data, bytes):
|
||||||
raise ValueError(
|
raise ValueError("%s should return bytes, got %s"
|
||||||
"%s should return bytes, got %s" % (self.fp, type(data).__name__)
|
% (self.fp, type(data).__name__))
|
||||||
)
|
|
||||||
self.bytes_read += len(data)
|
self.bytes_read += len(data)
|
||||||
if not data:
|
if not data:
|
||||||
self.done = -1
|
self.done = -1
|
||||||
|
@ -802,7 +760,7 @@ class FieldStorage:
|
||||||
def read_lines_to_eof(self):
|
def read_lines_to_eof(self):
|
||||||
"""Internal: read lines until EOF."""
|
"""Internal: read lines until EOF."""
|
||||||
while 1:
|
while 1:
|
||||||
line = self.fp.readline(1 << 16) # bytes
|
line = self.fp.readline(1<<16) # bytes
|
||||||
self.bytes_read += len(line)
|
self.bytes_read += len(line)
|
||||||
if not line:
|
if not line:
|
||||||
self.done = -1
|
self.done = -1
|
||||||
|
@ -822,7 +780,7 @@ class FieldStorage:
|
||||||
while 1:
|
while 1:
|
||||||
if _read >= self.limit:
|
if _read >= self.limit:
|
||||||
break
|
break
|
||||||
line = self.fp.readline(1 << 16) # bytes
|
line = self.fp.readline(1<<16) # bytes
|
||||||
self.bytes_read += len(line)
|
self.bytes_read += len(line)
|
||||||
_read += len(line)
|
_read += len(line)
|
||||||
if not line:
|
if not line:
|
||||||
|
@ -866,7 +824,7 @@ class FieldStorage:
|
||||||
last_boundary = next_boundary + b"--"
|
last_boundary = next_boundary + b"--"
|
||||||
last_line_lfend = True
|
last_line_lfend = True
|
||||||
while True:
|
while True:
|
||||||
line = self.fp.readline(1 << 16)
|
line = self.fp.readline(1<<16)
|
||||||
self.bytes_read += len(line)
|
self.bytes_read += len(line)
|
||||||
if not line:
|
if not line:
|
||||||
self.done = -1
|
self.done = -1
|
||||||
|
@ -878,7 +836,7 @@ class FieldStorage:
|
||||||
if strippedline == last_boundary:
|
if strippedline == last_boundary:
|
||||||
self.done = 1
|
self.done = 1
|
||||||
break
|
break
|
||||||
last_line_lfend = line.endswith(b"\n")
|
last_line_lfend = line.endswith(b'\n')
|
||||||
|
|
||||||
def make_file(self):
|
def make_file(self):
|
||||||
"""Overridable: return a readable & writable file.
|
"""Overridable: return a readable & writable file.
|
||||||
|
@ -907,13 +865,13 @@ class FieldStorage:
|
||||||
if self._binary_file:
|
if self._binary_file:
|
||||||
return tempfile.TemporaryFile("wb+")
|
return tempfile.TemporaryFile("wb+")
|
||||||
else:
|
else:
|
||||||
return tempfile.TemporaryFile("w+", encoding=self.encoding, newline="\n")
|
return tempfile.TemporaryFile("w+",
|
||||||
|
encoding=self.encoding, newline = '\n')
|
||||||
|
|
||||||
|
|
||||||
# Test/debug code
|
# Test/debug code
|
||||||
# ===============
|
# ===============
|
||||||
|
|
||||||
|
|
||||||
def test(environ=os.environ):
|
def test(environ=os.environ):
|
||||||
"""Robust test CGI script, usable as main program.
|
"""Robust test CGI script, usable as main program.
|
||||||
|
|
||||||
|
@ -923,7 +881,7 @@ def test(environ=os.environ):
|
||||||
"""
|
"""
|
||||||
print("Content-type: text/html")
|
print("Content-type: text/html")
|
||||||
print()
|
print()
|
||||||
# sys.stderr = sys.stdout
|
#sys.stderr = sys.stdout
|
||||||
try:
|
try:
|
||||||
form = FieldStorage() # Replace with other classes to test those
|
form = FieldStorage() # Replace with other classes to test those
|
||||||
print_directory()
|
print_directory()
|
||||||
|
@ -931,13 +889,10 @@ def test(environ=os.environ):
|
||||||
print_form(form)
|
print_form(form)
|
||||||
print_environ(environ)
|
print_environ(environ)
|
||||||
print_environ_usage()
|
print_environ_usage()
|
||||||
|
|
||||||
def f():
|
def f():
|
||||||
exec("testing print_exception() -- <I>italics?</I>")
|
exec("testing print_exception() -- <I>italics?</I>")
|
||||||
|
|
||||||
def g(f=f):
|
def g(f=f):
|
||||||
f()
|
f()
|
||||||
|
|
||||||
print("<H3>What follows is a test, not an actual exception:</H3>")
|
print("<H3>What follows is a test, not an actual exception:</H3>")
|
||||||
g()
|
g()
|
||||||
except:
|
except:
|
||||||
|
@ -956,25 +911,20 @@ def test(environ=os.environ):
|
||||||
except:
|
except:
|
||||||
print_exception()
|
print_exception()
|
||||||
|
|
||||||
|
|
||||||
def print_exception(type=None, value=None, tb=None, limit=None):
|
def print_exception(type=None, value=None, tb=None, limit=None):
|
||||||
if type is None:
|
if type is None:
|
||||||
type, value, tb = sys.exc_info()
|
type, value, tb = sys.exc_info()
|
||||||
import traceback
|
import traceback
|
||||||
|
|
||||||
print()
|
print()
|
||||||
print("<H3>Traceback (most recent call last):</H3>")
|
print("<H3>Traceback (most recent call last):</H3>")
|
||||||
list = traceback.format_tb(tb, limit) + traceback.format_exception_only(type, value)
|
list = traceback.format_tb(tb, limit) + \
|
||||||
print(
|
traceback.format_exception_only(type, value)
|
||||||
"<PRE>%s<B>%s</B></PRE>"
|
print("<PRE>%s<B>%s</B></PRE>" % (
|
||||||
% (
|
|
||||||
html.escape("".join(list[:-1])),
|
html.escape("".join(list[:-1])),
|
||||||
html.escape(list[-1]),
|
html.escape(list[-1]),
|
||||||
)
|
))
|
||||||
)
|
|
||||||
del tb
|
del tb
|
||||||
|
|
||||||
|
|
||||||
def print_environ(environ=os.environ):
|
def print_environ(environ=os.environ):
|
||||||
"""Dump the shell environment as HTML."""
|
"""Dump the shell environment as HTML."""
|
||||||
keys = sorted(environ.keys())
|
keys = sorted(environ.keys())
|
||||||
|
@ -986,7 +936,6 @@ def print_environ(environ=os.environ):
|
||||||
print("</DL>")
|
print("</DL>")
|
||||||
print()
|
print()
|
||||||
|
|
||||||
|
|
||||||
def print_form(form):
|
def print_form(form):
|
||||||
"""Dump the contents of a form as HTML."""
|
"""Dump the contents of a form as HTML."""
|
||||||
keys = sorted(form.keys())
|
keys = sorted(form.keys())
|
||||||
|
@ -996,14 +945,13 @@ def print_form(form):
|
||||||
print("<P>No form fields.")
|
print("<P>No form fields.")
|
||||||
print("<DL>")
|
print("<DL>")
|
||||||
for key in keys:
|
for key in keys:
|
||||||
print("<DT>" + html.escape(key) + ":", end=" ")
|
print("<DT>" + html.escape(key) + ":", end=' ')
|
||||||
value = form[key]
|
value = form[key]
|
||||||
print("<i>" + html.escape(repr(type(value))) + "</i>")
|
print("<i>" + html.escape(repr(type(value))) + "</i>")
|
||||||
print("<DD>" + html.escape(repr(value)))
|
print("<DD>" + html.escape(repr(value)))
|
||||||
print("</DL>")
|
print("</DL>")
|
||||||
print()
|
print()
|
||||||
|
|
||||||
|
|
||||||
def print_directory():
|
def print_directory():
|
||||||
"""Dump the current directory as HTML."""
|
"""Dump the current directory as HTML."""
|
||||||
print()
|
print()
|
||||||
|
@ -1016,7 +964,6 @@ def print_directory():
|
||||||
print(html.escape(pwd))
|
print(html.escape(pwd))
|
||||||
print()
|
print()
|
||||||
|
|
||||||
|
|
||||||
def print_arguments():
|
def print_arguments():
|
||||||
print()
|
print()
|
||||||
print("<H3>Command Line Arguments:</H3>")
|
print("<H3>Command Line Arguments:</H3>")
|
||||||
|
@ -1024,11 +971,9 @@ def print_arguments():
|
||||||
print(sys.argv)
|
print(sys.argv)
|
||||||
print()
|
print()
|
||||||
|
|
||||||
|
|
||||||
def print_environ_usage():
|
def print_environ_usage():
|
||||||
"""Dump a list of environment variables used by CGI as HTML."""
|
"""Dump a list of environment variables used by CGI as HTML."""
|
||||||
print(
|
print("""
|
||||||
"""
|
|
||||||
<H3>These environment variables could have been set:</H3>
|
<H3>These environment variables could have been set:</H3>
|
||||||
<UL>
|
<UL>
|
||||||
<LI>AUTH_TYPE
|
<LI>AUTH_TYPE
|
||||||
|
@ -1067,17 +1012,16 @@ environment as well. Here are some common variable names:
|
||||||
<LI>HTTP_REFERER
|
<LI>HTTP_REFERER
|
||||||
<LI>HTTP_USER_AGENT
|
<LI>HTTP_USER_AGENT
|
||||||
</UL>
|
</UL>
|
||||||
"""
|
""")
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# Utilities
|
# Utilities
|
||||||
# =========
|
# =========
|
||||||
|
|
||||||
|
|
||||||
def escape(s, quote=None):
|
def escape(s, quote=None):
|
||||||
"""Deprecated API."""
|
"""Deprecated API."""
|
||||||
warn("cgi.escape is deprecated, use html.escape instead", DeprecationWarning, stacklevel=2)
|
warn("cgi.escape is deprecated, use html.escape instead",
|
||||||
|
DeprecationWarning, stacklevel=2)
|
||||||
s = s.replace("&", "&") # Must be done first!
|
s = s.replace("&", "&") # Must be done first!
|
||||||
s = s.replace("<", "<")
|
s = s.replace("<", "<")
|
||||||
s = s.replace(">", ">")
|
s = s.replace(">", ">")
|
||||||
|
@ -1088,17 +1032,15 @@ def escape(s, quote=None):
|
||||||
|
|
||||||
def valid_boundary(s, _vb_pattern=None):
|
def valid_boundary(s, _vb_pattern=None):
|
||||||
import re
|
import re
|
||||||
|
|
||||||
if isinstance(s, bytes):
|
if isinstance(s, bytes):
|
||||||
_vb_pattern = b"^[ -~]{0,200}[!-~]$"
|
_vb_pattern = b"^[ -~]{0,200}[!-~]$"
|
||||||
else:
|
else:
|
||||||
_vb_pattern = "^[ -~]{0,200}[!-~]$"
|
_vb_pattern = "^[ -~]{0,200}[!-~]$"
|
||||||
return re.match(_vb_pattern, s)
|
return re.match(_vb_pattern, s)
|
||||||
|
|
||||||
|
|
||||||
# Invoke mainline
|
# Invoke mainline
|
||||||
# ===============
|
# ===============
|
||||||
|
|
||||||
# Call test() when this file is run as a script (not imported as a module)
|
# Call test() when this file is run as a script (not imported as a module)
|
||||||
if __name__ == "__main__":
|
if __name__ == '__main__':
|
||||||
test()
|
test()
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=cpython
|
||||||
|
type=module
|
||||||
|
version=3.3.3-1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-cgi',
|
||||||
|
version='3.3.3-1',
|
||||||
|
description='CPython cgi module ported to MicroPython',
|
||||||
|
long_description='This is a module ported from CPython standard library to be compatible with\nMicroPython interpreter. Usually, this means applying small patches for\nfeatures not supported (yet, or at all) in MicroPython. Sometimes, heavier\nchanges are required. Note that CPython modules are written with availability\nof vast resources in mind, and may not work for MicroPython ports with\nlimited heap. If you are affected by such a case, please help reimplement\nthe module from scratch.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
author_email='python-dev@python.org',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
py_modules=['cgi'])
|
|
@ -51,14 +51,14 @@ this means that that help by doc string feature doesn't work.
|
||||||
completions have also been stripped out.
|
completions have also been stripped out.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import sys
|
#import string, sys
|
||||||
|
import sys # MiroPython doesn't yet have a string module
|
||||||
|
|
||||||
__all__ = ["Cmd"]
|
__all__ = ["Cmd"]
|
||||||
|
|
||||||
PROMPT = "(Cmd) "
|
PROMPT = '(Cmd) '
|
||||||
# This is equivalent to string.ascii_letters + string.digits + '_'
|
#IDENTCHARS = string.ascii_letters + string.digits + '_'
|
||||||
IDENTCHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_"
|
IDENTCHARS = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_'
|
||||||
|
|
||||||
|
|
||||||
class Cmd:
|
class Cmd:
|
||||||
"""A simple framework for writing line-oriented command interpreters.
|
"""A simple framework for writing line-oriented command interpreters.
|
||||||
|
@ -72,11 +72,10 @@ class Cmd:
|
||||||
in order to inherit Cmd's methods and encapsulate action methods.
|
in order to inherit Cmd's methods and encapsulate action methods.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
prompt = PROMPT
|
prompt = PROMPT
|
||||||
identchars = IDENTCHARS
|
identchars = IDENTCHARS
|
||||||
ruler = "="
|
ruler = '='
|
||||||
lastcmd = ""
|
lastcmd = ''
|
||||||
intro = None
|
intro = None
|
||||||
doc_leader = ""
|
doc_leader = ""
|
||||||
doc_header = "Documented commands (type help <topic>):"
|
doc_header = "Documented commands (type help <topic>):"
|
||||||
|
@ -115,7 +114,7 @@ class Cmd:
|
||||||
if intro is not None:
|
if intro is not None:
|
||||||
self.intro = intro
|
self.intro = intro
|
||||||
if self.intro:
|
if self.intro:
|
||||||
self.stdout.write(str(self.intro) + "\n")
|
self.stdout.write(str(self.intro)+"\n")
|
||||||
stop = None
|
stop = None
|
||||||
while not stop:
|
while not stop:
|
||||||
if self.cmdqueue:
|
if self.cmdqueue:
|
||||||
|
@ -125,15 +124,15 @@ class Cmd:
|
||||||
try:
|
try:
|
||||||
line = input(self.prompt)
|
line = input(self.prompt)
|
||||||
except EOFError:
|
except EOFError:
|
||||||
line = "EOF"
|
line = 'EOF'
|
||||||
else:
|
else:
|
||||||
self.stdout.write(self.prompt)
|
self.stdout.write(self.prompt)
|
||||||
self.stdout.flush()
|
self.stdout.flush()
|
||||||
line = self.stdin.readline()
|
line = self.stdin.readline()
|
||||||
if not len(line):
|
if not len(line):
|
||||||
line = "EOF"
|
line = 'EOF'
|
||||||
else:
|
else:
|
||||||
line = line.rstrip("\r\n")
|
line = line.rstrip('\r\n')
|
||||||
line = self.precmd(line)
|
line = self.precmd(line)
|
||||||
stop = self.onecmd(line)
|
stop = self.onecmd(line)
|
||||||
stop = self.postcmd(stop, line)
|
stop = self.postcmd(stop, line)
|
||||||
|
@ -171,16 +170,15 @@ class Cmd:
|
||||||
line = line.strip()
|
line = line.strip()
|
||||||
if not line:
|
if not line:
|
||||||
return None, None, line
|
return None, None, line
|
||||||
elif line[0] == "?":
|
elif line[0] == '?':
|
||||||
line = "help " + line[1:]
|
line = 'help ' + line[1:]
|
||||||
elif line[0] == "!":
|
elif line[0] == '!':
|
||||||
if hasattr(self, "do_shell"):
|
if hasattr(self, 'do_shell'):
|
||||||
line = "shell " + line[1:]
|
line = 'shell ' + line[1:]
|
||||||
else:
|
else:
|
||||||
return None, None, line
|
return None, None, line
|
||||||
i, n = 0, len(line)
|
i, n = 0, len(line)
|
||||||
while i < n and line[i] in self.identchars:
|
while i < n and line[i] in self.identchars: i = i+1
|
||||||
i = i + 1
|
|
||||||
cmd, arg = line[:i], line[i:].strip()
|
cmd, arg = line[:i], line[i:].strip()
|
||||||
return cmd, arg, line
|
return cmd, arg, line
|
||||||
|
|
||||||
|
@ -200,13 +198,13 @@ class Cmd:
|
||||||
if cmd is None:
|
if cmd is None:
|
||||||
return self.default(line)
|
return self.default(line)
|
||||||
self.lastcmd = line
|
self.lastcmd = line
|
||||||
if line == "EOF":
|
if line == 'EOF' :
|
||||||
self.lastcmd = ""
|
self.lastcmd = ''
|
||||||
if cmd == "":
|
if cmd == '':
|
||||||
return self.default(line)
|
return self.default(line)
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
func = getattr(self, "do_" + cmd)
|
func = getattr(self, 'do_' + cmd)
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
return self.default(line)
|
return self.default(line)
|
||||||
return func(arg)
|
return func(arg)
|
||||||
|
@ -228,7 +226,7 @@ class Cmd:
|
||||||
returns.
|
returns.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
self.stdout.write("*** Unknown syntax: %s\n" % line)
|
self.stdout.write('*** Unknown syntax: %s\n'%line)
|
||||||
|
|
||||||
def get_names(self):
|
def get_names(self):
|
||||||
# This method used to pull in base class attributes
|
# This method used to pull in base class attributes
|
||||||
|
@ -240,9 +238,9 @@ class Cmd:
|
||||||
if arg:
|
if arg:
|
||||||
# XXX check arg syntax
|
# XXX check arg syntax
|
||||||
try:
|
try:
|
||||||
func = getattr(self, "help_" + arg)
|
func = getattr(self, 'help_' + arg)
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
self.stdout.write("%s\n" % str(self.nohelp % (arg,)))
|
self.stdout.write("%s\n"%str(self.nohelp % (arg,)))
|
||||||
return
|
return
|
||||||
func()
|
func()
|
||||||
else:
|
else:
|
||||||
|
@ -251,33 +249,33 @@ class Cmd:
|
||||||
cmds_undoc = []
|
cmds_undoc = []
|
||||||
help = {}
|
help = {}
|
||||||
for name in names:
|
for name in names:
|
||||||
if name[:5] == "help_":
|
if name[:5] == 'help_':
|
||||||
help[name[5:]] = 1
|
help[name[5:]]=1
|
||||||
names.sort()
|
names.sort()
|
||||||
# There can be duplicates if routines overridden
|
# There can be duplicates if routines overridden
|
||||||
prevname = ""
|
prevname = ''
|
||||||
for name in names:
|
for name in names:
|
||||||
if name[:3] == "do_":
|
if name[:3] == 'do_':
|
||||||
if name == prevname:
|
if name == prevname:
|
||||||
continue
|
continue
|
||||||
prevname = name
|
prevname = name
|
||||||
cmd = name[3:]
|
cmd=name[3:]
|
||||||
if cmd in help:
|
if cmd in help:
|
||||||
cmds_doc.append(cmd)
|
cmds_doc.append(cmd)
|
||||||
del help[cmd]
|
del help[cmd]
|
||||||
else:
|
else:
|
||||||
cmds_undoc.append(cmd)
|
cmds_undoc.append(cmd)
|
||||||
self.stdout.write("%s\n" % str(self.doc_leader))
|
self.stdout.write("%s\n"%str(self.doc_leader))
|
||||||
self.print_topics(self.doc_header, cmds_doc, 15, 80)
|
self.print_topics(self.doc_header, cmds_doc, 15,80)
|
||||||
self.print_topics(self.misc_header, list(help.keys()), 15, 80)
|
self.print_topics(self.misc_header, list(help.keys()),15,80)
|
||||||
self.print_topics(self.undoc_header, cmds_undoc, 15, 80)
|
self.print_topics(self.undoc_header, cmds_undoc, 15,80)
|
||||||
|
|
||||||
def print_topics(self, header, cmds, cmdlen, maxcol):
|
def print_topics(self, header, cmds, cmdlen, maxcol):
|
||||||
if cmds:
|
if cmds:
|
||||||
self.stdout.write("%s\n" % str(header))
|
self.stdout.write("%s\n"%str(header))
|
||||||
if self.ruler:
|
if self.ruler:
|
||||||
self.stdout.write("%s\n" % str(self.ruler * len(header)))
|
self.stdout.write("%s\n"%str(self.ruler * len(header)))
|
||||||
self.columnize(cmds, maxcol - 1)
|
self.columnize(cmds, maxcol-1)
|
||||||
self.stdout.write("\n")
|
self.stdout.write("\n")
|
||||||
|
|
||||||
def columnize(self, list, displaywidth=80):
|
def columnize(self, list, displaywidth=80):
|
||||||
|
@ -290,22 +288,24 @@ class Cmd:
|
||||||
self.stdout.write("<empty>\n")
|
self.stdout.write("<empty>\n")
|
||||||
return
|
return
|
||||||
|
|
||||||
nonstrings = [i for i in range(len(list)) if not isinstance(list[i], str)]
|
nonstrings = [i for i in range(len(list))
|
||||||
|
if not isinstance(list[i], str)]
|
||||||
if nonstrings:
|
if nonstrings:
|
||||||
raise TypeError("list[i] not a string for i in %s" % ", ".join(map(str, nonstrings)))
|
raise TypeError("list[i] not a string for i in %s"
|
||||||
|
% ", ".join(map(str, nonstrings)))
|
||||||
size = len(list)
|
size = len(list)
|
||||||
if size == 1:
|
if size == 1:
|
||||||
self.stdout.write("%s\n" % str(list[0]))
|
self.stdout.write('%s\n'%str(list[0]))
|
||||||
return
|
return
|
||||||
# Try every row count from 1 upwards
|
# Try every row count from 1 upwards
|
||||||
for nrows in range(1, len(list)):
|
for nrows in range(1, len(list)):
|
||||||
ncols = (size + nrows - 1) // nrows
|
ncols = (size+nrows-1) // nrows
|
||||||
colwidths = []
|
colwidths = []
|
||||||
totwidth = -2
|
totwidth = -2
|
||||||
for col in range(ncols):
|
for col in range(ncols):
|
||||||
colwidth = 0
|
colwidth = 0
|
||||||
for row in range(nrows):
|
for row in range(nrows):
|
||||||
i = row + nrows * col
|
i = row + nrows*col
|
||||||
if i >= size:
|
if i >= size:
|
||||||
break
|
break
|
||||||
x = list[i]
|
x = list[i]
|
||||||
|
@ -323,7 +323,7 @@ class Cmd:
|
||||||
for row in range(nrows):
|
for row in range(nrows):
|
||||||
texts = []
|
texts = []
|
||||||
for col in range(ncols):
|
for col in range(ncols):
|
||||||
i = row + nrows * col
|
i = row + nrows*col
|
||||||
if i >= size:
|
if i >= size:
|
||||||
x = ""
|
x = ""
|
||||||
else:
|
else:
|
||||||
|
@ -332,6 +332,6 @@ class Cmd:
|
||||||
while texts and not texts[-1]:
|
while texts and not texts[-1]:
|
||||||
del texts[-1]
|
del texts[-1]
|
||||||
for col in range(len(texts)):
|
for col in range(len(texts)):
|
||||||
# texts[col] = texts[col].ljust(colwidths[col])
|
#texts[col] = texts[col].ljust(colwidths[col])
|
||||||
texts[col] = "%-*s" % (colwidths[col], texts[col])
|
texts[col] = '%-*s' % (colwidths[col], texts[col])
|
||||||
self.stdout.write("%s\n" % str(" ".join(texts)))
|
self.stdout.write("%s\n"%str(" ".join(texts)))
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype = cpython
|
||||||
|
type = module
|
||||||
|
version = 3.4.0-1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-cmd',
|
||||||
|
version='3.4.0-1',
|
||||||
|
description='CPython cmd module ported to MicroPython',
|
||||||
|
long_description='This is a module ported from CPython standard library to be compatible with\nMicroPython interpreter. Usually, this means applying small patches for\nfeatures not supported (yet, or at all) in MicroPython. Sometimes, heavier\nchanges are required. Note that CPython modules are written with availability\nof vast resources in mind, and may not work for MicroPython ports with\nlimited heap. If you are affected by such a case, please help reimplement\nthe module from scratch.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
author_email='python-dev@python.org',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
py_modules=['cmd'])
|
|
@ -1,4 +1,5 @@
|
||||||
class defaultdict:
|
class defaultdict:
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def __new__(cls, default_factory=None, **kwargs):
|
def __new__(cls, default_factory=None, **kwargs):
|
||||||
# Some code (e.g. urllib.urlparse) expects that basic defaultdict
|
# Some code (e.g. urllib.urlparse) expects that basic defaultdict
|
||||||
|
@ -26,9 +27,6 @@ class defaultdict:
|
||||||
def __delitem__(self, key):
|
def __delitem__(self, key):
|
||||||
del self.d[key]
|
del self.d[key]
|
||||||
|
|
||||||
def __contains__(self, key):
|
|
||||||
return key in self.d
|
|
||||||
|
|
||||||
def __missing__(self, key):
|
def __missing__(self, key):
|
||||||
if self.default_factory is None:
|
if self.default_factory is None:
|
||||||
raise KeyError(key)
|
raise KeyError(key)
|
|
@ -0,0 +1,4 @@
|
||||||
|
srctype = micropython-lib
|
||||||
|
type = package
|
||||||
|
version = 0.2.1
|
||||||
|
author = Paul Sokolovsky
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-collections.defaultdict',
|
||||||
|
version='0.2.1',
|
||||||
|
description='collections.defaultdict module for MicroPython',
|
||||||
|
long_description="This is a module reimplemented specifically for MicroPython standard library,\nwith efficient and lean design in mind. Note that this module is likely work\nin progress and likely supports just a subset of CPython's corresponding\nmodule. Please help with the development if you are interested in this\nmodule.",
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='Paul Sokolovsky',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
packages=['collections'])
|
|
@ -1,10 +1,8 @@
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
|
|
||||||
d = defaultdict.defaultdict(lambda: 42)
|
d = defaultdict.defaultdict(lambda:42)
|
||||||
assert d[1] == 42
|
assert d[1] == 42
|
||||||
d[2] = 3
|
d[2] = 3
|
||||||
assert d[2] == 3
|
assert d[2] == 3
|
||||||
del d[1]
|
del d[1]
|
||||||
assert d[1] == 42
|
assert d[1] == 42
|
||||||
|
|
||||||
assert "foo" not in d
|
|
|
@ -0,0 +1,37 @@
|
||||||
|
class deque:
|
||||||
|
|
||||||
|
def __init__(self, iterable=None):
|
||||||
|
if iterable is None:
|
||||||
|
self.q = []
|
||||||
|
else:
|
||||||
|
self.q = list(iterable)
|
||||||
|
|
||||||
|
def popleft(self):
|
||||||
|
return self.q.pop(0)
|
||||||
|
|
||||||
|
def popright(self):
|
||||||
|
return self.q.pop()
|
||||||
|
|
||||||
|
def pop(self):
|
||||||
|
return self.q.pop()
|
||||||
|
|
||||||
|
def append(self, a):
|
||||||
|
self.q.append(a)
|
||||||
|
|
||||||
|
def appendleft(self, a):
|
||||||
|
self.q.insert(0, a)
|
||||||
|
|
||||||
|
def extend(self, a):
|
||||||
|
self.q.extend(a)
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return len(self.q)
|
||||||
|
|
||||||
|
def __bool__(self):
|
||||||
|
return bool(self.q)
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
yield from self.q
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return 'deque({})'.format(self.q)
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype = micropython-lib
|
||||||
|
type = package
|
||||||
|
version = 0.1.2
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-collections.deque',
|
||||||
|
version='0.1.2',
|
||||||
|
description='collections.deque module for MicroPython',
|
||||||
|
long_description="This is a module reimplemented specifically for MicroPython standard library,\nwith efficient and lean design in mind. Note that this module is likely work\nin progress and likely supports just a subset of CPython's corresponding\nmodule. Please help with the development if you are interested in this\nmodule.",
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
packages=['collections'])
|
|
@ -0,0 +1,16 @@
|
||||||
|
# Should be reimplemented for MicroPython
|
||||||
|
# Reason:
|
||||||
|
# CPython implementation brings in metaclasses and other bloat.
|
||||||
|
# This is going to be just import-all for other modules in a namespace package
|
||||||
|
from ucollections import *
|
||||||
|
try:
|
||||||
|
from .defaultdict import defaultdict
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
try:
|
||||||
|
from .deque import deque
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
class MutableMapping:
|
||||||
|
pass
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype = micropython-lib
|
||||||
|
type = package
|
||||||
|
version = 0.1.1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-collections',
|
||||||
|
version='0.1.1',
|
||||||
|
description='collections module for MicroPython',
|
||||||
|
long_description="This is a module reimplemented specifically for MicroPython standard library,\nwith efficient and lean design in mind. Note that this module is likely work\nin progress and likely supports just a subset of CPython's corresponding\nmodule. Please help with the development if you are interested in this\nmodule.",
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
packages=['collections'])
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=package
|
||||||
|
version=0.0.0
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-concurrent.futures',
|
||||||
|
version='0.0.0',
|
||||||
|
description='Dummy concurrent.futures module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
packages=['concurrent'])
|
|
@ -29,13 +29,10 @@ class closing(object):
|
||||||
f.close()
|
f.close()
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, thing):
|
def __init__(self, thing):
|
||||||
self.thing = thing
|
self.thing = thing
|
||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
return self.thing
|
return self.thing
|
||||||
|
|
||||||
def __exit__(self, *exc_info):
|
def __exit__(self, *exc_info):
|
||||||
self.thing.close()
|
self.thing.close()
|
||||||
|
|
||||||
|
@ -69,7 +66,6 @@ class suppress:
|
||||||
# See http://bugs.python.org/issue12029 for more details
|
# See http://bugs.python.org/issue12029 for more details
|
||||||
return exctype is not None and issubclass(exctype, self._exceptions)
|
return exctype is not None and issubclass(exctype, self._exceptions)
|
||||||
|
|
||||||
|
|
||||||
# Inspired by discussions on http://bugs.python.org/issue13585
|
# Inspired by discussions on http://bugs.python.org/issue13585
|
||||||
class ExitStack(object):
|
class ExitStack(object):
|
||||||
"""Context manager for dynamic management of a stack of exit callbacks
|
"""Context manager for dynamic management of a stack of exit callbacks
|
||||||
|
@ -83,23 +79,20 @@ class ExitStack(object):
|
||||||
# in the list raise an exception
|
# in the list raise an exception
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self._exit_callbacks = []
|
self._exit_callbacks = deque()
|
||||||
|
|
||||||
def pop_all(self):
|
def pop_all(self):
|
||||||
"""Preserve the context stack by transferring it to a new instance"""
|
"""Preserve the context stack by transferring it to a new instance"""
|
||||||
new_stack = type(self)()
|
new_stack = type(self)()
|
||||||
new_stack._exit_callbacks = self._exit_callbacks
|
new_stack._exit_callbacks = self._exit_callbacks
|
||||||
self._exit_callbacks = []
|
self._exit_callbacks = deque()
|
||||||
return new_stack
|
return new_stack
|
||||||
|
|
||||||
def _push_cm_exit(self, cm, cm_exit):
|
def _push_cm_exit(self, cm, cm_exit):
|
||||||
"""Helper to correctly register callbacks to __exit__ methods"""
|
"""Helper to correctly register callbacks to __exit__ methods"""
|
||||||
|
|
||||||
def _exit_wrapper(*exc_details):
|
def _exit_wrapper(*exc_details):
|
||||||
return cm_exit(cm, *exc_details)
|
return cm_exit(cm, *exc_details)
|
||||||
|
|
||||||
self.push(_exit_wrapper)
|
self.push(_exit_wrapper)
|
||||||
|
|
||||||
def push(self, exit):
|
def push(self, exit):
|
||||||
|
@ -127,10 +120,8 @@ class ExitStack(object):
|
||||||
|
|
||||||
Cannot suppress exceptions.
|
Cannot suppress exceptions.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def _exit_wrapper(exc_type, exc, tb):
|
def _exit_wrapper(exc_type, exc, tb):
|
||||||
callback(*args, **kwds)
|
callback(*args, **kwds)
|
||||||
|
|
||||||
self.push(_exit_wrapper)
|
self.push(_exit_wrapper)
|
||||||
return callback # Allow use as a decorator
|
return callback # Allow use as a decorator
|
||||||
|
|
|
@ -0,0 +1,5 @@
|
||||||
|
srctype = cpython
|
||||||
|
type = module
|
||||||
|
version = 3.4.2-3
|
||||||
|
long_desc = Port of contextlib for micropython
|
||||||
|
depends = ucontextlib, collections
|
|
@ -0,0 +1,19 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-contextlib',
|
||||||
|
version='3.4.2-3',
|
||||||
|
description='CPython contextlib module ported to MicroPython',
|
||||||
|
long_description='This is a module ported from CPython standard library to be compatible with\nMicroPython interpreter. Usually, this means applying small patches for\nfeatures not supported (yet, or at all) in MicroPython. Sometimes, heavier\nchanges are required. Note that CPython modules are written with availability\nof vast resources in mind, and may not work for MicroPython ports with\nlimited heap. If you are affected by such a case, please help reimplement\nthe module from scratch.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
author_email='python-dev@python.org',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
py_modules=['contextlib'],
|
||||||
|
install_requires=['micropython-ucontextlib', 'micropython-collections'])
|
|
@ -4,6 +4,7 @@ from contextlib import closing, suppress, ExitStack
|
||||||
|
|
||||||
|
|
||||||
class ClosingTestCase(unittest.TestCase):
|
class ClosingTestCase(unittest.TestCase):
|
||||||
|
|
||||||
class Closable:
|
class Closable:
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.closed = False
|
self.closed = False
|
||||||
|
@ -29,6 +30,7 @@ class ClosingTestCase(unittest.TestCase):
|
||||||
|
|
||||||
|
|
||||||
class SuppressTestCase(unittest.TestCase):
|
class SuppressTestCase(unittest.TestCase):
|
||||||
|
|
||||||
def test_suppress(self):
|
def test_suppress(self):
|
||||||
with suppress(ValueError, TypeError):
|
with suppress(ValueError, TypeError):
|
||||||
raise ValueError()
|
raise ValueError()
|
||||||
|
@ -37,6 +39,7 @@ class SuppressTestCase(unittest.TestCase):
|
||||||
|
|
||||||
|
|
||||||
class TestExitStack(unittest.TestCase):
|
class TestExitStack(unittest.TestCase):
|
||||||
|
|
||||||
# @support.requires_docstrings
|
# @support.requires_docstrings
|
||||||
def _test_instance_docs(self):
|
def _test_instance_docs(self):
|
||||||
# Issue 19330: ensure context manager instances have good docstrings
|
# Issue 19330: ensure context manager instances have good docstrings
|
||||||
|
@ -52,17 +55,15 @@ class TestExitStack(unittest.TestCase):
|
||||||
expected = [
|
expected = [
|
||||||
((), {}),
|
((), {}),
|
||||||
((1,), {}),
|
((1,), {}),
|
||||||
((1, 2), {}),
|
((1,2), {}),
|
||||||
((), dict(example=1)),
|
((), dict(example=1)),
|
||||||
((1,), dict(example=1)),
|
((1,), dict(example=1)),
|
||||||
((1, 2), dict(example=1)),
|
((1,2), dict(example=1)),
|
||||||
]
|
]
|
||||||
result = []
|
result = []
|
||||||
|
|
||||||
def _exit(*args, **kwds):
|
def _exit(*args, **kwds):
|
||||||
"""Test metadata propagation"""
|
"""Test metadata propagation"""
|
||||||
result.append((args, kwds))
|
result.append((args, kwds))
|
||||||
|
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
for args, kwds in reversed(expected):
|
for args, kwds in reversed(expected):
|
||||||
if args and kwds:
|
if args and kwds:
|
||||||
|
@ -82,28 +83,21 @@ class TestExitStack(unittest.TestCase):
|
||||||
|
|
||||||
def test_push(self):
|
def test_push(self):
|
||||||
exc_raised = ZeroDivisionError
|
exc_raised = ZeroDivisionError
|
||||||
|
|
||||||
def _expect_exc(exc_type, exc, exc_tb):
|
def _expect_exc(exc_type, exc, exc_tb):
|
||||||
self.assertIs(exc_type, exc_raised)
|
self.assertIs(exc_type, exc_raised)
|
||||||
|
|
||||||
def _suppress_exc(*exc_details):
|
def _suppress_exc(*exc_details):
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def _expect_ok(exc_type, exc, exc_tb):
|
def _expect_ok(exc_type, exc, exc_tb):
|
||||||
self.assertIsNone(exc_type)
|
self.assertIsNone(exc_type)
|
||||||
self.assertIsNone(exc)
|
self.assertIsNone(exc)
|
||||||
self.assertIsNone(exc_tb)
|
self.assertIsNone(exc_tb)
|
||||||
|
|
||||||
class ExitCM(object):
|
class ExitCM(object):
|
||||||
def __init__(self, check_exc):
|
def __init__(self, check_exc):
|
||||||
self.check_exc = check_exc
|
self.check_exc = check_exc
|
||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
self.fail("Should not be called!")
|
self.fail("Should not be called!")
|
||||||
|
|
||||||
def __exit__(self, *exc_details):
|
def __exit__(self, *exc_details):
|
||||||
self.check_exc(*exc_details)
|
self.check_exc(*exc_details)
|
||||||
|
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
stack.push(_expect_ok)
|
stack.push(_expect_ok)
|
||||||
self.assertIs(tuple(stack._exit_callbacks)[-1], _expect_ok)
|
self.assertIs(tuple(stack._exit_callbacks)[-1], _expect_ok)
|
||||||
|
@ -119,24 +113,21 @@ class TestExitStack(unittest.TestCase):
|
||||||
self.assertIs(tuple(stack._exit_callbacks)[-1], _expect_exc)
|
self.assertIs(tuple(stack._exit_callbacks)[-1], _expect_exc)
|
||||||
stack.push(_expect_exc)
|
stack.push(_expect_exc)
|
||||||
self.assertIs(tuple(stack._exit_callbacks)[-1], _expect_exc)
|
self.assertIs(tuple(stack._exit_callbacks)[-1], _expect_exc)
|
||||||
1 / 0
|
1/0
|
||||||
|
|
||||||
def test_enter_context(self):
|
def test_enter_context(self):
|
||||||
class TestCM(object):
|
class TestCM(object):
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
result.append(1)
|
result.append(1)
|
||||||
|
|
||||||
def __exit__(self, *exc_details):
|
def __exit__(self, *exc_details):
|
||||||
result.append(3)
|
result.append(3)
|
||||||
|
|
||||||
result = []
|
result = []
|
||||||
cm = TestCM()
|
cm = TestCM()
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
|
|
||||||
@stack.callback # Registered first => cleaned up last
|
@stack.callback # Registered first => cleaned up last
|
||||||
def _exit():
|
def _exit():
|
||||||
result.append(4)
|
result.append(4)
|
||||||
|
|
||||||
self.assertIsNotNone(_exit)
|
self.assertIsNotNone(_exit)
|
||||||
stack.enter_context(cm)
|
stack.enter_context(cm)
|
||||||
# self.assertIs(stack._exit_callbacks[-1].__self__, cm)
|
# self.assertIs(stack._exit_callbacks[-1].__self__, cm)
|
||||||
|
@ -146,11 +137,9 @@ class TestExitStack(unittest.TestCase):
|
||||||
def test_close(self):
|
def test_close(self):
|
||||||
result = []
|
result = []
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
|
|
||||||
@stack.callback
|
@stack.callback
|
||||||
def _exit():
|
def _exit():
|
||||||
result.append(1)
|
result.append(1)
|
||||||
|
|
||||||
self.assertIsNotNone(_exit)
|
self.assertIsNotNone(_exit)
|
||||||
stack.close()
|
stack.close()
|
||||||
result.append(2)
|
result.append(2)
|
||||||
|
@ -159,11 +148,9 @@ class TestExitStack(unittest.TestCase):
|
||||||
def test_pop_all(self):
|
def test_pop_all(self):
|
||||||
result = []
|
result = []
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
|
|
||||||
@stack.callback
|
@stack.callback
|
||||||
def _exit():
|
def _exit():
|
||||||
result.append(3)
|
result.append(3)
|
||||||
|
|
||||||
self.assertIsNotNone(_exit)
|
self.assertIsNotNone(_exit)
|
||||||
new_stack = stack.pop_all()
|
new_stack = stack.pop_all()
|
||||||
result.append(1)
|
result.append(1)
|
||||||
|
@ -175,25 +162,22 @@ class TestExitStack(unittest.TestCase):
|
||||||
with self.assertRaises(ZeroDivisionError):
|
with self.assertRaises(ZeroDivisionError):
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
stack.push(lambda *exc: False)
|
stack.push(lambda *exc: False)
|
||||||
1 / 0
|
1/0
|
||||||
|
|
||||||
def test_exit_suppress(self):
|
def test_exit_suppress(self):
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
stack.push(lambda *exc: True)
|
stack.push(lambda *exc: True)
|
||||||
1 / 0
|
1/0
|
||||||
|
|
||||||
def test_exit_exception_chaining_reference(self):
|
def test_exit_exception_chaining_reference(self):
|
||||||
# Sanity check to make sure that ExitStack chaining matches
|
# Sanity check to make sure that ExitStack chaining matches
|
||||||
# actual nested with statements
|
# actual nested with statements
|
||||||
exc_chain = []
|
exc_chain = []
|
||||||
|
|
||||||
class RaiseExc:
|
class RaiseExc:
|
||||||
def __init__(self, exc):
|
def __init__(self, exc):
|
||||||
self.exc = exc
|
self.exc = exc
|
||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def __exit__(self, *exc_details):
|
def __exit__(self, *exc_details):
|
||||||
exc_chain.append(exc_details[0])
|
exc_chain.append(exc_details[0])
|
||||||
raise self.exc
|
raise self.exc
|
||||||
|
@ -202,10 +186,8 @@ class TestExitStack(unittest.TestCase):
|
||||||
def __init__(self, outer, inner):
|
def __init__(self, outer, inner):
|
||||||
self.outer = outer
|
self.outer = outer
|
||||||
self.inner = inner
|
self.inner = inner
|
||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def __exit__(self, *exc_details):
|
def __exit__(self, *exc_details):
|
||||||
try:
|
try:
|
||||||
exc_chain.append(exc_details[0])
|
exc_chain.append(exc_details[0])
|
||||||
|
@ -217,7 +199,6 @@ class TestExitStack(unittest.TestCase):
|
||||||
class SuppressExc:
|
class SuppressExc:
|
||||||
def __enter__(self):
|
def __enter__(self):
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def __exit__(self, *exc_details):
|
def __exit__(self, *exc_details):
|
||||||
type(self).saved_details = exc_details
|
type(self).saved_details = exc_details
|
||||||
return True
|
return True
|
||||||
|
@ -234,13 +215,7 @@ class TestExitStack(unittest.TestCase):
|
||||||
# Inner exceptions were suppressed
|
# Inner exceptions were suppressed
|
||||||
# self.assertIsNone(exc.__context__.__context__.__context__)
|
# self.assertIsNone(exc.__context__.__context__.__context__)
|
||||||
exc_chain.append(type(exc))
|
exc_chain.append(type(exc))
|
||||||
assert tuple(exc_chain) == (
|
assert tuple(exc_chain) == (ZeroDivisionError, None, AttributeError, KeyError, IndexError)
|
||||||
ZeroDivisionError,
|
|
||||||
None,
|
|
||||||
AttributeError,
|
|
||||||
KeyError,
|
|
||||||
IndexError,
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
self.fail("Expected IndexError, but no exception was raised")
|
self.fail("Expected IndexError, but no exception was raised")
|
||||||
# Check the inner exceptions
|
# Check the inner exceptions
|
||||||
|
@ -251,7 +226,6 @@ class TestExitStack(unittest.TestCase):
|
||||||
def test_exit_exception_chaining(self):
|
def test_exit_exception_chaining(self):
|
||||||
# Ensure exception chaining matches the reference behaviour
|
# Ensure exception chaining matches the reference behaviour
|
||||||
exc_chain = []
|
exc_chain = []
|
||||||
|
|
||||||
def raise_exc(exc):
|
def raise_exc(exc):
|
||||||
frame_exc = sys.exc_info()[0]
|
frame_exc = sys.exc_info()[0]
|
||||||
if frame_exc is not None:
|
if frame_exc is not None:
|
||||||
|
@ -260,7 +234,6 @@ class TestExitStack(unittest.TestCase):
|
||||||
raise exc
|
raise exc
|
||||||
|
|
||||||
saved_details = None
|
saved_details = None
|
||||||
|
|
||||||
def suppress_exc(*exc_details):
|
def suppress_exc(*exc_details):
|
||||||
nonlocal saved_details
|
nonlocal saved_details
|
||||||
saved_details = exc_details
|
saved_details = exc_details
|
||||||
|
@ -281,13 +254,7 @@ class TestExitStack(unittest.TestCase):
|
||||||
# self.assertIsInstance(exc.__context__.__context__, AttributeError)
|
# self.assertIsInstance(exc.__context__.__context__, AttributeError)
|
||||||
# Inner exceptions were suppressed
|
# Inner exceptions were suppressed
|
||||||
# self.assertIsNone(exc.__context__.__context__.__context__)
|
# self.assertIsNone(exc.__context__.__context__.__context__)
|
||||||
assert tuple(exc_chain) == (
|
assert tuple(exc_chain) == (ZeroDivisionError, None, AttributeError, KeyError, IndexError)
|
||||||
ZeroDivisionError,
|
|
||||||
None,
|
|
||||||
AttributeError,
|
|
||||||
KeyError,
|
|
||||||
IndexError,
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
self.fail("Expected IndexError, but no exception was raised")
|
self.fail("Expected IndexError, but no exception was raised")
|
||||||
# Check the inner exceptions
|
# Check the inner exceptions
|
||||||
|
@ -350,7 +317,8 @@ class TestExitStack(unittest.TestCase):
|
||||||
self.assertIs(exc.__context__, exc3)
|
self.assertIs(exc.__context__, exc3)
|
||||||
self.assertIs(exc.__context__.__context__, exc2)
|
self.assertIs(exc.__context__.__context__, exc2)
|
||||||
self.assertIs(exc.__context__.__context__.__context__, exc1)
|
self.assertIs(exc.__context__.__context__.__context__, exc1)
|
||||||
self.assertIsNone(exc.__context__.__context__.__context__.__context__)
|
self.assertIsNone(
|
||||||
|
exc.__context__.__context__.__context__.__context__)
|
||||||
|
|
||||||
def _test_exit_exception_with_existing_context(self):
|
def _test_exit_exception_with_existing_context(self):
|
||||||
# Addresses a lack of test coverage discovered after checking in a
|
# Addresses a lack of test coverage discovered after checking in a
|
||||||
|
@ -360,7 +328,6 @@ class TestExitStack(unittest.TestCase):
|
||||||
raise inner_exc
|
raise inner_exc
|
||||||
finally:
|
finally:
|
||||||
raise outer_exc
|
raise outer_exc
|
||||||
|
|
||||||
exc1 = Exception(1)
|
exc1 = Exception(1)
|
||||||
exc2 = Exception(2)
|
exc2 = Exception(2)
|
||||||
exc3 = Exception(3)
|
exc3 = Exception(3)
|
||||||
|
@ -376,36 +343,37 @@ class TestExitStack(unittest.TestCase):
|
||||||
self.assertIs(exc.__context__, exc4)
|
self.assertIs(exc.__context__, exc4)
|
||||||
self.assertIs(exc.__context__.__context__, exc3)
|
self.assertIs(exc.__context__.__context__, exc3)
|
||||||
self.assertIs(exc.__context__.__context__.__context__, exc2)
|
self.assertIs(exc.__context__.__context__.__context__, exc2)
|
||||||
self.assertIs(exc.__context__.__context__.__context__.__context__, exc1)
|
self.assertIs(
|
||||||
self.assertIsNone(exc.__context__.__context__.__context__.__context__.__context__)
|
exc.__context__.__context__.__context__.__context__, exc1)
|
||||||
|
self.assertIsNone(
|
||||||
|
exc.__context__.__context__.__context__.__context__.__context__)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def test_body_exception_suppress(self):
|
def test_body_exception_suppress(self):
|
||||||
def suppress_exc(*exc_details):
|
def suppress_exc(*exc_details):
|
||||||
return True
|
return True
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
stack.push(suppress_exc)
|
stack.push(suppress_exc)
|
||||||
1 / 0
|
1/0
|
||||||
except IndexError as exc:
|
except IndexError as exc:
|
||||||
self.fail("Expected no exception, got IndexError")
|
self.fail("Expected no exception, got IndexError")
|
||||||
|
|
||||||
def test_exit_exception_chaining_suppress(self):
|
def test_exit_exception_chaining_suppress(self):
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
stack.push(lambda *exc: True)
|
stack.push(lambda *exc: True)
|
||||||
stack.push(lambda *exc: 1 / 0)
|
stack.push(lambda *exc: 1/0)
|
||||||
stack.push(lambda *exc: {}[1])
|
stack.push(lambda *exc: {}[1])
|
||||||
|
|
||||||
def test_excessive_nesting(self):
|
def test_excessive_nesting(self):
|
||||||
# The original implementation would die with RecursionError here
|
# The original implementation would die with RecursionError here
|
||||||
with ExitStack() as stack:
|
with ExitStack() as stack:
|
||||||
for i in range(5000):
|
for i in range(10000):
|
||||||
stack.callback(int)
|
stack.callback(int)
|
||||||
|
|
||||||
def test_instance_bypass(self):
|
def test_instance_bypass(self):
|
||||||
class Example(object):
|
class Example(object): pass
|
||||||
pass
|
|
||||||
|
|
||||||
cm = Example()
|
cm = Example()
|
||||||
cm.__exit__ = object()
|
cm.__exit__ = object()
|
||||||
stack = ExitStack()
|
stack = ExitStack()
|
||||||
|
@ -414,5 +382,5 @@ class TestExitStack(unittest.TestCase):
|
||||||
self.assertIs(tuple(stack._exit_callbacks)[-1], cm)
|
self.assertIs(tuple(stack._exit_callbacks)[-1], cm)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
|
@ -49,23 +49,14 @@ __getstate__() and __setstate__(). See the documentation for module
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import types
|
import types
|
||||||
|
#import weakref
|
||||||
# import weakref
|
#from copyreg import dispatch_table
|
||||||
# from copyreg import dispatch_table
|
#import builtins
|
||||||
# import builtins
|
|
||||||
|
|
||||||
|
|
||||||
class Error(Exception):
|
class Error(Exception):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
error = Error # backward compatibility
|
error = Error # backward compatibility
|
||||||
|
|
||||||
try:
|
|
||||||
from collections import OrderedDict
|
|
||||||
except ImportError:
|
|
||||||
OrderedDict = None
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from org.python.core import PyStringMap
|
from org.python.core import PyStringMap
|
||||||
except ImportError:
|
except ImportError:
|
||||||
|
@ -73,7 +64,6 @@ except ImportError:
|
||||||
|
|
||||||
__all__ = ["Error", "copy", "deepcopy"]
|
__all__ = ["Error", "copy", "deepcopy"]
|
||||||
|
|
||||||
|
|
||||||
def copy(x):
|
def copy(x):
|
||||||
"""Shallow copy operation on arbitrary Python objects.
|
"""Shallow copy operation on arbitrary Python objects.
|
||||||
|
|
||||||
|
@ -112,54 +102,33 @@ def copy(x):
|
||||||
|
|
||||||
_copy_dispatch = d = {}
|
_copy_dispatch = d = {}
|
||||||
|
|
||||||
|
|
||||||
def _copy_immutable(x):
|
def _copy_immutable(x):
|
||||||
return x
|
return x
|
||||||
|
for t in (type(None), int, float, bool, str, tuple,
|
||||||
|
type, range,
|
||||||
for t in (
|
types.BuiltinFunctionType, type(Ellipsis),
|
||||||
type(None),
|
types.FunctionType):
|
||||||
int,
|
|
||||||
float,
|
|
||||||
bool,
|
|
||||||
str,
|
|
||||||
tuple,
|
|
||||||
type,
|
|
||||||
range,
|
|
||||||
types.BuiltinFunctionType,
|
|
||||||
type(Ellipsis),
|
|
||||||
types.FunctionType,
|
|
||||||
):
|
|
||||||
d[t] = _copy_immutable
|
d[t] = _copy_immutable
|
||||||
t = getattr(types, "CodeType", None)
|
t = getattr(types, "CodeType", None)
|
||||||
if t is not None:
|
if t is not None:
|
||||||
d[t] = _copy_immutable
|
d[t] = _copy_immutable
|
||||||
# for name in ("complex", "unicode"):
|
#for name in ("complex", "unicode"):
|
||||||
# t = getattr(builtins, name, None)
|
# t = getattr(builtins, name, None)
|
||||||
# if t is not None:
|
# if t is not None:
|
||||||
# d[t] = _copy_immutable
|
# d[t] = _copy_immutable
|
||||||
|
|
||||||
|
|
||||||
def _copy_with_constructor(x):
|
def _copy_with_constructor(x):
|
||||||
return type(x)(x)
|
return type(x)(x)
|
||||||
|
|
||||||
|
|
||||||
for t in (list, dict, set):
|
for t in (list, dict, set):
|
||||||
d[t] = _copy_with_constructor
|
d[t] = _copy_with_constructor
|
||||||
if OrderedDict is not None:
|
|
||||||
d[OrderedDict] = _copy_with_constructor
|
|
||||||
|
|
||||||
|
|
||||||
def _copy_with_copy_method(x):
|
def _copy_with_copy_method(x):
|
||||||
return x.copy()
|
return x.copy()
|
||||||
|
|
||||||
|
|
||||||
if PyStringMap is not None:
|
if PyStringMap is not None:
|
||||||
d[PyStringMap] = _copy_with_copy_method
|
d[PyStringMap] = _copy_with_copy_method
|
||||||
|
|
||||||
del d
|
del d
|
||||||
|
|
||||||
|
|
||||||
def deepcopy(x, memo=None, _nil=[]):
|
def deepcopy(x, memo=None, _nil=[]):
|
||||||
"""Deep copy operation on arbitrary Python objects.
|
"""Deep copy operation on arbitrary Python objects.
|
||||||
|
|
||||||
|
@ -203,7 +172,8 @@ def deepcopy(x, memo=None, _nil=[]):
|
||||||
if reductor:
|
if reductor:
|
||||||
rv = reductor()
|
rv = reductor()
|
||||||
else:
|
else:
|
||||||
raise Error("un(deep)copyable object of type %s" % cls)
|
raise Error(
|
||||||
|
"un(deep)copyable object of type %s" % cls)
|
||||||
y = _reconstruct(x, rv, 1, memo)
|
y = _reconstruct(x, rv, 1, memo)
|
||||||
|
|
||||||
# If is its own copy, don't memoize.
|
# If is its own copy, don't memoize.
|
||||||
|
@ -212,14 +182,10 @@ def deepcopy(x, memo=None, _nil=[]):
|
||||||
_keep_alive(x, memo) # Make sure x lives at least as long as d
|
_keep_alive(x, memo) # Make sure x lives at least as long as d
|
||||||
return y
|
return y
|
||||||
|
|
||||||
|
|
||||||
_deepcopy_dispatch = d = {}
|
_deepcopy_dispatch = d = {}
|
||||||
|
|
||||||
|
|
||||||
def _deepcopy_atomic(x, memo):
|
def _deepcopy_atomic(x, memo):
|
||||||
return x
|
return x
|
||||||
|
|
||||||
|
|
||||||
d[type(None)] = _deepcopy_atomic
|
d[type(None)] = _deepcopy_atomic
|
||||||
d[type(Ellipsis)] = _deepcopy_atomic
|
d[type(Ellipsis)] = _deepcopy_atomic
|
||||||
d[int] = _deepcopy_atomic
|
d[int] = _deepcopy_atomic
|
||||||
|
@ -239,8 +205,7 @@ d[type] = _deepcopy_atomic
|
||||||
d[range] = _deepcopy_atomic
|
d[range] = _deepcopy_atomic
|
||||||
d[types.BuiltinFunctionType] = _deepcopy_atomic
|
d[types.BuiltinFunctionType] = _deepcopy_atomic
|
||||||
d[types.FunctionType] = _deepcopy_atomic
|
d[types.FunctionType] = _deepcopy_atomic
|
||||||
# d[weakref.ref] = _deepcopy_atomic
|
#d[weakref.ref] = _deepcopy_atomic
|
||||||
|
|
||||||
|
|
||||||
def _deepcopy_list(x, memo):
|
def _deepcopy_list(x, memo):
|
||||||
y = []
|
y = []
|
||||||
|
@ -248,11 +213,8 @@ def _deepcopy_list(x, memo):
|
||||||
for a in x:
|
for a in x:
|
||||||
y.append(deepcopy(a, memo))
|
y.append(deepcopy(a, memo))
|
||||||
return y
|
return y
|
||||||
|
|
||||||
|
|
||||||
d[list] = _deepcopy_list
|
d[list] = _deepcopy_list
|
||||||
|
|
||||||
|
|
||||||
def _deepcopy_tuple(x, memo):
|
def _deepcopy_tuple(x, memo):
|
||||||
y = []
|
y = []
|
||||||
for a in x:
|
for a in x:
|
||||||
|
@ -270,33 +232,22 @@ def _deepcopy_tuple(x, memo):
|
||||||
else:
|
else:
|
||||||
y = x
|
y = x
|
||||||
return y
|
return y
|
||||||
|
|
||||||
|
|
||||||
d[tuple] = _deepcopy_tuple
|
d[tuple] = _deepcopy_tuple
|
||||||
|
|
||||||
|
|
||||||
def _deepcopy_dict(x, memo):
|
def _deepcopy_dict(x, memo):
|
||||||
y = type(x)()
|
y = {}
|
||||||
memo[id(x)] = y
|
memo[id(x)] = y
|
||||||
for key, value in x.items():
|
for key, value in x.items():
|
||||||
y[deepcopy(key, memo)] = deepcopy(value, memo)
|
y[deepcopy(key, memo)] = deepcopy(value, memo)
|
||||||
return y
|
return y
|
||||||
|
|
||||||
|
|
||||||
d[dict] = _deepcopy_dict
|
d[dict] = _deepcopy_dict
|
||||||
if OrderedDict is not None:
|
|
||||||
d[OrderedDict] = _deepcopy_dict
|
|
||||||
if PyStringMap is not None:
|
if PyStringMap is not None:
|
||||||
d[PyStringMap] = _deepcopy_dict
|
d[PyStringMap] = _deepcopy_dict
|
||||||
|
|
||||||
|
|
||||||
def _deepcopy_method(x, memo): # Copy instance methods
|
def _deepcopy_method(x, memo): # Copy instance methods
|
||||||
return type(x)(x.__func__, deepcopy(x.__self__, memo))
|
return type(x)(x.__func__, deepcopy(x.__self__, memo))
|
||||||
|
|
||||||
|
|
||||||
_deepcopy_dispatch[types.MethodType] = _deepcopy_method
|
_deepcopy_dispatch[types.MethodType] = _deepcopy_method
|
||||||
|
|
||||||
|
|
||||||
def _keep_alive(x, memo):
|
def _keep_alive(x, memo):
|
||||||
"""Keeps a reference to the object x in the memo.
|
"""Keeps a reference to the object x in the memo.
|
||||||
|
|
||||||
|
@ -311,8 +262,7 @@ def _keep_alive(x, memo):
|
||||||
memo[id(memo)].append(x)
|
memo[id(memo)].append(x)
|
||||||
except KeyError:
|
except KeyError:
|
||||||
# aha, this is the first one :-)
|
# aha, this is the first one :-)
|
||||||
memo[id(memo)] = [x]
|
memo[id(memo)]=[x]
|
||||||
|
|
||||||
|
|
||||||
def _reconstruct(x, info, deep, memo=None):
|
def _reconstruct(x, info, deep, memo=None):
|
||||||
if isinstance(info, str):
|
if isinstance(info, str):
|
||||||
|
@ -343,7 +293,7 @@ def _reconstruct(x, info, deep, memo=None):
|
||||||
if state:
|
if state:
|
||||||
if deep:
|
if deep:
|
||||||
state = deepcopy(state, memo)
|
state = deepcopy(state, memo)
|
||||||
if hasattr(y, "__setstate__"):
|
if hasattr(y, '__setstate__'):
|
||||||
y.__setstate__(state)
|
y.__setstate__(state)
|
||||||
else:
|
else:
|
||||||
if isinstance(state, tuple) and len(state) == 2:
|
if isinstance(state, tuple) and len(state) == 2:
|
||||||
|
@ -369,12 +319,10 @@ def _reconstruct(x, info, deep, memo=None):
|
||||||
y[key] = value
|
y[key] = value
|
||||||
return y
|
return y
|
||||||
|
|
||||||
|
|
||||||
del d
|
del d
|
||||||
|
|
||||||
del types
|
del types
|
||||||
|
|
||||||
|
|
||||||
# Helper for instance creation without calling __init__
|
# Helper for instance creation without calling __init__
|
||||||
class _EmptyClass:
|
class _EmptyClass:
|
||||||
pass
|
pass
|
|
@ -0,0 +1,16 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise distutils will peek up our
|
||||||
|
# copy module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
setup(name='micropython-copy',
|
||||||
|
version='0.0.2',
|
||||||
|
description='CPython copy module ported to MicroPython',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
install_requires=['micropython-types'],
|
||||||
|
py_modules=['copy'])
|
|
@ -0,0 +1,21 @@
|
||||||
|
import uasyncio as asyncio
|
||||||
|
|
||||||
|
|
||||||
|
def run1():
|
||||||
|
for i in range(1):
|
||||||
|
print('Hello World')
|
||||||
|
yield from asyncio.sleep(2)
|
||||||
|
print("run1 finished")
|
||||||
|
|
||||||
|
def run2():
|
||||||
|
for i in range(3):
|
||||||
|
print('bar')
|
||||||
|
yield run1()
|
||||||
|
yield from asyncio.sleep(1)
|
||||||
|
|
||||||
|
|
||||||
|
import logging
|
||||||
|
logging.basicConfig(level=logging.INFO)
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
loop.create_task(run2())
|
||||||
|
loop.run_forever()
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype = cpython-backport
|
||||||
|
type = module
|
||||||
|
version = 0.2
|
|
@ -0,0 +1,27 @@
|
||||||
|
This patch shows changes done to asyncio.tasks.Task._step() from CPython 3.4.2.
|
||||||
|
|
||||||
|
--- tasks.py 2015-01-01 10:51:40.707114866 +0200
|
||||||
|
+++ uasyncio.py 2015-01-01 10:54:20.172402890 +0200
|
||||||
|
@@ -46,13 +55,16 @@
|
||||||
|
# Bare yield relinquishes control for one event loop iteration.
|
||||||
|
self._loop.call_soon(self._step)
|
||||||
|
elif inspect.isgenerator(result):
|
||||||
|
+ #print("Scheduling", result)
|
||||||
|
+ self._loop.create_task(result)
|
||||||
|
+ self._loop.call_soon(self._step)
|
||||||
|
# Yielding a generator is just wrong.
|
||||||
|
- self._loop.call_soon(
|
||||||
|
- self._step, None,
|
||||||
|
- RuntimeError(
|
||||||
|
- 'yield was used instead of yield from for '
|
||||||
|
- 'generator in task {!r} with {}'.format(
|
||||||
|
- self, result)))
|
||||||
|
+# self._loop.call_soon(
|
||||||
|
+# self._step, None,
|
||||||
|
+# RuntimeError(
|
||||||
|
+# 'yield was used instead of yield from for '
|
||||||
|
+# 'generator in task {!r} with {}'.format(
|
||||||
|
+# self, result)))
|
||||||
|
else:
|
||||||
|
# Yielding something else is an error.
|
||||||
|
self._loop.call_soon(
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-cpython-uasyncio',
|
||||||
|
version='0.2',
|
||||||
|
description='MicroPython module uasyncio ported to CPython',
|
||||||
|
long_description='This is MicroPython compatibility module, allowing applications using\nMicroPython-specific features to run on CPython.\n',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
py_modules=['uasyncio'])
|
|
@ -0,0 +1,99 @@
|
||||||
|
import inspect
|
||||||
|
import asyncio
|
||||||
|
import asyncio.futures as futures
|
||||||
|
from asyncio import *
|
||||||
|
|
||||||
|
|
||||||
|
OrgTask = Task
|
||||||
|
|
||||||
|
class Task(OrgTask):
|
||||||
|
|
||||||
|
def _step(self, value=None, exc=None):
|
||||||
|
assert not self.done(), \
|
||||||
|
'_step(): already done: {!r}, {!r}, {!r}'.format(self, value, exc)
|
||||||
|
if self._must_cancel:
|
||||||
|
if not isinstance(exc, futures.CancelledError):
|
||||||
|
exc = futures.CancelledError()
|
||||||
|
self._must_cancel = False
|
||||||
|
coro = self._coro
|
||||||
|
self._fut_waiter = None
|
||||||
|
|
||||||
|
self.__class__._current_tasks[self._loop] = self
|
||||||
|
# Call either coro.throw(exc) or coro.send(value).
|
||||||
|
try:
|
||||||
|
if exc is not None:
|
||||||
|
result = coro.throw(exc)
|
||||||
|
elif value is not None:
|
||||||
|
result = coro.send(value)
|
||||||
|
else:
|
||||||
|
result = next(coro)
|
||||||
|
except StopIteration as exc:
|
||||||
|
self.set_result(exc.value)
|
||||||
|
except futures.CancelledError as exc:
|
||||||
|
super().cancel() # I.e., Future.cancel(self).
|
||||||
|
except Exception as exc:
|
||||||
|
self.set_exception(exc)
|
||||||
|
except BaseException as exc:
|
||||||
|
self.set_exception(exc)
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
if isinstance(result, futures.Future):
|
||||||
|
# Yielded Future must come from Future.__iter__().
|
||||||
|
if result._blocking:
|
||||||
|
result._blocking = False
|
||||||
|
result.add_done_callback(self._wakeup)
|
||||||
|
self._fut_waiter = result
|
||||||
|
if self._must_cancel:
|
||||||
|
if self._fut_waiter.cancel():
|
||||||
|
self._must_cancel = False
|
||||||
|
else:
|
||||||
|
self._loop.call_soon(
|
||||||
|
self._step, None,
|
||||||
|
RuntimeError(
|
||||||
|
'yield was used instead of yield from '
|
||||||
|
'in task {!r} with {!r}'.format(self, result)))
|
||||||
|
elif result is None:
|
||||||
|
# Bare yield relinquishes control for one event loop iteration.
|
||||||
|
self._loop.call_soon(self._step)
|
||||||
|
elif inspect.isgenerator(result):
|
||||||
|
#print("Scheduling", result)
|
||||||
|
self._loop.create_task(result)
|
||||||
|
self._loop.call_soon(self._step)
|
||||||
|
# Yielding a generator is just wrong.
|
||||||
|
# self._loop.call_soon(
|
||||||
|
# self._step, None,
|
||||||
|
# RuntimeError(
|
||||||
|
# 'yield was used instead of yield from for '
|
||||||
|
# 'generator in task {!r} with {}'.format(
|
||||||
|
# self, result)))
|
||||||
|
else:
|
||||||
|
# Yielding something else is an error.
|
||||||
|
self._loop.call_soon(
|
||||||
|
self._step, None,
|
||||||
|
RuntimeError(
|
||||||
|
'Task got bad yield: {!r}'.format(result)))
|
||||||
|
finally:
|
||||||
|
self.__class__._current_tasks.pop(self._loop)
|
||||||
|
self = None # Needed to break cycles when an exception occurs.
|
||||||
|
|
||||||
|
|
||||||
|
asyncio.tasks.Task = Task
|
||||||
|
|
||||||
|
|
||||||
|
OrgStreamWriter = StreamWriter
|
||||||
|
|
||||||
|
class StreamWriter(OrgStreamWriter):
|
||||||
|
|
||||||
|
def awrite(self, data):
|
||||||
|
if isinstance(data, str):
|
||||||
|
data = data.encode("utf-8")
|
||||||
|
self.write(data)
|
||||||
|
yield from self.drain()
|
||||||
|
|
||||||
|
def aclose(self):
|
||||||
|
self.close()
|
||||||
|
return
|
||||||
|
yield
|
||||||
|
|
||||||
|
|
||||||
|
asyncio.streams.StreamWriter = StreamWriter
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype = dummy
|
||||||
|
type = module
|
||||||
|
version = 0.0.0
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-csv',
|
||||||
|
version='0.0.0',
|
||||||
|
description='Dummy csv module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['csv'])
|
|
@ -0,0 +1,99 @@
|
||||||
|
"""Constants and membership tests for ASCII characters"""
|
||||||
|
|
||||||
|
NUL = 0x00 # ^@
|
||||||
|
SOH = 0x01 # ^A
|
||||||
|
STX = 0x02 # ^B
|
||||||
|
ETX = 0x03 # ^C
|
||||||
|
EOT = 0x04 # ^D
|
||||||
|
ENQ = 0x05 # ^E
|
||||||
|
ACK = 0x06 # ^F
|
||||||
|
BEL = 0x07 # ^G
|
||||||
|
BS = 0x08 # ^H
|
||||||
|
TAB = 0x09 # ^I
|
||||||
|
HT = 0x09 # ^I
|
||||||
|
LF = 0x0a # ^J
|
||||||
|
NL = 0x0a # ^J
|
||||||
|
VT = 0x0b # ^K
|
||||||
|
FF = 0x0c # ^L
|
||||||
|
CR = 0x0d # ^M
|
||||||
|
SO = 0x0e # ^N
|
||||||
|
SI = 0x0f # ^O
|
||||||
|
DLE = 0x10 # ^P
|
||||||
|
DC1 = 0x11 # ^Q
|
||||||
|
DC2 = 0x12 # ^R
|
||||||
|
DC3 = 0x13 # ^S
|
||||||
|
DC4 = 0x14 # ^T
|
||||||
|
NAK = 0x15 # ^U
|
||||||
|
SYN = 0x16 # ^V
|
||||||
|
ETB = 0x17 # ^W
|
||||||
|
CAN = 0x18 # ^X
|
||||||
|
EM = 0x19 # ^Y
|
||||||
|
SUB = 0x1a # ^Z
|
||||||
|
ESC = 0x1b # ^[
|
||||||
|
FS = 0x1c # ^\
|
||||||
|
GS = 0x1d # ^]
|
||||||
|
RS = 0x1e # ^^
|
||||||
|
US = 0x1f # ^_
|
||||||
|
SP = 0x20 # space
|
||||||
|
DEL = 0x7f # delete
|
||||||
|
|
||||||
|
controlnames = [
|
||||||
|
"NUL", "SOH", "STX", "ETX", "EOT", "ENQ", "ACK", "BEL",
|
||||||
|
"BS", "HT", "LF", "VT", "FF", "CR", "SO", "SI",
|
||||||
|
"DLE", "DC1", "DC2", "DC3", "DC4", "NAK", "SYN", "ETB",
|
||||||
|
"CAN", "EM", "SUB", "ESC", "FS", "GS", "RS", "US",
|
||||||
|
"SP"
|
||||||
|
]
|
||||||
|
|
||||||
|
def _ctoi(c):
|
||||||
|
if type(c) == type(""):
|
||||||
|
return ord(c)
|
||||||
|
else:
|
||||||
|
return c
|
||||||
|
|
||||||
|
def isalnum(c): return isalpha(c) or isdigit(c)
|
||||||
|
def isalpha(c): return isupper(c) or islower(c)
|
||||||
|
def isascii(c): return _ctoi(c) <= 127 # ?
|
||||||
|
def isblank(c): return _ctoi(c) in (8,32)
|
||||||
|
def iscntrl(c): return _ctoi(c) <= 31
|
||||||
|
def isdigit(c): return _ctoi(c) >= 48 and _ctoi(c) <= 57
|
||||||
|
def isgraph(c): return _ctoi(c) >= 33 and _ctoi(c) <= 126
|
||||||
|
def islower(c): return _ctoi(c) >= 97 and _ctoi(c) <= 122
|
||||||
|
def isprint(c): return _ctoi(c) >= 32 and _ctoi(c) <= 126
|
||||||
|
def ispunct(c): return _ctoi(c) != 32 and not isalnum(c)
|
||||||
|
def isspace(c): return _ctoi(c) in (9, 10, 11, 12, 13, 32)
|
||||||
|
def isupper(c): return _ctoi(c) >= 65 and _ctoi(c) <= 90
|
||||||
|
def isxdigit(c): return isdigit(c) or \
|
||||||
|
(_ctoi(c) >= 65 and _ctoi(c) <= 70) or (_ctoi(c) >= 97 and _ctoi(c) <= 102)
|
||||||
|
def isctrl(c): return _ctoi(c) < 32
|
||||||
|
def ismeta(c): return _ctoi(c) > 127
|
||||||
|
|
||||||
|
def ascii(c):
|
||||||
|
if type(c) == type(""):
|
||||||
|
return chr(_ctoi(c) & 0x7f)
|
||||||
|
else:
|
||||||
|
return _ctoi(c) & 0x7f
|
||||||
|
|
||||||
|
def ctrl(c):
|
||||||
|
if type(c) == type(""):
|
||||||
|
return chr(_ctoi(c) & 0x1f)
|
||||||
|
else:
|
||||||
|
return _ctoi(c) & 0x1f
|
||||||
|
|
||||||
|
def alt(c):
|
||||||
|
if type(c) == type(""):
|
||||||
|
return chr(_ctoi(c) | 0x80)
|
||||||
|
else:
|
||||||
|
return _ctoi(c) | 0x80
|
||||||
|
|
||||||
|
def unctrl(c):
|
||||||
|
bits = _ctoi(c)
|
||||||
|
if bits == 0x7f:
|
||||||
|
rep = "^?"
|
||||||
|
elif isprint(bits & 0x7f):
|
||||||
|
rep = chr(bits & 0x7f)
|
||||||
|
else:
|
||||||
|
rep = "^" + chr(((bits & 0x7f) | 0x20) + 0x20)
|
||||||
|
if bits & 0x80:
|
||||||
|
return "!" + rep
|
||||||
|
return rep
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype = cpython
|
||||||
|
type = package
|
||||||
|
version = 3.4.2
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-curses.ascii',
|
||||||
|
version='3.4.2',
|
||||||
|
description='CPython curses.ascii module ported to MicroPython',
|
||||||
|
long_description='This is a module ported from CPython standard library to be compatible with\nMicroPython interpreter. Usually, this means applying small patches for\nfeatures not supported (yet, or at all) in MicroPython. Sometimes, heavier\nchanges are required. Note that CPython modules are written with availability\nof vast resources in mind, and may not work for MicroPython ports with\nlimited heap. If you are affected by such a case, please help reimplement\nthe module from scratch.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='CPython Developers',
|
||||||
|
author_email='python-dev@python.org',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='Python',
|
||||||
|
packages=['curses'])
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-datetime',
|
||||||
|
version='0.0.1',
|
||||||
|
description='Dummy datetime module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['datetime'])
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-dbm',
|
||||||
|
version='0.0.1',
|
||||||
|
description='Dummy dbm module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['dbm'])
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-decimal',
|
||||||
|
version='0.0.1',
|
||||||
|
description='Dummy decimal module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['decimal'])
|
|
@ -0,0 +1,3 @@
|
||||||
|
srctype=dummy
|
||||||
|
type=module
|
||||||
|
version=0.0.1
|
|
@ -0,0 +1,18 @@
|
||||||
|
import sys
|
||||||
|
# Remove current dir from sys.path, otherwise setuptools will peek up our
|
||||||
|
# module instead of system.
|
||||||
|
sys.path.pop(0)
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
|
||||||
|
setup(name='micropython-difflib',
|
||||||
|
version='0.0.1',
|
||||||
|
description='Dummy difflib module for MicroPython',
|
||||||
|
long_description='This is a dummy implementation of a module for MicroPython standard library.\nIt contains zero or very little functionality, and primarily intended to\navoid import errors (using idea that even if an application imports a\nmodule, it may be not using it onevery code path, so may work at least\npartially). It is expected that more complete implementation of the module\nwill be provided later. Please help with the development if you are\ninterested in this module.',
|
||||||
|
url='https://github.com/micropython/micropython/issues/405',
|
||||||
|
author='MicroPython Developers',
|
||||||
|
author_email='micro-python@googlegroups.com',
|
||||||
|
maintainer='MicroPython Developers',
|
||||||
|
maintainer_email='micro-python@googlegroups.com',
|
||||||
|
license='MIT',
|
||||||
|
py_modules=['difflib'])
|
|
@ -3,11 +3,11 @@
|
||||||
# Contact: email-sig@python.org
|
# Contact: email-sig@python.org
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
"Charset",
|
'Charset',
|
||||||
"add_alias",
|
'add_alias',
|
||||||
"add_charset",
|
'add_charset',
|
||||||
"add_codec",
|
'add_codec',
|
||||||
]
|
]
|
||||||
|
|
||||||
from functools import partial
|
from functools import partial
|
||||||
|
|
||||||
|
@ -18,6 +18,7 @@ from email import errors
|
||||||
from email.encoders import encode_7or8bit
|
from email.encoders import encode_7or8bit
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Flags for types of header encodings
|
# Flags for types of header encodings
|
||||||
QP = 1 # Quoted-Printable
|
QP = 1 # Quoted-Printable
|
||||||
BASE64 = 2 # Base64
|
BASE64 = 2 # Base64
|
||||||
|
@ -26,82 +27,84 @@ SHORTEST = 3 # the shorter of QP and base64, but only for headers
|
||||||
# In "=?charset?q?hello_world?=", the =?, ?q?, and ?= add up to 7
|
# In "=?charset?q?hello_world?=", the =?, ?q?, and ?= add up to 7
|
||||||
RFC2047_CHROME_LEN = 7
|
RFC2047_CHROME_LEN = 7
|
||||||
|
|
||||||
DEFAULT_CHARSET = "us-ascii"
|
DEFAULT_CHARSET = 'us-ascii'
|
||||||
UNKNOWN8BIT = "unknown-8bit"
|
UNKNOWN8BIT = 'unknown-8bit'
|
||||||
EMPTYSTRING = ""
|
EMPTYSTRING = ''
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Defaults
|
# Defaults
|
||||||
CHARSETS = {
|
CHARSETS = {
|
||||||
# input header enc body enc output conv
|
# input header enc body enc output conv
|
||||||
"iso-8859-1": (QP, QP, None),
|
'iso-8859-1': (QP, QP, None),
|
||||||
"iso-8859-2": (QP, QP, None),
|
'iso-8859-2': (QP, QP, None),
|
||||||
"iso-8859-3": (QP, QP, None),
|
'iso-8859-3': (QP, QP, None),
|
||||||
"iso-8859-4": (QP, QP, None),
|
'iso-8859-4': (QP, QP, None),
|
||||||
# iso-8859-5 is Cyrillic, and not especially used
|
# iso-8859-5 is Cyrillic, and not especially used
|
||||||
# iso-8859-6 is Arabic, also not particularly used
|
# iso-8859-6 is Arabic, also not particularly used
|
||||||
# iso-8859-7 is Greek, QP will not make it readable
|
# iso-8859-7 is Greek, QP will not make it readable
|
||||||
# iso-8859-8 is Hebrew, QP will not make it readable
|
# iso-8859-8 is Hebrew, QP will not make it readable
|
||||||
"iso-8859-9": (QP, QP, None),
|
'iso-8859-9': (QP, QP, None),
|
||||||
"iso-8859-10": (QP, QP, None),
|
'iso-8859-10': (QP, QP, None),
|
||||||
# iso-8859-11 is Thai, QP will not make it readable
|
# iso-8859-11 is Thai, QP will not make it readable
|
||||||
"iso-8859-13": (QP, QP, None),
|
'iso-8859-13': (QP, QP, None),
|
||||||
"iso-8859-14": (QP, QP, None),
|
'iso-8859-14': (QP, QP, None),
|
||||||
"iso-8859-15": (QP, QP, None),
|
'iso-8859-15': (QP, QP, None),
|
||||||
"iso-8859-16": (QP, QP, None),
|
'iso-8859-16': (QP, QP, None),
|
||||||
"windows-1252": (QP, QP, None),
|
'windows-1252':(QP, QP, None),
|
||||||
"viscii": (QP, QP, None),
|
'viscii': (QP, QP, None),
|
||||||
"us-ascii": (None, None, None),
|
'us-ascii': (None, None, None),
|
||||||
"big5": (BASE64, BASE64, None),
|
'big5': (BASE64, BASE64, None),
|
||||||
"gb2312": (BASE64, BASE64, None),
|
'gb2312': (BASE64, BASE64, None),
|
||||||
"euc-jp": (BASE64, None, "iso-2022-jp"),
|
'euc-jp': (BASE64, None, 'iso-2022-jp'),
|
||||||
"shift_jis": (BASE64, None, "iso-2022-jp"),
|
'shift_jis': (BASE64, None, 'iso-2022-jp'),
|
||||||
"iso-2022-jp": (BASE64, None, None),
|
'iso-2022-jp': (BASE64, None, None),
|
||||||
"koi8-r": (BASE64, BASE64, None),
|
'koi8-r': (BASE64, BASE64, None),
|
||||||
"utf-8": (SHORTEST, BASE64, "utf-8"),
|
'utf-8': (SHORTEST, BASE64, 'utf-8'),
|
||||||
}
|
}
|
||||||
|
|
||||||
# Aliases for other commonly-used names for character sets. Map
|
# Aliases for other commonly-used names for character sets. Map
|
||||||
# them to the real ones used in email.
|
# them to the real ones used in email.
|
||||||
ALIASES = {
|
ALIASES = {
|
||||||
"latin_1": "iso-8859-1",
|
'latin_1': 'iso-8859-1',
|
||||||
"latin-1": "iso-8859-1",
|
'latin-1': 'iso-8859-1',
|
||||||
"latin_2": "iso-8859-2",
|
'latin_2': 'iso-8859-2',
|
||||||
"latin-2": "iso-8859-2",
|
'latin-2': 'iso-8859-2',
|
||||||
"latin_3": "iso-8859-3",
|
'latin_3': 'iso-8859-3',
|
||||||
"latin-3": "iso-8859-3",
|
'latin-3': 'iso-8859-3',
|
||||||
"latin_4": "iso-8859-4",
|
'latin_4': 'iso-8859-4',
|
||||||
"latin-4": "iso-8859-4",
|
'latin-4': 'iso-8859-4',
|
||||||
"latin_5": "iso-8859-9",
|
'latin_5': 'iso-8859-9',
|
||||||
"latin-5": "iso-8859-9",
|
'latin-5': 'iso-8859-9',
|
||||||
"latin_6": "iso-8859-10",
|
'latin_6': 'iso-8859-10',
|
||||||
"latin-6": "iso-8859-10",
|
'latin-6': 'iso-8859-10',
|
||||||
"latin_7": "iso-8859-13",
|
'latin_7': 'iso-8859-13',
|
||||||
"latin-7": "iso-8859-13",
|
'latin-7': 'iso-8859-13',
|
||||||
"latin_8": "iso-8859-14",
|
'latin_8': 'iso-8859-14',
|
||||||
"latin-8": "iso-8859-14",
|
'latin-8': 'iso-8859-14',
|
||||||
"latin_9": "iso-8859-15",
|
'latin_9': 'iso-8859-15',
|
||||||
"latin-9": "iso-8859-15",
|
'latin-9': 'iso-8859-15',
|
||||||
"latin_10": "iso-8859-16",
|
'latin_10':'iso-8859-16',
|
||||||
"latin-10": "iso-8859-16",
|
'latin-10':'iso-8859-16',
|
||||||
"cp949": "ks_c_5601-1987",
|
'cp949': 'ks_c_5601-1987',
|
||||||
"euc_jp": "euc-jp",
|
'euc_jp': 'euc-jp',
|
||||||
"euc_kr": "euc-kr",
|
'euc_kr': 'euc-kr',
|
||||||
"ascii": "us-ascii",
|
'ascii': 'us-ascii',
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
# Map charsets to their Unicode codec strings.
|
# Map charsets to their Unicode codec strings.
|
||||||
CODEC_MAP = {
|
CODEC_MAP = {
|
||||||
"gb2312": "eucgb2312_cn",
|
'gb2312': 'eucgb2312_cn',
|
||||||
"big5": "big5_tw",
|
'big5': 'big5_tw',
|
||||||
# Hack: We don't want *any* conversion for stuff marked us-ascii, as all
|
# Hack: We don't want *any* conversion for stuff marked us-ascii, as all
|
||||||
# sorts of garbage might be sent to us in the guise of 7-bit us-ascii.
|
# sorts of garbage might be sent to us in the guise of 7-bit us-ascii.
|
||||||
# Let that stuff pass through without conversion to/from Unicode.
|
# Let that stuff pass through without conversion to/from Unicode.
|
||||||
"us-ascii": None,
|
'us-ascii': None,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Convenience functions for extending the above mappings
|
# Convenience functions for extending the above mappings
|
||||||
def add_charset(charset, header_enc=None, body_enc=None, output_charset=None):
|
def add_charset(charset, header_enc=None, body_enc=None, output_charset=None):
|
||||||
"""Add character set properties to the global registry.
|
"""Add character set properties to the global registry.
|
||||||
|
@ -127,7 +130,7 @@ def add_charset(charset, header_enc=None, body_enc=None, output_charset=None):
|
||||||
documentation for more information.
|
documentation for more information.
|
||||||
"""
|
"""
|
||||||
if body_enc == SHORTEST:
|
if body_enc == SHORTEST:
|
||||||
raise ValueError("SHORTEST not allowed for body_enc")
|
raise ValueError('SHORTEST not allowed for body_enc')
|
||||||
CHARSETS[charset] = (header_enc, body_enc, output_charset)
|
CHARSETS[charset] = (header_enc, body_enc, output_charset)
|
||||||
|
|
||||||
|
|
||||||
|
@ -150,15 +153,17 @@ def add_codec(charset, codecname):
|
||||||
CODEC_MAP[charset] = codecname
|
CODEC_MAP[charset] = codecname
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Convenience function for encoding strings, taking into account
|
# Convenience function for encoding strings, taking into account
|
||||||
# that they might be unknown-8bit (ie: have surrogate-escaped bytes)
|
# that they might be unknown-8bit (ie: have surrogate-escaped bytes)
|
||||||
def _encode(string, codec):
|
def _encode(string, codec):
|
||||||
if codec == UNKNOWN8BIT:
|
if codec == UNKNOWN8BIT:
|
||||||
return string.encode("ascii", "surrogateescape")
|
return string.encode('ascii', 'surrogateescape')
|
||||||
else:
|
else:
|
||||||
return string.encode(codec)
|
return string.encode(codec)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
class Charset:
|
class Charset:
|
||||||
"""Map character sets to their email properties.
|
"""Map character sets to their email properties.
|
||||||
|
|
||||||
|
@ -203,7 +208,6 @@ class Charset:
|
||||||
to the output_charset. If no conversion codec is necessary,
|
to the output_charset. If no conversion codec is necessary,
|
||||||
this attribute will have the same value as the input_codec.
|
this attribute will have the same value as the input_codec.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, input_charset=DEFAULT_CHARSET):
|
def __init__(self, input_charset=DEFAULT_CHARSET):
|
||||||
# RFC 2046, $4.1.2 says charsets are not case sensitive. We coerce to
|
# RFC 2046, $4.1.2 says charsets are not case sensitive. We coerce to
|
||||||
# unicode because its .lower() is locale insensitive. If the argument
|
# unicode because its .lower() is locale insensitive. If the argument
|
||||||
|
@ -211,9 +215,9 @@ class Charset:
|
||||||
# charset is ASCII, as the standard (RFC XXX) requires.
|
# charset is ASCII, as the standard (RFC XXX) requires.
|
||||||
try:
|
try:
|
||||||
if isinstance(input_charset, str):
|
if isinstance(input_charset, str):
|
||||||
input_charset.encode("ascii")
|
input_charset.encode('ascii')
|
||||||
else:
|
else:
|
||||||
input_charset = str(input_charset, "ascii")
|
input_charset = str(input_charset, 'ascii')
|
||||||
except UnicodeError:
|
except UnicodeError:
|
||||||
raise errors.CharsetError(input_charset)
|
raise errors.CharsetError(input_charset)
|
||||||
input_charset = input_charset.lower()
|
input_charset = input_charset.lower()
|
||||||
|
@ -222,7 +226,8 @@ class Charset:
|
||||||
# We can try to guess which encoding and conversion to use by the
|
# We can try to guess which encoding and conversion to use by the
|
||||||
# charset_map dictionary. Try that first, but let the user override
|
# charset_map dictionary. Try that first, but let the user override
|
||||||
# it.
|
# it.
|
||||||
henc, benc, conv = CHARSETS.get(self.input_charset, (SHORTEST, BASE64, None))
|
henc, benc, conv = CHARSETS.get(self.input_charset,
|
||||||
|
(SHORTEST, BASE64, None))
|
||||||
if not conv:
|
if not conv:
|
||||||
conv = self.input_charset
|
conv = self.input_charset
|
||||||
# Set the attributes, allowing the arguments to override the default.
|
# Set the attributes, allowing the arguments to override the default.
|
||||||
|
@ -231,8 +236,10 @@ class Charset:
|
||||||
self.output_charset = ALIASES.get(conv, conv)
|
self.output_charset = ALIASES.get(conv, conv)
|
||||||
# Now set the codecs. If one isn't defined for input_charset,
|
# Now set the codecs. If one isn't defined for input_charset,
|
||||||
# guess and try a Unicode codec with the same name as input_codec.
|
# guess and try a Unicode codec with the same name as input_codec.
|
||||||
self.input_codec = CODEC_MAP.get(self.input_charset, self.input_charset)
|
self.input_codec = CODEC_MAP.get(self.input_charset,
|
||||||
self.output_codec = CODEC_MAP.get(self.output_charset, self.output_charset)
|
self.input_charset)
|
||||||
|
self.output_codec = CODEC_MAP.get(self.output_charset,
|
||||||
|
self.output_charset)
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.input_charset.lower()
|
return self.input_charset.lower()
|
||||||
|
@ -260,9 +267,9 @@ class Charset:
|
||||||
"""
|
"""
|
||||||
assert self.body_encoding != SHORTEST
|
assert self.body_encoding != SHORTEST
|
||||||
if self.body_encoding == QP:
|
if self.body_encoding == QP:
|
||||||
return "quoted-printable"
|
return 'quoted-printable'
|
||||||
elif self.body_encoding == BASE64:
|
elif self.body_encoding == BASE64:
|
||||||
return "base64"
|
return 'base64'
|
||||||
else:
|
else:
|
||||||
return encode_7or8bit
|
return encode_7or8bit
|
||||||
|
|
||||||
|
@ -285,7 +292,7 @@ class Charset:
|
||||||
output codec.
|
output codec.
|
||||||
:return: The encoded string, with RFC 2047 chrome.
|
:return: The encoded string, with RFC 2047 chrome.
|
||||||
"""
|
"""
|
||||||
codec = self.output_codec or "us-ascii"
|
codec = self.output_codec or 'us-ascii'
|
||||||
header_bytes = _encode(string, codec)
|
header_bytes = _encode(string, codec)
|
||||||
# 7bit/8bit encodings return the string unchanged (modulo conversions)
|
# 7bit/8bit encodings return the string unchanged (modulo conversions)
|
||||||
encoder_module = self._get_encoder(header_bytes)
|
encoder_module = self._get_encoder(header_bytes)
|
||||||
|
@ -311,7 +318,7 @@ class Charset:
|
||||||
:return: Lines of encoded strings, each with RFC 2047 chrome.
|
:return: Lines of encoded strings, each with RFC 2047 chrome.
|
||||||
"""
|
"""
|
||||||
# See which encoding we should use.
|
# See which encoding we should use.
|
||||||
codec = self.output_codec or "us-ascii"
|
codec = self.output_codec or 'us-ascii'
|
||||||
header_bytes = _encode(string, codec)
|
header_bytes = _encode(string, codec)
|
||||||
encoder_module = self._get_encoder(header_bytes)
|
encoder_module = self._get_encoder(header_bytes)
|
||||||
encoder = partial(encoder_module.header_encode, charset=codec)
|
encoder = partial(encoder_module.header_encode, charset=codec)
|
||||||
|
@ -344,7 +351,7 @@ class Charset:
|
||||||
if not lines and not current_line:
|
if not lines and not current_line:
|
||||||
lines.append(None)
|
lines.append(None)
|
||||||
else:
|
else:
|
||||||
separator = " " if lines else ""
|
separator = (' ' if lines else '')
|
||||||
joined_line = EMPTYSTRING.join(current_line)
|
joined_line = EMPTYSTRING.join(current_line)
|
||||||
header_bytes = _encode(joined_line, codec)
|
header_bytes = _encode(joined_line, codec)
|
||||||
lines.append(encoder(header_bytes))
|
lines.append(encoder(header_bytes))
|
||||||
|
@ -397,9 +404,9 @@ class Charset:
|
||||||
# being bytes has never been nailed down, so fixing that is a
|
# being bytes has never been nailed down, so fixing that is a
|
||||||
# longer term TODO.
|
# longer term TODO.
|
||||||
if isinstance(string, str):
|
if isinstance(string, str):
|
||||||
string = string.encode(self.output_charset).decode("latin1")
|
string = string.encode(self.output_charset).decode('latin1')
|
||||||
return email.quoprimime.body_encode(string)
|
return email.quoprimime.body_encode(string)
|
||||||
else:
|
else:
|
||||||
if isinstance(string, str):
|
if isinstance(string, str):
|
||||||
string = string.encode(self.output_charset).decode("ascii")
|
string = string.encode(self.output_charset).decode('ascii')
|
||||||
return string
|
return string
|
Some files were not shown because too many files have changed in this diff Show More
Ładowanie…
Reference in New Issue