https://pypi.org/project/black/
cli.py was getting a bit untidy due to all of the heavily annotated
click function methods - used black to clean it up and make it
easier to read.
* New --plugins-dir=plugins/ option
New option causing Datasette to load and evaluate all of the Python files in
the specified directory and register any plugins that are defined in those
files.
This new option is available for the following commands:
datasette serve mydb.db --plugins-dir=plugins/
datasette publish now/heroku mydb.db --plugins-dir=plugins/
datasette package mydb.db --plugins-dir=plugins/
* Unit tests for --plugins-dir=plugins/
Closes#211
Example usage:
datasette package --static css:extra-css/ --static js:extra-js/ \
sf-trees.db --template-dir templates/ --tag sf-trees --branch master
This creates a local Docker image that includes copies of the templates/,
extra-css/ and extra-js/ directories. You can then run it like this:
docker run -p 8001:8001 sf-trees
For publishing to Zeit now:
datasette publish now --static css:extra-css/ --static js:extra-js/ \
sf-trees.db --template-dir templates/ --name sf-trees --branch master
Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List
For publishing to Heroku:
datasette publish heroku --static css:extra-css/ --static js:extra-js/ \
sf-trees.db --template-dir templates/ --branch master
Closes#157, #160
You can now tell Datasette to serve static files from a specific location at a
specific mountpoint.
For example:
datasette serve mydb.db --static extra-css:/tmp/static/css
Now if you visit this URL:
http://localhost:8001/extra-css/blah.css
The following file will be served:
/tmp/static/css/blah.css
Refs #160
You can now pass an additional argument specifying a directory to look for
custom templates in.
Datasette will fall back on the default templates if a template is not
found in that directory.
Refs #12, #153
The `datasette publish` and `datasette package` commands both now accept an
optional `--build` argument. If provided, this can be used to specify a branch
published to GitHub that should be built into the container.
This makes it easier to test code that has not yet been officially released to
PyPI, e.g.:
datasette publish now mydb.db --branch=master
I tesed this by first building and running a container using the new
Dockerfile from #114:
docker build .
docker run -it -p 8001:8001 6c9ca7e29181 /bin/sh
Then I ran this inside the container itself:
apt update && apt-get install wget -y \
&& wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz \
&& gunzip test-2.3.sqlite.gz \
&& mv test-2.3.sqlite test23.sqlite \
&& datasette -h 0.0.0.0 test23.sqlite
I visited this URL to confirm I got an error due to spatialite not being
loaded:
http://localhost:8001/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1
Then I checked that loading it with `--load-extension` worked correctly:
datasette -h 0.0.0.0 test23.sqlite \
--load-extension=/usr/lib/x86_64-linux-gnu/mod_spatialite.so
Then, finally, I tested it with the new environment variable option:
SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/mod_spatialite.so \
datasette -h 0.0.0.0 test23.sqlite
Running it with an invalid environment variable option shows an error:
$ SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/blah.so datasette \
-h 0.0.0.0 test23.sqlite
Usage: datasette -h [OPTIONS] [FILES]...
Error: Invalid value for "--load-extension": Path "/usr/lib/x86_64-linux-gnu/blah.so" does not exist.
Closes#112
You can now run these commands like so:
datasette now publish mydb.db \
--title="My Title" \
--source="Source" \
--source_url="http://www.example.com/" \
--license="CC0" \
--license_url="https://creativecommons.org/publicdomain/zero/1.0/"
This will write those values into the metadata.json that is packaged with the
app. If you also pass --metadata= that file will be updated with the extra
values before being written into the Docker image.
Closes#92
The serve command now accepts --sql_time_limit_ms for customizing the SQL time
limit.
The publish and package commands now accept --extra-options which can be used
to specify additional options to be passed to the datasite serve command when
it executes inside the rusulting Docker containers.
If someone executes 'select * from table' against a table with a million rows
in it, we could run into problems: just serializing that much data as JSON is
likely to lock up the server.
Solution: we now have a hard limit on the maximum number of rows that can be
returned by a query. If that limit is exceeded, the server will return a
`"truncated": true` field in the JSON.
This limit can be optionally controlled by the new `--max_returned_rows`
option. Setting that option to 0 disables the limit entirely.
Closes#69
Example usage:
datasette package fivethirtyeight.db \
--tag fivethirtyeight \
--metadata=538-metadata.json
This will create a temporary directory, generate a Dockerfile, copy in the
SQLite database and metadata file, then build that as a new docker image and
tag that in your local Docker repository as fivethirtyeight:latest.
You can then run the image like so:
docker run -p 8006:8001 fivethirtyeight
This will expose port 8001 in the container (the default) as port 8006 on your
host.
Closes#67
If provided, the --metadata option is the path to a JSON file containing
metadata that should be displayed alongside the dataset.
datasette /tmp/fivethirtyeight.db --metadata /tmp/metadata.json
Currently that metadata format looks like this:
{
"title": "Five Thirty Eight",
"license": "CC Attribution 4.0 License",
"license_url": "http://creativecommons.org/licenses/by/4.0/",
"source": "fivethirtyeight/data on GitHub",
"source_url": "https://github.com/fivethirtyeight/data"
}
If provided, this will be used by the index template and to populate the
common footer.
The publish command also accepts this argument, and will package any provided
metadata up and include it with the resulting Docker container.
datasette publish --metadata /tmp/metadata.json /tmp/fivethirtyeight.db
Closes#68
Building metadata is now optional. If you want to do it, do this:
datasette build *.db --metadata=metadata.json
Then when you run the server you can tell it to read from metadata:
datasette serve *.db --metadata=metadata.json
The Dockerfile generated by datasette publish now uses this mechanism.
Closes#60