Wykres commitów

141 Commity (1d64c9a8dac45b9a3452acf8e76dfadea2b0bc49)

Autor SHA1 Wiadomość Data
Simon Willison 2b344f6a34
Ran black against datasette/cli.py
https://pypi.org/project/black/

cli.py was getting a bit untidy due to all of the heavily annotated
click function methods - used black to clean it up and make it
easier to read.
2018-04-18 07:52:17 -07:00
Simon Willison b2955d9065
New --plugins-dir=plugins/ option (#212)
* New --plugins-dir=plugins/ option

New option causing Datasette to load and evaluate all of the Python files in
the specified directory and register any plugins that are defined in those
files.

This new option is available for the following commands:

    datasette serve mydb.db --plugins-dir=plugins/
    datasette publish now/heroku mydb.db --plugins-dir=plugins/
    datasette package mydb.db --plugins-dir=plugins/

* Unit tests for --plugins-dir=plugins/

Closes #211
2018-04-15 22:22:01 -07:00
Russ Garrett 8bfeb98478
Tidy up units support
* Add units to exported JSON
* Units key in metadata skeleton
* Docs
2018-04-14 11:43:34 +01:00
Simon Willison c195ee4d46
package and publish commands now accept --static and --template-dir
Example usage:

    datasette package --static css:extra-css/ --static js:extra-js/ \
    	sf-trees.db --template-dir templates/ --tag sf-trees --branch master

This creates a local Docker image that includes copies of the templates/,
extra-css/ and extra-js/ directories. You can then run it like this:

	docker run -p 8001:8001 sf-trees

For publishing to Zeit now:

	datasette publish now --static css:extra-css/ --static js:extra-js/ \
		sf-trees.db --template-dir templates/ --name sf-trees --branch master

Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List

For publishing to Heroku:

	datasette publish heroku --static css:extra-css/ --static js:extra-js/ \
		sf-trees.db --template-dir templates/ --branch master

Closes #157, #160
2017-12-09 10:28:49 -08:00
Simon Willison 2cc14a236c
Ditched short form options for --static and --template-dir
The -t clashes with the package --tag option
2017-12-08 19:47:50 -08:00
Simon Willison 0539905806
Renamed "datasette build" command to "datasette inspect"
Closes #130
2017-12-07 08:57:31 -08:00
Simon Willison 515eaa8ccb
--reload now reloads on metadata changes too 2017-12-07 08:42:28 -08:00
Simon Willison 32cf5a4a72
New datasette skeleton command for generating metadata.json
Closes #164
2017-12-06 22:20:37 -08:00
Simon Willison e981ac7d4d
--static option for datasette serve
You can now tell Datasette to serve static files from a specific location at a
specific mountpoint.

For example:

	datasette serve mydb.db --static extra-css:/tmp/static/css

Now if you visit this URL:

	http://localhost:8001/extra-css/blah.css

The following file will be served:

	/tmp/static/css/blah.css

Refs #160
2017-12-03 08:33:36 -08:00
Simon Willison 7ff51598c4
git commit -m "datasette --template-dir=mytemplates/" argument
You can now pass an additional argument specifying a directory to look for
custom templates in.

Datasette will fall back on the default templates if a template is not
found in that directory.

Refs #12, #153
2017-11-30 08:05:01 -08:00
Simon Willison 36701c8592
datasette build now takes --load-extension 2017-11-26 15:02:01 -08:00
Simon Willison fb505de11c
Back-ported format strings for compatibility with Py 3.5
Also fixed a encoding error where Heroku --json return needs to be treated as UTF8
2017-11-22 09:42:29 -08:00
Jacob Kaplan-Moss de42240afd Some bug fixes. 2017-11-21 10:51:58 -08:00
Jacob Kaplan-Moss 75450abbe8 Merge branch 'master' into publish-heroku 2017-11-21 10:19:42 -08:00
Jacob Kaplan-Moss 1f79be7e4e More error checking and docs 2017-11-21 10:10:48 -08:00
Simon Willison ddc808f387
Added --build=master option to datasette publish and package
The `datasette publish` and `datasette package` commands both now accept an
optional `--build` argument. If provided, this can be used to specify a branch
published to GitHub that should be built into the container.

This makes it easier to test code that has not yet been officially released to
PyPI, e.g.:

    datasette publish now mydb.db --branch=master
2017-11-19 10:20:17 -08:00
Simon Willison 80ada4dbb3
Added 'datasette --version' support
Using http://click.pocoo.org/5/api/#click.version_option
2017-11-18 21:59:16 -08:00
Jacob Kaplan-Moss 54d58ef690 Merge branch 'master' into publish-heroku 2017-11-17 13:36:50 -08:00
Jacob Kaplan-Moss 6eb23d2143 Moved `datasette build` to a post_compile hook. 2017-11-17 12:09:01 -08:00
Simon Willison 03572ae355 Allow --load-extension to be set via environment variable
I tesed this by first building and running a container using the new
Dockerfile from #114:

    docker build .
    docker run -it -p 8001:8001 6c9ca7e29181 /bin/sh

Then I ran this inside the container itself:

    apt update && apt-get install wget -y \
        && wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz \
        && gunzip test-2.3.sqlite.gz \
        && mv test-2.3.sqlite test23.sqlite \
        && datasette -h 0.0.0.0 test23.sqlite

I visited this URL to confirm I got an error due to spatialite not being
loaded:

http://localhost:8001/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1

Then I checked that loading it with `--load-extension` worked correctly:

    datasette -h 0.0.0.0 test23.sqlite \
        --load-extension=/usr/lib/x86_64-linux-gnu/mod_spatialite.so

Then, finally, I tested it with the new environment variable option:

    SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/mod_spatialite.so \
        datasette -h 0.0.0.0 test23.sqlite

Running it with an invalid environment variable option shows an error:

    $ SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/blah.so datasette \
        -h 0.0.0.0 test23.sqlite
    Usage: datasette -h [OPTIONS] [FILES]...

    Error: Invalid value for "--load-extension": Path "/usr/lib/x86_64-linux-gnu/blah.so" does not exist.

Closes #112
2017-11-17 06:13:35 -08:00
Simon Willison b7c4165346 Added --load-extension argument to datasette serve
Allows loading of SQLite extensions. Refs #110.
2017-11-16 08:48:49 -08:00
Jacob Kaplan-Moss f48cb705d8 Initial cut at `datasette publish heroku`
Rather gross, but proves that it works.
2017-11-15 11:53:00 -08:00
Simon Willison ea183b2ae3 Default to 127.0.0.1 not 0.0.0.0
Closes #98
2017-11-14 21:08:46 -08:00
Simon Willison 7fe1e8b482 Added extra metadata options to publish and package commands
You can now run these commands like so:

    datasette now publish mydb.db \
        --title="My Title" \
        --source="Source" \
        --source_url="http://www.example.com/" \
        --license="CC0" \
        --license_url="https://creativecommons.org/publicdomain/zero/1.0/"

This will write those values into the metadata.json that is packaged with the
app. If you also pass --metadata= that file will be updated with the extra
values before being written into the Docker image.

Closes #92
2017-11-14 21:02:11 -08:00
Simon Willison fc7c04fe0b Added 'datasette publish now --force' option
This calls now with --force - which is useful as it means you get a fresh copy of
datasette even if now has already cached that docker layer.
2017-11-13 17:48:03 -08:00
Simon Willison 1e698787a4 Added --sql_time_limit_ms and --extra-options
The serve command now accepts --sql_time_limit_ms for customizing the SQL time
limit.

The publish and package commands now accept --extra-options which can be used
to specify additional options to be passed to the datasite serve command when
it executes inside the rusulting Docker containers.
2017-11-13 14:00:53 -08:00
Simon Willison 8252e71da4 Limit on max rows returned, controlled by --max_returned_rows option
If someone executes 'select * from table' against a table with a million rows
in it, we could run into problems: just serializing that much data as JSON is
likely to lock up the server.

Solution: we now have a hard limit on the maximum number of rows that can be
returned by a query. If that limit is exceeded, the server will return a
`"truncated": true` field in the JSON.

This limit can be optionally controlled by the new `--max_returned_rows`
option. Setting that option to 0 disables the limit entirely.

Closes #69
2017-11-13 11:33:01 -08:00
Simon Willison e838bd743d Added README and improved help for 'datasette serve' 2017-11-13 10:41:59 -08:00
Simon Willison 20d41c8e8e publish now takes a required publisher argument - only current option is 'now'
Closes #76
2017-11-13 10:40:51 -08:00
Simon Willison 97c4bf4271 Added --cors argument to enable CORS
Closes #75
2017-11-13 10:17:42 -08:00
Simon Willison 495407acef Force initial .inspect() before starting server
Otherwise there is a long pause on the first request made.
2017-11-13 10:03:52 -08:00
Simon Willison 4143e3b45c New command: datasette package - packages a docker container
Example usage:

    datasette package fivethirtyeight.db \
        --tag fivethirtyeight \
        --metadata=538-metadata.json

This will create a temporary directory, generate a Dockerfile, copy in the
SQLite database and metadata file, then build that as a new docker image and
tag that in your local Docker repository as fivethirtyeight:latest.

You can then run the image like so:

    docker run -p 8006:8001 fivethirtyeight

This will expose port 8001 in the container (the default) as port 8006 on your
host.

Closes #67
2017-11-13 08:17:35 -08:00
Simon Willison 3ef35ca8b4 serve and publish commands now take a --metadata option
If provided, the --metadata option is the path to a JSON file containing
metadata that should be displayed alongside the dataset.

    datasette /tmp/fivethirtyeight.db --metadata /tmp/metadata.json

Currently that metadata format looks like this:

    {
        "title": "Five Thirty Eight",
        "license": "CC Attribution 4.0 License",
        "license_url": "http://creativecommons.org/licenses/by/4.0/",
        "source": "fivethirtyeight/data on GitHub",
        "source_url": "https://github.com/fivethirtyeight/data"
    }

If provided, this will be used by the index template and to populate the
common footer.

The publish command also accepts this argument, and will package any provided
metadata up and include it with the resulting Docker container.

    datasette publish --metadata /tmp/metadata.json /tmp/fivethirtyeight.db

Closes #68
2017-11-13 07:20:02 -08:00
Simon Willison ff2fb573cd datasette publish --name=now-accepts-name
Fixes #72
2017-11-12 18:12:21 -08:00
Simon Willison 4c66097d58 datasette publish now works with full paths
e.g. datasette publish /tmp/blah/database.db now does the right thing
2017-11-12 15:16:24 -08:00
Simon Willison 40a563ebac Reworked metadata building options
Building metadata is now optional. If you want to do it, do this:

    datasette build *.db --metadata=metadata.json

Then when you run the server you can tell it to read from metadata:

    datasette serve *.db --metadata=metadata.json

The Dockerfile generated by datasette publish now uses this mechanism.

Closes #60
2017-11-11 12:11:51 -08:00
Simon Willison 3863a30b5d publish command checks 'now' is installed
Closes #58
2017-11-11 08:00:00 -08:00
Simon Willison 65e350ca2a Implemented 'datasette publish one.db two.db' command
Closes #26
2017-11-10 23:25:22 -08:00
Simon Willison e9fce44195 Don't serve cache headers in debug or refresh modes 2017-11-10 12:26:37 -08:00
Simon Willison 1c57bd202f Replaced app_factory with new Datasette class
This should make it easier to add unit tests.
2017-11-10 11:05:57 -08:00
Simon Willison e7e50875d3 Renamed to 'datasette' 2017-11-10 10:38:35 -08:00