The index page now only shows row counts for immutable databases OR for
databases with less than 30 tables provided it could get a count for
each of those tables in less than 10ms.
Closes#467, Refs #460
If we have less than 5 tables we now also show one or more views in the
summary on the homepage.
Also corrected the logic for the row counts - we now count hidden and
visible tables separately.
Closes#373, Refs #460
Moved VirtualSpatialIndex check into a new mechanism that should allow
us to add further sanity checks in the future.
To test this I've had to commit a binary sample SpatiaLite database to
the repository. I included a build script for creating that database.
Closes#466
Refs #462
* inspect command now just outputs table counts
* test_inspect.py is now only tests for that CLI command
* Updated some relevant documentation
* Removed docs for /-/inspect since that is about to change
Fix for this error:
$ now alias --token=$NOW_TOKEN
> WARN! The `now alias` command (no arguments) was deprecated in favour of `now --target production`.
> Error! Couldn't find a deployment to alias. Please provide one as an argument.
The command "now alias --token=$NOW_TOKEN" exited with 1.
https://travis-ci.org/simonw/datasette/jobs/530597261
This command:
python tests/fixtures.py \
fixtures.db \
metadata.json \
fixtures-plugins/
Will now create the fixtures.db and metadata.json files, AND create
a folder called fixtures-plugins/ containing two test plugins.
You can then run it like this:
datasette fixtures.db \
-m metadata.json --plugins-dir=fixtures-plugins/
ASGI cannot differentiate between / and %2F in a URL, so we need an
alternative scheme for encoding the names of tables that contain special
characters such as /
For background, see
https://github.com/django/asgiref/issues/51#issuecomment-450603464
Some examples:
"table/and/slashes" => "tableU+002FandU+002Fslashes"
"~table" => "U+007Etable"
"+bobcats!" => "U+002Bbobcats!"
"U+007Etable" => "UU+002B007Etable"
I've run the black code formatting tool against everything:
black tests datasette setup.py
I also added a new unit test, in tests/test_black.py, which will fail if the code does not
conform to black's exacting standards.
This unit test only runs on Python 3.6 or higher, because black itself doesn't run on 3.5.
Binary columns (including spatialite geographies) get shown as ugly
binary strings in the HTML by default. Nobody wants to see that mess.
Show the size of the column in bytes instead. If you want to decode
the binary data, you can use a plugin to do it.
At the moment it's not easy to tell whether the hook is being called
in (for example) the row or table view, as in both cases the
`database` and `table` parameters are provided.
This passes the `view_name` added in #441 to the `extra_body_script`
hook.
This stops my automatic editor linting from flagging lines which are too
long. It's been lingering in my checkout for ages.
160 is an arbitrary large number - we could alter it if we have any
opinions (but I find the line length limit to be my least favourite part
of PEP8).
Datasette previously only supported one type of faceting: exact column value counting.
With this change, faceting logic is extracted out into one or more separate classes which can implement other patterns of faceting - this is discussed in #427, but potential upcoming facet types include facet-by-date, facet-by-JSON-array, facet-by-many-2-many and more.
A new plugin hook, register_facet_classes, can be used by plugins to add in additional facet classes.
Each class must implement two methods: suggest(), which scans columns in the table to decide if they might be worth suggesting for faceting, and facet_results(), which executes the facet operation and returns results ready to be displayed in the UI.
Also introduced a mechanism whereby table counts are calculated against a time limit
but immutable databases have their table counts calculated on server startup.
Thanks @russss!
* Add register_output_renderer hook
This changeset refactors out the JSON renderer and then adds a hook and
dispatcher system to allow custom output renderers to be registered.
The CSV output renderer is untouched because supporting streaming
renderers through this system would be significantly more complex, and
probably not worthwhile.
We can't simply allow hooks to be called at request time because we need
a list of supported file extensions when the request is being routed in
order to resolve ambiguous database/table names. So, renderers need to
be registered at startup.
I've tried to make this API independent of Sanic's request/response
objects so that this can remain stable during the switch to ASGI. I'm
using dictionaries to keep it simple and to make adding additional
options in the future easy.
Fixes#440